Most active commenters
  • DidYaWipe(7)
  • int_19h(3)

←back to thread

559 points cxr | 14 comments | | HN request time: 0.002s | source | bottom
Show context
weinzierl ◴[] No.44477561[source]
I get why you would hide interface elements to use the screen real estate for something else.

I have no idea why some interfaces hide elements hide and leave the space they'd taken up unused.

IntelliJ does this, for example, with the icons above the project tree. There is this little target disc that moves the selection in the project tree to the file currently open in the active editor tab. You have to know the secret spot on the screen where it is hidden and if you move your mouse pointer to the void there, it magically appears.

Why? What is the rationale behind going out of your way to implement something like this?

replies(8): >>44477624 #>>44477657 #>>44477684 #>>44477720 #>>44477854 #>>44478558 #>>44480234 #>>44484094 #
autobodie ◴[] No.44477657[source]
Intellij on Windows also buries the top menus into a hamburger icon and leaves the entire area they occupied empty! Thankfully there is an option to reverse it deep in the settings, but having it be the default is absolutely baffling.
replies(1): >>44477724 #
DidYaWipe ◴[] No.44477724[source]
Microsoft pulls the same BS. Look at Edge. Absolute mess. No menu. No title bar. What application am I even using?

This stupidity seems to have spread across Windows. No title bars or menus... now you can't tell what application a Window belongs to.

And you can't even bring all of an application's windows to the foreground... Microsoft makes you hover of it in the task bar and choose between indiscernible thumbnails, one at a time. WTF? If you have two Explorer windows open to copy stuff, then switch to other apps to work during the copy... you can't give focus back to Explorer and see the two windows again. You have to hover, click on a thumbnail. Now go back and hover, and click on a thumbnail... hopefully not the same one, because of course you can't tell WTF the difference between two lists of files is in a thumbnail.

And Word... the Word UI is now a clinic on abject usability failure. They have a menu bar... except WAIT! Microsoft and some users claim that those are TABS... except that it's just a row of words, looking exactly like a menu.

So now there's NO menu and no actual tabs... just a row of words. And if you go under the File "menu" (yes, File), there are a bunch of VIEW settings. And in there you can add and remove these so-called "tabs," and when you do remove one, the functionality disappears from the entire application. You're not just customizing the toolbar; you're actually disabling entire swaths of features from the application.

It's an absolute shitshow of grotesque incompetence, in a once-great product. No amount of derision for this steaming pile is too much.

replies(5): >>44478065 #>>44478385 #>>44478435 #>>44479181 #>>44481690 #
int_19h ◴[] No.44478435[source]
This isn't just a Windows thing. Look at Gnome for another example. macOS of late also likes to take over the title bar for random reasons, although there at least the menu bar is still present regardless.
replies(3): >>44478993 #>>44479299 #>>44481233 #
1. DidYaWipe ◴[] No.44479299[source]
I've always considered the Mac's shared menu bar a GUI 1.0 mistake that should have been fixed in the transition to OS X. Forcing all applications to share a single menu that's glued to the top of the screen, and doesn't switch back to the previous application when you minimize the one you're working with, is dumb.

Windows and Unix GUIs had it right: Put an application's menu where it belongs, on the application's main frame.

But now on Windows... NO menu? Oh wait, no... partial menus buried under hamburger buttons in arbitrary locations, and then others buried under other buttons.

replies(2): >>44479616 #>>44487001 #
2. danaris ◴[] No.44479616[source]
...The Mac menu bar is what it is for a very good reason. Being at the top of the screen makes it an infinitely-tall target.

All you have to do to get to it is move your mouse up until you can't move it up any more.

This remains a very valuable aspect to it no matter what changes in the vogue of UIs have come and gone since.

The fact that you think that you've "minimized the application" when you minimized a window just shows that you are operating on a different (not better, not worse, just different) philosophy of how applications work than the macOS designers are.

replies(3): >>44479883 #>>44481637 #>>44487855 #
3. layer8 ◴[] No.44479883[source]
This argument never made much sense to me, although I do subscribe to Fitts' Law. With desktop monitor sizes since 20+ years ago, the distance you have to travel, together with the visual disconnect between application and the menu bar, negates the easier targetability. And with smaller screen sizes, you would generally maximize the application window anyway, resulting in the same targetability.

The actual historical rationale for the top menu bar was different, as explained by Bill Atkinson in this video: https://news.ycombinator.com/item?id=44338182. The problem was that due to the small screen size, non-maximized windows often weren't wide enough to show all menus, and there often wasn't enough space vertically below the window's menu bar to show all menu items. That's why they moved the menus to the top of the screen, so that there always was enough space, and despite the drawback, as Atkinson notes, of having to move the mouse all the way to the top. This drawback was significant enough that it made them implement mouse pointer acceleration to compensate.

So targetability wasn't the motivation at all, that is a retconned explanation. And the actual motivation doesn't apply anymore on today's large and high-resolution screens.

replies(1): >>44480264 #
4. danaris ◴[] No.44480264{3}[source]
> With desktop monitor sizes since 20+ years ago, the distance you have to travel, together with the visual disconnect between application and the menu bar, negates the easier targetability.

Try it on a Mac; the way its mouse acceleration works makes it really, really easy to just flick either a mouse or a finger on a trackpad and get all the way across the screen.

replies(2): >>44480895 #>>44487005 #
5. layer8 ◴[] No.44480895{4}[source]
I’m not saying it’s necessarily harder to reach a menu bar at the top of the screen, given suitable mouse acceleration. But you also have to move the mouse pointer back to whatever you are doing in the application window, and moving to the top menu bar is not that much (if at all) easier to really justify the cognitive and visual separation. It that were the case, then as many application controls as possible should be moved to the border of the screen.
6. gmueckl ◴[] No.44481637[source]
There are videos out there where CHM interviewed Bill Atkinson. One part has him go over old Polaroids of Lisa interface drafts. There, he justifies the menu bar at the top of the screen differently: they couldn't figure out what to do when the menu was too wide.for the window when the user made it narrow.
replies(1): >>44493273 #
7. int_19h ◴[] No.44487001[source]
I fully agree with you that the menu bar placement in macOS is really weird and confusing and rather inconvenient (regardless of any claimed benefits per Fitt's Law). It's ironic that it ended up being a benefit in the age of UX enshittification solely because it forces apps to have the menu in the first place (although I increasingly see apps that do the bare minimum there and hide the rest behind hamburger menus in the apps).
replies(1): >>44487773 #
8. int_19h ◴[] No.44487005{4}[source]
Mac is my primary desktop these days and has been for over a year now, and it's still annoying.
replies(1): >>44487925 #
9. DidYaWipe ◴[] No.44487773[source]
Exactly. The single menu causes quite a few problems, both obvious and subtle.

But yeah... now I'm relieved when I go home from work and get back on my Mac. I waste so much time hunting for stuff on Windows now... it's just incredible.

Pompous pedants used to trot out "Fitt's Law" in defense of the Mac's dumb menu all the time, when in fact it contra-indicates it:

"Fitts’ law states that the amount of time required for a person to move a pointer (e.g., mouse cursor) to a target area is a function of the distance to the target divided by the size of the target. Thus, the longer the distance and the smaller the target’s size, the longer it takes."

Right, so where should an application's menu go? ON ITS WINDOW. Not way up at the top of the screen. It's as if the people citing this "law" don't even read it.

10. DidYaWipe ◴[] No.44487855[source]
Ah yes, this old argument. Except nobody slams his cursor against the top of the screen in real life, assuming that the menu bar is "infinitely tall." Watch real users interact with a Mac's menu, and you simply won't see this behavior. Not to mention that it doesn't work if you're using a laptop and a second monitor positioned behind and above it.

And we're talking about a GUI here, so when I minimize an application's GUI then yes, I expect that I've minimized the application. And again, I think you'll find that the vast majority of users work under this M.O.

But your observation raises another usability issue caused by the single menu: Instead of an "infinite" desktop, the Mac reduces the entire screen to a single application's client area... so, historically, Mac applications treated it that way...littering it with an armada of floating windows that you had to herd around.

The problem is that turning the whole screen into one application's client area fails because you can see all the other crap on your desktop and all other open applications' GUIs THROUGH the UI of the app you're trying to use. It's stupid.

So, to users' relief, the floating-window nonsense has been almost entirely abandoned over the last couple of decades and single-window applications have become the norm on Mac as they have been on Windows forever. Oh wait, hold on... here comes Apple regressing back to "transparent" UI with "liquid glass;" a failed idea from 20+ years ago.

Full circle, sadly.

11. DidYaWipe ◴[] No.44487925{5}[source]
I've been on Mac for 20 years and it's still annoying as hell.

Another side effect is the uselessness of the Help menu. What help am I looking at? The application owns the menu, so where's the OS help?

Oh right, it's just all mixed together. When I'm searching for information in some developer tool I'm using, I really enjoy all the arbitrary hits from the OS help about setting up printers, sending E-mail, whatever.

12. DidYaWipe ◴[] No.44493273{3}[source]
Huh. I wonder why they thought this was such a big deal. I mean... the user caused the problem, and could easily fix it by enlarging the window. Other windowing GUIs handle this just fine.
replies(1): >>44506895 #
13. gmueckl ◴[] No.44506895{4}[source]
I'm not sure if it was user tested, but IIRC part of the problem was that there was no visual indication that the menu bar is cut off and some of the commands are inaccessible. We need to remember that this was new - they were introducing a GUI to the masses for the first time ever. Everything hat to be extremely clear.
replies(1): >>44514868 #
14. DidYaWipe ◴[] No.44514868{5}[source]
Good point, and surely a valid concern... one that Apple has forgotten in some places today. Look at the utterly useless icon/thumbnail view in Finder: Contents don't wrap within the window. There could be dozens or hundreds of files off-screen in limbo, and you'll never know.

There are, of course, ways to indicate "more controls this way" with an arrow or other affordance when there's a toolbar or menu overflow, though.

Anyway, the point is that by the time OS X came along, other platforms had solved the problems but Apple rejected those widely-accepted solutions.