Just look, for instance, at FPGAs: almost all the tooling is proprietary, very expensive, and very buggy too. Or look at PCB design: Altium seems to be the standard here still, despite Kicad having made huge advances and by most accounts being as good or even better. It took decades (Kicad started in 1992) for the FOSS alternatives here to really catch on much, and only really because PCBs became cheap enough for hobbyists to design and construct their own (mainly because of Chinese PCB companies), and because CERN contributed some resources.
I'm not sure what the deal is with engineers hating collaboratively-developed and freely-available software, but it's a real thing in my experience. It's like someone told them that FOSS is "socialism" and they just reflexively dismiss or hate it.
Most of the incredibly well used robust open source packages are sponsored by large tech companies. The embedded space just hasn't had that kind of sponsorship.
Anyone who sincerely thinks GIMP can replace Photoshop or is otherwise good will never understand why professionals eschew open source software when there's work that needs doing.
One of the reasons chefs rarely have anything to do with cookbooks they write past the initial set of recipes is because it’s really hard to see things from an inexpert perspective. People ask us things like “how long do I cook [something] and we often have no idea how to answer that question. Knowing how much that can change depending on the heat source, initial ingredient temperature, how long it’s been unrefrigerated, the water content of the pieces you’ve got, the shape of the pieces, etc etc etc, we just say “uh, until it’s done?” But it takes a lot of skill and experience to realize when most things you need to cook are done, so recipe developers and cookbook writers do a ton of testing to figure out about how long it takes to get you 80% of the way there and then give some simple ways to approximately gauge doneness in that context. If they’d learn a few simple things that “aren’t that hard”, they’d have precise, bang-on results like I do, every time. But unless you cook the same things so the time, you’d need to repeat that across all of the different cooking scenarios that require specialized knowledge. Chefs run into that because people want us to tell them how to cook things all the time, so the skill gap is apparent, and we see the value in someone who knows how to address that. It was never really shown to me like that as a developer, so I see why so many get stuck in the “come on, it’s not that hard” mindset, generally.
Interface design is conceptually harder, because you need to really consider many skill levels that have different needs. The answer isn’t developers reading some article to “make nicer looking interfaces” or “dumb things down”— which we’ll just piss people off in the end and many of them will be developers assuming it’s an interface designer’s fault. The answer is to deliberately enfranchise designers into the FOSS process to figure out who would benefit from the software, and make an interface that can serve everyone’s needs: inexpert and advanced users alike, if that make sense. You do not have remove advanced functionality to make it useful to non-developer users.
So the first step is to put aside the dev nerd machismo for a minute and recognize that designers serve a crucial purpose that isn’t “dumbing things down” or “making things look nice” and that most developers have no idea how to do it themselves. Once that’s a thing, figuring out how to enfranchise designers into FOSS will be the next step.
That's true, but also remember that not all designers are actually competent, or in agreement with your preferences. I'm definitely no MS fan, but look for instance at Windows 7, or better yet, Windows Vista: IMO, Vista was the peak for the Windows UI (forget about the performance problems it had at the time): it was pretty, but pretty easy to use most of the time too, with a highly discoverable interface. Now look at modern GNOME, which its backers tell us is created by "UI experts". It reminds me of Scientologists calling themselves "mental health experts". Calling yourself an "expert designer" doesn't make you one, and even any professional field, there's frequently wide disagreement. A lot of design is subjective anyway, so you can't just point to one self-appointed "designer's" opinion and use that as gospel. Even back in MS land, somehow the "experts" went from the highly-attractive Vista UI to the butt-ugly flat UI "Metro" UI in Win8, and that's at the very same company!
Moving the start menu from the corner, or otherwise wasting corner or edge space, is one of them.