Most active commenters
  • kragen(14)
  • mianos(4)
  • f1shy(3)
  • silver_silver(3)
  • pdimitar(3)
  • ori_b(3)
  • ajross(3)
  • bluGill(3)

←back to thread

Be Aware of the Makefile Effect

(blog.yossarian.net)
431 points thunderbong | 84 comments | | HN request time: 1.26s | source | bottom
1. mianos ◴[] No.42664066[source]
I have an alternate theory: about 10% of developers can actually start something from scratch because they truly understand how things work (not that they always do it, but they could if needed). Another 40% can get the daily job done by copying and pasting code from local sources, Stack Overflow, GitHub, or an LLM—while kinda knowing what’s going on. That leaves 50% who don’t really know much beyond a few LeetCode puzzles and have no real grasp of what they’re copying and pasting.

Given that distribution, I’d guess that well over 50% of Makefiles are just random chunks of copied and pasted code that kinda work. If they’re lifted from something that already works, job done—next ticket.

I’m not blaming the tools themselves. Makefiles are well-known and not too verbose for smaller projects. They can be a bad choice for a 10,000-file monster—though I’ve seen some cleanly written Makefiles even for huge projects. Personally, it wouldn’t be my first choice. That said, I like Makefiles and have been using them on and off for at least 30 years.

replies(7): >>42664103 #>>42664461 #>>42664526 #>>42664536 #>>42664757 #>>42672850 #>>42676540 #
2. sebazzz ◴[] No.42664103[source]
> That leaves 50% who don’t really know much beyond a few LeetCode puzzles and have no real grasp of what they’re copying and pasting.

Who likely wouldn't have a job if it weren't for LLMs.

replies(1): >>42664197 #
3. raziel2p ◴[] No.42664197[source]
pretty sure we've made this complaint about a subset of developers since way before chatgpt and the like.
replies(1): >>42664544 #
4. huijzer ◴[] No.42664461[source]
> That leaves 50% who don’t really know much beyond a few LeetCode puzzles and have no real grasp of what they’re copying and pasting.

Small nuance: I think people often don’t know because they don’t have the time to figure it out. There are only so many battles you can fight during a day. For example if I’m a C++ programmer working on a ticket, how many layers of the stack should I know? For example, should I know how the CPU registers are called? And what should an AI researcher working always in Jupyter know? I completely encourage anyone to learn as much about the tools and stack as possible, but there is only so much time.

replies(6): >>42664760 #>>42664847 #>>42665008 #>>42665319 #>>42666573 #>>42670611 #
5. Loic ◴[] No.42664526[source]
I like Makefiles, but just for me. Each time I create a new personal project, I add a Makefile at the root, even if the only target is the most basic of the corresponding language. This is because I can't remember all the variations of all the languages and frameworks build "sequences". But "$ make" is easy.
replies(2): >>42664751 #>>42664822 #
6. f1shy ◴[] No.42664536[source]
I would just change the percentages, but is about as true as it gets.
replies(1): >>42665361 #
7. f1shy ◴[] No.42664544{3}[source]
And that happens not only with developers, but in any profession, which gives me shivers when I go to the doctor!
8. choeger ◴[] No.42664751[source]
You're probably using the wrong tool and should consider a simple plain shell script (or a handful of them) for your tasks. test.sh, build.sh, etc.
replies(2): >>42666987 #>>42667376 #
9. adrian_b ◴[] No.42664757[source]
Actually it is trivial to write a very simple Makefile for a 10,000 file project, despite the fact that almost all Makefiles that I have ever seen in open-source projects are ridiculously complicated, far more complicated than a good Makefile would be.

In my opinion, it is a mistake almost always when you see in a Makefile an individual rule for making a single file.

Normally, there should be only generic building rules that should be used for building any file of a given type.

A Makefile should almost never contain lists of source files or of their dependencies. It should contain only a list with the directories where the source files are located.

Make should search the source directories, find the source files, classify them by type, create their dependency lists and invoke appropriate building rules. At least with GNU make, this is very simple and described in its user manual.

If you write a Makefile like this, it does not matter whether a project has 1 file or 10,000 files, the effort in creating or modifying the Makefile is equally negligible. Moreover, there is no need to update the Makefile whenever source files are created, renamed, moved or deleted.

replies(2): >>42665406 #>>42672054 #
10. silver_silver ◴[] No.42664760[source]
We can’t really call the field engineering if this is the standard. A fundamental understanding of what one’s code actually makes the machine do is necessary to write quality code regardless of how high up the abstraction stack it is
replies(3): >>42664887 #>>42665077 #>>42681706 #
11. 1aqp ◴[] No.42664822[source]
I'd say: you are absolutely using the right tool. :-)
12. silveraxe93 ◴[] No.42664847[source]
This is the 40% that OP mentioned. But there's a proportion on people/engineers that are just clueless and are incapable of understanding code. I don't know the proportion so can't comment on the 50% number, but hey definitely exist.

If you never worked with them, you should count yourself lucky.

13. cudgy ◴[] No.42664887{3}[source]
Sure if you are doing embedded programming in C. How does one do this in web development though where there are hundreds of dependencies that get updated monthly and still add functionality and keep their job?
replies(2): >>42665083 #>>42665438 #
14. kragen ◴[] No.42665008[source]
If you spend 80% of your time (and mental energy) applying the knowledge you already have and 20% learning new things, you will very quickly be able to win more battles per day than someone who spends 1% of their time learning new things.

Specifically for the examples at hand:

- at 20%, you will be able to write a Makefile from scratch within the first day of picking up the manual, rather than two or three weeks if you only invest 1%.

- if you don't know what the CPU registers are, the debugger won't be able to tell you why your C++ program dumped core, which will typically enable you to resolve the ticket in a few minutes (because most segfaults are stupid problems that are easy to fix when you see what the problem is, though the memorable ones are much hairier.) Without knowing how to use the disassembly in the debugger, you're often stuck debugging by printf or even binary search, incrementally tweaking the program until it stops crashing, incurring a dog-slow C++ build after every tweak. As often as not, a fix thus empirically derived will merely conceal the symptom of the bug, so you end up fixing it two or three times, taking several hours each time.

Sometimes the source-level debugger works well enough that you can just print out C++-level variable values, but often it doesn't, especially in release builds. And for performance regression tickets, reading disassembly is even more valuable.

(In C#, managed C++, or Python, the story is of course different. Until the Python interpreter is segfaulting.)

How long does it take to learn enough assembly to use the debugger effectively on C and C++ programs? Tens of hours, I think, not hundreds. At 20% you get there after a few dozen day-long debugging sessions, maybe a month or two. At 1% you may take years.

What's disturbing is how many programmers never get there. What's wrong with them? I don't understand it.

replies(2): >>42666717 #>>42667133 #
15. kragen ◴[] No.42665077{3}[source]
Steam engines predate the understanding of not just the crystalline structure of steel but even the basics of thermodynamics by quite a few decades.
replies(2): >>42665397 #>>42668316 #
16. kragen ◴[] No.42665083{4}[source]
Maybe switch to less frequently updated dependencies and rewrite the easy ones in-house?
replies(1): >>42665537 #
17. oweiler ◴[] No.42665319[source]
That's why suggestions like RTFM! are stupid. I just don't have time to read every reference documentation of every tool I use.
replies(3): >>42667634 #>>42667658 #>>42667793 #
18. mianos ◴[] No.42665361[source]
I’d be curious to hear your ratio. It really varies. In some small teams with talented people, there are hardly any “fake” developers. But in larger companies, they can make up a huge chunk.

Where I am now, it’s easily over 50%, and most of the real developers have already left.

PS: The fakes aren’t always juniors. Sometimes you have junior folks who are actually really good—they just haven’t had time yet to discover what they don’t know. It’s often absolutely clear that certain juniors will be very good just from a small contribution.

replies(1): >>42665716 #
19. silver_silver ◴[] No.42665397{4}[source]
Yes and they’re far less efficient and require far more maintenance than an equivalent electric or even diesel engine, where equivalent power is even possible
replies(2): >>42665482 #>>42667136 #
20. mianos ◴[] No.42665406[source]
If everything in your tree is similar, yes. I agree that's going to be a very small Makefile.

While this is true, for much larger projects, that have lived for a long time, you will have many parts, all with slight differences. For example, over time the language flavour of the day comes and goes. Structure changes in new code. Often different subtrees are there for different platforms or environments.

The Linux kernel is a good, maybe extreme, but clear example. There are hundreds of Makefiles.

replies(2): >>42666711 #>>42671248 #
21. silver_silver ◴[] No.42665438{4}[source]
The current state of web development is unfortunately a perfect example of this quality crisis. The tangle of dependencies either directly causes or quickly multiplies the inefficiency and fragility we’ve all come to expect from the web. The solution is unrealistic because it involves design choices which are either not trendy enough or precluded by the platform
22. BlueTemplar ◴[] No.42665482{5}[source]
Why do you assume that the same doesn't apply to electric and diesel engines ?
replies(1): >>42671235 #
23. pdimitar ◴[] No.42665537{5}[source]
Yes, and I should overrule half the business decisions of the company while I am at it. Oh, and I'll push back on "we need the next feature next week" and I'll calmly respond "we need to do excellent engineering practices in this company".

And everybody will clap and will listen to me, and I will get promoted.

...Get real, dude. Your comments come across a bit tone-deaf. I am glad you are in a privileged position but you seem to have fell for the filter bubble effect and are unaware to how most programmers out there have to work if they want to pay the bills.

replies(2): >>42666493 #>>42667215 #
24. f1shy ◴[] No.42665716{3}[source]
My personal experience: - 5% geniuses. This are people who are passionate about what they do, they are always up to date. Typically humble, not loud people. - 15% good, can do it properly. Not passionate, but at least have a strong sense of responsibility. Want to do “the right thing” or do it right. Sometimes average intelligence, but really committed. - 80% I would not hire. People who talk a lot, and know very little. Probably do the work just because they need the money.

That applies for doctors, contractors, developers, taxi drivers, just about anything and everything. Those felt percentages had been consistent across 5 countries, 3 continents and 1/2 a century of life

PS: results are corrected for seniority. Even in the apprentice level I could tell who was in each category.

replies(1): >>42669625 #
25. ori_b ◴[] No.42666493{6}[source]
Yes, sometimes things are unfixably broken, and it's impossible to build anything good.

For everything else, there's MasterCard.

replies(3): >>42667991 #>>42668906 #>>42671852 #
26. ajross ◴[] No.42666573[source]
> I completely encourage anyone to learn as much about the tools and stack as possible, but there is only so much time.

That seems like a weird way to think about this. I mean, sure, there's no time today to learn make to complete your C++ ticket or whatever. But yesterday? Last month? Last job?

Basically, I think this matches the upthread contention perfectly. If you're a working C++ programmer who's failed to learn the Normal Stable of Related Tools (make, bash, python, yada yada) across a ~decade of education and experience, you probably never will. You're in that 50% of developers who can't start stuff from scratch. It's not a problem of time, but of curiosity.

replies(1): >>42666690 #
27. Joker_vD ◴[] No.42666690{3}[source]
> I mean, sure, there's no time today to learn make to complete your C++ ticket or whatever. But yesterday? Last month? Last job?

That seems like a weird way to think about this. Of course there was no time in the past to learn this stuff, if you still haven't learned it by the present moment. And even if there were, trying to figure out whether there perhaps was some free time in the past is largely pointless, as opposed to trying to schedule things in the future: you can't change the past anyhow, but the future is somewhat more malleable.

replies(1): >>42667032 #
28. adrian_b ◴[] No.42666711{3}[source]
Different platforms and environments are handled easily by Make "variables" (actually constants), which have platform-specific definitions, and which are sequestered into a platform-specific Makefile that contains only definitions.

Then the Makefiles that build a target file, e.g. executable or library, include the appropriate platform-specific Makefile, to get all the platform-specific definitions.

Most of my work is directed towards embedded computers with various architectures and operating systems, so multi-platform projects are the norm, not the exception.

A Makefile contains 3 kinds of lines: definitions, rules and targets (typical targets may be "all", "release", "debug", "clean" and so on).

I prefer to keep these in separate files. If you parametrize your rules and targets with enough Make variables to allow their customization for any environment and project, you must almost never touch the Makefiles with rules and targets. For each platform/environment, you write a file with appropriate definitions, like the names of the tools and their command-line options.

The simplest way to build a complex project is to build it in a directory with a subdirectory for each file that must be created. In the parent directory you put a Makefile that is the same for all projects, which just invokes all the Makefiles from the subdirectories that it finds below, passing any CLI options.

In the subdirectory for each generated file, you just put a minimal Makefile, with only a few lines, which includes the Makefiles with generic rules and targets and the Makefile with platform-specific definitions, adding the only information that is special for the generated file, i.e. what kind of file it is, e.g. executable, static library, dynamic library etc., a list of directories where to search for source files, the strings that should be passed to compilers for their include directory list and their preprocessor definition list, and optionally and infrequently you may override some Make definitions, e.g. for providing some special tool options, e.g. when you generate from a single source multiple object files.

29. icameron ◴[] No.42666717{3}[source]
That’s an insightful comment, but there is a whole universe of programmers who never have to directly work in C/C++ and are productive in safe languages that can’t segfault usually. Admittedly we are a little jealous of those elite bitcrashers who unlock the unbridled power of the computer with C++… but yeah a lot of day jobs pay the bills with C#, JavaScript, or Python and are considered programmers by the rest of the industry
replies(1): >>42667139 #
30. poincaredisk ◴[] No.42666987{3}[source]
I disagree. Make is - at it's simplest form - exactly a "simple plain shell script" for your tasks, with some very nice bonus features like dependency resolution.

Not the parent, bit I usually start with a two line makefile and add new commands/variables/rules when necessary.

31. ajross ◴[] No.42667032{4}[source]
To be clear: I'm not suggesting a time machine, and I'm not listing any particular set of skills everyone must have. I'm saying that excusing the lack of core job skills by citing immediate time pressure is a smell. It tells me that that someone probably won't ever learn weird stuff. And in software development, people who don't learn weird stuff end up in that 50% bucket posited upthread.
replies(2): >>42668260 #>>42672049 #
32. remus ◴[] No.42667133{3}[source]
You make it sound easy, but I think it's hard to know where to invest your learning time. For example, I could put some energy into getting better at shell scripting but realistically I don't write enough of it that it'll stick so for me I don't think it'd be a good use of time.

Perhaps in learning more shell scripting I have a breakthrough and realise I can do lots of things I couldn't before and overnight can do 10% more, but again it's not obvious in advance that this will happen.

replies(4): >>42667199 #>>42667752 #>>42669793 #>>42670902 #
33. kragen ◴[] No.42667136{5}[source]
Steam engines currently power most of the world's electrical grid. The main reason for this is that, completely contrary to what you said, they are more efficient and more reliable than diesel engines. (Electric motors of course are not a heat engine at all and so are not comparable.)

Steam engines used to be very inefficient, in part because the underlying thermodynamic principles were not understood, but also because learning to build safe ones (largely a question of metallurgy) took a long time. Does that mean that designing them before those principles were known was "not engineering"? That seems like obvious nonsense to me.

replies(1): >>42670147 #
34. kragen ◴[] No.42667139{4}[source]
Yeah, I write most things in Python or JavaScript because it's much more practical.
replies(1): >>42670219 #
35. kragen ◴[] No.42667199{4}[source]
I agree. And there's no infallible algorithm. I think there are some good heuristics, though:

- invest more of your time in learning more about the things you are currently finding useful than in things that sound like they could potentially be useful

- invest more of your time in learning skills that have been useful for a long time (C, Make) than in skills of more recent vintage (MobX, Kubernetes), because of the Lindy Effect

- invest more of your time in skills that are broadly applicable (algorithms, software design, Python, JavaScript) rather than narrowly applicable (PARI/GP, Interactive Brokers)

- invest your time in learning to use free software (FreeCAD, Godot, Postgres) rather than proprietary software (SolidWorks, Unity, Oracle), because sooner or later you will lose access to the proprietary stuff.

- be willing to try things that may not turn out to be useful, and give them up if they don't

- spend some time every day thinking about what you've been doing. Pull up a level and put it in perspective

replies(3): >>42670922 #>>42671161 #>>42674002 #
36. kragen ◴[] No.42667215{6}[source]
I know a lot of people have terrible jobs at profoundly dysfunctional companies. I've had those too. That situation doesn't improve unless you, as they say, have the serenity to accept the things you cannot change, the courage to change the things you can, and the wisdom to know the difference.

Not everyone has a position where they have the autonomy to spend a lot of effort on paying down technical debt, but some people do, and almost every programmer has a little.

I think it's important to keep in view both your personal incentive system (which your boss may be lying to you about) and the interests of the company.

replies(1): >>42667971 #
37. nrclark ◴[] No.42667376{3}[source]
(not the parent)

Make is - at its core - a tool for expressing and running short shell-scripts ("recipes", in Make parlance) with optional dependency relationships between each other.

Why would I want to spread out my build logic across a bunch of shell scripts that I have to stitch together, when Make is a nicely integrated solution to this exact problem?

replies(1): >>42668877 #
38. kccqzy ◴[] No.42667634{3}[source]
I feel like your working environment might be to blame: maybe your boss is too deadline-driven so that you have no time to learn; or maybe there is too much pressure to fix a certain number of tickets. I encourage you to find a better workplace that doesn't punish people who take the time to learn and improve themselves. This also keeps your skills up to date and is helpful in times of layoffs like right now.
39. prerok ◴[] No.42667658{3}[source]
Seriously? Yes, you should read the docs of every API you use and every tool you use.

I mean, it's sort of ok if you read somewhere how to use it and you use it in the same way, but I, for one, always check the docs and more often even the implementation to see what I can expect.

40. BurningFrog ◴[] No.42667752{4}[source]
One simple approach that the second, or at least third, time you deal with something, you invest time to learn it decently well. Then each time you come back to it, go a bit deeper.

This algorithm makes you learn the things you'll need quite well without having to understand and/or predict the future.

41. aulin ◴[] No.42667793{3}[source]
you don't have the time because you spend it bruteforcing solutions by trial and error instead of reading the manual and doing them right the first time
42. pdimitar ◴[] No.42667971{7}[source]
The serenity in question boils down to "I'll never make enough money to live peacefully and being able to take a two years sabbatical so let's just accept I'll be on the hamster wheel for life and I can never do anything about it".

No. I'll let my body wither and get spent before my spirit breaks. I refuse to just "accept" things. There's always something you can do.

BTW is that not what HN usually preaches? "Change your job to a better one" and all that generic motivational drivel [that's severely disconnected from reality]? Not throwing shade at you here in particular, just being a bit snarky for a minute. :)

RE: your final point, I lost the desire to keep view of both my personal and my company's incentive systems. Most "incentive systems" are basically "fall in line or GTFO".

Before you ask, I am working super hard to change my bubble and get a bit closer to yours. To say it's not easy would be so understated so as to compare the description of a lightning hit on you and you enduring the said lightning hit. But as said above, I am never giving up.

But... it's extremely difficult, man. Locality and your own marketing matter a lot, and when you have been focused on technical skills all your life and marketing is as foreign to you as are the musical notes of an alien civilization... it's difficult.

replies(1): >>42668158 #
43. pdimitar ◴[] No.42667991{7}[source]
Any golden MasterCards with $50M one-time limit you could offer for free? I can think of a few things to fix with those.

RE: unfixably broken, well, not necessarily in concept but de facto you are sadly correct. Most people resist even the provably good changes.

44. kragen ◴[] No.42668158{8}[source]
I can't recommend others follow my path. Some of the results have been pretty bad. Hopefully your path works out well. We all die in the end.
45. n_ary ◴[] No.42668260{5}[source]
> I'm saying that excusing the lack of core job skills by citing immediate time pressure is a smell. It tells me that that someone probably won't ever learn weird stuff. And in software development, people who don't learn weird stuff end up in that 50% bucket posited upthread.

Or the whole chain of work culture is bad and people do not have adequate down time or brain juice to pursue these. Additionally, how many do you want to learn? I have dealt with Makefile, then recently someone decided to introduce taskfile and then someone else wanted to use build.please and someone tried to rewrite a lot of CI pipelines using python because shell scripting is too arcane, while someone decided that CI were super slow and must be hosted on premises using their favorite system(was it now drone or whatever I forgot). Eventually, things become so many and chaotic, your brain learns to copy-paste what works and hope for the best as the tool you have spent time learning will be replaced in few months.

replies(3): >>42668493 #>>42668586 #>>42669874 #
46. foobarchu ◴[] No.42668316{4}[source]
I don't consider that an equal comparison. Obviously an engineer can never be omniscient and know things nobody else knows either. They can, and should, have an understanding of what they work with based on available state of the art, though.

If the steam engine was invented after those discoveries about steel, I would certainly hope it would be factored into the design (and perhaps used to make those early steam engines less prone to exploding).

replies(1): >>42670231 #
47. skydhash ◴[] No.42668493{6}[source]
A lot of these technologies share a common base that can be pretty small. Once you learn about Make and concepts like target, recipe, dependencies,... It'd be easier to learn Ansible or Github actions even though they don't solve the same problem. It's the same when learning programming language and whatever tools of the week. But that requires to spend a bit of effort to goes under the hood and understand the common abstractions instead of memorizing patterns and words.
48. ajross ◴[] No.42668586{6}[source]
> Or the whole chain of work culture is bad and people do not have adequate down time or brain juice to pursue these.

And... again, I have to say that that kind of statement is absolutely of a piece with the analysis upthread. Someone who demands a "work culture" that provides "down time" or "brain juice" to learn to write a makefile... just isn't going to learn to write a makefile.

I mean, I didn't learn make during "downtime". I learned it by hacking on stuff for fun. And honed the skills later on after having written some really terrible build integration for my first/second/whatever job: a task I ended up doing because I had already learned make.

It all feeds back. Skills are the components you use to make a career, it doesn't work if you expect to get the skills like compensation.

49. dwaltrip ◴[] No.42668877{4}[source]
Any modern attempts to do this better than make? I often write small “infra” bash scripts in my projects, maybe I could use a tool like that.
replies(3): >>42669142 #>>42671292 #>>42675659 #
50. klibertp ◴[] No.42668906{7}[source]
> For everything else, there's...

...the very definition of brokenness :D Not much of a (good) choice there...

51. TheTaytay ◴[] No.42669142{5}[source]
I haven’t used them yet, but I keep seeing people touting alternatives, and in particular “just”: https://github.com/casey/just

This is primarily aimed at a “task runner” replacement rather than a “compilation with automatic file timestamp comparison replacement”

Others I stumbled across: Taskfile Mage XcFile

None of them have tempted me enough to move away from a set of bash scripts or scripts written in the language of my repo (yet).

52. mianos ◴[] No.42669625{4}[source]
From my 40 years in the field, I see much the same trend. I wouldn’t call 5% of developers “genius”—maybe 1% are true geniuses. Those folks can be an order of magnitude better at certain tasks—doing things no one else can—but only within a limited sphere. They also bring their own baggage, like unique personalities. Still, I believe there’s always room for genius on a big team, even with all the complications.

Typically, upper management wants smooth, steady output. But the better your people are, the bumpier that output gets—and those “one-percenters” can produce some pretty extreme spikes. If you think of it like a graph, the area under the curve (the total productivity) can be way bigger for a spiky output than for a flat, low-level one. So even if those spikes look messy, they often deliver a ton of long-term value.

53. jppittma ◴[] No.42669793{4}[source]
If it’s a tool you use every day, it’s worth understanding on a deeper level. I’ve used the shell probably every day in my professional career, and knowing how to script has saved me and my team countless hours of tedious effort with super simple one liners.

The other thing that’s worth learning is that if you can find tools that everybody uses regularly, but nobody understands, then try to understand those, you can bring enormous value to your team/org.

54. jppittma ◴[] No.42669874{6}[source]
Skill issue. Time spent learning make will help you understand bazel etc.
55. PaulHoule ◴[] No.42670147{6}[source]
Steam engines are thoroughly obsolete in the developed world where there are natural gas pipeline networks.

People quit building coal burning power plants in North America at the same time they quit burning nuclear power plants for the same reason. The power density difference between gas turbines and steam turbines is enough that the capital cost difference is huge. It would be hard to afford steam turbines if the heat was free.

Granted people have been building pulverized coal burning power plants in places like China where they'd have to run efficient power plants on super-expensive LNG. They thought in the 1970s it might be cheaper to gasify coal and burn it in a gas turbine but it's one of those technologies that "just doesn't work".

Nuclear isn't going to be affordable unless they can perfect something like

https://www.powermag.com/what-are-supercritical-co2-power-cy...

If you count the cost of the steam turbine plus the steam generators plus the civil works to enclose those, nuclear just can't be competitive.

replies(2): >>42670237 #>>42670458 #
56. bluGill ◴[] No.42670219{5}[source]
Both have strong limits for writing complex code. Typescript is one attempt of an answer because bad as javascript is for large programs the web forces it. I prefer a million lines of c++ to 100k lines of python - but if 5k lines of python sill do them c++ is way too much overhead. (rust likely plays better than c++ for large problems from scratch but most large probles have existing answers and throwing something else in would be hard)
replies(2): >>42670464 #>>42681675 #
57. bluGill ◴[] No.42670231{5}[source]
A lot of material science was developed to make cannons not explode - that them went into making steam engines possible. The early steam engines introduced their own needed study of efficiency-
58. bluGill ◴[] No.42670237{7}[source]
Gas is often used to power the boilers because steam is so much better.
59. kragen ◴[] No.42670458{7}[source]
There is some truth in what you say. Though steam engines still power most of the power grid (especially in the "developed world") their capital costs are indeed too high to be economically competitive.

However, there are also some errors.

In 02022 24% of total US electrical power generation capacity was combined-cycle gas turbines (CCGT), https://www.eia.gov/todayinenergy/detail.php?id=54539 which run the exhaust from a gas turbine through a boiler to run a steam turbine, thus increasing the efficiency by 50–60%. So in fact a lot of gas turbines are installed together with a comparable-capacity steam turbine, even today.

Syngas is not a technology that "just doesn't work". It's been in wide use for over two centuries, though its use declined precipitously in the 20th century with the advent of those natural-gas pipeline networks. The efficiency of the process has improved by an order of magnitude since the old gasworks you see the ruins of in many industrial cities. As you say, though, that isn't enough to make IGCC plants economically competitive.

The thing that makes steam engines economically uncompetitive today is renewable energy. Specifically, the precipitous drop in the price of solar power plants, especially PV modules, which are down to €0.10 per peak watt except in the US, about 15% of their cost ten years ago. This combines with rapidly dropping prices for batteries and for power electronics to undercut even the capex of thermal power generation rather badly, even (as you say) if the heat was free, whereas typically the fuel is actually about half the cost. I don't really understand what the prospects are for dramatically cheaper steam turbines, but given that the technology is over a century old, it seems likely that its cost will continue to improve only slowly.

replies(1): >>42670784 #
60. kragen ◴[] No.42670464{6}[source]
I agree with all of this.
61. godelski ◴[] No.42670611[source]
Funny enough I'm an A̶I̶ML researcher and started in HPC (High Performance Computing).

  >  if I’m a C++ programmer ... should I know how the CPU registers are called?
Probably.

Especially with "low level"[0] languages knowing some basics about CPU operations goes a long way. You can definitely get away without knowing these things but this knowledge will reap rewards. This is true for a lot of system based information too. You should definitely know about things like SIMD, MIMD, etc because if you're writing anything in C/C++ these days it should be because you care a lot about performance. There's a lot of stuff that should be parallelized that isn't. Even stuff that could be trivially parallelized with OpenMP.

  > what should an AI researcher working always in Jupyter know?
Depends on what they're researching. But I do wish a lot more knew some OS basics. I see lots of things in papers where they're like "we got 10x" performance on some speed measurement but didn't actually measure it correctly (e.g. you can't use time.time and be accurate because there's lots of asynchronous operations). There's lots of easy pitfalls here that are not at all obvious and will look like they are working correctly. There's things about GPUs that should be known. Things about math and statistics. Things about networking. But this is a broad field so there are of course lots of answers here. I'd at least say anyone working on AI should read at least some text on cognitive science and neuroscience because that's a super common pitfall too.

I think it is easy to not recognize that information is helpful until after you have that information. So it becomes easy to put off as not important. You are right that it is really difficult to balance everything though but I'm not convinced this is the problem with those in that category of programmers. There's quite a number of people who insist that they "don't need to" learn things or insist certain knowledge isn't useful based on their "success."

IMO the key point is that you should always be improving. Easier said than done, but it should be the goal. At worst, I think we should push back on anyone insisting that we shouldn't be (I do not think you're suggesting this).

[0] Quotes because depends who you talk to. C++ historically was considered a high level language but then what is Python, Lua, etc?

62. PaulHoule ◴[] No.42670784{8}[source]
Yeah, and people are talking about renewables as if the storage is free. Or people quote case 17 out of

https://www.eia.gov/analysis/studies/powerplants/capitalcost...

as if 1.5 hours of storage was going to cut it. I've been looking for a detailed analysis of what the generation + storage + transmission costs of a reliable renewable grid is that's less than 20 years old covering a whole year and I haven't seen one yet.

replies(1): >>42670903 #
63. aragilar ◴[] No.42670902{4}[source]
There's always the option of asking coworkers/friends/mentors/etc. what they found useful.
replies(1): >>42670914 #
64. kragen ◴[] No.42670903{9}[source]
I haven't seen one either.

To be honest, I don't think anyone has any idea yet (other than crude upper bounds) because it depends a lot on things like how much demand response can help. Demand response doesn't have to mean "rolling blackouts"; it could mean "running the freezer during the day when electricity is free". Will people heat their houses in the winter with sand batteries? Will desiccant air conditioning pan out? Can nickel–iron batteries compete economically with BYD's gigafactories? What about sodium-ion? Nobody has any idea.

I was pleased to calculate recently that the EV transition, if it looks something like replacing each ICE vehicle with the BYD equivalent of a Tesla Model Y, would add several hours of distributed grid-scale storage, if car owners choose to sell it back to the grid. But that's still a far cry from what you need for a calm, cloudy week. Maybe HVDC will be the key, because it's never cloudy across all of China.

Sensible-heat seasonal thermal stores for domestic climate control (in some sense the most critical application) have been demonstrated to be economically feasible at the neighborhood scale. PCM or TCES could be an order of magnitude lower mass, but would the cost be low enough?

65. kragen ◴[] No.42670914{5}[source]
I'm embarrassed I didn't mention this. Yes, collective mental effort can be orders of magnitude better than individual.
66. skydhash ◴[] No.42670922{5}[source]
> spend some time every day thinking about what you've been doing. Pull up a level and put it in perspective

That's the most useful point. Sometimes just a few why-s can give you a clearer view on your activities and a direction for your efforts.

67. moregrist ◴[] No.42671161{5}[source]
> invest your time in learning to use free software (FreeCAD, Godot, Postgres) rather than proprietary software (SolidWorks, Unity, Oracle), because sooner or later you will lose access to the proprietary stuff.

The think you have a solid point with Postgres v Oracle and I haven’t followed game dev in a while, but your FreeCAD recommendation is so far from industry standard that I don’t think it’s good advice.

If you need to touch CAD design in a professional setting, learn SolidWorks or OnShape. They’re what every MechE I’ve ever worked with knows and uses, and they integrate product lifecycle aspects that FreeCAD does not.

replies(2): >>42672230 #>>42685694 #
68. jmb99 ◴[] No.42671235{6}[source]
We don’t have to assume, because we know. We can calculate and measure the efficiency of gasoline and diesel engines, and electric motors. We know that electric motors are highly efficient, and ICE engines are not.
replies(1): >>42672614 #
69. ori_b ◴[] No.42671248{3}[source]
Engineering is deciding that everything in your source tree will be designed to be similar, so you can have a small makefile.
70. ori_b ◴[] No.42671292{5}[source]
That depends. Why do think make does this poorly?
71. inkyoto ◴[] No.42671852{7}[source]
> For everything else, there's MasterCard.

I'm pretty sure that the original meme was «In God we trust; for everything else, there’s American Express» (with or without cocaine).

72. imtringued ◴[] No.42672049{5}[source]
You don't seem to understand that make is not a core skill.

In an ideal world build tools would just work and people wouldn't have to think about them at all.

replies(1): >>42672351 #
73. imtringued ◴[] No.42672054[source]
Sure, but this will require you to know how to tell the compiler to generate your Makefile header dependencies and if you end up making a mistake, this will cause silent failures.
74. kragen ◴[] No.42672230{6}[source]
You could have said the same thing about Postgres vs. Oracle 30 years ago. But it's true that these heuristics are not always in agreement.
75. computerfriend ◴[] No.42672351{6}[source]
Someone has to write the makefile.
76. BlueTemplar ◴[] No.42672614{7}[source]
We're not talking about efficiency, we are talking about engineering developing before science.
77. afiori ◴[] No.42672850[source]
Being able to set up things and truly understanding how they work are quite different imo.

I agree with the idea that a lot of productive app developers would not be able to set up a new project ex novo but often it is not about particularly true understanding but rather knowing the correct set of magic rules and incantations to make many tools work well together

78. mkrajnak ◴[] No.42674002{5}[source]
I agree. An additional perspective that I have found useful came from a presentation I saw by one of the pragmatic programmers.

They suggested thinking about investing in skills like financial investments. That is, investments run on a spectrum from low risk, low return to high risk, high return.

Low risk investments will almost always pay out, but the return is usually modest. Their example: C#

High risk investments often fail to return anything, but sometimes will yield large returns. Their example: Leaning a foreign language.

Some key ideas I took away:

- Diversify.

- Focus on low risk to stay gainfully employed.

- Put some effort into high risk, but keep expectations safe.

- Your mix may vary based on your appetite for risk.

79. nrclark ◴[] No.42675659{5}[source]
`Just` is also popular in this space, but tbh I think Make is a better choice.

Make is included in most Linux distros, including the ones available for WSL. It's also included with Apple's developer tools. It's been used for decades by millions of people, and is very mature.

If you use it in a simple way, Make is almost identical to `Just`. And when/if you want Make's powerful features, they're there and ready for you. Make's documentation is also exceptional.

I've used a bunch of these kinds of tools over the years, and I've never found one that I like more than Make.

80. znpy ◴[] No.42676540[source]
> They can be a bad choice for a 10,000-file monster

Whether they are a bad choice really depends on what are the alternatives though

81. retros3x ◴[] No.42681675{6}[source]
Many devs are never even coming into challenging projects like that. For a large part of the dev community its just simple webapps for most of their professional life.
replies(1): >>42685384 #
82. retros3x ◴[] No.42681706{3}[source]
The problem is that software is much more forgiving than real life engineering project. You can't build a skyscraper with duct tape. With software, especially the simple webapps most devs work on, you don't NEED good engineering skills to get it running. It will suck of course, but it will not fall apart immediately. So of course most "engineers" will go the path of least resistance and never leave the higher abstractions to dive deep in concrete fundamentals.
83. kragen ◴[] No.42685384{7}[source]
That was never a good strategy. Even before GPT-4 and Claude, those folks were losing their jobs to Wordpress, Google Sheets, Wix, SquareSpace, etc.
84. jononor ◴[] No.42685694{6}[source]
10 years ago people said the same about KiCAD and Blender :) No guarantee that FreeCAD will be the same - but you can be pretty confident it will never go away.

However, if the goal is to be a full time employee doing CAD, I would of course going with one of the most established tools. Potentially even get some certifications.