←back to thread

In Defense of C++

(dayvster.com)
185 points todsacerdoti | 2 comments | | HN request time: 0.001s | source
Show context
fouronnes3 ◴[] No.45268145[source]
This is a good article but it only scratches the surface, as is always the case when it comes to C++.

When I made a meme about C++ [1] I was purposeful in choosing the iceberg format. To me it's not quite satisfying to say that C++ is merely complex or vast. A more fitting word would be "arcane", "monumental" or "titanic" (get it?). There's a specific feeling you get when you're trying to understand what the hell is an xvalue, why std::move doesn't move or why std::remove doesn't remove.

The Forest Gump C++ is another meme that captures this feeling very well (not by me) [2].

What it comes down to is developer experience (DX), and C++ has a terrible one. Down to syntax and all the way up to package management a C++ developper feels stuck to a time before they were born. At least we have a lot of time to think about all that while our code compiles. But that might just be the price for all the power it gives you.

[1] https://victorpoughon.github.io/cppiceberg/

[2] https://mikelui.io/img/c++_init_forest.gif

replies(4): >>45268251 #>>45271645 #>>45273605 #>>45277835 #
jandrese ◴[] No.45268251[source]
In Linuxland you at least have pkg-config to help with package management. It's not perfect but neither is any other package management solution.

If I'm writing a small utility or something the Makefile typically looks something like this:

    CC=clang
    PACKAGES=libcurl libturbojpeg
    CFLAGS=-Wall -pedantic --std=gnu17 -g $(shell pkg-config --cflags $(PACKAGES))
    LDLIBS=$(shell pkg-config --libs $(PACKAGES))

    ALL: imagerunner

    imagerunner: imagerunner.o image_decoder.o downloader.o
replies(1): >>45268768 #
duped ◴[] No.45268768[source]
Consider that to do this you must:

- Use a build system like make, you can't just `c++ build`

- Understand that C++ compilers by default have no idea where most things are, you have to tell them exactly where to search

- Use an external tool that's not your build system or compiler to actually inform the compiler what those search paths are

- Oh also understand the compiler doesn't actually output what you want, you also need a linker

- That linker also doesn't know where to find things, so you need the external tool to use it

- Oh and you still have to use a package manager to install those dependencies to work with pkg-config, and it will install them globally. If you want to use it in different projects you better hope you're ok with them all sharing the same version.

Now you can see why things like IDEs became default tools for teaching students how to write C and C++, because there's no "open a text editor and then `c++ build file.cpp` to get output" for anything except hello world examples.

replies(7): >>45269381 #>>45270904 #>>45270987 #>>45271280 #>>45273544 #>>45277859 #>>45343902 #
1. jandrese ◴[] No.45277859[source]
You also don't do "rustc build". Cargo is a build system too.

The whole point of pkg-config is to tell the compiler where those packages are.

I mean yeah, that's the point of having a tool like that. It's fine that the compiler doesn't know that, because its job is turning source into executables, not being the OS glue.

I'm not sure "having a linker" is a weakness? What are talking about?

It is true that you need to use the package manager to install the dependencies. This is more effort than having a package manager download them for you automatically, but on the other hand you don't end up in a situation where you need virtual environments for every application because they've all downloaded slightly different versions of the same packages. It's a bit of a philosophical argument as to what is the better solution.

The argument that it is too hard for students seems a bit overblown. The instructions for getting this up and running are:

    1. apt install build-essential
    2. extract the example files (Makefile and c file), cd into the directory
    3. type "make"
    4. run your program with ./programname
I'd argue that is fewer steps than setting up almost any IDE. The Makefile is 6 lines and is easy to adapt to any similar size project. The only major weakness is headers, in which case you can do something like:

    HEADERS=headerA.h headerB.h headerC.h

    file1.o: $(HEADERS)
    file2.o: $(HEADERS)
    file3.o: $(HEADERS)
If you change any header it will trigger a full system rebuild, but on C projects this is fine for a long time. It's just annoying that you have to create a new entry for every c file you add to the project instead of being able to tell make to add that to every object automatically. I suspect there is a very arcane way to do this, but I try to keep it as simple as possible.
replies(1): >>45278648 #
2. steveklabnik ◴[] No.45278648[source]
I'm not your parent, but the overall point of this kind of thing is that all of these individual steps are more annoying and error-prone than one command that just takes care of it. `cargo build` is all you need to build the vast majority of Rust projects. No need to edit the Makefile for those headers, or remember which commands you need to install the various dependencies, and name them individually, figuring out which name maps to your distro's naming scheme, etc. It's not just "one command vs five" it's "one command for every project vs five commands that differ slightly per project and per platform". `make` can come close to this, and it's why people love `./configure; make`, and there's no inherent reason why this couldn't paper over some more differences to make it near universal, but that still only gets you Unix platforms.

> but on the other hand you don't end up in a situation where you need virtual environments for every application because they've all downloaded slightly different versions of the same packages.

The real downside here is that if you need two different programs with two different versions of packages, you're stuck. This is often mitigated by things like foo vs foo2, but I have been in a situation where two projects both rely on different versions of foo2, and cannot be unified. The per-project dependency strategy handles this with ease, the global strategy cannot.