https://docs.carbon-lang.dev/docs/project/roadmap.html
What _is_ interesting is that I get the impression that Carbon is being workshopped with the C++ community, rather than the wider PLT community -- I worry that they won't benefit from the broader perspectives that'll help it avoid well-known warts elsewhere.
That all-consonant keyword always makes it seem like I'm reading Hungarian notation when reading Rust for instance. An other options I've seen for instance in Pony, "fun", is already an English word with a completely different meaning.
Even the "function" from Javascript seems fine to me.
Compatibility with C++ is fine, but so far it seems carbon's safety story is entirely a wishlist rather than anything yet. Seems like Carbon might be a more of a place to demonstrate features for C++ committees than a real language?
Personally I have hand it up to here with lousy programmingn languages that make it easy for me to write bugs.
In practice everybody just uses class, because who as the time to type the full keyword and signature declarations in C++ are already unwieldy as it is.
Basically there should be a 1.0 somehow towards the end of 2026.
https://github.com/carbon-language/carbon-lang/blob/trunk/do...
This is a talk from last year CppNorth, there should be one this year as well,
A "function" keyword often exists just to help the parser. C3, for example, to simply the parser of its language that's a superset of C, adds a "fn" keyword for this very purpose of disambiguation.
https://docs.carbon-lang.dev/docs/project/roadmap.html
Even on the submitted page, the oldest you could claim it represents is 2024. But I stand by my earlier remark. When linking to an active project's documentation or home page, unless it's to a specifically dated version of it, a date doesn't make sense. For instance, linking to something specific in Python 2.6 documentation, maybe add a date. But if it's just to python.org, it would be absurd to tag it with [1991].
I like the use of [] though, it reminds me of Scala, which I liked before they did the scala 3 fork.
If they can't get safety right at the design stage, they'll never get it right. We already have D and Zig in this space.
But then defining a type constructor itself still uses `()`, like `class UnsafeAllowDelete(T:! Concrete) { ... }`. It does seem somewhat inconsistent.
Such small things as using __ __ in python and small inconveniences (lua's 1 instead of 0) really has a lot of people, what do I say.. yea, polarized on this matter.
I don't even watch fireship anymore. I actively resist the urge to. There are some other better channels like typecraft or primagen or dreams of code and so many other enthusiasts, there is this one bash guy that I watch whose having fun in life doing side quests like going to gym and gardening and I am all for that too.
As to "getting it right" - things are not so simple. The emphasis on memory-safety soundness is based on some empirical hypotheses, some better founded than others, and it's unclear what "getting it right" means.
From a software correctness perspective, the road to sound memory safety is as follows: 1. We want to reduce the amount of costly bugs in software as cheaply as possible, 2. Memory unsafe operations are a common cause of many costly bugs, 3. Some or all memory bugs can be eliminated cheaply with sound language guarantees.
The problem is that 1. memory safety refers to several properties that don't all contribute equally to correctness (e.g. out-of-bounds access causes more serious bugs than use-after-free [1]), and 2. soundly guaranteeing different memory safety properties has different costs. It gets more complicated than that (e.g. there are also unsound techniques that have proven very effective to consider), but that's the overview.
It is, therefore, as of yet unclear which memory safety properties are worth it to soundly guarantee in the language, and the answer may depend on the language's other goals (and there must be other goals that are at least as important, because the empty language guarantees not only all memory safety properties but all (safety [2]) correctness properties, yet nobody uses it as it's useless, while a language like ATS can be used to write many useful programs, but few use it because it's just too costly to use well). The goal is always to find the right balance.
For example, Java soundly guarantees lack of use-after-free at the cost of increased memory footprint; that may be "getting it right" for some programs but not all. Rust soundly guarantees lack of use-after-free at the cost of imposing strong and elaborate typesystem constraints (that, as is often the case, are more constraining than the property they guarantee); that, too, may be "getting it right" for some programs, though not all. Zig guarantees lack of out-of-bounds access in a simple language at the cost of not guaranteeing lack of use-after-free, and that may also be "getting it right" for some programs but not all.
So what "getting it right" means always depends on constraints other than safety (Rust and Zig want to consume less memory than Java; Java and Zig want to be simpler than Rust; Java and Rust want to guarantee more memory safety properties than Zig). If Carbon wants to be more interoperable with C++ than Java, Rust, or Zig, then it will have to figure out what "getting it right" means for Carbon.
[1]: https://cwe.mitre.org/top25/archive/2024/2024_cwe_top25.html
[2]: https://en.wikipedia.org/wiki/Safety_and_liveness_properties
If it ever goes beyond that remains to be seen.
The Carbon team is the first to point out that anyone doing green field development should reach out to Rust or any managed language that fits the project scope.
The remaining people driving where the language goes have other priorities in mind like reflection.
The profiles that were supposed to be so much better than the Safe C++ proposal, none of them made it into C++26, and it remains to be seen if we ever will see a sensible preview implementation for C++29.
Carbon exists so that it's possible to migrate a large C++ code base, like Chrome, from C++ to something saner, incrementally.
The most important attribute of Carbon is not the specifics of the syntax but the fact that it's designed to be used in a mixed C++ / Carbon code base and comes with tooling to convert as much of C++ as possible to Carbon.
That's what makes Carbon different from any other language: D, Zig, Nim, Rust etc.
It's not possible to port a millions line C++ code base, like Chrome, to another language so large C++ projects are stuck with objectively pretty bad language and are forced to continue to use C++ even though a better language might exist.
That's why Carbon is designed for incremental adoption in large C++ projects: you can add Carbon code to existing C++ code and incrementally port C++ over to Carbon until only Carbon code exists.
Still a very large investment but at least possible and not dissimilar to refactoring to adopt newer C++ features like e.g. replacing use of std::string with std::string_view.
That's why it's a rational project for Google. Even though it's a large investment, it might pay off if they can write new software in Carbon instead of C++ and refactor old code into Carbon.
Carbon is not a programming language (sort of) - https://news.ycombinator.com/item?id=42983733 - Feb 2025 (97 comments)
Ask HN: How is the Carbon language going? - https://news.ycombinator.com/item?id=40480446 - May 2024 (1 comment)
Will Carbon Replace C++? - https://news.ycombinator.com/item?id=34957215 - Feb 2023 (321 comments)
Carbon Programming Language from Google - https://news.ycombinator.com/item?id=32250267 - July 2022 (1 comment)
Google Launches Carbon, an Experimental Replacement for C++ - https://news.ycombinator.com/item?id=32223270 - July 2022 (232 comments)
Carbon Language: An experimental successor to C++ - https://news.ycombinator.com/item?id=32151609 - July 2022 (504 comments)
Carbon: high level programming language that compiles to plain C - https://news.ycombinator.com/item?id=4676789 - Oct 2012 (39 comments)
Also, FWIW, it is very ergonomic for Nim to call C (though the reverse is made complex by GC'd types). { I believe similar can be said for other PLangs you mention, but I am not as sure. } It's barely an inconvenience. Parts of Nim's stdlib still use libc and many PLangs do that for at least system calls. You can also just convert C to Nim with the c2nim program, though usually that requires a lot of hand editing afterwards.
Maybe they should write a C++2carbon translator tool? That would speed things up for them. Maybe they already have and I just haven't heard of it? I mean the article does say "some level of source-to-source translation", but I couldn't find details/caveats poking around for a few minutes.
superficial details matter - people that stayed on C++ instead of transitioning to flashy new ones have type-before-name as part of programming identity
you can have all the features in the world (and be recognized by it), but if the code doesn't _look_ like C++, then it's of no interest
Mind you, I'm not saying that your solution doesn't work. Just that it doesn't work for the GP.
https://en.wikipedia.org/wiki/Generic_programming - Worth studying up on if you're unfamiliar with it.
I don't think it will reach the same distribution as other languages, as the niche is "large C++ projects, which want to transition to something else without rewrite" for anybody else there are a huge number of alternatives.
If WG21 were handling Rust instead f64 would implement Ord, and people would just write unsafe blocks with no explanation in the implementation of supposedly "safe" functions. Rust's technology doesn't care but their culture does.
Beyond that though, the profiles idea is dead in the water because it doesn't deliver composition. Rust's safety composes. Jim's safe Activity crate, Sarah's safe Animals crate and Dave's safe Networking crate compose to let me work with a safe IPv6-capable juggling donkey even though Jim, Sarah and Save have never met and had no idea I would try that.
A hypothetical C++ 29 type safe Activity module, combined with a thread safe Animals module, and a resource leak safe Networking module doesn't even get you something that will definitely work, let alone deliver any particular safety.
Some part of it want C++ to be Rust, with a focus on compile-time safety. Others take "C++" literally as "C with extra stuff" and value performance over safety.
Companies like Google are likely to be in the former camp, as for what they are doing, security is critical. Unsurprisingly, Carbon is a Google project.
Video game companies on the other hand are likely to be in the latter camp. Most of the times, security is not as critical, especially for offline games, and memory corruption usually don't go further than a game crash. Tight memory management however is critical, and it often involves raw pointers and custom allocation schemes.
Thus there is an opening for a faster language. And still for a safer one. And for an easier one to use. So all C++ has going for it is inertia. It's moribund unless the committee reconsider their stance on intentionally losing the performance competition.
That would either be a wholesale conversion or emitting a translation shim style thing at the boundary between legacy c++ and the new language.
I'm not sure Carbon is necessary to achieve such a conversion.
It means eliminating undefined behavior, and unplanned interaction between distant parts of the program.
I know you can compile C++ files to object files, pass them to the D compiler, and have them call eachothers' functions. I've never tried it though.
--------
g++ -c foo.cpp
dmd bar.d foo.o -L-lstdc++
--------
_Incrementally_: a C++ project can be incrementally made more sane also using constructs to avoid and constructs to use once the problem domain is confined. In my past, I had successfully implemented this quest for 3 different fairly large C++ projects. This is not a strong selling point for carbon.
Don't get me wrong - less undefined behaviour is better, but drawing a binary line between some and none makes for a convenient talking point, but isn't necessarily the sweet spot for the complicated and context-dependent series of tradeoffs that is software correctness.
Hypothetically you could importcpp fns, classes, etc when compiling with nim cpp
importcpp what you need. exportcpp for the other way around
Honestly, while I find the syntax terse, I welcome more low level languages able to push performance.
You could do this with Nim, Nim 2’s ARC model is compatible with c++’s RAII. Nim supports moves, destructors, copies, etc. see https://nim-lang.org/docs/destructors.html
You can import C++ classes, member functions, free functions, etc. easily with importcpp
importcpp for the code you are incrementally porting over. You could write a libclang script to do this for you. Exportcpp for what you any code that have been ported but have dependencies in C++ land.
My best guess is they want C++ compatibility and a new language due to preferences, more control over the compiler, etc. which are all valid reasons
function add(a: i32, b: i32): i32 {
return a + b;
}
Than the example you provided and it is approximately the same length. I used to arrow functions everywhere in TS/JS and it made it difficult to read IME, and there was zero benefit. They are find for things like event handlers, promises chains etc. But I'd rather just use function when I don't have to worry about the value of this.A major role that C plays today is being the common protocol all languages speak[0]. C++ can't fill this role, and neither can Rust.
There is a huge opportunity for some language to become the next common protocol, the common ABI, that all languages share in common.
(Maybe Rust could do this, but they haven't stabilized their ABI yet, and I don't the details.)
Maybe the page was updated recently, but there is a "why" link near the top:
https://docs.carbon-lang.dev/#why-build-carbon
What I would like to see is more documentation on the "why not" that summarizes why other languages and proposals are not sufficient. For example, Safe C++ proposal[1] appears to satisfy all requirements, but I can't find any reference to it.
It keeps adding keywords and it has become way harder to keep it in your head. It’s over 220 at this point. Don’t take my word for it, Swift creator doesn’t agree with its current direction either.
Second, I would be surprised if the static analyses in the tool are precise enough for real-world Zig programs. For example, it is undecidable to determine whether a function “takes ownership” of an argument pointer. In particular, if you want to avoid false negatives, the “free after transfer” case needs to be conservative, but then you almost certainly will flag false positives.
One other use case I could think of is gaming, where there is an incredible amount of load-bearing C++ code that's never realistically going to be rewritten, and strict memory safety is not necessarily a sine qua non in the way it is in other fields.
For all of C++'s faults, it is an extremely stable and vendor-independent language. The kind of organisation that's running on some C++ monolith from 1995 is not going to voluntarily let Apple become a massive business risk in return for marginally nicer DX.
(Yes, Swift is OSS now, but Apple pays the bills and sets the direction, and no one is seriously going to maintain a fork.)
But we need to get the language and interop into good shape to be able to thoroughly test and evaluate the migration.
FWIW, the biggest challenge with Safe C++ is that WG21 rejected[1] that direction. And it was developed without building a governance model or way to evolve outside of WG21, and so doesn't seem to have a credible path forward.
[1]: FWIW, some members of WG21 don't agree with this characterizationp, but both the author's impression and the practical effect was to reject the direction.
FWIW, we're working hard whenever looking at an aspect of the language to look at other languages beyond C++ and learn any and everything we can from them. Lots of our design proposals cite Swift, Rust, Go, TypeScript, Python, Kotlin, C#, Java, and even Scala.
I believe that getting WG21 to actually say "No" was very useful to have non-technical leadership people understand that C++ can't be the solution they need.
It would be nice if there was a somewhat higher level ABI that languages could use though. The C ABI is very low level and tedious.
I find it hard to trust Google to maintain any software nor to write software that is maintainable by a community. They write software for themselves and themselves alone.
You wouldn't get idiomatic code out but with some effort you'd get rust/d/c/other which clang compiles to the same IR as the original.
How much refactoring is warranted afterwards would depend on how much effort you put in to recreating templates / header files / modules etc on the fly.
I'm not sure I'd choose to do this myself if I was in Google's position but it would be tempting.
But Rust allows pattern matching on floats.
https://play.rust-lang.org/?version=stable&mode=debug&editio...
Rust Zulip is C++ WG21 confirmed?
the point of carbon is that you can incrementally migrate your c++ program to it in place, and the migrated code will end up easier to maintain than the original c++.
One good aspect about C++ is its backwards compatibility or stability. Also a drawback, but companies not having to spend huge amounts of time, expertise and money rewriting their whole codebases all the time is something they appreciate.
Rust is often somewhat stable, but not always.
https://internals.rust-lang.org/t/type-inference-breakage-in...
https://github.com/rust-lang/rust/issues/127343
300 comments on Github.
https://github.com/NixOS/nixpkgs/pull/332176
Rust has editions, but it's a feature that it will probably take years to really be able to evaluate.
What kind of compatibility story will Carbon have? What features does it have to support compatibility?
As to things ABI prevents:
- scoped_lock was added to not break ABI by modifying lock_guard
- int128_t has never been standardized because modifying intmax_t is an ABI break. Although if you ask me, intmax_t should just be deprecated.
- unique_ptr could fit in register with language modifications, which would be needed to make it zero-overhead, compared to a pointer
- Many changes to error_code were rejected because they would break ABI
- status_code raised ABI concerns
- A proposal to add a filter to recursive_directory_iterator was rejected because it was an ABI break
- A proposal to make most of <cstring> constexpr (including strlen) will probably die because it would be an ABI break.
- Adding UTF-8 support to regex is an ABI break
- Adding support for realloc or returning the allocated size is an ABI break for polymorphic allocators
- Making destructors implicitly virtual in polymorphic classes
- Return type of push_back could be improved with an ABI break
- Improving shared_ptr would be an ABI break
- [[no_unique_address]] could be inferred by the compiler should we not care at all about ABI
https://github.com/rust-lang/rust/issues/41620#issuecomment-...
https://github.com/rust-lang/rust/pull/84045#issuecomment-82...
Elaborating on this cross-talk, any academic taxonomy says reference counting is a kind of GC. { See, the subtitle or table of contents of Jones 1996 "Garbage Collection: Algorithms for Automatic Dynamic Memory Management", for example. } Maybe you & I (or Nim's --mm?) can personally get the abbreviation "AMM" to catch on? I doubt it, but we can hope!! :) Sometimes I think I should try more. Other times I give up.
Before the late 90s, people would say "tracing GC" or "reference counting GC" and just "GC" for the general idea, but somehow early JavaVM GC's (and their imitators) were so annoying to so many that "The GC" came to usually refer, not just to the abstract idea of AMM, but to the specific, concrete separate tracing GC thread(s). It's a bit like if "hash table" had come to mean only a "separately chained linked list" variant because that's what you need for delete-in-the-middle-of-iterating like C++ STL wants and then only even the specific STL realization to boot { only luckily that didn't happen }.
If it (purportedly?) exists so that Google can move multi-million line code bases from C++ to something better bit-by-bit, because it's otherwise infeasible to do so, why would Google drop it after they have ported the first million?
You can simply wait to see if Chrome adopts it.
I guess, some Mac apps? In that case I think most platform independent "guts" would be in C or C++, and the Obj-C++ part is tied to the frameworks, so the devs would have to rewrite it anyway.
So that's maybe a bad example. In the same way I think it's fine that "Structured programming" is about the need to use structured control flow, not the much later idea of structured concurrency even though taken today you might say they both have equal claim to this word "structured".
In contrast it is weird that people decided somehow "Object oriented" means the features Java has, rather than most of what OO was actually about when it was invented. I instinctively want to blame Bjarne Stroustrup but can't think of any evidence.
Printing as in the example from Carbon's Github repository, does not work. 'Print("Test");' gives a complaint about not finding 'Print'.
I can imagine the thought process behind the designers of the language went as follows:
"It's not possible to improve C++ without breaking backwards compatibility"
"That's correct, but if we're going to break backwards compatibility anyways, why not use this as an opportunity to change a bunch of things?"
aka the python 3 mentality, where necessary changes were combined with unnecessary changes that caused pointless migration costs. The fallacy is derived from the fact that breaking backwards compatibility is considered a massive fixed cost due to the fact that libraries have to be updated, therefore adding small incremental costs will not meaningfully increase overall cost. In reality the fixed cost of breaking backwards compatibility can be reduced massively if the proper care is taken, which means all the "just because" changes that were thrown in as a bonus, end up representing a much larger share of the migration cost than initially anticipated.
Not sure what to think of this one. Either one introduces a new keyword to opt out (not great), or all public destructors of an abstract base class are implicitly marked virtual (not great and another "hidden" language feature like threadsafe-statics).
After all, an abstract base class does not need its destructor to be public.
But D and C++ have just enough differences to make extern(C++) not be automatic. It can take some pretty arcane metaprogramming to get things to work, and some things are impossible.
It's also worth pointing out that D isn't trying to be fully compatible with C++.
Titus correctly predicts the committee's actual response and highlights its danger, summarised in the title. You can pick now, as Corentin desires. You can pick never, which Corentin despairs at - not unreasonably because it means you're giving up performance. But Titus highlights the third option, the committee can instead dither forever never having the courage to announce an ABI break but also lacking courage to declare absolute stability with the encouragement that brings for legacy systems.
ABI Now is - at least in performance - competing with Rust. An ABI swap can make some C++ have the same performance as the analogous Rust which wasn't possible with the old ABI and that matters for outfits like Google.
ABI Never is a different niche. Guaranteed ABI stability gives C++ certainty. It makes C++ a stronger contender for some applications where today it can't go and nor can Rust because people don't think "Just recompile" is a reasonable choice, whether they are correct or not.
ABI Dither is neither of these things. There is no certainty, just because the committee is dithering today doesn't mean they won't make a decision tomorrow, or next year. But meanwhile you're not competitive with the best in class alternatives
It might seem as though incrementing a signed integer past its maximum can't be as problematic as a use after free even though both are Undefined Behaviour, but nah, in practice in real C++ compilers today they can both result in remote code execution.
There is a place for Unspecified results, for example having it be unspecified whether a particular arithmetic operation rounds up or down may loosen things up enough that much faster machine code is generated and well, the numbers are broadly correct still. But that's not what Undefined behaviour does.
> https://internals.rust-lang.org/t/type-inference-breakage-in...
> https://github.com/rust-lang/rust/issues/127343
> 300 comments on Github.
> https://github.com/NixOS/nixpkgs/pull/332176
Might worth noting that this change technically doesn't violate Rust's stability guarantees since type inference changes and/or adding new impls are exempt. Of course, that doesn't really help with the question of whether this change should have been made in the given timeframe (as opposed to the socket struct change IIRC?), but that ship has long sailed.
Anyway, like the "major" modes of hash collision resolution, reference counted GC has also been around concurrently (haha) with ref tracing GC since the dawn of modern computing. Unix hard-links (& other things) codify ref counting into filesystems.. Python has always had ref-counted GC, older Lisp more focused on tracing GC, etc., etc. Popularity measures are notoriously difficult.
Mostly people like to abbreviate { like having a search $PATH instead of using /bin/foo everywhere }. The whole point of abstraction is to neglect details. Neglect naturally leads to forgetting (or never learning/knowing). Ignorance leads people to cross-talk (or worse willfully misinterpret/project). Cross-talk leads to suffering. Yoda out. ;-)
EDIT: Also, speaking of abbreviation & clarity, in Nim "arc" has, at least until this writing, always stood for Automatic Reference Counting, not Atomic Ref Counting as seems the more rusty terminology and is vaguely suggested by @miguel_martin, to whom I originally replied with an "arc/atomicArc", though it seems like, in Nim 3, it may become both Automatic & Atomic, but probably not changing its abbreviation to "AARC".
[1] https://github.com/carbon-language/carbon-lang/?tab=readme-o...
Technically speaking the clauses on either side of the "or" aren't mutually exclusive. You can have a "full, correct, fully compliant, reference implementation" that is also a closed-source implementation!
Well, unless the implication that Circle isn't "full, correct, [and] fully compliant", in which case I feel I should ask "with respect to what?" and "why do you need those requirements?"
Like, Python and Javascript both have many "implementations", and those are some of the most popular languages. Python does not have an ISO specification. But Javascript does have an Ecma standard, ECMAScript.
Rust is getting another implementation in the form of gccrs. And there is work on a specification for Rust https://rustfoundation.org/media/ferrous-systems-donates-fer... . Arguably not a standard, but still helpful.
Eh, bit of a mixed bag, I think, depending on the context in which the words are used. "Circle" can refer to the compiler/toolchain or the set of C++ extensions the compiler implements, whereas Safe C++ is either the proposal or the extensions the proposal describe. As a result, you can say that you can compile Safe C++ using Circle, and you can also describe Safe C++ as a subset of the Circle extensions. I wouldn't exactly describe the lines as well-defined, for what it's worth.
> There are presumably differences between them, and I do not know what those differences are, and I do not know if those differences were documented somewhere.
They're sort of documented indirectly, as far as I can tell. Compare the features in the Safe C++ proposal and the features described in the Circle readme [0]. That'll get you an approximation at least, albeit somewhat shaded by the old docs (understandable given the one-man show).
> I cannot find any occurrences of "reference implementation" in the Safe C++ draft.
The exact words "reference implementation" may not show up, but I think this bit qualifies (emphasis added):
> Everything in this proposal took about 18 months to design and implement in Circle.
[0]: https://github.com/seanbaxter/circle/blob/master/new-circle/...
as for zig being specified, well, it's pre 1.0, and the authors have I believe, specifically called out specification as being "the first priority after 1.0".
In addition, as I mentioned I don't think this is the first time Rust has had to navigate this kind of wide-ranging technically-allowed-but-still-breaking change. The Rust devs first created a PR to change its internal representation for IP addresses in November 2020 [0], but multiple major libraries (including mio, which tokio depends on) incorrectly assumed that the representation for Rust's IP address type was the same as libc's representation and basically type punned between the two, so the change would result in UB. The Rust devs could have pushed out the change anyways, as the change didn't violate the backwards compatibility guarantee due to just being an internal implementation detail change, but the PR didn't actually land until July 2022 [1] because the Rust devs wanted to give the ecosystem time to migrate.
More discussion at [2].
[0]: https://github.com/rust-lang/rust/pull/78802
[1]: https://github.com/rust-lang/rust/pull/78802#event-709670882...
[2]: https://old.reddit.com/r/rust/comments/wcw93o/a_major_refact...
I have done C++ for a living and it is not the easiest but there is tooling and warnings as errors that catch a lot of the errors before even you make a mistake.
It is true that packaging is more challenging but it is also true that it is very configurable ro squeeze performance as much as possible (which is on of C++'s niches). And by squeezing I mean beyond setting a release build. You could for example decide to go with LTO + PGO + remove position independent code and do static linking for all dependencies, for example.
You can do virtually anything that no other language can do and whwn you need it, believe me it is useful.
But you can still code every day code wirh your lambdas, ranges, smart pointers and virtual interfaces.
I understand C++ has some baggage but is is very far from being an "objectively bad language" in ly opinion. More so if you take into account its performance and library availability, which is second to none for almost any task, except maybe for the typical enterprise-like Java app or web stuff, byt niw C++26 will include reflection and annotations, so this could be a game changer.
At the same time, I was affected by this breakage, and it took me all of ten minutes to fix. So I both understand the outrage, and agree with it in general, but also, it was a tad overblown, I think.
Should they have done a slower rollout, like the IpAddr change? Probably. Is it the end of the world that they made a mistake? Nah. But if it happens more often, that's cause for concern.
In particular, the committee’s unwillingness to make ABI-breaking changes to the language, or more abstractly, to consider the needs of organizations with huge active code bases at least as seriously as those with huge legacy code bases.
Furthermore, an unbounded blast radius isn't itself the direct problem. A bug that with some probability casues your program to crash and your disk to be deleted is far less dangerous than a bug that allows a remote attacker to relatively easily steal all your secrets. UBs also differ on that front.
And again, virtually all programs are not provably without UB. For example, a Java program still interacts with an OS or with some native library that might suffer from a UB. So clearly we do tolerate some probability of UB, and we clearly do not think that eliminating any possibility of UB is worth any price.
When a program is just code on the screen, it's just a mathematical object, and then it's easy to describe a UB - the loss of all program meaning - as the most catastrophic outcome. But software correctness goes beyond the relatively simple world of programming language semantics, and has to consider what happens when a program is running, at which point it is no longer a mathematical object but a physical one. If a remote attacker steals all our secrets, we don't care if it's a result of some bug in the program itself (due to UB or otherwise), in other software the program interacts with, some fault or weakness in the hardware, or human operator error. The probability of any of these things is never zero, and we have to balance the cost of addressing each of these possibilities.
To give an example in the context of Carbon, we know that old code tends to suffer from fewer severe bugs than new code. So, if we want to reduce the probability of bugs, it is possible that it may be more worthwhile to invest - say, in terms of language complexity budget - in interop with C++ code than in eliminating every possible kind of UB, including those that are less likely to appear, sneak past testing, and cause an easily exploitable vulnerability.
Nah, this is in the context of C++ so UB isn't the most catastrophic situation. Undefined Behaviour is a behaviour, which means we might well be able to avoid invoking the behaviour and then it's fine. For example if your C++ program has a use-after-free in the code invoked only by the "Add file" feature but otherwise works fine, we can just ensure operators never use "Add file" and the UB, no matter what it might be, won't happen.
C++ has IFNDR, clauses which say this is "Ill-Formed, No Diagnostic Required". The program itself wasn't actually a valid C++ program, it has absolutely no defined meaning, there may be no indication of a problem from your compiler but alas it's entirely meaningless. This isn't behavioural, it's an invisible status of the entire program.
- Our newsletter / announcements via GitHub Discussions[1] email[2] or RSS[3]
- The "Last Week in Carbon" posts via GitHub[4] or RSS[5]
- The Discord server: https://discord.gg/ZjVdShJDAs
[1]: https://github.com/carbon-language/carbon-lang/discussions/c...
[2]: https://groups.google.com/a/carbon-lang.dev/g/announce
[3]: https://github.com/carbon-language/carbon-lang/discussions/c...
[4]: https://github.com/carbon-language/carbon-lang/discussions/c...
[5]: https://github.com/carbon-language/carbon-lang/discussions/c...
If you want, you can think of UB like a mathematical singularity in some physical theory. The theory of the language has nothing to say about what happens in such a program. But that doesn't mean that we can't reasonably talk about what happens not using that theory [1]. Indeed, one of the reasons that UB are of concern is that some of them are frequent causes of security exploits - and that's the thing we ultimately care about, not that the program loses its semantics. But again, not all of them are equally common causes of such outcomes, and not all of them are equally hard to avoid in the first place.
[1]: In fact, this is easy to explain in software: the programming language can say nothing about the meaning of a program with UB - indeed, it has no meaning in that language - but because we do have an executable, the program still has a well-defined meaning in the machine language it compiled to (machine language has no UB, or, even if some machine architecture does declare that some instruction stream is UB, most programs with UB in some programming language still do not compile to a program with UB in machine code). So the program that has no meaning in C++ still has meaning in machine code, and as that is the program we ultimately run and care about, we can talk about which UBs are more or less likely to result in which machine code behaviour.
Maybe a hybrid approach should be done byt breaking ABI in a so Core-supported language can cascade into so many places.
People say that Rust is great bc of that but that is just a trade-off and anyway there are olenty of dependencies like Boost, Abseil or others that can play that role anyways
UB is a behaviour, it's unbounded, so it's an immediate disaster, and "time travel" UB can make this harder to reason about, because the as-if rule can mean that although it didn't in some sense "happen" yet the behaviour has consequences earlier. But if we avert the behaviour it won't happen. It is not correct to say that UB means the entire program had no meaning.
You give the "mathematical singularity" analogy, consider division. We doubtless agree than 6 divided by 3 is 2. And 6 divided by 2 is 3. But how about 6 divided by 0? This is not defined, we cannot perform such an operation. But division is not as a result somehow entirely without meaning, it just has this well understood limitation. Likewise for software with UB that we can avert.
IFNDR is a catastrophe because it truly does render the entire software without meaning.
You're right that incrementally rewriting isn't much of an advantage over C++ itself, but I think you're missing the point that the emphasis on "incremental" is to highlight the advantage in rewriting C++ code in Carbon over alternatives that don't provide as much compatibility (with Rust being somewhat notorious for being suggested by outside parties as a target for rewriting in whenever discussions about C++ codebases happen). The argument for Carbon over C++ isn't specifically that it can be rewritten incrementally, but that it's just a better language, which has benefits _after_ the rewrite. To be clear, I'm sure that someone could come up with reasonable objections to that claim as well, but I think it's distinct from the part you're objecting to, and it's worth treating as a separate concern.
In the language. I.e. the language assigns no meaning to source the program, which is, indeed, the "catastrophic" impact of UB (or IFNDR) within the theory of the language. But since a running program takes the form of an executable, and the executable always has a defined meaning in another language (machine code), while C++ has nothing to say about what such programs do (i.e. that's the end of the helpfulness of that theory) that doesn't mean we can't talk about or care about the meaning of the executable.
An executable that crashes and an executable that leaks all your secrets have very different consequences, and while the C++ spec says absolutely nothing about the relationship between different UBs and these behaviours, that doesn't mean that these relationships don't exist.
A mathematical singularity in a physical theory means that that particular theory has nothing to say about the physics of that situation, not that there's no actual physics going on, and the physics that is actually going on could be more or less "catastrophic" depending on what we mean by that.
I'm not sure about the second sentence either? Circle (supposedly?) implements everything in the Safe C++ proposal, so in that respect Safe C++ exists. Alternatively, you can say Safe C++ doesn't exist because major compilers don't implement it, but that's kind of the point of the Safe C++ proposal (and many (most?) other C++ language proposals, for that matter) - it's describing new features that don't currently exist but might be worth adding to the standard.
> people might be interested in alternatives that they expect might be available sooner.
This is also a bit funny because this was one of the more contentious points of debate in the Safe C++ vs. profiles discussion, and the impression I got is that between the two Safe C++ was generally considered to be closer to "might be available sooner" than profiles.
> and I think it would be fair to consider alternatives at this point
I would assume those who could consider alternatives already have, and that those (still) interested in safe(r) C++ do so because the alternatives are insufficient for one reason or another.