The analogy of OS as cars (Windows is a station wagon, Linux is a tank) is brought up in the recent Acquired episode on Microsoft, where Vista was a Dodge Viper but Windows 7 was a Toyota Camry, which is what users actually wanted.
The analogy of OS as cars (Windows is a station wagon, Linux is a tank) is brought up in the recent Acquired episode on Microsoft, where Vista was a Dodge Viper but Windows 7 was a Toyota Camry, which is what users actually wanted.
Putting it in a browser window gives it bad odds. You can also listen to it:
Begins at 01:30, 25 minutes.
In his book The Life of the Cosmos, which everyone should read, Lee Smolin gives the best description I've ever read of how our universe emerged from an uncannily precise balancing of different fundamental constants. The mass of the proton, the strength of gravity, the range of the weak nuclear force, and a few dozen other fundamental constants completely determine what sort of universe will emerge from a Big Bang. If these values had been even slightly different, the universe would have been a vast ocean of tepid gas or a hot knot of plasma or some other basically uninteresting thing--a dud, in other words. The only way to get a universe that's not a dud--that has stars, heavy elements, planets, and life--is to get the basic numbers just right. If there were some machine, somewhere, that could spit out universes with randomly chosen values for their fundamental constants, then for every universe like ours it would produce 10^229 duds.
Though I haven't sat down and run the numbers on it, to me this seems comparable to the probability of making a Unix computer do something useful by logging into a tty and typing in command lines when you have forgotten all of the little options and keywords. Every time your right pinky slams that ENTER key, you are making another try. In some cases the operating system does nothing. In other cases it wipes out all of your files. In most cases it just gives you an error message. In other words, you get many duds. But sometimes, if you have it all just right, the computer grinds away for a while and then produces something like emacs. It actually generates complexity, which is Smolin's criterion for interestingness.
Not only that, but it's beginning to look as if, once you get below a certain size--way below the level of quarks, down into the realm of string theory--the universe can't be described very well by physics as it has been practiced since the days of Newton. If you look at a small enough scale, you see processes that look almost computational in nature.
I think that the message is very clear here: somewhere outside of and beyond our universe is an operating system, coded up over incalculable spans of time by some kind of hacker-demiurge. The cosmic operating system uses a command-line interface. It runs on something like a teletype, with lots of noise and heat; punched-out bits flutter down into its hopper like drifting stars. The demiurge sits at his teletype, pounding out one command line after another, specifying the values of fundamental constants of physics:
universe -G 6.672e-11 -e 1.602e-19 -h 6.626e-34 -protonmass 1.673e-27....
and when he's finished typing out the command line, his right pinky hesitates above the ENTER key for an aeon or two, wondering what's going to happen; then down it comes--and the WHACK you hear is another Big Bang.
We should move to the Command Table.
As a conceptual framework http://augmentingcognition.com/assets/Kay1977.pdf
Whenever I see new coders struggle, it usually is because they:
- Don't know the context of what they are executing
- Don't know about the concept of input and output
On the command line, the context is obvious. You are in the context. The working dir, the environment, everything is the same for you as it is for the thing you execute via ./mything.py.Input and output are also obvious. Input is what you type, output is what you see. Using pipes to redirect it comes naturally.
Not being natively connected to context, input and output is often at the core of problems I see even senior programmers struggle with.
CorelDraw feels more efficient because one quickly has what looks like a beautiful, colorful house on the screen. And then one does not understand why the doors don't work correctly.
If one starts with GUIs and doesn't really understand what is behind, then all kinds of trouble happen.
So I guess, as with any tool, understanding is key.
Bullhorn: "But if you accept one of our free tanks we will send volunteers to your house to fix it for free while you sleep!"
Did Linux distros actually offer support at some point? (By what I assume would be some project contributor ssh-ing into your machine)
My impression was always the arguments were more like "Well yes, but we have this literal building full of technical manuals that describe every bolt and screw of your tank - and we can give you a copy of all of them for free! And think about it - after you have taken some modest effort to read and learn all of them by heart, you'll be able to fix and even modify your tank all on your own! No more dependence on crooked car dealers! And if you need help, we have monthly community meetups you can attend and talk with people just as tank-crazy as you are! (please only attend if you're sufficiently tank-crazy, and PLEASE only after you read the manuals)"
(This was decades ago, the situation has gotten significantly better today)
While I share the opinion that the command line is _the one true way_ of computing, I don't think this is really all that true. Computers are alien. GUIs are alien. CLIs are alien. Everything is learned. Everything is experience. Everything is culture. Learning, experience, and culture blind us from the experience of the novice. This is "expert blindness".
Why Western Designs Fail in Developing Countries https://youtu.be/CGRtyxEpoGg
https://scholarslab.lib.virginia.edu/blog/novice-struggles-a...
I think Windows very much wanted to be something different with COM. Instead of starting a shell in the context of the program, you'd connect some external "shell" into the very object graph if your program to inspect it. It turns out to be very difficult, and Windows has largely retreated to commandline centric architecture, but I think there was some essential attempt at foundational innovation there.
I would argue that the commandline has very much proven to be the best trade-off between being useful and being simple, but there is no saying if there exists some alternative.
I don't think that was the intended implication. I think the analogy is more akin to: "If send us a bug report, we'll fix it and ship a new version that you can download and use for free." In the olden days, you'd have to buy a new version of commercial software if it didn't work for your machine, complementary patches were rare.
That link is only part 1 (of 7). It's still around 2 and a bit hours of listening in total. https://www.youtube.com/@robertreads4323/videos
But CLI contexts are only obvious if the computer user is already familiar with the CLI which biases the learned mind to perceive things as obvious when they really are not.
A lot of CLI commands syntax are based on position instead of explicit argument names.
E.g. creating file system links via CLI has opposite syntax positions in Linux vs Windows:
- Linux: ln srcfile targetlink
- Windows : mklink targetlink srcfile
If one goes back & forth between the 2 operating systems, it's easy to mis-type the wrong syntax because the CLI doesn't make it obvious. On the other hand, using a GUI file browser like Ubuntu Nautlius or Windows Explorer lets a novice create links ("shortcuts") without memorizing CLI syntax.This gap of knowledge is also why there is copy&paste cargo-culting of inscrutable ffmpeg, git, rsync, etc commands.
E.g. using ffmpeg to covert a DVD to mp4 by manually concatenating *.VOB files has very cumbersome and error-prone syntax. It's easier to use a visual GUI like Handbrake to click and choose the specific title/chapters to convert.
CLI vs GUI superiority depends on the task and the knowledge level of the user.
Yes, command line suffers from discoverability of which different applications (such as ln/mklink) may not be consistent.
It is one of the bigger problems (imho) of the cli but it doesn't go against GPs point.
The command line does have a learning curve (partly because of the above), but it is also quite rewarding.
Wow, that seems quite fundamental. Computing 101.
I'm not a "coder" and I spend "99%" of time on the command line. Because I prefer it. Ever since the 80s when I first used a VAX.
wget -O - https://web.stanford.edu/class/cs81n/command.txt | nroff | less
Just in userspace you have;
dmesg -w
tail -f /var/log/messages
There's also dbus to monitor on Linux systems and a lot of kernel hook
tricks you can use to get a message pop up if an event happens.Because it gets annoying to have a process splurge notification stuff to a term you are working in, that's why you have info-bars which many terminal emulators support.
Outside of the syntax (which seems to live forever), you have things like non-sane defaults, obscurantist man pages ... the list goes on.
But this is precisely the same as what is lost in the transition from the
command-line interface to the GUI.
Why are we rejecting explicit word-based interfaces, and embracing
graphical or sensorial ones--a trend that accounts for the success of both
Microsoft and Disney?
But we have lost touch with those intellectuals, and with anything like
intellectualism, even to the point of not reading books any more, though we
are literate.
Elsewhere [0] I have called this concept "post-literacy," and this theme pervades much of Stephenson's work - highly technologically advanced societies outfitted with molecular assemblers and metaverses, populated by illiterate masses who mostly get by through the use of pictographs and hieroglyphic languages (emoji, anyone?). Literacy is for the monks who, cloistered away in their monasteries, still scribble ink scratchings on dead trees and ponder "useless" philosophical quandaries.The structure of modern audiovisual media lends itself to the immediate application of implicit bias. On IRC, in the days of 56k before bandwidth and computer networks had developed to the point of being able to deliver low-latency, high definition audio and video, perhaps even for "real-time" videoconferencing, most of your interactions with others online was mediated through the written word. Nowhere here, unless some party chooses to disclose it, do race, gender, accent, physical appearance, or otherwise, enter into the picture and possibly cloud your judgment of who a person is - or, more importantly, the weight of their words, and whether or not they are correct, or at least insightful; consider Turing's "Computing Machinery and Intelligence" paper which first introduced what is now called the "Turing test," and how it was designed to be conducted purely over textual media as a written conversation, so as to avoid influencing through other channels the interrogator's judgment of who is the man, and who is the machine.
The only real problem is that anyone who has no culture, other than this
global monoculture, is completely screwed. Anyone who grows up watching TV,
never sees any religion or philosophy, is raised in an atmosphere of moral
relativism, learns about civics from watching bimbo eruptions on network TV
news, and attends a university where postmodernists vie to outdo each other
in demolishing traditional notions of truth and quality, is going to come
out into the world as one pretty feckless human being.
Moreover, the confusion of symbols for reality, the precession of digitized, audiovisual content from a mere representation to more-than-real, digital hyperreality (since truth and God are all dead and everything is merely a consensual societal hallucination), leads people to mistake pixels on a screen for actual objects; narrative and spin for truth; influencers, videos, and YouTube personalities for actual people; or words from ChatGPT as real wisdom and insight - much in the same way that Searle's so-called "Chinese room" masquerades as an actual native speaker of Mandarin or Cantonese: "What we're really buying is a system of metaphors. And--much more important--what we're buying into is the underlying assumption that metaphors are a good way to deal with the world." So many ignorant people could be dangerous if they got pointed in the wrong
direction, and so we've evolved a popular culture that is (a) almost
unbelievably infectious and (b) neuters every person who gets infected by
it, by rendering them unwilling to make judgments and incapable of taking
stands.
It simply is the case that we are way too busy, nowadays, to comprehend
everything in detail.
The structure of modern short-form, upvote-driven media, lends itself to the production of short-form messages and takes with laughably small upper bounds on the amount of information they can contain. In a manner reminiscent of "you are what you eat," you think similarly to the forms of media you consume - and one who consumes primarily short-form media will produce short-form thoughts bereft of nuance and critical thinking, and additionally suffer from all the deficits in attention span we have heard of as the next braindead 10-second short or reel robs you of your concentration, and the next, and the next...Beyond the infectious slot machine-like dopamine gratification of the pull-to-refresh and the infinite doomscroll, the downvote has become a frighteningly effective means of squashing heterodoxy and dissent; it is only those messages that are approved of and given assent to by the masses that become visible on the medium. Those who take a principled stand are immediately squashed down by the downvote mob, or worse, suffer from severe censure and invective at the hands of those zealous enforcers of orthodoxy. The downvote mechanism is reminiscent of the three "filters" Chomsky wrote of when he was discussing the mass media in "Manufacturing Consent," and the way advertisers, government, and capital all control and filter what content is disseminated to media consumers.
The message of modern, audiovisual, short-form, upvote-driven social media is bias and group compliance bereft of nuance. If you want to produce and consume novel ideas you are better served by media based on the written word.
Evidently the financial capitalism have worked for a certain period but does not work anymore. So, why keeping committing suicide? We have started to suicide with the WWI. We kept going with WWII and we continue now.
We are still the leader for IT, and we know what does it work, the classic FLOSS/at least open IT model, the one where some sell iron not bits, the one where customers own their systems and bend them as they wish, the one where communication is not owned by some walled gardens but open like on Usenet, classic mails (as opposite to webmails, hiding the decentralized ability for most users who do not own the WebMUA). To continue with the China comparison I've recently bought some battery tools, very cheap crap but enough for domestic usage and I've observed that batteries have a standard connector, I can swap them from different brands issueless, batteries, chargers are "standard". I also own some high end domestic battery tools, who happen to have a damn nfc tag inside the battery tying it to the device, even if inside the battery are classic connected li-ion batteries. The same I observed for BEV, some friends and I have three Chinese BEV from different brands/models and they have a gazillion of common parts. So to say "open/standard pay back" yes, it might erode SOME OEMs margins, but pay back the society at a whole and as a result the OEM itself. The same is valid in software terms. My desktop is Emacs/EXWM, I use as a search&narrow framework consult/orderless/vertico, they simply "plug in" any other packages because the system is a unique application end-user programmable at runtime. You do not need to "specifically support" a package to integrate it. You do not need third party to explicitly support your package to use it together. And what we do instead? Our best to create walled gardens, we have had XMPP, SIP/RTP and now most are on Zoom/Teams/Meet/Slack/Skype/* all walled gardens. Even many push to substitute emails with some new colorful walled garden. Similarly any app try to add features someone else have since it's impossible just using it, like a sexp in a buffer.
As a result modern IT from Unix where the user at least can combine simple tools with some IPCs in script we are limited by cut&paste, drag&drop and not much more. Anything is damn complicated because there are no shared data structure in a shared environment, but any "app" do everything on it's own, often with a gazillion of dependencies who have a gazillion of deps on their own, with a big load of parsers of any kind and even "some standard to pass data" (like JSON) who try to emerge but are still not a shared data structure in an unique integrated environment.
All of this is the old Greenspun's tenth rule and actually is killing our IT innovation, killing one of the last sector we still rules, for the same issues that have killed our industry essentially.
I've seen people think they have a specific Python environment active just because they were in their project's directory on the command line.
I've seen people not understand that "python -m pip" is a command and even if they are in a directory which has "python" in its name, they still have to type "python" for that command.
PS: The command line might even be an emperor. And the emperor could be naked...
This is definitively not true for macOS.
Who would want that?
"Stay away from my house, you freak!" would be the normal reaction. Unless some serious trust is developed, I would not let people into my house while I sleep.
Also the actual usual reaction would have been more like: "hey it is open source, you can fix anything on your tank yourself"
You need a new module to connect with your other devices, just build it yourself, no big deal!
Ever wonder how Red Hat became a billion-dollar company before it was bought by IBM, and now makes up a huge segment of IBM's revenue stream?
Have you noticed SuSE is still around?
Have you ever speculated on how Canonical keeps its lights on?
Paid support, my naive friend. Linux support is big business and is what keeps the popular distros alive.
Why add this?
Whats the solution then?
We have already given so much in the western world.
As an aside, but I think relevant and you might find it interesting:
A decade or so I discovered Oberon, the last masterwork of the great genius of programming languages Niklaus "Bucky" Wirth. A complete OS, UI and compiler, in about four and a half thousand lines of code.
I have it running in various forms.
I introduced it to the Squeak Smalltalk community, and when I told them what I was looking for:
« a relatively mature, high-performance, small OS underneath, delivering some degree of portability -- something written in a type-safe, memory-managed language, with multiprocessor support and networking and so on already present, so that these things do not have to be implemented in the higher-level system. »
That is how I found Oberon. They told me such a thing did not and could not exist.
I gave them some links and people were amazed.
It seems deeply obscure in, as you say, the West.
But I have found an active community working on it and developing it. It is in Russia.
It may be that in other countries now the target of Western sanctions, we may inadvertently be fostering some very interesting tech developments...
The tank people offer to send someone to look into the car (rsp. tank) but the buyer rejects them from entering their house.
That's significant, because a car is much less private than a house. In the real world, if my car had an issue, it would be perfectly reasonable to give it into the hands of a mechanic, even if I don't know them personally. (And evidently the reputation of the dealership isn't the deciding factor either, otherwise all the independent repair shops wouldn't exist)
On the other hand, I'd be much more wary to let strangers into my house without supervision, because I have far more private and valuable possessions there than in my car.
So the question is whether computers are more like cars or like houses. I'd argue, they sort of blur the line and have definitely moved closer to "house" in the last decades. But it might have been different back then.
I’m not really religious about anything, but I often end up going back to the CLI for a lot of things because it’s just less of an annoyance.
I wrote python-wool as a simple wrapper to python to make that true because it's just easier that way. Direnv can also be configured to do that as well.
"I embraced OS X as soon as it was available and have never looked back. So a lot of 'In the beginning was the command line' is now obsolete. I keep meaning to update it, but if I'm honest with myself, I have to say this is unlikely."
https://slashdot.org/story/04/10/20/1518217/neal-stephenson-...
But people still dredge this quarter century old apocrypha up and use it to pat themselves on the back for being Linux users. "I use a Hole Hawg! I drive a tank! I'm not like those other fellows because I'm a real hacker!"
Talk confidently when you have experience.
When you’re typing a lot, you really don’t want to do a lot of typing for each step. And the shell scripts were for automating some process, not solving a particular problem (you use a programming language for that). The workflow is to have the references ready in case you forgot something.
That brings up the man pages, which can varies in quality, but, for most software I’ve used, tend to be comprehensive. But they assume that you’re knowledgeable. If you’re not, take some time to read a book about system administration (users, processes, files permissions,…).
I also primarily use Windows and don't have a dog in the fight you mentioned. I might actually dislike Linux more than OSX, though it has been quite a while since I've seriously used the one-button OS.
And I still think that we can improve. More over, we ought to improve.
https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...
The cli excels because it is extremely flexible, with far more options available than a set of buttons could ever display. But discoverability rounds down to 0, and there are footguns. It seems like spreadsheet software has found an almost drop in ui that would greatly enhance the cli.
Files are not a good abstraction for events. Signals are broken in many ways. And DBus is both extremely clunky to use and non-portable.
There isn’t a built-in event paradigm similar to how streams and pipes are an integral part of the Unix-style CLI.
But then again, if you're working in AutoCAD, you'd never say "I used to work in CLI only, now I use GUIs more and more".
Clearly they meant GUIs that have CLIs behind, or at least CLI alternatives.
Another example could be qemu and the GUIs that we have nowadays. One final example would be simply drag and dropping files via Finder instead of using cp/mv
I think DEC had one or two. And you could find someone who would meet you somewhere to help you out, it was an exciting time. Also there were lots of install fests for Linux.
Most activity took place on USENET, so getting help was rather easy.
For example, I had asked how I could connect 2 monitors to my 386SX, one controlled by a VGA card, the other via a mono-card, each monitor with a couple of VTs. That was doable with Coherent on install. A day later I got a patch.
Things moved very quickly back then :)
And if you were using AutoCAD in the 80's you can say exactly that you used to use the CLI only!
That complicated series of commands you just ran? Copy and paste them into the Jira ticket so the junior employee who makes half your salary can run them next time.
Why is that? On the low level everything is a state of electronic cells. Files address those cells in a suitable fashion. Modern programming abstractions such as async/await are very simple, but fail miserably if you need something really complex and efficient.
https://www.opengroup.org/openbrand/register/apple.htm
The GUI is built on top of the Unix foundation and does not stand alone or work without it.
The 'FoxTrot' comic made a big deal about this not long after Mac OS X was released:
I was a kid at the time, but did many people actually buy windows? I know about the ad-thing where the cast of Friends or whatever bought windows 95, but as I recall even back then the OS just came with the device. The only exception was OSX, which was a “Big Deal,” even non-technical people downloaded it.
Anyway, it is funny to see this in retrospect. Nowadays, operating systems have become so commoditized that you can’t even make a business selling them.
I love Linux but his description is quite optimistic.
It's a 3 part problem: available commands, their options, their syntax. Part one would need to capture prompt input before enter was hit using solutions similar to those found at [1] perhaps the most useful but least complete one there is the one that uses apropos so something like `apropos -s 1 '' | sort | grep calc | less`. Similar solutions would be required for two and three. The roughest and easiest prototype would probably be two tabs in a split screen view, which would allow for selection of displayed matches to then modify the prompt creating those matches in the other tab. But Calc style popups directly attached to your cursor would be more useful still.
[1] https://stackoverflow.com/questions/948008/linux-command-to-...
Meanwhile Windows has become those cars with two 27" screen as dashboard, which has bad user experience and full of advertisements.
In more broad terms I do not put much attention in a specific programming language even if clearly an OS-single-application is tied to a specif programming language, in the sense that there are many, and they are many factions loving one and hating others, the point is offering something usable at user level, like "just type a sexp and execute it" also from an email, because integration means also an immense small diversity and heavy dialogues so innovation. With such model we can keep our supremacy and more important we can't lose it because essentially anything float in a common see.
The main issue to reach such goal I think it's general cultural of the masses, today most peoples, many programmers included, do think that IT means computers, like saying that astronomy is the science of telescopes. Of course computers are like pen and paper, an essential tool, but they are a tool, the purpose of IT is information and that's not a specific technical task but a broad aspects involving essentially all disciplines. Until this is clear for anyone there is little hope people understand the need, power and issues of IT, they'll keep looking at the finger pointing the Moon instead of at the Moon.
The rest came after, even the most basic computer skills came after because to learn we need to be motivated, learning "just because you have to" as a social rule is not productive.
it can be a single make command
Quick outline of the course, in case anyone wants a starting point:
* Introduction
Quick history of Unix
* When you log in
Difference between login and interactive shells
System shell files vs user shell files
.zshenv for environment variables like PATH, EDITOR, and PAGER
.zprofile for login shells, we don't use it
.zshrc for interactive shells
Your login files are scripts, and can have anything in them
* Moving Around
** Where am I?
pwd = "print working directory"
stored in variable $PWD
Confusingly, also called current working directory, so you may see CWD or cwd mentioned
** What is here?
ls
ls -al
ls -alt
. prefix to filenames makes them hidden
. is also the current directory!
.. means the parent directory
file tells you what something is
cat displays a file
code opens it in vscode
** Finding my way around
cd
cd -
dirs | sed -e $'s/ /\\\n/g'
** Getting Help From The Man
man 1 zshbuiltins
manpage sections
** PATH
echo $PATH | sed -e $'s/:/\\\n/g'
zshenv PATH setting
which tells you what will be run
** Environment Variables
env | sort
EDITOR variable
** History
ctrl-r vs up arrow
** Permissions
Making something executable
** Prompts
zsh promptinit
zsh prompt -l
** Pipes and Redirection
Iterate to show how pipes work
cat ~/.zshrc | grep PATH
ls -al > ~/.tmp/ls-output.txt
** Commands
*** BSD vs GNU commands
BSD are supplied by Apple, and Apple often uses old versions
GNU are installed via homebrew, and match those commands available in LinuxOr, more likely: so that you yourself can remember what you did the next time the problem arises. Or your colleague, who is senior, but does not know this part of the codebase or infra well. Heck, you can even write a shell script, automate things and have your productivity increased!
These things will be just as true and useful in some communist FOSS context as they are in a capitalist system.
Repeating the same actions in a CLI with no readline is an exercise in frustration, but ... that's not what happens most of the time.
Yep! Windows did indeed come with machines, but the upgrades were always a big seller. I remember when Windows 3.1 hit the shelves and seemed to be everywhere. Same with Windows 95 but that one was a tougher upgrade because of the increased system requirements.
For one thing, UNIX != command line.
In the same vein, Windows NT is not based on DOS anymore, even if it has a command line which resembles (parts of) DOS.
It's a close analogy, because the Comet was actually the next model of Edsel. They just changed the branding. Same with Vista to 7.
Today new OS versions aren't such a big deal, but when Windows 95 came out, and then XP, they were huge, with total interface redesigns.
On the other hand, I don't think people went out of their way as much to buy smaller upgrades like Windows 98.
Screenshots along with text, then use Microsoft Paint to mark up the screen shots. For example, circling the appropriate menu option in a thick red line. Sadly, I do not know how to graphically convey the double-click operation.
Its a time consuming and error prone process, best used if the buggy GUI application is abandonware. Otherwise, the new version will have the same bug or non-intuitive process but the GUI will be unrecognizable. Probably because some manager, somewhere, read too many HN articles and decided a user-facing refactor was the most important thing they could possibly do.
Linux and the UNIX derivates are not even cousins. Not related. Not even the same species. They just both look like crabs a la https://en.wikipedia.org/wiki/Carcinisation.
It's kind of ironic that you're using a post from 20 years ago to invalidate an essay from 25 years ago, about an OS that's been substantially dumbed down in the last 10 years.
Bad corporate blood will tell.
Setting aside the "more BSD/Mach than Linux", OS X pressed a lot of the same buttons that BeOS did: a GUI system that let you drop to a Unix CLI (in Be's case, Posix rather than Unix, if we're going to be persnickety), but whose GUI was sufficiently complete that users rarely, if ever, had to use the CLI to get things done. Folks who love the CLI (hi, 99% of HN!) find that attitude baffling and shocking, I'm sure, but a lot of people really don't love noodling with text-based UIs. I have friends who've used the Mac for decades -- and I don't mean just use it for email and web browsing, but use it for serious work that generates the bulk of their income (art, desktop publishing, graphic design, music, A/V editing, etc.) -- who almost never open the Terminal app.
> though it has been quite a while since I've seriously used the one-button OS
Given that OS X has supported multi-button mice since 2001, I certainly believe that. :)
So when Apple started making workstations, I got one. I've been a satisfied customer ever since.
I have no idea whatsoever what dumbing down you're referring to. The way I use macOS has barely changed in the last ten years. In fact, that's a major part of the appeal.
Input maybe, although realizing that instead of typing, you can pipe, is a major conceptual breakthrough when new users are learning to work the command line.
But output? The existence of stdout and stderr as two sources of text with no visible distinction is highly nonobvious and continues to trip me up in practical situations to this very day.
macOS as an operating system has been "completed" for about 7 years. From that point, almost all additions to it have been either focused on interoperation with the iPhone (good), or porting of entire iPhone features directly to Mac (usually very bad).
Another point of view is that macOS is great, but all ideas that make it great come from 20 years ago, and have died at the company since then. If Apple were to build a desktop OS today, there's no way they would make it the best Unix-like system of all time.
And since MacOS 8 before that...
Not anymore. https://justine.lol/ape.html
Long ago:
If the solution to a problem is as simple as “copy and paste this command”, I want to hand that off to a junior employee so I can go do other things that require more creativity.
This also applies to Windows, by the way (except it’s more like 20-30 years ago).
Definitely. A new boxed OS version would often be the only updates anyone ever applied to their system. Even if you had Internet access, dial-up speeds and limited disk space meant downloading OS updates was often impractical. Even relatively small updates took forever to download.
There was also the relative costs of a computer. A $2,000 computer in 1995 would be about $4,000 in today dollars. Buying an OS update would be a relatively inexpensive way to upgrade the capabilities of your expensive computer without completely replacing it. Going from some Windows 3.1 release to Windows 95 would have been a nice upgrade in system stability for many people. Certainly not everyone but for many.
I'ts just that if I'm going to rely on someone else, I'd prefer for that someone else to be one of the people who has developed and applied their own expertise to that problem domain, rather than a statistical model that only actually knows how likely words are to appear in proximity to each other, and has no capacity to validate its probabilistic inferences against the outside world.
People whose first exposure to computing was in the mid-'90s or later seem to have less depth of understanding of the fundamentals and less experiencing tinkering and using process of elimination to solve problems. I started encountering people who didn't understand certain fundamental concepts and had no exposure to the CLI, but still had CS degrees and were applying for developer positions, about 15 years ago.
That said, the best late 90s expression of the core advantage of the GUI over the TUI/CLI is that it demands less of the user:
"recognize and click" vs
"remember and type"
That seems very fundamental to me.
I have not seen as succinct expression of tradeoffs for V(oice)UI or L(LM)UIs
This is markedly different from how it was in the past when they needed people to get up, go to a store, and buy the disc containing the new version of the OS.
Maybe the chat interface does away with the first half of the GUI/CLI schemes, skipping over learning the affordance part of the interface.
I assume if anyone associated with Microsoft compared Vista to anything other than an abject failure, it's because they are - at best - broken or defective people who were involved in the creation of Vista, and therefore not objective and not to be trusted in any way.
Dodge Viper? WTF?
It’s a zsh shell with BSD utils. 99% of my shell setup/tools on Linux just work on macOS. I can easily install the gnu utils if I want 99.9% similarity.
I very happily jump between macOS and Linux, and while the desktop experience is always potentially the best on Linux (IMO nothing compares to hyprland), in practice macOS feels like the most polished Linux distro in existence.
Do people just see, like, some iOS feature and freak out? This viewpoint always seems so reactionary. Whereas in reality, the macOS of the past that you’re pining for is still right there. Hop on a Snow Leopard machine and a Ventura machine and you’ll see that there are far, far more similarities than differences.
Mainly slowly hiding buttons and options and menus that used to be easily accessible, now require holding function or re-enabling in settings or using terminal to bring them back.
{ select[i]@dropdown:states > click@button:submit }
The fact that we don't have this (yet) does not mean it is not possible. In fact, given that the current darling of tech LLMs can 'compute visually' based on tokens (text) should make it clear that any 'representation' can ultimately be encoded in text.So a 'record' feature on GUI 'a' can create action encoding that can be emailed to your peer looking at GUI 'b' and 'pasted'.
At the time I was an embedded developer at Microsoft and had been a Windows programmer in the mid 90s. It was pretty clear that there was some dunning Krueger going on here. Neal knew enough about tech to be dangerous, but not really enough to be talking with authority
If you wish Apple supported computers longer, fine. I’d personally disagree because I’ve had wonderful luck with them supporting my old hardware until said hardware was so old that it was time to replace it anyway, but would respect your different opinion. Don’t exaggerate it to make a point though.
In the Beginning was the Command Line (1999) - https://news.ycombinator.com/item?id=37314225 - Aug 2023 (2 comments)
In the Beginning Was the Command Line - https://news.ycombinator.com/item?id=29373944 - Nov 2021 (4 comments)
In the Beginning was the Command Line (1999) - https://news.ycombinator.com/item?id=24998305 - Nov 2020 (64 comments)
In the beginning was the command line (1999) [pdf] - https://news.ycombinator.com/item?id=20684764 - Aug 2019 (50 comments)
In the Beginning Was the Command Line (1999) - https://news.ycombinator.com/item?id=16843739 - April 2018 (13 comments)
In the Beginning Was the Command Line (1999) - https://news.ycombinator.com/item?id=12469797 - Sept 2016 (54 comments)
In the beginning was the command line - https://news.ycombinator.com/item?id=11385647 - March 2016 (1 comment)
In the Beginning was the Command Line, by Neal Stephenson - https://news.ycombinator.com/item?id=408226 - Dec 2008 (12 comments)
In the beginning was the command line by Neil Stephenson - https://news.ycombinator.com/item?id=95912 - Jan 2008 (5 comments)
In the Beginning Was the Command Line - https://news.ycombinator.com/item?id=47566 - Aug 2007 (2 comments)
(Reposts are fine after a year or so; links to past threads are just to satisfy extra-curious readers. In the case of perennials like this one, it's good to have a new discussion every once in a while so newer user cohorts learn what the classics are.)
The usual interpretation is that scripts glue together system commands that don't otherwise incorporate one another. This is tremendously useful and is by itself a huge advantage over GUIs.
But the second sense is that vast libraries of scripts keep developers bound to their previous UI promises. You don't capriciously change command-line options without raising all kinds of ire. I've seen GNU tools (tar comes to mind) in which options have been "deprecated" for decades, yet are still available. I seem to recall a very small number of cases where options had their senses entirely inverted, to much protest, and ultimately abandonment of those tools. In other cases you've got generations of programs which support backwards-compatible options with precursors: exim and postfix with sendmail, ssh with rsh, alpine with pine, off the top of my head. Developer headspace and script compatibility are valuable.
On that note, whilst I find the *BSD's extreme conservatism annoying myself, I can appreciate that it does lead to increased consistency over time as compared with GNU userland.
Some recent discussion on the border phone search ruling:
Threre is an implicit supperiority in the text which is just as cringey now as it was at the time, but i think its still a good analogy about different preferences and relationships different people have to their computers.
I'm not sure about the analogy though, they might have been thinking of later Viper versions where the complaints would be more about cost, gas mileage, or general impracticality for daily use.
<https://en.wikipedia.org/wiki/Expect>
Much of modern operating systems is events-based and relies on polling for specific actions, devices connecting / disconnecting, etc. Linux device management is based on this, for example. There are numerous scripts which fire as a service is enabled or disabled (networking "ifup" and "ifdown" scripts, for example), or services are started / stopped. Systemd extends these capabilities markedly.
And of course there's a whole slew of conditional processing, beginning with init scripts, [], test, &&, ||, and specific conditional logic within shells, scripting, and programming languages.
Underneath it was just Windows, but the interface ruined it
I'm typing this on a 12 year old MacBook Pro running Debian whose hardware perfectly fine, but hasn't been supported by Apple in years.
FWIW, Debian supports it fine, though NVidia recently dropped support for the GPU in their Linux drivers.
I'm going to miss it when it dies, too. Plastic Lenovos just can't compare.
However and unfortunately I feel your last statement is spot tf on! Our only hope I guess is that they have incurred enough tech debt to be unable to enshitify themselves.
For those not in the know apple is an og hacker company, their first product was literally a blue box! Why this matters and gp is correct and why linux peeps gets in a tivvy and what stephenson was getting at with the batmobile analogy is that traditionally if hackers built something consumer facing they couldn’t help themselves but to bake in the easter eggs.
The trouble with the original MacOS was that the underlying OS was a cram job to fit into 128Kb, plus a ROM. It didn't even have a CPU dispatcher, let alone memory protection. So it scaled up badly. That was supposed to be fixed in MacOS 8, "Copeland", which actually made it out to some developers. But Copeland was killed so that Apple could hire Steve Jobs, for which Apple had to bail out the Next failure.
Want to know how to copy and paste quickly? It's Right There in the menu you found the action in. Don't know that yet? Alt + E (underlined in Edit) then some other key to jump to the action in the list, or you now see the list, abort the previous command sequence with Esc, and then start memorizing the new shortcut for Ctrl+c + Ctrl+v.
Of course, this belief probably had no downsides or negative consequences, other than hurting my brain, which they probably did not regard as a significant problem.
https://youtu.be/WdtK9Sj8ADw?list=PLoTU9_iCGa6go3vsnxlNn1wZS...
> We invite specialists to take part in a professional discussion to identify the causes of Oracle DBMS failure
> System logs and a more complete description of the situation will be posted in the blog. To participate in the discussion, you must fill out the registration form and wait for an invitation to your email.
https://web.archive.org/web/20120716225650/http://www.sbrf.r...
Doesn't work so well for GUIs, which was sort of the original point.
Neal said the essay was quickly obsolete, especially in regards to Mac, but I'll always remember this reference about hermetically sealed Apple products. To this day, Apple doesn't want anyone to know how their products work, or how to fix them, to the point where upgrading or expanding internal hardware is mostly impossible.
I feel like it could be done if it was really a goal from day one, and there were things like "record this set of actions as a script" built into the toolkit. Even Applescript was still an afterthought, I think, albeit a very well supported one.
In the meantime, given the comprehensive failure of UI toolkits to support this style, it is completely fair for people to act and speak as if it doesn't exist.
If someone makes a GUI that does specifically what you want, you're in luck. If not, you're S.O.L.
With CLIs, you can almost always find some way to string things together to get what it is you want.
The hard case is working in domains which aren't inherently textual: graphics, audio, video, etc. Even there, you can often get much more accomplished from the terminal than you'd think, though setting up the process can be challenging at times. Not so useful for one-offs, but invaluable for repetitive processes.
So, "zero shared lineage" seems like a very strong statement.
Many of those ideas came from NeXT, so more like 30 years ago.
https://wiki.archlinux.org/title/Arch_Linux
> Arch developers remain unpaid, part-time volunteers, and there are no prospects for monetizing Arch Linux
I was amazed.
As for Apple, their openness comes and goes. The Apple II was rather open, the early Macintosh was not. Macintosh slowly started opening up with early NuBus machines through early Mac OS X. Since then they seem to be closing things up again. Sometimes it was for legitimate reasons (things had to be tightened up for security). Sometimes it was for "business" reasons (the excessively tight control over third-party applications for iOS and the incredible barriers to repair).
As for the author's claims about their workings being a mystery, there wasn't a huge difference between the Macintosh and other platforms. On the software level: you could examine it at will. At the hardware level, nearly everyone started using custom chips at the same time. The big difference would have been IBM compatibles, where the chipsets were the functional equivalent of custom chips yet were typically better documented simply because multiple hardware and operating system vendors needed to support them. Even then, by 1999, the number of developers who even had access to that documentation was limited. The days of DOS, where every application developer had to roll their own hardware support were long past. Open source developers of that era were making a huge fuss over the access to documentation to support hardware beyond the most trivial level.
- The settings app is now positively atrocious, "because iPhone"
- SIP is an absolute pox to deal with.
- "Which version of Python will we invoke today" has become a fabulous game with multiple package managers in the running
- AppCompat games.
- Continued neglect for iTunes (which is now a TV player with a "if we must also provide music, fine" segment added - but it still thinks it should be a default client for audio files)
- iCloud wedging itself in wherever it can
Yes, all of those can be overcome. That's because the bones are still good, but anything that Apple has hung off those since Tim Cook is at best value neutral, and usually adds a little bit more drag for every new thing.
Don't get me wrong, I still use it - because it's still decent enough - but there's definitely a trajectory happening.
Is locking down the System folder any more problematic than app armor, and any less useful for system integrity? Putting everything from brew under /opt follows UNIX conventions perfectly fine, definitely more than using snaps in Ubuntu for basic command line utilities. And installing whatever you want on macOS is just as easy as it is on Ubuntu.
This sort of complaint just gets so boring and detached from reality, and I’m not saying that you don’t use macOS but it reads like something from someone who couldn’t possibly be using it day-to-day. For me it’s a great compromise in terms of creating an operating system where I can do anything that I would do in Linux with just as much ease if not more, but also not have to provide tech support on for my elderly parents.
[1] https://github.com/systemd/systemd/issues?q=is%3Aissue+is%3A...
[2] https://www.reddit.com/r/linux/comments/18kh1r5/im_shocked_t...
To do that in a gui-centric fashion, the only tools we actually have are textual commands that direct gui action. essentially its a middleman step: if we are already writing our applescript lets say, we might as well just abrogate the actual commands that our applescript is trying to abrogate through the gui system.
I have used UNIX/Linux on a daily basis for over 30 years, and OSX/MacOS daily for over 15 years. I know how UNIX systems work and where things traditionally are located. And until a few years ago MacOS was a reasonable UNIX that could be used more or less like a friendly UNIX system -- but it is becoming increasingly less so.
On the command line, the context is obvious
Hardly.If you’re lucky, you might know what your current directory is.
More often than not, at any particular point, your command line is paused in the middle of some likely ad hoc multi-step process. A process with a bunch of state stored as opaque blobs of data scattered across the file system. More so exacerbated in my case as those files are likely cleverly named x, x1, x2.
Modern systems benefit from things like command history, scroll back buffers, and similar constructs that can be leveraged to help you, as the user, restore the current context. But for a very long time, many simply returned to a $ and a soulless, cold, blinking cursor callously expecting that you recall you know where you are and what you’re doing.
The tools are there to help you dig and perhaps restore the current context (current directory, latest files, etc.) but that’s a far cry from “obvious”. Lots of folks return, blow their internal call stack, and just start over when they come back from lunch (if practical).
- Windows user: hell.
Ubuntu/Arch/any distro without full ofline mirrors: hell too.
I do not advise skimming.
I've been a full-time tech journalist for 2 & a half years now (I was in the 1990s as well but the 21st century is very different) and I find the majority of readers who angrily disagree with my articles did not in fact understand the article because they tried to skim it and they didn't get the gist.
(In a previous job I was a TESOL/TEFL English teacher. "Skimming for gist" is a skill we test for, and many people don't have it and don't know they don't have it. I an not accusing you here -- but you did mention your own English in negative terms.
For example, I was on a talk at FOSDEM in February -- https://fosdem.sojourner.rocks/2024/event/3113 -- and it seemed to me that most of the audience angrily arguing about what the GPL meant and implied had not really genuinely read and understood all 6 pages of the GPL.)
Executive summary of Oberon:
https://ignorethecode.net/blog/2009/04/22/oberon/
13 page academic assessment, but very readable and accessible:
"Oberon – The Overlooked Jewel" https://dcreager.net/remarkable/Franz2000.pdf
If you don't want SIP, it will take you a few minutes to reboot and switch it off permanently (or perhaps until the next OS upgrade). This is really the only one in the list which has to be "overcome", and personally I think that SIP enabled by default is the right choice. Anyone who needs SIP disabled can work out how to do that quickly - but it is years since I've had a reason to do it even temporarily, so I suspect the audience for this is small.
Multiple package managers and Python: that sounds like a problem caused by running multiple third party package managers.
If you want games, x86 or console is the preferred choice. Issue for some, decidely not for others. I'd much rather have the Mx processor than better games support.
iTunes - I can't comment, I don't use it.
iCloud - perfectly possible to run without any use of iCloud, and I did for many years. I use it for sync for couple of third party apps, and it's nice to have that as an available platform. It doesn't force its way in, and the apps that I use usually support other platforms as well.
A system should be heavily locked down and secure by default unless you really know what you are doing and choose to manually override that.
Modern MacOS features add an incredible level of security- it won't run non-signed apps unless you know what you're doing and override it. Even signed apps can only access parts of the filesystem you approve them to. These things are not a hassle to override, and basically make it impossible for hostile software to do things that you don't want it to.
I agree there is some conceptual inconsistency- which I see on almost all OSs nowadays, but Windows 8 being the most egregious example, where you are mixing smartphone and traditional desktop interface elements in a confusing way.
That's very right, but modern life is complicated so accurately study something demand much time, quickly see the concepts might helps and well, the concept of textual-UI is definitively not alien to me, since my desktop is EXWM, with almost all my digital life in org-mode, org-roam-managed notes, it's still very different than Oberon (or Plan 9) desktop but the textual concept and org-mode links that can execute sexps on click (a feature I use much, for instance to link specific mail/threads in notes and create interactive presentations) it's similar. The 2D "spaced" desktop concept It's something I see in the far past, SUN Looking Glass LG3D concept desktop https://en.wikipedia.org/wiki/Project_Looking_Glass and yes while the above help they still can't tell me what's inside the package, meaning what's behind the UI concept and the language grammar. I still miss the architecture.
However I suppose for what I've seen so far that essentially it's not really usable in real life so it's a nice to know project but stop here, like Lisp M Genera or Plan 9. Emacs at least can be used for real today. It's sad the IT industry have pushed what I call the glorification of ignorance, but more than preserving knowledge for a more civilized world I think we can't do.
So far in the last decades most of the old valid ideas get anyway accepted, for instance widget based UIs have essentially failed and are more and more substituted by WebUIs witch are read-only DocUIs or NotebookUIs witch are limited 2D CLIs, something close to a DocUI. They are mostly text-based as well. So well, maybe in 10+ years we will finally have something like an Oberon or LispM desktop, surely returned with many anti-users aspects but still offering something of the past glory, and there memories will help to keep correcting the aim and reducing the loss. Anyway until people realize the substantial importance and role of IT there is little hope for a more civilized era...
No, it's not viable as a general-purpose OS these days. At one time it was and was deployed to non-technical staff inside ETH.
The last development in the line, not from Wirth himself, has a zooming GUI, resizable overlapping windows, SMP, a TCP/IP stack, an email client and a very basic HTTP only web browser. It is closer than you might expect.
I believe the core OS is on the order of 8000 LOC.
You may enjoy my FOSDEM talks if you're interested in this kind of thing.
I did one involving rebooting the local OS stack based on Oberon and Smalltalk, or maybe Newspeak:
https://archive.fosdem.org/2021/schedule/event/new_type_of_c...
I turned it into an article recently:
https://www.theregister.com/2024/02/26/starting_over_rebooti...
And this year a more Linux centric one based around 9front:
https://fosdem.org/2024/schedule/event/fosdem-2024-3095-one-...
That became an article series:
As an aside, I think a great foundation for a GUI application is a "server" API that the GUI interacts with. You can automate a lot of testing this way and you can give power users the ability to automate workflows. Web apps are already built this way, but if you're making a standalone app you even have the privilege of making a very chatty API, which makes testing even easier.
For starter, it is much less annoying from a security/notification standpoint, you can tell it to fuck off and let you do your things if you know what you are doing.
macOS isn't too bad yet but is clearly lagging behind, Apple is unwillingly to meaningfully improve some parts and seems to refuse to sherlock some apps because it clearly goes against their business interests. They make more money earning the commission on additional software sales from the App Store, a clear conflict of interest. They got complacent just like Valve with all the money from running it's marketplace.
For the most part the macOS user is of the religious zealots' type and they barely know how to do the basics, far worse than you average seasoned windows user, even though in principle macOS should be easier to handle (in practice it's not exactly true but still...).
People here who seemed to think otherwise really live in the reality distortion field and it seems to be linked to the mythical Silicon Valley "hacker". At first, I drank the kool-aid on that definition but it actually seems pretty disrespectful for "real" hacker; but whatever, I guess.
The openness and freedom to modify like an open UNIX was a major selling point, losing all that for "security" features that mostly appeal to the corporate are not great. Those features also need to be proven useful because as far as I'm concerned, it's all theory, in practice I think they are irrelevant.
The notification system is as annoying and dumb as in iOS and the nonstop "security" notification and password prompt is just a way to sell you on the biometrics usefulness; which Apple, like big morons they are, didn't implement in a FaceID way, in the place where it made the most sense to begin with: laptops/desktops. Oh, but they have a "nice", totally not useless notch.
Many of the modern Apps are ports of their iOS version, wich makes them feel almost as bad as webapps (worse if we are talking about webapps on windows) and they are in general lacking in many ways both from a feature and UI standpoint.
Apple Music is a joke of a replacement for iTunes, and I could go on and on.
The core of the system may not have changed that much (well expect your data is less and less accessible, forcibly stored in their crappy obscure iCloud folder/dbs with rarely decent exports functions) but as the article hinted very well, you don't really buy an OS, just like nobody is really buying solely an engine. A great engine is cool and all, but you need a good car around that to make it valuable and this is exactly the same for an OS. It used to be that macOS was a good engine with a great car around, in the form of free native apps that shipped with it or 3rd party ones. Nowadays unless you really need the benefits of design/video apps very optimized for Apple platforms it increasingly is not a great car.
Apps around the system aren't too bad but they are very meh, especially for the price you pay for the privilege (and the obsolescence problem already mentioned above).
It's not really that macOS has regressed a lot (although it has in some in the iOSification process) but also that it didn't improve a whole lot meanwhile price and other penalty factors increased a lot.
But I doubt you can see the light, you probably are too far in your faith.
An idea already implemented to varying degrees of success in places like BeOS, Haiku (operating system), and to a much lesser extent, AppleScript! You could also throw in the COM interop of Windows and OS/2.
I have less than 50 hours use on my Windows 11 machine, a midgrade Lenovo P358 rig I bought renewed because it had plenty of memory and an Nvidia T1000 card. Yet it taught me that the test of an operating system is how quickly you can navigate around, and how well it can find things, given only clues. Windows 11 is just snapper, quicker, than the latest macOS running on a new M3 Mac.
___
[1]: Ironically, the heavy state subsidies for buying an electric car are, like anything else in Norway, financed by... Oil money.
He returns to it rather explicitly in _Dodge in Hell,_ with the odyssee through the "Facebooked" wastelands of -- where was it, Idaho / Montana / the Dakotas? Something like that -- where the MAGAfied barely-literate natives hound the scientist / tech-type heroes.
There is some software that I find nice and convenient in macOS but it has gotten really hard to justify the price of the hardware considering the downsides.