Later on, I was able to do some computational work on an Altix 3700 with 256 sockets and 512G of RAM spread over four full-height cabinets with the nest of NUMAlink cables at the back), at the time running SuSE linux and that was wild seeing the 256 sockets being printed out with a cat /proc/cpuinfo. Now the same capabilities are available in a 4U machine.
The corporate lineage story is also just as interesting as the hardware they made as well. Acquisition, spinoff, acquisition, rename, acqusition, shutter, now perhaps just a few books and binders and memories in the few remaining personnal at HPE are all that's left (via Cray, via Tera, via SGI, via Cray Research).
RIP SGI
I remember the colors as being very different, from the photos, though.
The Personal Iris was a deep purplish-brown, and the Indigo was ... indigo.
Jim Clark sounds like my kinda guy. I made a hash of my teenage years, and barely squeaked in, with a GED, myself. It has all worked out, OK, in the end, though.
One of the neatest things is that they let us (Trilogy/pcOrder) put together a sand volleyball team to compete in their company intramurals.
Their cafeteria was also top notch.
Similarly, it feels like Silicon Graphics is a case where they really should have become more standard. Now, unlike Amiga, they were too expensive to catch on with regular consumers, but I feel like they should have become and stayed the "standard" for workstation computers.
Irix was a really cool OS, and 4Dwm was pretty nice to use and play with. It makes me sad that they beaten by Apple.
Incredible, though, how the relatively cheaper Windows NT machines and 3dfx cards and graphics software just killed them. I was a little sad when I wandered around the campus of an employer in Mountain View and noticed the fading sign that had what was left of the SGI logo.
I think it's a combination of a skillset/culture needed to create a paradigm shift isn't the same one needed to compete with others on a playing field you built, and of complacency. It happens over an over. We saw it happen with RIM, and we're watching it happen right now with Prusa Research.
I think you highlighted very correctly there, though, why SGI lost. It turned out there were cheaper options, which while not on par with SGI workstations initially, just improved at a faster rate than SGI and eventually ended up with a much better cost/functionality profile. I feel like SGI just bet wrong. The article talks about how they acquired Cray, which were originally these awesome supercomputers. But it turned out supercomputers essentially got replaced by giant networks of much lower cost PCs.
SGI's hardware was cutting-edge and exotic. IRIX was killer (sorry Solaris). Cray was a subdivision. My coworkers used emacs, too. They put an O2 on my desk!
The dream didn't last long. Major layoffs hit just a few months after I started full time. I wrote about the experience here: https://davepeck.org/2009/02/11/the-luckiest-bad-luck/
I co-oped for SGI onsite in the sales/marketing/support for a major ISP of the day back in the late 90s and the buzz around the office was that the company (at this point experimenting with overpriced Windows NT boxes and generic Linux servers) was experiencing massive brain drain to some brand new startup that was going to make something called a "GeForce" card for cheap PCs that was going to avoid the pitfalls of the then popular Voodoo cards. Apparently the engineers were unhappy with the direction the company was taking under the new leadership and thought that there was still an interest in graphics acceleration.
As an outsider--cause I didn't live and work in California--this was the go-go atmosphere of such companies back then where they thought they could do no wrong. And the after work parties were wild (how the heck do you break off half a toilet bowl?).
One of the buildings had plastic over the windows cause that's where they were working on the plugin GL card for the PC. (Ssh! No one's supposed to know that!)
Being the first system engineer in St Louis, my eyes lit up when my manager told me he had ordered an 16-core machine for my office--just for me!
I was hired as a video expert. The company re-org'ed and my new boss decided he needed a Fortran expert so that was the end of my job with SGI.
Along the same lines, there is an alternate timeline where the Sharp X68000 took over the world: https://www.youtube.com/watch?v=OepeiBF5Jnk
See the dominance of Threadripper in workstations, which is built on top of mainstream desktop and server parts bin. Or look at the Epyc based supercomputers, rumored to be the only supercomputers to turn a net profit for the suppliers, thanks to leveraging a lot of existing IP.
I guess it's just kind of impossible to predict the future. I don't think it's an incompetent decision to try and focus entirely on the workstation world; there are lots of businesses that make no attempt to market to consumers, and only market to large companies/organizations, since the way budgeting works with big companies is sort of categorically different than consumer budgets.
But you're absolutely right. Apple and Windows computers just kept getting better and better, faster and faster, and cheaper and cheaper, as did 3D modeling and video editing software for them. I mean, hell, as a 12 year old kid in 2003, I had both Lightwave 3D (student license) and Screenblast Movie Studio (now Vegas) running on my cheap, low-spec desktop computer, and it was running fast enough to be useful (at least for standard definition).
In this case, I think the market "chose right" - and the reason that the cheaper options won is because they were just better for the customer, better upgradability, better compatibility, and better competition among companies inside the ecosystems.
One of the most egregious things I point to when discussing SGI/Sun is how they were both so incredibly resistant to something as simple as the ATX/EATX standard for motherboard form factors. They just had to push their own form factors (which could vary widely from product to product) and allowed almost zero interoperability. This is just one small example but the attitude permeated both companies to the extent that it basically killed them.
They leapt out ahead of the competition with an advanced OS, purpose-built for graphics and sound in a way that PCs and Macs weren't.
Which was great. But they weren't really better than the competition. They were just doing something the competition wasn't. And when the competition actually started doing those things they got eclipsed in a hurry.
I wonder if Tesla will suffer the same fate. They were obviously around a decade ahead of the established players when it came to electric cars. But once the other established players actually got serious about electric cars, Tesla largely stopped being special, and upstarts like Lucid and Rivian are neck and neck with them (in terms of compelling products, not sales) as well.
At the time. A brief moment in time, and then they had no path forward and were rapidly steamrolled. Nothing was "chosen wrong" in this aspect.
What smaller businesses are using will tend to be what takes over in the future, just due to natural processes. When smaller businesses grow, they would generally prefer to fund the concurrent growth of existing vendors that they like using than they are to switch to the existing "industrial-grade" vendor.
At the same time, larger organizations that can afford to start with the industrial-grade vendors are only as loyal as they are locked in.
Amiga was only better 1985-1988.
I still have my original Amiga and A2000. I was an Amiga user for a decade. They were very good. I was platform agnostic, caring only to get work done as quickly and easily as possible so I was also an early Macintosh user as well as Sun and PA-RISC. And yes, I still have all of those dinosaurs too.
By 1987 PC and Mac caught up and never looked back.
But by 1988 the PS/2 with a 386 and VGA was out and the A2000 was shipping with a 7MHz 68000 and ECS.
By 1990 the 486s were on the market and Macs were shipping with faster 030s and could be equipped with NuBUS graphics cards that made Amiga graphics modes look like decelerated CGA.
After the A2000 the writing was on the wall.
Note: my perspective is of someone who has always used computers to do work, with ALMOST no care for video games so all of the blitter magic of Amiga was irrelevant to me. That being said when DOOM came out I bought a PC and rarely used my Amigas again.
What I can confidently assert is that I upgraded my A2000 many times and ran into the absolute configuration nightmare that is the Amiga architecture and the problems with grafting upgrades onto a complex system with multiple tiers of RAM and close OS integration with custom chips.
One more bit of heresy is that I always considered Sun's platform to be superior to SGI's.
As much as I loved my O2 (my first professional computer), it was underpowered for the time for anything other than texture manipulation. The closed source nature of that time period and the hardware sales motion meant that you were paying through the teeth for compilers on top of already very expensive hardware. The Cray-linked Origin 200's ran Netscape web server with ease but that's a lot of hardware in a time period when everything went out of date very quickly-donated ours! Irix still looks better than the new Mac OS UIs IMO but no-Motif is a small price to pay for far cheaper access to SDKs IMO. Also, Irix was hilariously insecure due in part to its closed source nature. https://insecure.org/sploits_irix.html
This means there are products out there with futuristic features that will be seen as requirements for all things going forward and right now those features are niche elements of some product.
The Amiga was a fantastic device but not a general purpose device. Lots of things are fantastic at a niche but not general, and those almost always fail.
Is this also the "worse is better" truism?
I think a more useful explanation is that people rate the value of avoiding vendor lockin extraordinarily high, to the extent that people will happily pick worse technology if there's at least two competing vendors to choose from. The IBM PCs were not good, but for convoluted legal reasons related to screwups by IBM their tech became a competitive ecosystem. Bad for IBM, good for everyone else. Their competitors did not make that "mistake" and so became less preferred.
Microsoft won for a while despite being single vendor because the alternative was UNIX, which was at least sorta multi-vendor at the OS level, except that portability between UNIXen was ropey at best in the 90s and of course you traded software lockin for hardware lockin; not really an improvement. Combined with the much more expensive hardware, lack of gaming and terrible UI toolkits (of which Microsoft was the undisputed master in the 90s) and then later Linux, and that was goodbye to them.
Of course after a decade of the Windows monopoly everyone was looking for a way out and settled on abusing an interactive document format, as it was the nearest thing lying around that was a non-Microsoft specific way to display UI. And browsers were also a competitive ecosystem so a double win. HTML based UIs totally sucked for the end users, but .... multi-vendor is worth more than nice UI, so, it wins.
See also how Android wiped out every other mobile OS except iOS (nobody cares much about lockin for mobile apps, the value of them is just not high enough).
https://en.wikipedia.org/wiki/Silicon_Graphics_International
https://vizworld.com/2009/04/what-led-to-the-fall-of-sgi-cha...
https://vizworld.com/2009/04/what-led-to-the-fall-of-sgi-cha...
https://vizworld.com/2009/04/what-led-to-the-fall-of-sgi-cha...
https://vizworld.com/2009/04/what-led-to-the-fall-of-sgi-cha...
https://vizworld.com/2009/05/what-led-to-the-fall-of-sgi-epi...
Driving to a Phish show at Shoreline, we passed the low-slung office buildings of SGI which seemed like the sexiest place to work. When I graduated, I thought I was "too dumb in CS" to get a job in Mountain View and went to grad school in biophysics instead.
By the time I was a few years into grad school, I worked in a computer graphics lab outfitted with Reality Monsters and Octanes and other high end SGIs (when you maxxed out an SGI's graphics and RAM, they were really fast). I was porting molecular graphics code to Linux using Mesa (much to the derision of the SGI fans in the lab). When we got a FireGL2 card it had a linux driver and could do reasonable molecular graphics in real time and the SGI folks looked real scared (especially because the SGI Visual Workstation had just come out and was a very expensive turkey).
Less than a decade after that I was working in those very buildings for Google. Google took over SGI's old HQ (Jeff Dean told me there was a period where Google and SGI overlapped in the GooglePlex and the SGI folks looked very sad as they paid for their lunches and teh googlers got free food). There was still plenty of SGI signage strewn about. And now Google has gone dumb and also built their own HQ next door (note the correlation between large SV companies building overly fancy HQs and then going out of business).
Such is the cycle of sexy tech.
There's other stuff too; they had better color graphics in the 80s while DOS was still dealing with CGA and EGA, and decent sound hardware. Even by 1990, the video toaster was released, well before it got any port to DOS.
[1] I'm sure it got better, my first exposure to it was System 7 and that thing was an unholy mess. I didn't touch macOS again until OS X.
Oh indubitably! I don't think even the most committed Amiga fan, even the ones that speculate about alternate histories, would deny that at all.
The thing is, though, that only happened because Commodore essentially decided that since it had so much of a head start, it could just rest on its laurels and not really innovate or improve anything substantially, instead of constantly pushing forward like all of its competitors would do, and so eventually the linear or even exponential curve of other hardware manufacturers' improvements outpaced its essentially flat improvement curve. So it doesn't seem like IBM PCs and eventually even Macs outpacing the power of Amiga Hardware was inevitable or inherent from the start.
If they had instead continued to push their lead — actually stuck with the advanced Amiga chips that they were working on before it was canceled and replaced with ECS for instance — I certainly see the possibility of them keeping up with other hardware, and eventually transitioning to 3D acceleration chips instead of 2D acceleration chips when that happened in the console world, eventually perhaps even leading to the Amiga line being the first workstation line to have the gpus, and further cementing their lead, while maintaining everything that made Amiga great.
Speculating even further, as we are seeing currently with the Apple M-series having a computer architecture that is composed of a ton of custom made special purpose chips is actually an extremely effective way of doing things; what if Amiga still existed in this day and age and had a head start in that direction, a platform with a history of being extremely open and well documented and extensible being the first to do this kind of architecture, instead of it being Apple?
Of course there may have been fundamental technical flaws with the Amiga approach that made it unable to keep up with other hardware even if Commodore had had the will; I have seen some decent arguments to that effect, namely that since it was using custom vendor-specific hardware instead of commodity hardware that was used by everyone, they couldn't take advantage of the cross-vendor compatibility like IBM PCs, could and also couldn't take advantage of economies of scale like Intel could, but who knows!
One day, someone wheeled this approx. 3x3 foot sized box to my door and asked me if I wanted it. It was a SGI Onyx with a giant monitor sitting on top, with a keyboard and mouse.
I plugged it in and it sounded like an airplane taking off. It immediately heated up my entire tiny office. It was the 4th Unix I had ever played with (Ultrix, NeXT and A/UX were previous ones). It had some cool games on it, but beyond that, at the time, I had no use for it because A/UX on my Quadra950, was so much more fun to play with.
I don't even think I ever opened it up to look at it. I don't know what I was thinking. lol.
After realizing it did not have much going for it, I ended up just turning it on when the office was cold and using it as a foot rest.
Oh yea, found a video... https://www.youtube.com/watch?v=Bo3lUw9GUJA
Those buildings represented that change to me. I can remember coming to concerts at the Shoreline in the 90s and looking at those Silicon Graphics buildings: they looked so cool, and they represented the cutting edge of technology (at the time). And yet...it all disappeared.
Same goes for the Sun campus which is where Meta/Facebook is now. Famously, the Facebook entrance sign is literally the same old Sun sign, just turned around! [0]
So I always cautioned co-workers: this too, shall pass. Even Google.
[0] https://www.businessinsider.com/why-suns-logo-is-on-the-back...
When these RISC-based workstations were initially released their performance, especially at graphics, was well beyond what a PC could do, and justified their high prices. A "workstation" was in a class by itself, and helped establish the RISC mystique.
However, eventually Intel caught up with the performance, at a lower price, and that was pretty much the end. Sun lived on for a while based on their OS and software ecosystem, but eventually that was not enough especially with the advent of Linux, GCC, etc, as a free alternative.
How is that in anyway different from Apple today with it's ARM SoCs, soldered SSDs and an OS that requires "entitlements" from Apple to "unlock" features and develop on?
When I was in high school we had a lab full of SGI machines. They also never got used. Hundreds of thousands of dollars of computing equipment, and probably that much again in software licenses (at the commercial rate), just sitting there doing nothing. It was heartbreaking.
On a happy note, the SGI bus (a semi-trailer full of SGI machines demoing their capabilities) came to school one time. As a teenage nerd, getting to play with a refrigerator-sized Onyx2 was a good day.
But I think you're broadly right.
[1] Yes I know about OpenFOAM, I know I could use that if I really wanted.
So in 20 years in, the current batch of senior devs will be retiring, and the current noobies will have become senior devs.
*Whatever language is easy to learn today will be a big deal in 20 years*
That's how PHP, Python, and JavaScript won. Since JavaScript got so much money poured on it to make it fast, secure, easy, with a big ecosystem, I say JS (or at least TS) will still be a big deal in 20 years.
The latest batch of languages know this, and that's why there are no big minimal languages. Rust comes with a good package manager, unit tester, linter, self-updater, etc., because a language with friction for noobies will simply die off.
One might ask how we got stuck with the languages of script kiddies and custom animated mouse cursors for websites. There's no other way it could turn out, that's just how people learn languages.
For Fortran? My memory is hazy but at NASA NAS a bunch of us were using gcc/g++ starting ~1990. g++ was... an adventure. Building my own (fine!) compiler for free(!) got me hooked on OSS to the point that when Linux/FreeBSD launched I jumped in as fast as I could.
I really loved my various SGI boxen. Magical times. I was a NASA "manager" so had the Macintosh "manager" interface box that I solved by keeping it turned off.
There was a window in the mid-90s where it would have been possible for SGI to develop a PC 3D accelerator for the consumer market using their GE technology, but nobody in the C-Suite had the stomach to make a new product that would undercut the enormous profit margins on their core product. It's the classic corporate trap. Missing out on the next big thing because you can't see past next quarter's numbers. Imagine basically an N64 on a PCI card for $150 in 1996. The launch versions could be bundled with a fully accelerated version of Quake. The market would have exploded.
In 91 I was a dedicated Atari ST user convinced of the superiority of the 68k architecture, running a UUCP node off my hacked up ST. By the end of 92 I had a grey-box 486 running early releases of Linux and that was that. I used to fantasize over the photos and screenshots of workstations in the pages of UnixWorld and similar magazines... But then I could just dress my cheap 486 up to act like one and it was great.
> 80s while DOS was still dealing with CGA and EGA, and decent sound hardware.
And then the 80s ended. What point did I make that you are contradicting?
> Even by 1990, the video toaster was released,
And if you wanted to do CAD? Would you use an Amiga? Probably not. What about desktop publishing? Pointing out that Amiga had carved out a niche (in video editing) when that was the norm back in those days doesn't make any strong comment about the long term superiority or viability of the platform.
Also, I don't buy into the idea that just because a company had something "superior" for a short period of time with no further company direction that they didn't lose fair and square. That Amiga had something cool in the 80s but didn't or couldn't evolve isn't because the market "chose wrong". Commodore as a company was such a piece of shit it made Apple of the 80s look well run. Suffering a few more years with the occasional bomb on System 7 was not a market failure.
> Macs had bizarre cooperative multitasking
What was bizarre about it, compared to any other cooperative multitasking system of the time? Also you seem to be fixated on preemptive multitasking to the neglect of things like memory protection.
I wonder if the uni is so locked down now that students can sit in the lab all night.
Being a bit pragmatic in getting my actual thesis done I discovered that there was all of a sudden, a lot more resources available on one of the (older) Sun servers.
It saved me days if not weeks.
Absolutely, they could have been where Nvidia is now!
I'm on board for this project?
I started with Unix on a Personal IRIS as an undergrad working in a physics lab which used it for imaging capture and analysis. I was the nominal sys admin, with one semester of Minix under my belt and just enough to be dangerous. (I once removed /bin/cc because I thought it was possible to undelete, like on DOS. I had to ask around the meteorology department for a restore tape.)
The summer before grad school I got a job at the local supercomputing center to work on a parallelization of CHARMm, using PVM. I developed it on that PI, and on a NeXT. That's also when I learned about people at my future grad school working on VR for molecular visualization, in a 1992 CACM article. So when I started looking for an advisor, that's the lab I chose, and I became the junior co-author and eventual lead developer of VMD.
With a Crimson as my desktop machine, a lab full of SGIs and NeXTs, and the CAVE VR setup elsewhere in the building. Heady times.
I visited SGI in 1995 or so, on holiday, thinking that would be a great place to work. They even had an Inventor plugin for molecular visualization, so I thought it would be a good lead. I emailed and got an invited to visit, where the host kindly told me that they were not going to do more in molecular visualization because they wanted to provide the hardware everyone uses, and not compete in that software space.
In the early 1990s SGIs dominated molecular modeling (replacing Evans & Sutherland), so naturally the related tools, like molecular dynamics codes, also ran on them. But we started migrating to distributed computing, where it didn't make sense to have 16 expensive SGIs, leaving them more as the head .. which as you pointed out, was soon able to run just fine on a Linux machine.
I was there around 97 (?) and remember everyone in the company being asked to read the book "The Innovator's Dilemma", which described exactly this situation - a high end company being overtaken by worse but cheaper competitors that improved year by year until they take the entire market. The point being that the company was extremely aware of what was happening. It was not taken by surprise. But in spite of that, it was still unable to respond.
The Saturn 5 was clearly a technical marvel better than any plane, and it'd get you anywhere much faster.
If you spare no expense, you get a better product. Sure. I'm also not surprised that a $100k BMW is more comfortable than a Renault Clio.
I had a couple of Indigos that I supported while an undergraduate (I had a student job with the University's Unix group in their computing center), and the SGIs felt to me exactly like the Amiga- Really cool, but kind of lopsided. I tended to do most of my work on the SPARCstations and ignore the SGIs unless I specifically wanted to play with the graphics stuff.
I actually still have an Indigo XS24 that I collected at one point over the years. Tried to get it to boot a bit ago but it's dead, unfortunately.
The variety in enclosures matched the novelty in architectures of the period. Exciting times to be part of.
My first PC was acquired in 1992, and still only had a lousy beeper, on a 386SX.
Kodak was not actually in a position to be big in digital. And, of course, the digital camera manufacturers mostly got eclipsed by smartphones anyway a decade or so later.
Regarding computing cycles, boom/bust, I recently re-read Soul of New Machine and was struck by how much the world has NOT changed. Sure we're not talking about micro/mini-computers and writing micro-coded assembly but the whole "the market is pivoting and we need to ride this wave" and "work like a dog to meet some almost unobtainum goal" seems to still underpin being an engineer in "tech" today.
There were times when Java ran better on Intel than on Solaris.
Really interesting article that goes into depth regarding the SGI products. I didn't know Clark basically invented the GPU.
Speaking of exotic hardware, I'm actually sitting next to an SGI O2 (currently powered off). A beautiful machine!
In my opinion, Mesa played a more significant role because it first allowed people to port OpenGL software to run on software-only cheap systems running Linux, and later provided the framework for full OpenGL implementations coupled with hardware acceleration.
Of course, I still greatly enjoyed running Quake on Windows on my 3dfx card with OpenGL.
What happened was Intel, they took great decisions like automating the design of their processors and this made them grow at an incredible pace. The Amiga depended on a different processor that stagnated.
Chip ram, fast ram, cpu ram, expansion board ram, or slow ram? Did too much ram force your zorro card into the slooooooooooow ram address space (mine did)? Tough cookies bucko!
Macintosh, pounding on table: "RAM is RAM!"
Is there any significance really to Foot Locker basically being a reorganized Woolworth's as opposed to being a brand-new company?
If you're big enough and have some product lines that still bring in a lot of money and didn't totally collapse like IBM you can sometimes pull it off. But it's hard.
"Literally something that should not exist" is the perfect way of putting it. In 1990, lots of people needed boutique workstation vendors. In 2000, nobody did.
Price-point, SGI technology was a financially flawed model pertaining to the growing market and more useful than flawed performance of the low cost technology market.
Did anyone at SGI try to simply buy the low tech products, play with them a bit, and see about slowly integrating your tech to make that low tech product just a little better than the competition and cost effect for the market?
Then Intel introduced dual core (or maybe just two chips in one housing sharing a bus), and that generated a lot of buzz. So he wrote a follow-up titled "Pecked To Death By Ducks With Two Bills".
I don't recall the timing, though, how it related to the timing of asking everyone to read The Innovator's Dilemma. But at least some of the time, there was a pretty deep denial (or at least a pretty deep effort to keep the customers in denial).
There were all kinds of toys, though. There was a dedicated classroom setup for video-based remote learning some 30 years before COVID - that got used for one semester, from what I gather (was never used while I was there). The school was even host to a dialup ISP at one point.
The administrators were all in on technology. The teachers, not so much...
Eventually, in my last year, the government changed the funding model and the party ended.
(I tried some early NT graphics cards on a Pentium Pro machine. This was before gamer GPUs; these were pro cards from tiny operations. Fujitsu tried unsuccessfully to get into that business, with a small business unit in Silicon Valley. At one point they loaned me a Fujitsu Sapphire graphics card prototype. When I went back to their office to return it, the office had closed.)
Also, there was a bad real estate deal. SGI owned a lot of land where Google HQ is now. They sold it to Goldman Sachs in a sale and lease-back transaction, selling at the bottom of the market. That land, the area north of US 101 in Mountain View had, and has, a special property tax break. It's the "Shoreline Regional Park Community", set up in 1969. The area used to be a dump. Those hills near Google HQ are piles of trash. So there was a tax deal to get companies to locate there. That made the land especially valuable.
Yeah fair. I do wonder if a port like the SNES version would have been possible if id would have greenlit it, but that's a "what if" universe. Alien Breed 3D would run on a 1200, but IIRC it ran pretty poorly on that.
> And then the 80s ended. What point did I make that you are contradicting?
I mean, yes, VGA cards and Soundblaster cards were around in 1990, but they weren't really standard for several years later.
> And if you wanted to do CAD? Would you use an Amiga? Probably not. What about desktop publishing? Pointing out that Amiga had carved out a niche (in video editing) when that was the norm back in those days doesn't make any strong comment about the long term superiority or viability of the platform.
Also fair. I'll acknowledge my view is a bit myopic, since I don't really do CAD or desktop publishing, but I do some occasional video editing, and I do think Amigas were quite impressive on that front. You're right in saying it was a "niche" though.
> Commodore as a company was such a piece of shit it made Apple of the 80s look well run.
No argument here. Still think that the hardware was pretty cool though.
> What was bizarre about it
I guess "bizarre" was the wrong word. It was just really really unstable, and System 7 would constantly freeze for seemingly no reason and I hated it.
> Also you seem to be fixated on preemptive multitasking to the neglect of things like memory protection.
I feel like if Commodore had been competently run, they could have done work to get proper protected memory support, but again that's of course a "what if" universe that we can't really know for sure.
I guess what frustrates me is that it did genuinely feel like Commodore was really ahead of the curve. I think the fact that they had something pretty advanced like preemptive multitasking (edit: fixed typo) in the mid 80s was a solid core to build on, and I do kind of wish it had caught on and iterated. I see no reason why the Amiga couldn't have eventually gotten decent CAD and Desktop publishing software. I think Commodore didn't think they had to keep growing.
This is I think the premise that you and people like me who think Amiga could have gone on to do great things disagree on, I think. Most Amiga fans would say that it totally had a path forward, or at least there is no evidence that it didn't, and the failure to follow that path therefore it wasn't an inherent technical problem, but a problem of politics and management. Do you have any evidence to the contrary?
Which is one of the all-time greats IMHO. I'd keep it around too.
The article has this: ''As Bob Bishop took the reigns of SGI, things looked dark. AMD announced their 64 bit architecture in October, PC graphics had made massive strides while remaining significantly less expensive than SGI’s offerings, NT was proving to be a solid and less expensive competitor to UNIX, Linux was eating away at traditional UNIX market segments, and Itanium still hadn’t launched.''
I can agree with almost all of that statement but I object to the ''NT was proving to be a solid and less expensive competitor to UNIX'' part as mostly false in any mixed OS environment over which I'd ever been admin.
This is conventional wisdom (and thus, usually correct).
However, it's always interesting to look at counterexamples: Beretta, for example (in business for 500 years).
https://www.albertcory.io/lets-do-have-hindsight
or the IBM PC, which cannibalized IBM's business, at least in IBM's mind. Thus, they screwed it up and let Wintel make the real billions. So it worked, until they took your advice and decided that had to stop.
I demonstrated IRIX to younger colleagues and it was - ok, so it's alright I guess, like anything else we have today? Yep.. but contemporary world was NOT like that.
I had an Octane as well, heavy and loud beast. All in storage now waiting for move. In late 90s I worked a lot on SGI machines (vfx).
Sun Microsystems was a company like no other. The last of a dying breed of "family" technology companies.
By the early 90's the Amiga just wasn't competitive. The chip set barely evolved since 1985. ECS barely added anything over the original chip set. By around 1992 or 1993, 386 systems with SVGA and Soundblaster cards were cheap. Amiga AGA was too little, too late. Also consider the low end AGA system (Amiga 1200) was totally crippled with only 2 megs of slow "chip" RAM.
I was an Amiga fan until 1993. Started with an A500, then A3000. Eventually I moved on to a 486 clone w/Linux. Later on I had a Sun SparcStation 10 at home, so I agree with you on Sun and SGI.
But the reality is the Commodore 64 kept Commodore going during most of that period rather than Amiga sales. It's similar to Apple where the Apple 2 kept Apple afloat during the 80s and 90s until Steve returned.
Took me awhile to find a copy on the net, https://www.seriss.com/people/erco/sgi-irix-bloat-document.t...
Here's a formatted-for-HTML version: http://www.art.net/%7Ehopkins/Don/unix-haters/tirix/embarras...
By '97 or so SGI actually had essentially given up competing when they shut down the team that was developing the successor to InfiniteReality.
In a sense though, Silicon Graphics did become more standard, in that their original 3D framework was Iris GL, which then evolved into OpenGL, which became the main 3D graphics standard for many years.
In case anyone's interested, their graphics card (GE1, the world's first ever hardware 3D graphics card?) looks like:
https://matrix-client.matrix.org/_matrix/media/v3/download/m...
...and the PM2 68k processor card mentioned in the post looks like:
https://matrix-client.matrix.org/_matrix/media/v3/download/m...
...and one of the machines itself looks like:
https://matrix-client.matrix.org/_matrix/media/v3/download/m...
Suffice it to say that I have a very soft spot for these machines :)
Interesting. How cheap? Never used Macs, only Windows and Unix and Linux.
By the late 80s the "microcomputer" hobby/games market was dead and systems like the ST and Amiga (or Acorn Archimedes, etc.) were anachronisms. You had to be a PC-compat or a Mac or a Unix workstation or you were dead. Commodore and Atari both tried to push themselves into that workstation tier by selling cheaper 68030 machines than Sun, etc, but without success.
I was around to witness the he tail end of that office space transition (on the incoming side at Google). It was surreal to be sitting in the physical carcass of a company I had long fantasized about (in part due to their marketing via Hollywood).
In retrospect it was ironic because a company that was based on selling very expensive high performance compute (SGI) was being physically replaced by a company selling (albeit indirectly) very cheap high performance compute.
Not only are virtually all organizational processes and incentives fundamentally aligned against effectively responding, the best practices, patterns and skill sets of most managers at virtually every level are also counter to what they must do to effectively respond. Having been a serial tech startup founder for a couple decades, I then sold one of my startups to a valley tech giant and ended up on the senior leadership team there for a decade. I'd read Innovator's Dilemma in the 90s and now I've now seen it play out from both sides, so I've thought about it a lot. My key takeaway is that an incumbent's lack of effective response to disruption isn't necessarily due to a lack of awareness, conviction or errors in execution. Sure, there are many examples where that's the case but the perverse thing about I.D. is that it can be nearly impossible for the incumbent to effectively respond - even if they recognize the challenge early, commit fully to responding and then do everything within their power perfectly.
I've even spent time sort of "theory crafting" how a big incumbent could try to "harden" themselves in advance against potential disruption. The fundamental challenge is that you end up having to devote resources and create structures which actually make the big incumbent less good at being a big incumbent far in advance of the disruptive threat appearing. It's hard enough to start hardcore, destructive chemo treatment when you actually have cancer. Starting chemo while you're still perfectly healthy and there's literally no evidence of the threat seems crazy. It looks like management incompetence and could arguably be illegal in a publicly traded company ("best efforts to maximize/preserve shareholder value" etc).
My take: that’s just fine. Tightly crafted code is not a lost art, and is in fact getting easier to write these days. You’re just not forced into scrabbling for every last byte and cpu cycle anymore just to get acceptable results.
I used it for a while earlier at work, and don't remember many problems with it. One did have to apply OS patches fairly regularly to it, but IIRC, that process was somewhat smooth.
I have never really said that they where "taken by surprise", but a part of it felt like (from the outside) that management had been a little blinded by their pass success and the profit margins from their workstations combined with no clear path forwards for the whole industry. Nvidia could have very easily been just a curiosity of the past but they managed to strike it lucky standing on the shoulders of others.
If SGI had always been a company that could provide graphics workstations the worked with x86/Windows PC's early for example - maybe they would have fared better. Would have gone with the flow of technology at the time rather than fighting uphill no matter the potential technical brilliance. But being saddled to their MIPS processors and custom OS meant that once people left, they almost never came back. One can have the best tech and still fail.
SGI also never had a presence in business critical applications which gave some of the other vendors more momentum (HP-UX/PA-RISC, VMS/Alpha, Solaris/SPARC).
Yes, but the team that did that also left SGI, then worked directly with Nintendo for the GameCube and are acquired by ATI. I’m not sure how SGI managed to not support that effort within itself.
With money to burn SGI was a childhood brand, legends in 3D. Such wonderful memories. 15k on a desktop setup - it was loose change, however it shows how clueless I was back then. However I'd felt like I'd "arrived".
SGI with Windows NT - lol - I wrote my first OpenGL game in Visual Basic... I've always been somewhat of an outlier ;-) God help me.
The point? My personal experience says something about the strength of the SGI brand - even in the face of what was happening at the time (3DFX and so on - my previous company was one of the few 3DFX api devs - illustrating how clueless I was...)... it all happened so quickly... I'm not surprised SGI couldn't respond - or more importantly understand the strength of Microsoft/OpenGL/DirectX in the boiling pot of 3DFX / Nvidia and the rest... From memory it took three years and SGI was done - shared memory architecture? No longer worth the cost. :-(
Looking back, I was such a kid - a complete fool.
Greybeard advice: bet on the monopoly. Be smart. Brands like SGI are nothing in the face of install base. Think about how crazy it was to spend 15k on a desktop SGI back then... nostalgia is insanity, vanity.
It was a dream company for pretty much every siggraph person at that time. I was in grad school, eagerly awaiting a very popular 3-semester course in computer graphics. It had been devised and taught by a young promising professor who had published some pioneering siggraph papers. I signed up for the course. On the first day of class, the head of the department walked in and said the professor had been recruited by his dream company SGI for an ungodly sum of money to work on some Jewish director’s movie about a dinosaur themepark. I thought ok, whatever, someone else will teach the course. The bastards scrapped the entire 3 series computer graphics module because there wasn’t anyone else who could teach that. So we had to pick from one of the usual dumb options - databases, OS, Networks, Compilers. Since then I’ve always held a grudge against sgi.
Fall 95 enter freshman year and we had Indys and IBM RS6000s as the main workstations on campus. Really great setup where you could sit at any workstation and all your stuff just worked and your whole environment seamlessly migrated. The only thing you had to do was if you were compiling your own stuff you'd have to recompile it for the machine you sat down at.
SGI brought a demo truck to campus in the spring of my Freshman year (Spring 96) and blew us all away. They were there for interviews, obviously I was a freshman but we all went to check it out.
Summer 96 I get an internship and for kicks they gave me an Indy with a 21" CRT (huge at the time) and the silly video camera that was like 10+ years ahead of it's time.
Fall 96 we got labs full of O2s.
Fall 1997 I bought a 3DFX card. MS/Intel somehow made a donation to the school and got them to start phasing out the Unix workstations. The windows NT setup was terrible, they never had the printing and seamless movement of files down till after I graduated. Video games in the Fall of 1997 on the 3DFX were basically as impressive as the demos on the $100k refrigerator sized machine SGI showed in 1995.
Probably fall 1998 I remember my Dad got a computer with an Nvidia Riva 128.
Spring 99 I graduated and that fall I rebuilt my PC with a Geforce 256.
I'm not sure when I last saw an SGI, but I did briefly use one of their NT machines IIRC.
Last time I had a Sun machine at work was probably 2004. I remember maybe 2007-2008 at work deciding for the first time we were going to support Linux, then by 2010-11 we had dropped support for Sun.
Most of the commercial Unix workstations had tons of Unix annoyances I never found Linux to have. Irix was maybe the best. HP-UX was super annoying I remember. I didn't use DEC Unix and Tru64 much. Closed source PC Unix like SCO I remember being horrible.
I remember a talk by Clayton Christensen talking specifically about Intel and how they setup the Celeron division to compete with themselves (based on his advice).
A key property of tech in economics lingo is that it is “natural monopolies” - all fixed cost and no variable cost.
This creates these winner takes all games. In this case both Intel, SGI plus others knew the rules and it just ended up with Intel taking the prize and it all becoming Wintel for a decade or so - basically until the smart phone allowed enough capital to be accrued to challenge the old monopoly.
Most Hollywood effects were all done on SGI systems before the slow migration to Linux. Renderman, Maya, were all SGI first-party programs.
Also SGI made huge advances in NUMA and machines with dozens of CPUs/processors before most other companies ventured into this space.
But not business critical like IBM CICS or Java.
1. https://en.wikipedia.org/wiki/NUMAlink
2. https://www.cs.ucr.edu/~bhuyan/CS213/2004/numalink.pdf
3. https://cseweb.ucsd.edu/classes/fa12/cse260-b/Lectures/Lec17...
3Dfx grew up in the arcade market. They were always consumer-focused.
Prices start start at $3,495
https://www.1000bit.it/js/web/viewer.html?file=%2Fad%2Fbro%2...
I was at Apple when we worked with engineers from Kodak who were working to change various format standards to allow digital photos. This was in the late 1980s or early 1990s.
I'm not a particular Kodak apologist but suggesting that a company should have been able to anticipate and correct for their business collapsing by 90% in a decade or so seems to need a lot of particulars.
And if memory serves, the Bible (https://www.goodreads.com/book/show/603263.Advanced_Programm...) didn’t cover it, which was a problem.
There is no way to do this for an IBM z16, which is the kind of vendor lock in that people are saying Apple doesn’t have.
Even displacing the big Japanese camera manufacturers, who by then had dominated high-end photography, would have required reversing decades of a shift away from high-end cameras like the Retina line.
I don't doubt there was company DNA against digital photography but it's not like non-smartphone photography, especially beyond relatively niche pro/prosumer level, has had such a good run recently either.
When you switched to Intel in 1992, PC's had already existed since 1981. PC's didn't wipe out most other home computers overnight.
Nice.
I once worked at a startup that had a Cobalt Qube in the server room, and the Cobalt was ... cobalt blue.
https://en.m.wikipedia.org/wiki/File:Cobalt_Qube_3_Front.jpg
Kodak could have spun off a consumer electronics or semiconductor manufacturing company. But it's not clear why that is actually a better model than someone else just spinning up similar entities.
I don't need all the chemical engineers and a lot of other people connected with the old business anyway. And I'm sure not turning them into semiconductor experts.
So you're one of the 10% of employees in HR who snuck through to the other side. Is that really a big deal?
They were as dead as SGI in the same timeframe.
The graphics demos looked like trash, basically just untextured and badly shaded plain colored objects rotating on the screen. For reference I was playing Quake III around the time which had detailed textures and dynamic lighting.
I asked the SGI presenter what one of his Indigo workstations cost. He said $40,000, not including the graphics card! That’s extra.
I laughed in his face and walked out.
Worked at a university in the early 90s.
Maybe irix was okay to use if you were just sitting in front of it doing rando user / graphics things, but administering it was unbearable. The license fees to get OS updates were exorbitant; you'd have to get wacky new licenses to enable NFS or NIS and you'd need new kernels for just about anything.
As far as I could tell they were a cursed company that hated their users. "Here's a pretty thing that does one thing well but is otherwise insane and will ruin you when you need it most."
Good riddance.
For team building, we launched potato canons into NASA Moffet field, blew up or melted Sun machines for fun with thermite and explosives. Lots of amazing people and fond memories for a kid getting started.
This was their downfall, trying to scale out adoption with esoteric hardware.
I remember being quoted $18k ish for memory upgrade on a O2 or origin, same amount of memory I had just bought for $500 for an intel Linux box at home.
Sure, it wasn’t apples to apples, but I remember thinking very clearly that this wasn’t going to end well for SGI.
SGI could double down on their servers and supercomputers, which they did for a while, but without entry-level options, their product lines becomes the domain of legacy clients who are too afraid (or too smart) to port to cheaper platforms. And being legacy in a highly dynamic segment like HPC is a recipe for disaster. IBM survived because their IBMi (the descendant of the AS/400) and mainframe lines are very well defended by systems that are too risky to move tied to hardware that's not that much more expensive than a similarly capable cluster of generic and less capable machines. As the market was being disrupted from under them, they retreated up and still defend their hill very effectively.
The other movement they could do was to shift downwards, towards the PC, and pull the rug from under their workstation line. By the time Microsoft acquired Softimage and had it ported to NT, it was already too late for SGI to even try that move, as NT was solidified as a viable competitor in the visual computing segment, running on good-enough machines much, much cheaper than anything SGI had.
https://www.linkedin.com/posts/jeremyallison_wither-google-f...
More importantly, the things that made Quake III so great were state-of-the-art for gaming. But those things couldn't render lines quickly and well (a mainstay of CAD at the time), or render at very high resolution (which IIRC was 1280x1024 in that era).
Here's what Carmack said abotu the SGIs a few years before: """SGI Infinite reality: ($100000+) Fill rate from hell. Polygons from hell. If you don’t trip up on state changes, nothing will come within shouting distance of this system. You would expect that.""" SGI was also key for map builds before PCs were capable.
But yes, 1999-2000 was just around the cusp of when SGI went from "amazing" to "meh".
Obviously the observation has a confirmation bias.
Integraph started making PCs with high-end graphics at one point, when they abandoned CLIX and gave up on their (Fairchild's, really) Clipper processor. It didn't work for them either. SGI did their own "Visual Workstation" that ran Windows and had a Pentium, but that too was a huge disappointment.
On the upside, I've never known anyone else who had even heard of Durrell.
From 1991 when I first saw SunOS I wanted a SPARCstation. I started college in 1993 and the school was full of DEC Ultrix machines, Suns, HP PA-RISC, and a handful of IBM RS/6000 and SGIs.
I just thought DOS/Windows PCs were such garbage. Single user, no preemptive, multitasking, no memory protection. Then Linux came out and it changed everything. I bought a PC just to run Linux. My dream of a Unix RISC workstation faded away.
My roommate in 1996 bought a DEC Alpha. Not the cheaper Multia but an Alpha that could run OSF/1 Digital Unix. He actually ran NetBSD on it.
In 1997 I took the computer graphics class and we used SGIs. There was just one lab of them reserved for that class and grad students. I was so excited and it was really cool but I didn't think I could ever afford one. It's still really cool though that you had one.
Later the large Altix NUMA systems with core counts in unprecedented sizes (and problems booting due to lock contention ;)
And of course their donation of the XFS filesystem to the linux world!
These years later, while the innovator's dilemma thesis describes what, there's still little treatment of why and how.
I keep wanting someone to account for the roles of investment and finance.
Amazon's innovation was lower cost of capital. They convinced investors to wait for returns. And they got a massive tax holiday. (How could they not succeed?)
Ditto Tesla, with its saavy moves like govt loans, prepurchases, tax incentives, and selling direct.
That cheap capital was necessary, but not sufficient. Both still had to create products customers wanted.
I keep coming back to Apple. How'd Apple avoid the trap? Despite their terrible position. My guess is better financial strategy (if that's the right phrase). Apple focused on margins (and monosophony) instead of market share. And then leveraged their war chest to implement monosphony.
Fuji is interesting, they weren't that successful in first digital cameras, but now have some interesting mirrorless ones. They still make film.
SGI had invested in building significant strengths and competency in its sales and distribution structure. This was one of their key competitive moats. Unfortunately, not only did the shift in economics make this strength irrelevant, it turned it into a fundamental weakness. All that workstation-centric sales, distribution, service and support infrastructure dramatically weighed down their payroll and opex. This was fine as long as they could count on the higher margins of their existing business. While it's easy to say they should "just layoff all those people and relaunch as a desktop company" that can't be done in one quarter or even one year. It requires fundamentally different structures, processes, systems and skill sets. Hiring, training and integrating all that while paying for massive layoffs and shutting down offices, warehouses etc takes time and costs a lot of money.
Worse, once their existing workstation customers saw them shutting down the SGI the customers had bought workstations and service contracts from to become a different kind of company entirely, sales revenue would have taken an overnight nosedive. SGI's stock would also have tanked far more immediately than it did as the fickle stock market investors sold stock they'd bought because SGI offered a specific risk/return expectation which just became much more "risk" and much less "return" (at least in the near-term). In making such a dramatic move SGI would have effectively dumped much of their current quarterly revenue and the value of one of their core strengths - all at the same moment. Thus turning them into one of their emerging startup competitors with all of a startup's disadvantages (no big ongoing revenue streams, no big cash pile (or high stock valuation to leverage for cash)) yet none of a startup's strengths (nimble, lower-paid staff and more patient venture investors).
The point of my earlier post was mainly that a true disruptive market shift is nearly impossible for a large, established incumbent to successfully survive because they basically have to rapidly turn into someone else almost overnight. How can a champion sumo wrestler survive a shift so dramatic that their sport quickly turns into a track meet? Even seeing it coming doesn't help. How does one even prepare for such a shift since losing mass turns you into a bad sumo wrestler long before you even start being a viable sprinter? As Christiansen observed, such disruptions are often enabled by technology but the actual cause of incumbent death is often due to the shift turning an incumbent's own strengths into weaknesses almost overnight.
It turned out that double precision was a mistake that was sold as a “professional” feature. By sharing edges correctly and using the correct rounding modes, single precision provides pixel-perfect rendering. Efficiencies like this allowed the consumer GPUs to run circles around SGI hardware.
That's pretty wild, I had completely forgotten about that. It was too late of course by then.
I was working at a CAD/CAE the company which leased SGI workstations in the early 2000s. Was just a fresh grad wondering why were they leasing them, then realized they were more expensive than some cars.
Some of the developers were starting to use NT workstation with 3D graphics, they were not cheap, but they could run circles around Octanes and Indigos at a fraction of the cost. They were rushing to port everything to NT as writing had been on the wall for a while by then.
Competitors such as Sun did port their proprietary Unix flavors to the x86 PC platform but never achieved traction. It was impossible to compete with free Linux, and lack of device drivers was always an obstacle.
I don't know whether this is available online, but I can recommend it as a pleasant programme, with lovely scenery, interesting storylines, and engaging actors.
The SSA says that Social Security tax was a mere 1% before 1950, and would remain below 4% until the 1970s. [1]
A 1949 Form 1040 [2] suggests that a single filer with 3 dependent children, earning $2700 that year, would have a federal income tax liability of only $7. Not 7 percent, 7 dollars.
From what I've seen their biggest current challenge is their mailer. Netflix spent a lot of time designing their signature red envelopes, working with the USPS on ensuring they would survive a trip through the postal sorting machines. DVDInBox has yet to reproduce it - their envelopes sometimes arrive fairly mangled, or have not arrived at all (lost to the bowels of the machines).
IBM also came into the game more vertically integrated. Having your own fab is expensive, but if you read on what Sun and SGI had to do in order to get chips, that route also wasn't great.
In the beginning there was a chance that Sun and SGI could have merged, the reason it didn't happen was mostly because leadership couldn't agree on who would lead the company. Between them they duplicated a lot of technology while sitting right next to each other. Both doing their own RISC chips, at times Sun was doing their own graphics cars, competing in 'low' priced and mid priced work stations, incomparable Unix developments, competing in the large SMP market against each other. If they had been together and things could have been different a larger install base and more investment into the architecture might have given them a better chance.
Why did they cancel it, money running out? It’s sad to think they were close to a new architecture but then just kept selling IR for years (and even sold a FireGL-based “Onyx” by the end).
Also was it a separate team working on the lower-end graphics like VPro/Odyssey?
The real failure is not picking up new business along the way. With the N64 they showed they could design a consumer product. But outside of that they were in no future consoles. 3dfx and ArtX both came from former SGI. You don't need to stop selling workstations just because you make chips for consoles and other such devices. Nvidia barely survived and might not have if not for consoles. There are other markets where their expertise could have been applied.
Surviving changes like that often requires finding other markets. And then when it comes to making hard choices you need to cut part of the thing that is unprofitable. But this is really hard to do and in some ways it goes against the 90s US corporate philosophy of focusing only on the 'Core' business. DEC for example sold profitable business units that might have been helpful to have. DEC had a printer business and had the potential for a Database business. Oracle calls RDB one of the best acquisitions.
I just finished reading 'LIFE UNDER THE SUN: My 20-Year Journey at Sun Microsystems' that talks about Sun and the storage businesses a bit. Sun was never happy with how well they did in storage. Sun was part of defining Fibre Channel.
For some reason that still doesn't make sense to they bought StorageTek for an incredible $4 billion at the time when Sun wasn't exactly flying high. The explanation from the CEO given in the book mentioned above is baffling.
Edit:
Seems they bought:
1999: MAXSTRAT Corporation, a company in Milpitas, California selling Fibre Channel storage servers.
Never heard of them. Not sure how comparable it was.
Tesla was very capital efficient compared to how hard the task was, only just enough to get the next product out. By the time they were making big loses, you could see the Model 3 was gone turn around once they reached scale. There was always a clear story of if we do X next, that means Y.
> I keep coming back to Apple. How'd Apple avoid the trap?
I think by changing CEO radically at the right time. Whatever was gone work for Apple, it wasn't any of the typical things you would think off. I'm not at all a Jobs fan, but I don't think many people could have turned it around.
Jobs had the ability to come back and radically shift the company. They also had the benefit of 20 years of loyal costumers, maybe the biggest asset Apple had was their loyal costumer base (or captured market). Personally I can say my dad was a complete Apple fan, he would just buy Apple. He simply didn't care the higher performance or any of the other things.
A “budget” SGI card cost more than my parents’ car. I bought three different GPUs by that point with my pocket money.
It was actually 9-bit RAM, but the CPU could only see 8 of the bits!
https://albertcory50.substack.com/p/whats-missing-with-ameri...
There are a few exceptions, but for the most part, "professional management" just does not survive for very long.
This is even a major point of discussion in the book. The incumbents always see it coming a mile away. They can't respond because doing so breaks their business model. Employees can't go to managers and say, "We need to enter this low-margin market for low-end products." Doing so is a good way to get sidelined or fired.
The "dilemma" is that either traditional option, compete with the upstarts or move further up-market, leads to the same outcome - death.
If you're the incumbent, a paradigm shift usually forces you to intentionally cannibalize your existing revenue base in service of the not-yet-proven new thing. That's if you want to survive.
It's incredibly difficult, rare, nearly impossible, to pull such a thing off within a company system.
Imagine being Blockbuster, knowing that to survive, you'd need to transition to a content streaming company.
That's a ton to unwind before you can even buy servers, let alone hire the people to -lead the industry- in such a shift. All to simply remain alive.
Remember the Lavarand[1]? Random number generator based on an array of lava lamps?
I assume you mean Steven Spielberg and one of the Jurassic Park films?
If so, why can't you just say so? Why are you referring to Steven Spielberg, one of the most famous directors of all time, as "some Jewish director?" Do you think people won't recognize the name? I promise people know who Steven Spielberg is.
The Fibre Channel XIO boards were really needed back then for that application as PCI was still way too slow.
I was sad to know that when I left that job that SGI server was being replaced and the support personal at SGI were going to lose their jobs too.
Oracle threatened to not support us when I used an unprivileged Xvfb instqance instead.
Still stupid but not that uncommon back then.
There's no way the shareholders win this lawsuit. Violating fiduciary duty involves doing things you know won't provide value. Long term planning would be a defense for this.
The shareholders could absolutely oust those executives, though. And they may very well do so.
After all that training, still nobody did it (ok I did and one other guy). That company couldn't change anything. It was amazing.
They had a project to change a department into more proactive than reactive. The solution was to create a lot of bureaucracy surrounding being proactive. As you can imagine bureaucracy about being proactive was really just institutionalizing ... not being proactive.
I eventually left and work at a smaller company now. It's been refreshing for years now when we can decide "this process doesn't work, let's not do it anymore" and it just happens. Even just new coworker: "I won't be in tomorrow, who do I tell?", me: "you just did, have a great time off" seems revolutionary after being at a big company for so long.
I'm convinced that as the sheer numbers of humans increases the friction to making real change in a company decreases and there's not much you can do. Fundamental change to respond to real disruption, nigh impossible.
I think the real problem here is PC workstations, with Windows NT (later Linux) and an Intel chip, could do 90% of the SGI workstations for a fraction of the price. By workstation I mean an honest workstation with a high end graphics card, 10k RPM SCSI drives, and hundreds of megs of RAM.
The price of SGI workstations was "call us" which translates to tens of thousands of dollars per workstation. PC workstations didn't and couldn't replace all uses of SGI workstations. What SGI was not able to handle was the fact their customers suddenly having a viable option besides paying them tens of thousands of dollars for their workstations. Even their Visual Workstations weren't a real improvement cost wise as those were still overpriced compared to competitors' PC workstations.
Yes there were separate teams working on the lower-end graphics.
ARM is the only one left standing from that era which with hindsight seemed so unlikely.
Is there at least some truthiness to it? Or has this just become Silicon Valley urban legend in my head?
I’m so sick of dancing around this topic. Managers and business types destroy companies. It never stops. Company after company. How many times do we have to pretend this isn’t the case before we see the truth.
Innovators make things worth selling. Business types drop it on the floor.
Commodore tried for a play where their - still much lower end - newest generation chipset would have scaled (with the same chips) from being a full low end computer, console, or set-top box, computer (it had a PA RISC core on chip, so could run standalone), a high end graphics card for PCs, and the chipset for a higher end Amiga at the same time.
They ran out of money - short maybe a few million - before being able to test if that would have worked as a way to widen their market.
I wish we'd have gotten to see how that'd unfolded, though Commodore's deep management dysfunction probably still would have made it fail.
ARM had the advantage in that space of starting from the very low end, and being able to squeeze margins instead of being squeezed.
It is wild to think that in games for instance, we went from Quake in 1996 running software rendering to Quake 3 requiring a GPU only 3 years later and that becoming the standard in a matter of months.
I remember trying to explain to SGI reps that while we loved the SGI hardware, Linux on commodity x86 was the increasingly more effective choice for our projects. We wanted to buy SGI x86 boxes but they were pushing NT.
It was very apparent that SGI salesmen knew which way the wind was turning and they were milking the last bits of the commission gravy train.
Even when everyone understands “The Innovator’s Dilemma”, the incentives of salesmen and shareholders can be to optimize for extracting value out of the dying company. If I am a shareholder in a company being out innovated, it might make sense to want them to maximize short term profits while reallocating capital to the innovators instead of trying to improve the dying company.
During good times focus on your core and then peter out. Just follow a nice parabolic arc.
A lot of success is just timing and attempting to fight all threats look foolish. A remember, for every up and comer that “we all saw coming” there were lots that didn’t make it. If you waste time fighting those you wasted money.
See: The Blue LED https://youtu.be/AF8d72mA41M?feature=shared
So it’s not clear that the company knows better. Feels like educated guesses but a lof of luck involved.
The company was going to shit after the Amiga launched, it took a competent manager to save the company and turn the Amiga around into a moderate success.
Commodore didn't really have money to keep up chip development. They had their fab they would have need to upgrade that as well, or drop it somehow.
Another example of that is the Acorn Archimedes. Absolutely fucking incredibly hardware for the price. Like crushing everything in price vs performance. But ... literally launched with a de-novo operating system with 0 applications. And its was a small company in Britain.
The dream scenario is for Sun to realize that they should build a low cost all costume chip device. They had the margin on the higher end business to support such a development for 2-3 generations and to get most software ported to it. They also had the software skill to make the hardware/software in a way that would allow future upgrades.
They had the best processor on the market, yet they decided to sell Intel and Windows. I really don't underestand what were they smoking.
The attack on SGI didn't only happen on the lame Windows side, with their crappy software and ultra lame 3D software (compared to what SGI had), which people would love because their lameness was matched by their ultra-cheap pricepoint.
The attack on SGI also came from open-source and Linux: cheap commodity hardware that'd run both mediocre but ubiquitous commercial software and the very same cheap commodity hardware that'd run Linux.
On which OS are most (all?) AI models trained today? What OS powers 500 of the world's top 500 supercomputers?
Linux.
That's the tragedy of SGI: as if cheap commodity x86 hardware wasn't enough, they got then attacked by both Windows and Linux on that cheap hardware.
P.S: as a sidenote my best friend (still best friend to this day)'s stepfather was the head of SGI Benelux (Belgium/The Netherlands/Luxembourg) so my friend had this SGI Indy (the "pizza tower" one) at his home. Guess what we'd do every day after school?
ARM survived this long because it had a niche others couldn't break into (and still can't) as the highest performance per watt anywhere.
They also missed an (earlier) boat with the Commodore 900 workstation, which would run Mark Williams' Coherent OS (a Unix-like system).
Even for video rendering, if your box is twice as fast, it'll be outcompeted by machines that cost half as much or less. At times my desktop PC was not fast enough, it was simpler to get another PC and run time-consuming things on it while I did other things on the other.
Had one of the major PC vendors hired their designers and built just run-of-the-mill PCs, housing them in those amazing cases, I wonder how that would have worked out.
Of course in the 90s this would've been quite modern I imagine, considering that the competition was gray painted steel boxes and steel cabinets with a gray powder coat.
Intel is basically at its root PC hardware. Yes, it doesn't dominate the entire industry, but memory is pretty key to a PC and the OS that runs on it.
In fundamentally it's another fab product. Like chipsets.
He also leant his SGI Indigo2 to the LGR guy on YouTube, and he did a great video on it: https://youtu.be/ZDxLa6P6exc
Not looking it up right now but the original Q1 had a very low poly count.
ChuckMcM was at Sun at the time, and mentioned a while back he tried to get Sun to buy Commodore outright:
https://news.ycombinator.com/item?id=39585430 (his later replies in that sub-thread are also worth reading)
(With apologies for reviving 90s IRIX/Solaris snark in my earlier post. :-)
There are some vintage computer club talks where they dive into this.
RISC architectures live on today. Your house likely has dozens of MIPS chips in various components. You've got more ARM chips for sure but no shortage of other RISC chips in various components.
The red team designs a new streamlined look with less click baity posts and fewer ads. Users flock to it, abandoning the existing platform. The new platform isn't monetized as effectively as the old one so revenue drops by billions per year - maybe an order of magnitude more than the new product brings in. Shareholders demand an answer as to what management are doing about it.
There might be some golden path, but that typically relies on finding a new market that doesn't interfere with the existing business (e.g. AWS and Amazon). However, the far easier options are a) shutdown the new service or b) increase its monetization, thereby starting the enshitification cycle.
With exception of a few UNIX systems like Irix, Solaris with NeWS, NeXTSTEP, Apollo, everything else tends to be the same deck of cards reshuffled.
We were a couple of physics grad students working on a side project in late 1993. My background was a semester course based on Foley & van Dam. Hardware gave us a 5-10 year lead over what we could have done with consumer tech.
There wasn't really a "rest of CG". Only the highest-end SGI machines at the time had hardware texture mapping - most did it in software (see https://en.wikipedia.org/wiki/Extreme_Graphics).
We aren't talking 2D organo-chem hexagons, but 3D spheres and cylinders. Back around 1995 I posted some benchmarks to Usenet about the different approaches I tried (including NURBS), but I can no longer find a copy of it.
The straight-forward way is to render the spheres as a bunch of triangles, so, what, 50 polygons per sphere? Times 100,000 spheres = 5 million polygons. That was large for the time, but doable. Plus, during movement we used a lower level of detail.
What was Quake's polygon count?
Oh, and we're displaying animated molecules, including interacting with a live physics simulation, so no pre-computed BSP either.
Rastering spheres quickly on a PC was also possible then, which was RasMol's forte, but it was flat compared to having a couple hardware-based point lights plus ambient lighting.
Interestingly, AutoCAD (RIP Walker) tried to get into molecular modeling, but it didn't work out. https://www.fourmilab.ch/autofile/e5/chapter2_82.html
I don't know what Sun had planned for this tech.
A even more interesting approach for Sun would have been to cooperate or acquire Acorn. The Acorn Archimedes was an almost perfect low end work station product. Its incredible weakness was its lack of OS and it total lack of applications.
Acorn spend an absolutely absurd amount of money to try to get the OS and application on the platform. They spend 3 years developing an new OS, and then realized that this was going nowhere. So they rushed out another new OS. And then they realized that nobody want to buy a machine with a compromise OS and no application. So they had to put up huge effort to try to fix that. The company simply couldn't sustain that kind of effort on the Software side while at the same time building new processors and new machines. Its surprising what they achieved but it wasn't a good strategy.
Had they just adopted SunOS (BSD) it would have been infinitely better for them. And for Sun to release new high and and low end RISC workstations at the same time would have been an absolute bomb in the market.
Even if you added all the bells and whistles to the system (Ethernet, SCSI, extra RAM), you could be very low priced and absolutely blow pretty much every other system out of the water.
They didn't really miss the boat on that. The C900 wasn't really an attractive machine. They would have sold it at a pretty high price. At the same time you could buy a PC with Unix on it that was arguable better. It would have just been another market where they got clobbered by the PC. Not like Zilog was the right horse to bet on anyway.
>Personal
>Iris
Must have been the color of their lover's eyes.
Re Acorn though — As much better from a market perspective as buying Acorn and releasing RISC- and BSD-based low-end workstations might have been for Sun, I still prefer to imagine a world where the Amiga's unique hardware and software got to live on — perhaps with compatibility layers to run Sun software, but nevertheless preserving a UNIX-like but still non-UNIX OS lineage and non-generic-PC hardware lineage.
The iPhone was a high-cost competitor to the iPod. And Apple knew that multi-function devices usually offered poorer user experiences than more focused, (mostly) single-function devices like the iPod. But they still did the iPhone.
Apple abandoned both intel and Nvidia (not to mention AMD/ATI) which is interesting. Apple seems to be vertically integrated like IBM or SGI - but better at differentiation and at locking in new customers.
Apple seems particularly good at differentiated product design; their devices often just look better than the competition and seem to offer a better user experience.
(That being said, SGI's designs look pretty cool too.)
But it is not true that SGI failed to understand there was a point where desktop PCs would be good enough to replace dedicated workstations. They had built a $3500 PC graphics card (IrisVision) way early on, and did the Nintendo deal before PC 3D really became a thing, they partnered in 1991 with Compaq on an investment and a $50-million joint effort to develop a workstation (https://www.encyclopedia.com/books/politics-and-business-mag...), and they were themselves taking advantage of the ability to squeeze more and more 3D into a single chip; the tech trends were obvious.
SGI was a client of tech research a firm I joined at the time and it was heartbreaking to see them lose and very hard to figure out what they could/should do. It wasn't my explicit role to solve their problem but I spent a lot of time thinking about it.
You do capture some of the dynamics well but you don't capture the heart of it. The heart of it is point #1. The rest below are also inhibitors.
1) SGI had revenues of $2 billion/year and the 3D market revenue was, say, $50/million/year. (OK, maybe a bit more than that; Matrox 2D king had what, $125 million in revenue?) How do you trade the former for the latter? And on top of that trade high margins for low margins? When 95% of your stakeholders don't care about the new (PC games) market?
2) Engineering pride/sophistication/control. The company started out focused on Graphics but, being in Silicon Valley, had grown/acquired huge engineering strengths in other areas besides graphics, CPUs design (MIPS), NUMA cache-coherent SMP hardware+SMP UNIX design, etc and that's before you get to the Cray acquisition and bits. They were the "Apple" of graphics workstation vendors but there was no "iphone" integrated vertical play for them downmarket (except maybe consoles and they half-tried that with N64 and even Nvidia while using that has minimized that due to its low margins and low opportunity for upside.) It was hard technically to give up/deprioritize all those levels of engineering sophistication in favor of competing on graphics performance and price-performance when the PCI(/AGP) bus was fixed, the software API layer was fixed, the CPU layer was fixed, and you had to compete with 80+ fledging companies to win in both performance and price/performance in a low margin PC 3D games graphics which is just a much lower value-added play for engineering.
3) Compensation. Employees with knowledge of how to make a 3D graphics chip had a bigger upside outside of the company than inside with that knowledge. They left. 3dfx founder left. ArtX guys left. Later other guys left for Nvidia.
4) Slow/different engineering cycle times/culture. SGI cycle times for new products were 3-4 years. Sure they'd do a speed bump at the half-way point. Some volume 2D chip companies would have tweaks to their designs coming out every 2 weeks. PC graphics vendors needed a new product every 6-12 months. Nvidia's most critical life-saving measure in their product history was to cut their cycle time radically by using simulation because there was no other option, and it left them with 2/3rds of their target blend modes not working. https://www.acquired.fm/episodes/nvidia-the-gpu-company-1993...
5) Pride around owning/controlling the 3D software layer. Having developed OpenGL, they didn't/wouldn't figure out how to let Microsoft control/change/partner with it for gaming markets at the API level. Yes they licensed OpenGL, and eventually they caved and cross-licensed patents, but Microsoft was never going to let them own/control the API nor was Microsoft ever going to fully support an open API. And there was no love lost between Silicon Valley engineers and Microsoft in the late 90s. Hard to partner.
6) Executive leadership (I don't know any of this firsthand.) The founder who cared about graphics, Ed Clark, by some accounts saw the workstation/PC threat a ways away. When the critical timeframe came to deal with it though (92-94), he also saw the web and saw how big that wave was, and PC 3D graphics was a small wave in comparison so he switched focus and left behind him was the CEO who was an ex-HP executive who had for a decade grown SGI from $5m revenue to billions by remaking SGI into the image of HP rather than focusing on graphics, graphics, graphics and was not equipped to bet the company on rebirthing/growing the company out of a nascent PC 3D games market growing.
As an industry analyst serving SGI as a client (and its competitors) and seeing the forces facing them in the mid/late-90s, what was known at the time was: - 3D games were going to fuel consumer usage, on both consoles and PCs, thanks to one-chip texture mapping (+ over time geometry processing) - Wintel controlled the PC space but did allow third-party graphics cards manufacturers and were fine with that - Linux was good enough, but was for a transition period not nearly as good as conventional UNIX - There was a huge room for improvement in 3D graphics on the PC at first, but at some point you would get good-enough bang for your buck and then the market for your graphics processors would stagnate. Screen resolutions grow more slowly than Moore's law, and once you can do enough shading for every pixel on the screen 4x over and enough polygons matrix operations for one polygon on every pixel on the screen, how much more compute do you really need?
But in 92-95 it was hard to advocate for SGI downsizing/betting-the-company based on this alone.
In mid-1993 Windows NT 3.1 marked "good enough" Windows to compete with UNIX (process isolation) and Windows 3.5 in Nov 1994 solidified that. In Nov 1995, with Pentium Pro specInt figures coming out, it was clear RISC was going to die but just cognitively hard to recognize.
SGI clearly saw the problem and tried to diversify out of it Alias/Wavefront (1995) and going upmarket (buying Cray whom they were cannibalizing) but neither of those "saved it".
I remember thinking at some point charting out with a colleague how the industry consolidation might happen that they really should join up with Apple (both really rode the 'halo' effect of high-end sexy products to sell bread and butter products effectively and both were very vertically-integration oriented); I don't know if that was ever considered. Apple was 100x weaker then and I don't recall if the actual market cap economics would have worked.
What wasn't fully recognized (at least by me or others I read voraciously at the time), but is clearer in hindsight was that: - while SMP parallelism was part of the wave of the future (and required both work on both CPU+OS layers of the stack), and GPUs contained a mixture of ASIC-based parallelism for both shading and vertex operations, that one could construct a general purpose coprocessing engine that would have real market uses beyond graphics (first in HPC then in AI) and a longer-term value proposition that would outlast a gamer-centric niche market and be a general compute engine.
The term GPU alone that Nvidia used early on in marketing now implies and delivers on that vision, but it didn't imply it (to my eyes) at the time; SGI and others had used the term GPU informally even before Nvidia marketing did, to refer to chips with geometry/matrix computations on them and Nvidia was entering that game (and of course talking up their book.) But the true vision of creating a general-purpose parallelism compute engine coprocessor and API along with it was really fleshed out at Stanford PhD work by Ian Buck a decade later in 2004 and he then graduated and took it to Nvidia and it became CUDA. https://graphics.stanford.edu/papers/brookgpu/brookgpu.pdf At least as far as I can tell.
It was always possible as a graphics chip company in the 1990s to see the next 1-2 generations of possible technical improvements, but it was very hard to see a multi-billion dollar business. Tens of millions absolutely. Hundreds of millions, sure. But billions? And this goes back to my point #1.
There was a very real possibility they would enter the 3D PC gaming fray, not be able to compete and lose, and then all that Silicon Valley graphics marketing halo they benefited from so much have would faded. But it faded anyway.
I suppose they could have cross-licensed their texture mapping patents to any startup giving them 5-10% equity in return for capital and then tried to buy them out as they grew. They tried fighting over that issue with Nvidia later. But they would have hemoraged engineers to that approach and I'm less sure that would have worked.
In my view, they should have just bit the bullet, rolled the dice, and played the PC 3D graphics gaming game and stayed true to their name, "Silicon" "Graphics". Not "Silicon Software" (alias/wavefront) or even "Silicon parallelism" (Cray). It can be true that if you stay in a niche (3D graphics for the masses), you end up in a ditch. But they lacked the courage of their potential and went the wrong way. In hindsight, someone probably should have kicked out McCracken in 1992, not 1997 and they could have gotten a more visionary leader less tied to the HP executive way of looking at problems/opportunities. But I don't know how they could have transitioned to a leader with better vision or where they could have found one.
I'd be interested if there was ever an HBS case study on this based on internal SGI documents. Or if others have pointers to internal SGI discussions of this dilemma. It still bothers me 30 years later as you can see by the length of this post. Too bad I posed this on hn a day late.
https://anthonysmoak.com/2016/03/27/andy-grove-and-intels-mo...
Being into art at the time, was facinated with Silicon Graphics computers. If I could "wish" a xmas present, it would have been (something like) an SGI Indigo2 system.
It wasn't long before a 486-PC entered our home. It was certainly a learning experience trying to get Windows 3.1 to work but it was DooM that completely changed my views on video games. I never expected graphics to make such a huge leap when I considered Mortal Kombat 2 to be peak graphics! How game graphics changed for the rest of the 90s is insane!!!
When the Nintendo 64 came about (originally Ultra 64 and Project Reality) I honestly had a childish attitude that NOTHING would touch it for a long time because it was SGI under the hood! In defence of my childish attitude, I was still a child.
The harsh reality is... by the time the Nintendo 64 was on the shelves I knew it was already "old tech" after watching (and playing) a demo of Tomb Raider 2 in a computer shop. It was likely a Pentium 90 mhz computer with Windows 95, maybe 500GB Hard drive space and running on VooDoo graphics. Once you get over the awesome graphics you realise that these machines can be anywhere from £600-£1500.
My defensive side kicks in... but.. but.. SGI is still better, right? They cost so much more they are not about games, they are for Hollywood movies! A year or two after, seeing what 3D animation software can do... wouldn't be surprised if rendering speed was competitive on Windows 98 machines to SGI ones.
Move on to today. Our phones are more powerful than those really expensive SGI systems from 1992. It is crazy when you think about it.
When looking at 3d Acceleration cards on PCs, I believe PowerVR (and 3dfx) was released in 1996. Beforehand would have demonstrated their technology at conferences, shows.. with demos and adverts before they were released... this means SGI must have seen this coming as early as 1994.
In as little as 4 years... Graphics Cards were required for PC gamers!
What is hard to convey to people outside the 3D hardware space is that the chief problem once you have the 3d pipeline down is really a market-development problem.
How do you sell an ever increasing amount of coprocessor compute?
Because the moment you hit the inevitable S-curve flattening out of your 3D value proposition, your coprocessor gets integrated into the (Intel) CPU die, just like the intel 387sx/dx floating point unit or earlier generations of IO coprocessors. Hence a frantic strategic push always into raytracing, HPC, AI, etc.
In hindsight it looks visionary, but the paranoia is at least as much a driving factor.
It’s now, for now, incredibly lucrative and the mastery of hardware parallelism may last as a moat for Nvidia, but I can sympathize at SGI not wanting to squeeze themselves back into a coprocessor-only business model. It’s a scary prospect. We can see only with hindsight it could have worked. Both business and technical leadership had such huge success diversifying their graphics hardware expertise into full system skills that they couldn’t refocus. They would have had to die in order to live again. Or perhaps slightly less exaggerated, to survive they would have to forsake/kill off a huge part of what they had become.
The feature here is that AmigaOS will try and reuse the ExecBase structure if found.
Such structure has a checksum, which is checked. If the check fails, a new one is made. This happens e.g. on power on, or after running games that are not system friendly (i.e. most games).
But if the check passes, this structure has important information, such as a list of memory regions, the "cold/cool/warm" vectors, which are function addresses that get called if non-zero at different points of the boot process (non-surprisingly a virus favorite), as well as and a list of reset-resident modules, which become allocated memory, thus protecting them.
A popular such device implements a reset-resident memory-backed block device, which the Amiga is able to boot from.