Most active commenters
  • kragen(3)

←back to thread

251 points slyall | 16 comments | | HN request time: 0.637s | source | bottom
1. teknover ◴[] No.42057746[source]
“Nvidia invented the GPU in 1999” wrong on many fronts.

Arguably the November 1996 launch of 3dfx kickstarted GPU interest and OpenGL.

After reading that, it’s hard to take author seriously on the rest of the claims.

replies(7): >>42057829 #>>42057999 #>>42058016 #>>42058036 #>>42058425 #>>42060437 #>>42062422 #
2. ahofmann ◴[] No.42057829[source]
Wow, that is harsh. The quoted claim is in the middle of a very long article. The background of the author seems to be more on the scientific side, than the technical side. So throw out everything, because the author got one (not very important) date wrong?
replies(3): >>42058317 #>>42058394 #>>42058470 #
3. santoshalper ◴[] No.42057999[source]
Possibly technically correct, but utterly irrelevant. The 3dfx chips accelerated parts of the 3d graphics pipeline and were not general-purpose programmable computers the way a modern GPU is (and thus would be useless for deep learning or any other kind of AI).

If you are going to count 3dfx as a proper GPU and not just a geometry and lighting accelerator, then you might as well go back further and count things like the SGI Reality Engine. Either way, 3dfx wasn't really first to anything meaningful.

replies(1): >>42059309 #
4. rramadass ◴[] No.42058016[source]
After actually having read the article i can say that your comment is unnecessarily negative and clueless.

The article is a very good historical one showing how 3 important things came together to make the current progress possible viz;

1) Geoffrey Hinton's back-propagation algorithm for deep neural networks

2) Nvidia's GPU hardware used via CUDA for AI/ML and

3) Fei-Fei Li's huge ImageNet database to train the algorithm on the hardware. This team actually used "Amazon Mechanical Turk"(AMT) to label the massive dataset of 14 million images.

Excerpts;

“Pre-ImageNet, people did not believe in data,” Li said in a September interview at the Computer History Museum. “Everyone was working on completely different paradigms in AI with a tiny bit of data.”

“That moment was pretty symbolic to the world of AI because three fundamental elements of modern AI converged for the first time,” Li said in a September interview at the Computer History Museum. “The first element was neural networks. The second element was big data, using ImageNet. And the third element was GPU computing.”

5. Someone ◴[] No.42058036[source]
I wound not call it invent”, but it seems Nvidia defined the term GPU. See https://www.britannica.com/technology/graphics-processing-un... and https://en.wikipedia.org/wiki/GeForce_256#Architecture:

“GeForce 256 was marketed as "the world's first 'GPU', or Graphics Processing Unit", a term Nvidia defined at the time as "a single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second"”

They may have been the first with a product that fitted that definition to market.

replies(1): >>42060526 #
6. ◴[] No.42058317[source]
7. ◴[] No.42058394[source]
8. KevinMS ◴[] No.42058425[source]
Can confirm. I was playing Unreal on my dual Voodoo2 SLI rig back in 1998.
9. RicoElectrico ◴[] No.42058470[source]
Revisionist marketing should not be given a free pass.
replies(1): >>42059634 #
10. FeepingCreature ◴[] No.42059309[source]
But the first NVidia GPUs didn't have general-purpose compute either. Google informs me that the first GPU with user-programmable shaders was the GeForce 3 in 2001.
11. twelve40 ◴[] No.42059634{3}[source]
yet it's almost the norm these days. Sick of hearing Steve Jobs invented smartphones when I personally was using a device with web and streaming music years before that.
replies(1): >>42060545 #
12. kragen ◴[] No.42060437[source]
Arguably the November 01981 launch of Silicon Graphics kickstarted GPU interest and OpenGL. You can read Jim Clark's 01982 paper about the Geometry Engine in https://web.archive.org/web/20170513193926/http://excelsior..... His first key point in the paper was that the chip had a "general instruction set", although what he meant by it was quite different from today's GPUs. IRIS GL started morphing into OpenGL in 01992, and certainly when I went to SIGGRAPH 93 it was full of hardware-accelerated 3-D drawn with OpenGL on Silicon Graphics Hardware. But graphics coprocessors date back to the 60s; Evans & Sutherland was founded in 01968.

I mean, I certainly don't think NVIDIA invented the GPU—that's a clear error in an otherwise pretty decent article—but it was a pretty gradual process.

13. kragen ◴[] No.42060526[source]
That sounds like marketing wank, not a description of an invention.

I don't think you can get a speedup by running neural networks on the GeForce 256, and the features listed there aren't really relevant (or arguably even present) in today's GPUs. As I recall, people were trying to figure out how to use GPUs to get faster processing in their Beowulfs in the late 90s and early 21st century, but it wasn't until about 02005 that anyone could actually get a speedup. The PlayStation 3's "Cell" was a little more flexible.

14. kragen ◴[] No.42060545{4}[source]
You don't remember when Bill Gates and AOL invented the internet, Apple invented the GUI, and Tim Berners-Lee invented hypertext?
15. binarybits ◴[] No.42062422[source]
Defining who "really" invented something is often tricky. For example I mentioned in the article that there is some dispute about who discovered backpropagation. A

According to Wikipedia, Nvidia released its first product, the RV1, in November 1995, the same month 3dfx released its first Voodoo Graphics 3D chip. Is there reason to think the 3dfx card was more of a "true" GPU than the RV1? If not, I'd say Nvidia has as good a claim to inventing the GPU as 3dfx does.

replies(1): >>42064399 #
16. in3d ◴[] No.42064399[source]
NV1, not RV1.

3dfx Voodoo cards were initially more successful, but I don’t think anything not actually used for deep learning should count.