←back to thread

73 points mrbluecoat | 2 comments | | HN request time: 0s | source
Show context
generuso ◴[] No.45779707[source]
The idea was always appealing, but the implementation has always remained challenging.

For over a decade, "Mythic AI" was making accelerator chips with analog multipliers based on research by Laura Fick and coworkers. They raised $165M and produced actual hardware, but at the end of 2022 have almost gone bankrupt and since then there has been very little heard from them.

Much earlier, the legendary chip designers Federico Faggin and Carver Mead founded Synaptics with an idea to make neuromorphic chips which would be fast and power efficient by harnessing analog computation. Carver Mead published a book on that in 1989: "Analog VLSI and Neural Systems", but making working chips turned to be too hard, and Synaptics successfully pivoted to touchpads and later many other types of hardware.

Of course, the concept can be traced to an even older and still more legendary Frank Rosenblatt's "Perceptron" -- the original machine learning system from 1950s. It implemented the weights of the neural network as variable resistors that were adjusted by little motors during training. Multiplication was simply input voltage times conductivity of the resistor producing the current -- which is what all the newer system are also trying to use.

replies(2): >>45783806 #>>45784218 #
1. smartbit ◴[] No.45784218[source]
The idea of analog neural networks is appealing. I bought Analog VLSI and Neural Systems in 1989 and still have it as a trophy on my bookshelves. My gut feeling says one day analog neural networks will be a thing, if only for the reason of considerable lower power consumption.

I’m not saying that life is analog, DNA is two bits. IMHO life is a mix of Analog & Digital.

replies(1): >>45786849 #
2. pk-protect-ai ◴[] No.45786849[source]
It is very difficult to scale digital-analog hybrids, because of amount of DAC-ADC components required.