←back to thread

120 points LorenDB | 3 comments | | HN request time: 0.415s | source
Show context
pingou ◴[] No.44441760[source]
Telepathy is on its way. Next step they just skip the conversion of brain signals to words and just directly send the signals to another brain. But I think some conversion/translation would still be necessary.
replies(11): >>44441794 #>>44441799 #>>44441825 #>>44442005 #>>44442059 #>>44442154 #>>44442271 #>>44442579 #>>44442793 #>>44443580 #>>44443730 #
Cthulhu_ ◴[] No.44441799[source]
I think (futurology / science fiction) that they will make some kind of brain link, but there won't be any translations happening in between, just raw brain signals from one to the other, like an extra sensory input; there won't be any encoding or data that can be translated to speech or images, but the connected brains will be able to learn to comprehend and send the signals to / from each other and learn to communicate that way.
replies(5): >>44441927 #>>44441943 #>>44442006 #>>44445289 #>>44447353 #
1. voidUpdate ◴[] No.44442006[source]
I think the main problem with that is that different people thing in different ways. I think in full sentences and 3D images, whereas other people might think without images at all. How do you translate that?
replies(2): >>44442101 #>>44443529 #
2. yieldcrv ◴[] No.44442101[source]
If statements

for how the brain chip chooses to function

3. suspended_state ◴[] No.44443529[source]
It is very likely that this device works by perceiving and interpreting brain waves. Actually, from the article:

> “We recorded neural activities from single neurons, which is the highest resolution of information we can get from our brain,” Wairagkar says. The signal registered by the electrodes was then sent to an AI algorithm called a neural decoder that deciphered those signals and extracted speech features such as pitch or voicing.