←back to thread

120 points LorenDB | 4 comments | | HN request time: 0.836s | source
Show context
pingou ◴[] No.44441760[source]
Telepathy is on its way. Next step they just skip the conversion of brain signals to words and just directly send the signals to another brain. But I think some conversion/translation would still be necessary.
replies(11): >>44441794 #>>44441799 #>>44441825 #>>44442005 #>>44442059 #>>44442154 #>>44442271 #>>44442579 #>>44442793 #>>44443580 #>>44443730 #
Cthulhu_ ◴[] No.44441799[source]
I think (futurology / science fiction) that they will make some kind of brain link, but there won't be any translations happening in between, just raw brain signals from one to the other, like an extra sensory input; there won't be any encoding or data that can be translated to speech or images, but the connected brains will be able to learn to comprehend and send the signals to / from each other and learn to communicate that way.
replies(5): >>44441927 #>>44441943 #>>44442006 #>>44445289 #>>44447353 #
1. hearsathought ◴[] No.44445289[source]
I still fail to see how that's possible since it is assumed every brain "encodes" data uniquely. Communication between computers is possible because we have agreed upon standards. If every computer encoded characters differently, no communication would be possible. Without agreed upon ports or agreed upon mechanism to agree upon ports one computer could not communicate with another. So how can brain-to-brain communication work given that encoding/communication "standards" are impossible since each brain is different?

For example, I see a tree and my brain generates a unique signal/encoding/storage representing the tree. Another person sees the tree and generates a unique signal/encoding/storage representing the tree. How would my brain communicate "tree" to his brain since both our "trees" are unique to our brains?

My brain device reads my brain signal "1010101" for tree. My friend's device reads brain signal "1011101" for tree. How could we possibly map 1010101 to 1011101. Or is the assumption that human brains have identical signals/encoding for each thought.

replies(1): >>44445685 #
2. goopypoop ◴[] No.44445685[source]
I already learned to interpret touch, taste, smision etc. when I was just a baby. How hard can a new one be?
replies(1): >>44455354 #
3. hearsathought ◴[] No.44455354[source]
Did you even read my comment? I'm not talking about your brain's ability. I'm talking about how a device can interpret your brain signals and transfer it to another brain to have actual communication when both brains essentially have their own internal "language". How your brain "stores" the idea of tree is entirely different from how I "store" the idea of a tree. Different neurons, different location in the brain and different signals. Do you know computers, networks and communications work? It's all bound by artificial standards we agreed upon a priori.

The only way I see is by textual or auditory mechanism between people who speak the same language ( standards agreed upon a priori ). But that wouldn't be brain to brain. It would be brain to text/speech to eyes/ears to brain.

replies(1): >>44456722 #
4. goopypoop ◴[] No.44456722{3}[source]
I'm saying it's clearly possible to develop from scratch the framework with which to learn to interpret each individual remote machine's raw data stream.

Your intermediate protocol woes are a red herring.