←back to thread

YC: Requests for Startups

(www.ycombinator.com)
514 points sarimkx | 6 comments | | HN request time: 1.395s | source | bottom
1. wolframhempel ◴[] No.39384280[source]
Slight tangent - but I feel that our need for "explainable AI" might lead to us missing out on some fundamentally different ways of thinking and reasoning that AI could provide. Machines might reach conclusions in ways that simply can't be "dumbed down" to human reasoning. That doesn't mean that these means are esoteric or outside of logic, but it just takes a few levels of derivations and statistic to make something really hard to comprehend for us.
replies(1): >>39384794 #
2. keiferski ◴[] No.39384794[source]
I call this "machine metaphysics" and I think it's an extremely interesting area of speculation. A whole lot of human metaphysics are based on our (very limited) sensory inputs – which machines/AI have infinitely more of.

As a quick example, it seems to me that an AI may not need the concept of a universal, in the philosophical sense, because it is capable of handling a near-infinite number of particulars.

replies(2): >>39385037 #>>39385357 #
3. practal ◴[] No.39385037[source]
There are a LOT of natural numbers, even for an AI.
replies(1): >>39385815 #
4. wolframhempel ◴[] No.39385357[source]
That's super interesting. Are there any books/articles/further reading on this you can recommend?
replies(1): >>39385665 #
5. keiferski ◴[] No.39385665{3}[source]
Unfortunately I haven't come across much writing of this sort, as most of the philosophical writing about AI centers on the reverse topic: how humans will be changed by machines.
6. keiferski ◴[] No.39385815{3}[source]
I had in mind more concrete aspects of "human" reality. A lot of the divisions we make between objects seem dependent on our language, which is dependent on our sensory abilities.

The concept of species is maybe a good example – it doesn't make a whole lot of sense, if you think about it. It's merely a linguistic placeholder to describe a step in a process, because the idea of constant evolutionary change is too difficult to encapsulate in a single concept. An AI would have no issue with this, as it is capable of taking in and holding much more data and conceptualizing the concept of a process.