←back to thread

166 points levlaz | 6 comments | | HN request time: 1.518s | source | bottom
1. PeterStuer ◴[] No.41877171[source]
For nature we have many models, physics, chemistry, biology, ..., depending on our needs. None of them are more wrong, but they operate at different scales and are useful in different contexts.

My gripe with theoretical computer science was that if felt like a Newtonian physics level model of digital processes, while an equivalent of biology level models would be needed for/suited to most "real-life computing".

replies(2): >>41877337 #>>41877613 #
2. psychoslave ◴[] No.41877337[source]
Well, that’s basically what we have with applications, isn’t it? It’s not like we need to think about each bit-trick we rely on when making a visio in parallel of a pair programming session over whatever IDE of the day we might use.
replies(1): >>41877616 #
3. pmontra ◴[] No.41877613[source]
A biology level model for computing would be some billion of very small CPU cores each one doing its own thing, interacting with the others (actor model?) and yielding a collective result by averaging their states (by some definition of average.) It could be OK for some problems (simulations of physical systems?) but not much for others (placing a window in the middle of the screen.)

By the way, a lot of small CPU cores is what we use inside graphic cards. However they are not actors. They are very deterministic. The Newtonian physics model.

replies(2): >>41877638 #>>41878144 #
4. PeterStuer ◴[] No.41877616[source]
But how is that supported by TCS?
5. PeterStuer ◴[] No.41877638[source]
How about we start by introducing time, interactions with things exterior to the subsystem modeled?
6. GenericCanadian ◴[] No.41878144[source]
Sounds a lot like what Wolfram is working on with categorizing cellular automota. Strikes me that a lot of his work is very biological in its search for axioms from experimentation