←back to thread

230 points mdp2021 | 1 comments | | HN request time: 0.204s | source
Show context
Crazyontap ◴[] No.41866060[source]
When I was younger, I was fascinated by evolution, especially the intricacies of how things just work. This fascination also explains why many people believe in the intelligent design theory.

However, witnessing the rapid evolution of AI with just a few hundred GPUs, enough data, and power, I no longer wonder what a billion years of feedback loops and randomness can achieve.

replies(18): >>41866202 #>>41866478 #>>41866660 #>>41866806 #>>41866826 #>>41867595 #>>41867652 #>>41867789 #>>41867813 #>>41867833 #>>41867834 #>>41867913 #>>41868264 #>>41868344 #>>41868565 #>>41868579 #>>41869785 #>>41909242 #
kortilla ◴[] No.41866806[source]
AI isn’t being trained on random though. It’s the corpus of a large portion of all of humanity’s written communication. I don’t think it’s a good analogy to evolution.

A single training session will iterate more than the number of generations of all birds.

replies(2): >>41867251 #>>41909280 #
1. fennecbutt ◴[] No.41909280[source]
I would instead think of the llm training corpus as being equivalent to the physical laws that govern our reality.

LLM training is training an organism to "survive" in an environment consisting of languages/lexicons.

Not getting eaten by a tiger is equivalent to being able to produce a semantically and logically correct sentence.