I always start with God’s design thinking it is best. That’s our diverse, mixed-signal, brain architecture followed by a good upbringing. That means we need to train brain-like architectures in the same way we train children. So, we’ll need whatever data they needed. Multiple streams for different upbringings, too.
The data itself will be most senses collecting raw data about the world most of the day for 18 years. It might require a camera on the kid’s head which I don’t like. I think people letting a team record their life is more likely. Split the project up among many families running in parallel, 1-4 per grade/year. It would probably cost a few million a year.
(Note: Parent changes might require an integration step during AI training or showing different ones in the early years.)
The training system would rapidly scan this information in. It might not be faster than human brains. If it is, we can create them quickly. That’s the passive learning part, though.
Human training involves asking lots of questions based on internal data, random exploration (esp play) with reinforcement, introspection/meditation, and so on. Self-driven, generative activities whose outputs become inputs into the brain system. This training regiment will probably need periodic breaks from passive learning to ask questions or play which requires human supervision.
Enough of this will probably produce… disobedient, unpredictable children. ;) Eventually, we’ll learn how to do AI parenting where the offspring are well-behaved, effective servants. Those will be fine-tuned for practical applications. Later, many more will come online which are trained by different streams of life experience, schooling methods, etc.
That was my theory. I still don’t like recording people’s lives to train AI’s. I just thought it was the only way to build brain-like AI’s and likely to happen (see Twitch).
My LLM concept was to do the same thing with K-12 education resources, stories, kids games, etc. Parents already could tell us exactly what to use to gradually build them up since they did that for their kids year by year. Then, several career tracts layering different college books and skill areas. I think it would be cheaper than GPT-4 with good performance.