interesting that Waymo could do uninterrupted trips back in 2013, wonder what took them so long to expand? regulation? tailend of driving optimization issues?
noticed one of the slides had a cross over 'AGI 2027'... ai-2027.com :)
interesting that Waymo could do uninterrupted trips back in 2013, wonder what took them so long to expand? regulation? tailend of driving optimization issues?
noticed one of the slides had a cross over 'AGI 2027'... ai-2027.com :)
but in hindsight looks like this slowed them down quite a bit despite being early to the space...
You need image processing just as much as you need scenario management, and they're orthoganol to each other, as one example.
If you want a general transport system... We do have that. It's called rail. (And can and has been automated.)
But what's driving a car? A generalist human brain that has been trained for ~30 hours to drive a car.
We have multiple parts of the brain that interact in vastly different ways! Your cerebellum won't be running the role of the pons.
Most parts of the brain cannot take over for others. Self-healing is the exception, not the rule. Yes, we have a degree of neuroplasticity, but there are many limits.
(Sidenote: Driver's license here is 240 hours.)
Current breed of autonomous driving systems have problems with exceptional situations - but based on all I've read about so far, those are exactly of the kind that would benefit from a general system able to understand the situation it's in.
A big problem I am noticing is that the IT culture over the last 70 years has existed in a state of "hardware gun get faster soon". And over the last ten years we had a "hardware cant get faster bc physics sorry" problem.
The way we've been making software in the 90s and 00s just isn't gonna be happening anymore. We are used to throwing more abstraction layers (C->C++->Java->vibe coding etc) at the problem and waiting for the guys in the fab to hurry up and get their hardware faster so our new abstraction layers can work.
Well, you can fire the guys in the fab all you want but no matter how much they try to yell at the nature it doesn't seem to care. They told us the embedded c++-monkeys to spread the message. Sorry, the moore's law is over, boys and girls. I think we all need to take a second to take that in and realize the significance of that.
[1] The "guys in the fab" are a fictional character and any similarity to the real world is a coincidence.
[2] No c++-monkeys were harmed in the process of making this comment.
For example... The nerve cells in your gut may speak to the brain, and interact with it in complex ways we are only just beginning to understand, but they are separate systems that both have control over the nervous system, and other systems. [1]
General Intelligence, the psychological theory, and General Modelling, whilst sharing words, share little else.
AGI is defined in terms of "General Intelligence", a theory that general modelling is irrelevant to.