←back to thread

760 points MindBreaker2605 | 1 comments | | HN request time: 0.001s | source
Show context
numpy-thagoras ◴[] No.45897574[source]
Good. The world model is absolutely the right play in my opinion.

AI Agents like LLMs make great use of pre-computed information. Providing a comprehensive but efficient world model (one where more detail is available wherever one is paying more attention given a specific task) will definitely eke out new autonomous agents.

Swarms of these, acting in concert or with some hive mind, could be how we get to AGI.

I wish I could help, world models are something I am very passionate about.

replies(2): >>45897629 #>>45901238 #
sebmellen ◴[] No.45897629[source]
Can you explain this “world model” concept to me? How do you actually interface with a model like this?
replies(6): >>45898106 #>>45898143 #>>45899047 #>>45901655 #>>45902131 #>>45902333 #
natch ◴[] No.45898143[source]
He is one of these people who think that humans have a direct experience of reality not mediated by as Alan Kay put it three pounds of oatmeal. So he thinks a language model can not be a world model. Despite our own contact with reality being mediated through a myriad of filters and fun house mirror distortions. Our vision transposes left and right and delivers images to our nerves upside down, for gawd’s sake. He imagines none of that is the case and that if only he can build computers more like us then they will be in direct contact with the world and then he can (he thinks) make a model that is better at understanding the world
replies(7): >>45898364 #>>45898490 #>>45898733 #>>45898924 #>>45899674 #>>45899676 #>>45904464 #
BoxOfRain ◴[] No.45898490[source]
Isn't this idea demonstrably false due to the existence of various sensory disorders too?

I have a disorder characterised by the brain failing to filter own its own sensory noise, my vision is full of analogue TV-like distortion and other artefacts. Sometimes when it's bad I can see my brain constructing an image in real time rather than this perception happening instantaneously, particularly when I'm out walking. A deer becomes a bundle of sticks becomes a muddy pile of rocks (what it actually is) for example over the space of seconds. This to me is pretty strong evidence we do not experience reality directly, and instead construct our perceptions predictively from whatever is to hand.

replies(2): >>45898676 #>>45902257 #
scoot ◴[] No.45898676{3}[source]
Pleased to meet someone else who suffers from "visual snow". I'm fortunate in that like my tinnitus, I'm only acutely aware of it when I'm reminded of it, or, less frequently, when it's more pronounced.

You're quite correct that our "reality" is in part constructed. The Flashed Face Distortion Effect [0][1] (wherein faces in the peripheral vision appear distorted due the the brain filling in the missing information with what was there previously) is just one example.

[0] https://en.wikipedia.org/wiki/Flashed_face_distortion_effect [1] https://www.nature.com/articles/s41598-018-37991-9

replies(2): >>45898782 #>>45899073 #
1. nervousvarun ◴[] No.45899073{4}[source]
Only tangentially related but maybe interesting to someone here so linking anyways: Brian Kohberger is a visual snow sufferer. Reading about his background was my first exposure to this relatively underpublicized phenomenon.

https://en.wikipedia.org/wiki/2022_University_of_Idaho_murde...