logoalt Hacker News

b450today at 6:43 PM2 repliesview on HN

Reminds me of the blog post about Waymo's "World Model". Training on real-world data results in a sufficiently rich model to start simulating novel scenarios that aren't in the training data (like the elephant wandering into the street), which in turn can feed back into training. One could imagine scientific inquiry working the same way.

It strikes me that many of these complex systems have indeterminate boundaries, and a fair amount of distortion might be baked into the choice of training data. Poverty (to take an example from this post) probably has causes at economic, psychological, ecological, physiological, historical, and political levels of description (commenters please note I didn't think too hard about this list). What data we feed into our models, and how those data are understood as operationalizations of the qualitative phenomena we care about, might matter.


Replies

delichontoday at 6:58 PM

> like the elephant wandering into the street

Or a dinosaur that looks like it might:

https://x.com/phatman_19/status/2030728278437491102

gwerbintoday at 6:46 PM

This "world model" concept has been a big deal in AI research, in LLMs.