logoalt Hacker News

jongjongtoday at 3:32 AM0 repliesview on HN

Never thought I'd be reading this on TechCrunch but fully resonates and it's an interesting article. Also, I understand why some people think we live in a simulation. It can be explained to some extent; we're glued to our phones/devices and those devices choose what information we see.

We are only aware of the stuff that our devices show to us; yet the vastness of the internet creates a false sense that we know everything. This dual reality (deep reality vs the surface reality we see) creates the feeling of being in a simulation; we have a feeling that there's another reality beyond our simulation. We implicitly trust the algorithms to do the curation for us, personalized to our tastes, but the algorithms are heavily biased towards popular content, ideas and people. It's a tiny subset of reality that's highly manipulated and fake. The less critically-minded you are, the smaller but more pleasant your world is (until you reach a certain point?).

We have hype leading adoption, which funds development capacity which leads to slight improvements, which lead to consolidation of hype... But there exist alternatives that are 10x better from the beginning but lacking the hype component altogether and those things appear to not exist. Value creators are often terrible at marketing. It's hard to sell to people who are inside the simulation when you are outside of it because you don't speak the same language.

The contrast between form vs substance has reached comically absurd levels and sadly, the clear winner is form.

To really get the full picture, you almost have to already know all the key information. At best, AI/LLMs can give you confirmation of your existing knowledge with additional supporting data... But even that's under attack; there are narratives trying to discredit the objectivity of LLMs by saying that they are programmed to agree with you for engagement... That's a persuasive narrative, especially in the age of fake news, but I really hope we ignore these narratives; we just have to observe that LLMs do in fact push back effectively when you're wrong! You can't make an LLM agree with you on facts that are wrong no matter how many times or how many ways you repeat them. The only wiggle-room is in terms of 'importance' or 'relevance', not facts.

Critical thinking (e.g. poking holes in otherwise perfectly satisfying explanations) is now more important than ever if you want to stay connected to reality because there are incredibly powerful forces in place to make sure we stay on the first layer.