Here's the actual statement from the European Comission: https://ec.europa.eu/commission/presscorner/detail/en/ip_26_...
It's important to note they aren't creating laws against infinite scrolling, but are ruling against addictive design and pointing to infinite scrolling as an example of it. The wording here is fascinating, mainly because they're effectively acting as arbiters of "vibes". They point to certain features they'd like them to change, but there is no specific ruling around what you can/can't do.
My initial reaction was that this was a terrible precedent, but after thinking on it more I asked myself, "well what specific laws would I write to combat addictive design?". Everything I thought of would have some way or workaround that could be found, and equally would have terrible consequences on situations where this is actually quite valuable. IE if you disallow infinite scrolling, what page sizes are allowed? Can I just have a page of 10,000 elements that lazy load?
Regardless of your take around whether this is EU overreach, I'm glad they're not implementing strict laws around what you can/can't do - there are valuable situations for these UI patterns, even if in combination they can create addictive experiences. Still, I do think that overregulation here will lead to services being fractured. I was writing about this earlier this morning (https://news.ycombinator.com/item?id=47005367), but the regulated friction of major platforms (ie discord w/ ID laws) is on a collision course with the ease of vibe coding up your own. When that happens, these comissions are going to need to think long and hard around having a few large companies to watch over is better than millions of small micro-niche ones.
>The wording here is fascinating, mainly because they're effectively acting as arbiters of "vibes"
This is not such an unusual thing in law, as much as us stem-brained people want legal systems to work like code. The most famous example is determining art vs pornography - "I know it when I see it" (https://en.wikipedia.org/wiki/I_know_it_when_I_see_it)
Life is complex and beautiful and trying to regulate every possible outcome beforehand just makes it boring and depressing.
I thought about it for only a few seconds, but here is one way to do it. Have users self-report an "addiction factor", then fine the company based on the aggregate score using a progressive scale.
There is obviously a lot of detail to work out here-- which specific question do you ask users, who administers the survey, what function do you use scale the fines, etc. But this would force the companies to pay for the addiction externality without prescribing any specific feature changes they'd need to make.
> I asked myself, "well what specific laws would I write to combat addictive design?".
Only allowing algorithmic feeds/recommendations on dedicated subpages to which the user has to navigate, and which are not allowed to integrate viewing the content would be an excellent start IMO.
Assuming it was "just" about banning infinite scrolling. Not saying it is a good idea, but right now I cannot think of a legitimate use case where you would need it, unless your goal is engagement.
> My initial reaction was that this was a terrible precedent
These laws are harsh... but, as much as I hate to say it, the impact social media has had on the world has been worse.
[dead]
I wouldn't worry about that. You're ignoring politics, and what this actually is. If the EU had a real problem with addictive designs and social media the time to move against it was of course 10+ years ago. They do not intend, not even remotely, to sabotage the profit machines that those companies are, they just want political weapons against the companies. The intention here is not to cure addiction, destroy profits, the intention is to use economic power to achieve political ends. The EU is built on this, it just didn't use to involve that many private companies.
Like most famous EU laws, this is not a law for people. Like the Banking regulations, the DMA, the GPDR, the AI act, this law cannot be used by individuals to achieve their rights against companies and certainly not against EU states, who have repeatedly shown willingness to use AI against individuals, including face recognition (which gets a lot of negative attention and strict rules in the AI act, and EU member states get to ignore both directly, and they get to allow companies to ignore the rules), violate GPDR against their own citizens (e.g. use medical data in divorce cases, or even tax debt collection, and they let private companies ignore the rules for government purposes (e.g. hospitals can be forced report if you paid for treatment rather than pay alimony, rather than pay your back taxes)). The first application of the GPDR was to remove links about Barrosso's personal history from Google.
These laws can only be used by the EU commission against specific companies. Here's how the process works: someone "files a complaint", which is an email to the EU commission (not a complaint in the legal sense, no involvement of prosecutors, or judges, or any part of the justice system of any member state at all). Then an EU commissioner starts a negotiation process and rules on the case, usually imposing billions of euros in fines or providing publicly-backed loans (in the case of banks). The vast, vast, vast majority of these complaints are ignored or "settled in love" (French legal term: the idea is that some commission bureaucrat contacts the company and "arranges things", never involving any kind of enforcement mechanism). Then they become chairman of Goldman Sachs (oops, that just happened once, giving Goldman Sachs it's first communist chairman, yes really. In case you're wondering: Barrosso), or join Uber's and Salesforce's executive teams, paid through Panama paper companies.
In other words: these laws are not at all about addictive design, and saving you from it, they're about going after specific companies for political means. Google, Facebook, Goldman Sachs, ...
Ironically the EU is doing exactly what Trump did with tariffs. It's just that Trump is using a sawed-off shotgun where the EU commission is using a scalpel.
>"well what specific laws would I write to combat addictive design?"
Hear me out: banning advertising on the Internet. It's the only way. It's the primordial domino tile. You knock that one over, every other tile follows suit. It's the mother of chain reactions. There would be no social media, no Internet as we know it. Imagine having TikTok, YouTube or X trying to survive on subscriptions alone in their current iterations. Impossible. They'd need to change their top priority from "maximizing engagement by fostering addictive behavior" to "offering a product with enough quality for someone to pay a fee in order to be able to use it".