Large portions of the tech sector thrive off the attention economy. If your goal as a product is to have someone spend hours a day everyday engaged with your product, and you focus on a data driven approach to maximize the time spent on the app, then you’ll create something not dissimilar to addiction.
Will there be any outcome to remedy the situation from this even if actual harm is proven to the letter of the law?
Seems not so far back the Sacklkers were proven(?) to have profited and fueled the oiod crisis while colluding with the healthcare industry - and last i heard they were haggling over the fine to pay to the state. While using various financial loopholes to hide their wealth under bankrupcy and offshore instruments.
What then the trillion dollar companies that can drag out appeals for decades and obfuscate any/all recommendations that may be reached.
> "This case is as easy as A-B-C," Lanier said as he stacked children's toy blocks bearing the letters.
> He contended the A was for addicting, the B for brains and the C for children.
I gotta admit, I find this really trivial and silly that this is how court cases go, but I understand that juries are filled of all sorts of people and lawyers I guess feel the need to really dumb things down? Or maybe it's the inner theater kid coming out?
The problem started when people asked about privacy, they reply with "I have nothing to hide"
That gives big techs the power to do whatever, and once power is granted, it hardly ever is revoked.
"They don't only build apps; they build traps," Lanier said, saying Meta and YouTube pursued "addiction by design," making his arguments using props like a toy Ferrari and a mini slot machine.
These are opening remarks, Perhaps we should wait until they actually present evidence.
It's true but also (could be) innocent. In the sense that if you A/B test things and look for engagement, you will almost certainly end up with "addictive" systems.
I think this may also be why there is so much sugar in American food. People buy more of the sweet stuff. So they keep making it sweeter.
I'm not sure who should be responsible. It kinda feels like a "tragedy of the commons" kind of situation.
Restaurants try to make food you will remember and want again. Authors try to write books you can't stop reading. It's silly to imagine that any type of media would do anything other than seek to gain your interest and attention. It's our job to have personal hygiene and to control our information diet. This postmodern social construction perspective that tries to blame everyone for our problems is a lame approach to the problem.
This is just one data point, but I once talked to a person who worked for YouTube. I asked him if he had children, he said yes. I asked him if he let them use YouTube Kids and he said "no way, that's completely banned at home". That told me everything that I needed to know.
Don't consume your own product.
We're going to see the same kind of thing we saw with the tobacco industry - CEOs claiming they had no idea the product was engineering to be addictive. I have even less faith that this anyone will hold anybody accountable though.
A recommender engine that tries to capture and sustain attention in 1-2 second intervals, what else would you call it?
The traditional answer is "engagement," but there is a strong argument to me made that intentional engagement (engagement by conscious, willful choice) is not possible, repetitively, for a vast smorgasbord of content spinning by at short intervals
The interest of corporations, the interest of capital is above the interest of society at large. Nothing will pan out of this trial.
This is weird because ... at which point does an addiction start?
People used to be addicted to watching TV, right? Well, nobody was made responsible for that. If it is addiction, and I am not necessarily saying this is not, then all websites would fall under the same category IF they are designed well enough to become addictive. Most games would fall under that category too. I don't think this is a good category at all. Both Meta and Google should pay simply for wasting our time here, but the "you designed your applications and websites in an addictive manner" ... that's just weird.
If engineering addiction for children is illegal, it should obviously be illegal to target adults too?
I wonder if there's something from addiction neuroscience that can be utilized as "evidence" in the legal sense; or if it's used in the process of product development. i.e. how google/meta/frito lay/P&G maximize dopamine hits in the ventral segmental area, etc
Do they expect social media sites to show less posts of things you are interested or less posts from your friends to make you use it less. They provide good products which people like.
Ignoring what I think of the case itself, I hate how many headlines now are just the talking points for one side or the other in a dispute. At least try to pretend you’re being a neutral reporter rather than regurgitating what someone with an agenda says.
"I let an iPad raise my kid and now she sucks" is a wild lawsuit premise.
Is this a replay of the comics and video games are doing irreparable harm arguments from not too long ago?
I find myself in the uncomfortable position of sympathizing with both sides of the argument - a yes-but-no position.
There is no law that dictates these two things:
1) You can't stalk someone deliberately and persistently, using any means, or medium; even if you're a company, and even if you have good intentions.
2) You can't intentionally influence any number of people towards believing something false and that you know is against their interest.
These things need to be felony-level or higher crimes, where executives of companies must be prosecuted.
Not only that, certain crimes like these should be allowed to be prosecuted by citizens directly. Especially where bribery and threats by powerful individuals and organizations might compromise the interests of justice.
The outcome of this trial won't amount to anything other than fines. The problem is, this approach doesn't work. They'll just find different ways that can skirt the law. Criminal consequence is the only real way to insist on justice.
Listen, it sounds a lot less evil when you label it "audience engagement" mmk?
People aren't customers anymore. They are the resource to be mined. Advertisers are the customers.
All of these things they're saying are unethical, but not illegal, right?
"But if we don't engineer addiction, China will beat us to it! It's a national interest!"
Is it really landmark tho? First it was nicotine and big tobacco, then the same addiction engineers designed ultra processed foods. Kraft, nabisco, etc all spun off by tobacco companies... Normalizing food addiction in children. And now it's screens and social media. It's the same fundamental physiology but it seems like society can't learn a lesson either.
I thought it was kind of pathetic how quickly they shoved ipads into schools with no real long term data, no research whatsoever. Just insane really. And now here we are yet again.
Curious how this is gonna turn out, but I'm not holding my breath.
I'd argue that we basically incentivise companies to cause harm whenever it is unregulated and profitable because the profits are never sufficiently seized and any prosecution is a token effort at best.
See leaded gas, nicotine, gambling, etc. for prominent examples.
I personally think prosecution should be much harsher in an ideal world; if a company knows that its products are harmful, it should ideally be concerned with minimising that harm instead of fearing to miss out on profits without any legal worries.
I am generally displeased with the way social media has evolved, but I'm not in favor of this lawsuit. It seems like a way to blame tech companies for Congress' failure to regulate businesses properly. None of the engineers involved thought of their work as a way to rot the minds of future generations. Their thought process was straightforward-
1. We sell ads to make money 2. If we keep eyeballs on our apps more than competing apps, we can increase the price for our ads and make more money 3. Should we implement limits to kick kids off the app after they've been doomscrolling for an hour? Absolutely not, that would violate our duty to our shareholder. If parents complain, we'll say they should implement the parental controls present on their phones and routers. We can't make choices to limit our income if parents don't use the tools they already have.
I'm sorry that social media has ruined so many kids' lives, but I don't think the responsibility lies with the tech companies in this case. It lies with the society that has stood by idly while kids endured cyber-bullying and committed suicide. This isn't something that happened recently- the USA has had plenty of time to respond as a society and chosen not to. Want to sue someone? Sue Congress.
Google and Meta are rational actors in a broken system. If you want to change something you should change the rules that they operate under and hold them accountable for those rules going forward. Australia (and Spain) is doing something about it- now that social media is banned for kids under 16 in those countries, if social media companies try to do anything sneaky to get around that you actually have a much stronger case.
Now if there were evidence that they were intentionally trying to get kids bullied and have them commit suicide then by all means, fine them into oblivion. But I doubt there is such evidence.
Would it be ironic if Meta and Google were required to add "a landing page that displays information about responsible [social media use]" similar to their policies regarding gambling ads (example below)?
Any tech company funded primarily by ad money is dirty.
Just ban personalized advertising and be done with it, they will do any amount of harm so long as it gets a click.
I mean everyone knows this right? There are even leaked memos. They are public companies who need to grow revenue and they gain that revenue mostly through ads and attention.
It’s a matter of time, it could be decades, until we shape those "engineered addiction" similarly to the Tobacco trials.
Related:
Unsealed court documents show teen addiction was big tech's "top priority"
Anything where you scroll through posts or endlessly watch short videos is highly addictive.
If you think its not and just "similar to addiction", just try blocking these sites in your browser/phone and see how long you last before feeling negative effects.
absolute hysteria from everyone on here
The jury is using words outside of their medical context in situations that do not justify the word then. In fact, most of society seems okay with this gross mis-use of the term to apply to things that don't actually manipulate incentive salience. We're going to end up with authoritarians in control of all 'screens' just because our schools have done such a bad job of explaining neuroscience. If you think handing the federal government control of all screens is a good idea in the USA you really need to look around.
I am not saying that Facebook didn't try. I am just saying that only having access to screens, they would inevitably fail. Screens are very unlike addictive drugs and cannot directly alter neurochemistry (at least not any more than a sunset or any perception does). I strongly dislike the company and have personally never created a Facebook account nor used the website.
[dead]
They're not afraid of the idea of programming people.
When I worked there every week there would be a different flyer on the inside of the bathroom stall door to try to get the word out about things that really mattered to the company.
One week the flyer was about how a feed video needed to hook the user in the first 0.2 seconds. The flyer promised that if this was done, the result would in essence have a scientifically measurable addictive effect, a brain-hack. The flyer was to try to make sure this message reached as many advertisers as possible.
It seemed to me quite clear at that moment that the users were prey. The company didn't even care what was being sold to their users with this brain-reprogramming-style tactic. Our goal was to sell the advertisers on the fact that we were scientifically sure that we had the tools to reprogram our users brains.