When I worked there every week there would be a different flyer on the inside of the bathroom stall door to try to get the word out about things that really mattered to the company.
One week the flyer was about how a feed video needed to hook the user in the first 0.2 seconds. The flyer promised that if this was done, the result would in essence have a scientifically measurable addictive effect, a brain-hack. The flyer was to try to make sure this message reached as many advertisers as possible.
It seemed to me quite clear at that moment that the users were prey. The company didn't even care what was being sold to their users with this brain-reprogramming-style tactic. Our goal was to sell the advertisers on the fact that we were scientifically sure that we had the tools to reprogram our users brains.
Seems not so far back the Sacklkers were proven(?) to have profited and fueled the oiod crisis while colluding with the healthcare industry - and last i heard they were haggling over the fine to pay to the state. While using various financial loopholes to hide their wealth under bankrupcy and offshore instruments.
What then the trillion dollar companies that can drag out appeals for decades and obfuscate any/all recommendations that may be reached.
> He contended the A was for addicting, the B for brains and the C for children.
I gotta admit, I find this really trivial and silly that this is how court cases go, but I understand that juries are filled of all sorts of people and lawyers I guess feel the need to really dumb things down? Or maybe it's the inner theater kid coming out?
These are opening remarks, Perhaps we should wait until they actually present evidence.
I think this may also be why there is so much sugar in American food. People buy more of the sweet stuff. So they keep making it sweeter.
I'm not sure who should be responsible. It kinda feels like a "tragedy of the commons" kind of situation.
Don't consume your own product.
The traditional answer is "engagement," but there is a strong argument to me made that intentional engagement (engagement by conscious, willful choice) is not possible, repetitively, for a vast smorgasbord of content spinning by at short intervals
I'm all for preventing addictive algorithms, but if you are letting your 6 year old watch you-tube, you got problems. This is just going to be used to push age restrictions
People used to be addicted to watching TV, right? Well, nobody was made responsible for that. If it is addiction, and I am not necessarily saying this is not, then all websites would fall under the same category IF they are designed well enough to become addictive. Most games would fall under that category too. I don't think this is a good category at all. Both Meta and Google should pay simply for wasting our time here, but the "you designed your applications and websites in an addictive manner" ... that's just weird.
I find myself in the uncomfortable position of sympathizing with both sides of the argument - a yes-but-no position.
If you think its not and just "similar to addiction", just try blocking these sites in your browser/phone and see how long you last before feeling negative effects.
That gives big techs the power to do whatever, and once power is granted, it hardly ever is revoked.
Unsealed court documents show teen addiction was big tech's "top priority"
I'd argue that we basically incentivise companies to cause harm whenever it is unregulated and profitable because the profits are never sufficiently seized and any prosecution is a token effort at best.
See leaded gas, nicotine, gambling, etc. for prominent examples.
I personally think prosecution should be much harsher in an ideal world; if a company knows that its products are harmful, it should ideally be concerned with minimising that harm instead of fearing to miss out on profits without any legal worries.
I thought it was kind of pathetic how quickly they shoved ipads into schools with no real long term data, no research whatsoever. Just insane really. And now here we are yet again.
1. We sell ads to make money 2. If we keep eyeballs on our apps more than competing apps, we can increase the price for our ads and make more money 3. Should we implement limits to kick kids off the app after they've been doomscrolling for an hour? Absolutely not, that would violate our duty to our shareholder. If parents complain, we'll say they should implement the parental controls present on their phones and routers. We can't make choices to limit our income if parents don't use the tools they already have.
I'm sorry that social media has ruined so many kids' lives, but I don't think the responsibility lies with the tech companies in this case. It lies with the society that has stood by idly while kids endured cyber-bullying and committed suicide. This isn't something that happened recently- the USA has had plenty of time to respond as a society and chosen not to. Want to sue someone? Sue Congress.
Google and Meta are rational actors in a broken system. If you want to change something you should change the rules that they operate under and hold them accountable for those rules going forward. Australia (and Spain) is doing something about it- now that social media is banned for kids under 16 in those countries, if social media companies try to do anything sneaky to get around that you actually have a much stronger case.
Now if there were evidence that they were intentionally trying to get kids bullied and have them commit suicide then by all means, fine them into oblivion. But I doubt there is such evidence.
I am not saying that Facebook didn't try. I am just saying that only having access to screens, they would inevitably fail. Screens are very unlike addictive drugs and cannot directly alter neurochemistry (at least not any more than a sunset or any perception does). I strongly dislike the company and have personally never created a Facebook account nor used the website.
1) You can't stalk someone deliberately and persistently, using any means, or medium; even if you're a company, and even if you have good intentions.
2) You can't intentionally influence any number of people towards believing something false and that you know is against their interest.
These things need to be felony-level or higher crimes, where executives of companies must be prosecuted.
Not only that, certain crimes like these should be allowed to be prosecuted by citizens directly. Especially where bribery and threats by powerful individuals and organizations might compromise the interests of justice.
The outcome of this trial won't amount to anything other than fines. The problem is, this approach doesn't work. They'll just find different ways that can skirt the law. Criminal consequence is the only real way to insist on justice.