this post was submitted on 12 Oct 2024
65 points (100.0% liked)

Technology

37800 readers
67 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

For the first time, internal TikTok communications have been made public that show a company unconcerned with the harms the app poses for American teenagers. This is despite its own research validating many child safety concerns.

The confidential material was part of a more than two-year investigation into TikTok by 14 attorneys general that led to state officials suing the company on Tuesday. The lawsuit alleges that TikTok was designed with the express intention of addicting young people to the app. The states argue the multi-billion-dollar company deceived the public about the risks.

In each of the separate lawsuits state regulators filed, dozens of internal communications, documents and research data were redacted — blacked-out from public view — since authorities entered into confidentiality agreements with TikTok.

But in one of the lawsuits, filed by the Kentucky Attorney General’s Office, the redactions were faulty. This was revealed when Kentucky Public Radio copied-and-pasted excerpts of the redacted material, bringing to light some 30 pages of documents that had been kept secret.

[...]

TikTok’s own research states that “compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety,” according to the suit.

In addition, the documents show that TikTok was aware that “compulsive usage also interferes with essential personal responsibilities like sufficient sleep, work/school responsibilities, and connecting with loved ones.”

TikTok: Time-limit tool aimed at ‘improving public trust,’ not limiting app use

The unredacted documents show that TikTok employees were aware that too much time spent by teens on social media can be harmful to their mental health. The consensus among academics is that they recommend one hour or less of social media usage per day.

The app lets parents place time limits on their kids’ usage that range from 40 minutes to two hours per day. TikTok created a tool that set the default time prompt at 60 minutes per day.

[...]

you are viewing a single comment's thread
view the rest of the comments
[–] Steve@communick.news 19 points 2 months ago (2 children)

It's always surprising to me that people think these harms are limited to kids and teens. These same issues effect everyone of all ages. Even I've noticed my attention span has been effected.

[–] Alice@beehaw.org 6 points 2 months ago

I think it's more that kids are the ones expected to be protected by the law, whereas adults are allowed to knowingly engage in addictive behavior, like alcohol and cigarettes.

[–] technocrit@lemmy.dbzer0.com 1 points 2 months ago* (last edited 2 months ago) (4 children)

Cool anecdote. Lets have the state clamp down on free speech and ban popular apps because of it.

[–] averyminya@beehaw.org 15 points 2 months ago

Or we could, you know, force social media companies to not use psychologists to make their apps more addictive by design. Something called ethics.

It's extremely telling that you can look for a job as a psychologist for Meta and all the opportunities that are available are UX researchers.

[–] Steve@communick.news 4 points 2 months ago* (last edited 2 months ago)

You're conflating free speech of individuals, with engagement driven black box recommendation algorithms of corporations. It's a common mistake. I think most people make it.

A company can allow people to post things, and for people to see them if they like, without algorithmically pushing it in endless scrolling interfaces.

For example Lemmy and Mastodon. You only see what you choose to subscribe to. The sites don't chose to push any content into your feed because an algorithm thinks you'll like it.

There is a big difference between the two.
And removing the algorithms isn't a hindrance to free speech, only profits.

[–] Kissaki@beehaw.org 4 points 2 months ago

Can you explain what you mean by free speech?

Is the free choice of content selection algorithm free speech? Isn't the speech, the content, there either way, and could be selected through other alternative algorithms?

Is using deliberately engaging or addicting design free speech? Isn't the speech, the content, there either way?

[–] DdCno1@beehaw.org 3 points 2 months ago

Do you really think that free speech exists on an app controlled by the CCP?