Return to Transcripts main page
The Whole Story with Anderson Cooper
Wired For Trouble. Aired 8-9p ET
Aired July 09, 2023 - 20:00 ET
THIS IS A RUSH TRANSCRIPT. THIS COPY MAY NOT BE IN ITS FINAL FORM AND MAY BE UPDATED.
[20:00:02]
JIM ACOSTA, CNN HOST: And it's going to make it that much harder for us to catch up with our kids.
AUDIE CORNISH, CNN CORRESPONDENT: Exactly.
ACOSTA: To figure out what they're doing.
CORNISH: But people really I hope they identify with this because every kid, every parent who has a phone will find something kind of familiar in tonight's show.
ACOSTA: All right. We'll be watching.
Audie Cornish, thank you very much.
Stay with us as Audie Cornish delves into all of this on a new episode of "THE WHOLE STORY WITH ANDERSON COOPER." That is next here on CNN.
Thanks for joining me this evening. I'm Jim Acosta. I'll see you next weekend. Good night.
ANDERSON COOPER, CNN HOST: Thanks for joining us for THE WHOLE STORY. I'm Anderson Cooper.
You may have heard the U.S. surgeon-general recently issued a warning that social media carries what he called a profound risk of harm to the mental health and well-being of children and adolescence. It's something all of us who are parents are worried about. Because of a law passed in the mid-1990s designed to help the internet grow, social media companies are shielded from almost all responsibility for the content their users post, no matter how untrue or harmful it may be.
But now with billions of people online, the question is, have things spiraled out of control, especially when it comes to kids?
CNN's Audie Cornish spoke with three families who are taking on some of the biggest social media companies in an uphill legal battle in an effort to change the way the internet works.
We want to mention, some of the topics in the next hour include eating disorders and suicide, and can be disturbing to watch.
(BEGIN VIDEOTAPE)
UNIDENTIFIED FEMALE: Good morning.
UNIDENTIFIED FEMALE: Are you getting on, mom?
UNIDENTIFIED FEMALE: Yes, let's get on.
UNIDENTIFIED FEMALE: OK.
AUDIE CORNISH, CNN HOST, THE ASSIGNMENT WITH AUDIE CORNISH PODCAST (voice-over): A tour bus through Washington in spring is typically full of kids.
UNIDENTIFIED FEMALE: Look, look, look.
UNIDENTIFIED FEMALE: Oh, my.
CORNISH: This one is almost all parents. Almost.
CECE, HIGH SCHOOL SENIOR: I'm excited. We're actually doing something. It's going to be amazing. We're going to make change.
CORNISH: Eighteen-year-old CeCe is here from Kentucky with her mom Candace. And the trip she's documenting is likely just the beginning of a very long journey. That's because she and everyone on this bus have picked a fight with tech industry giants TikTok, Snapchat and the parent company of Facebook and Instagram, Meta.
UNIDENTIFIED FEMALE: We lost our son CJ to suicide.
CORNISH: They believe the apps that gave us community and connection during the pandemic are targeting the youngest users, feeding them harmful information and fueling a mental health crisis among American adolescence.
UNIDENTIFIED FEMALE: We lost our daughter England Roberts to suicide.
UNIDENTIFIED FEMALE: June of 2020 Alex lost his life to a fentanyl poisoning through a pill he purchased on Snapchat.
CORNISH: So they're fighting. This little band of Davids taking on economic Goliaths. Armed not with swords, but with stories.
There's 16-year-old Mark who was not in D.C. for this trip. 11-year- old Selena represented in Washington by her mom, Tammy. And there's CeCe, a high school senior.
(On-camera): I hear you're an artist?
CECE: Yes.
CORNISH: When did you start using social media?
CECE: I started using it around 11, 12. I've had so many art accounts, but I've had my personal accounts where I would post like pictures of me, pictures of me and my friends.
CORNISH: But are you old enough to have accounts at that age? CECE: The actual age is 13. So I was technically too young. You can
just lie about your age. You don't need any actual identity.
CORNISH (voice-over): CeCe had Instagram because she asked her mom, and Candace thought it was safe.
CANDACE WUEST, CECE'S MOM: Her middle school at the time literally announced things on Instagram. Of course I would think, oh, it's fine. If the school is using it as a way to communicate.
CORNISH: It also made it easier for them to stay connected following a divorce. On the week CeCe with was her father, her mom was just a DM away.
WUEST: If I didn't have the money to keep the service on, she could use the internet and get ahold of me through Instagram Messenger. I thought it was just the greatest thing ever.
CORNISH (on-camera): When does your social media use move from my art, communicating with my mom, to something else?
CECE: It was fifth grade. I was 160 pounds and I was 11, maybe.
WUEST: The doctor used the term obese to her.
CECE: And me and my dad started walking and I started trying to eat better. Then I was like, what if I could lose as much weight as possible. And so when getting on social media, I was like, OK, how can I do this?
WUEST: We would exchange recipes and whatnot on Instagram.
CORNISH: Did you just type recipes, skinny?
CECE: Yes. Recipes.
CORNISH: Like what did you do?
CECE: I typed like inspiration for losing weight. I looked up negative calorie recipes and it started snowballing into I was getting sick, but I was also looking up sickly things. Like it's kind of -- it got sick with me, if that makes sense.
[20:05:09]
CORNISH (voice-over): That snowball led to an avalanche of images of progressively thinner bodies, of progressively wider thigh gaps and more and more pronounced collar bones until her feed was no longer about losing a few pounds but hiding the pounds she had already lost on the way to an eating disorder.
CECE: I was comparing myself to these models, to these recipes, and I was saying, I'm not eating like that. I have to be eating like that. I'm fatter than this person.
CORNISH (on-camera): Do your parents know about any of this? CECE: I think they were concerned about my weight as well. So they saw
the healthy eating as like, OK, she's trying to take an initiative and trying to start losing weight.
CORNISH: But you're not saying, hey, look at this image of this emaciated person. This is what I want to look like. You're not showing them what you're looking at.
CECE: No.
CORNISH (voice-over): By 13 when CeCe began to resemble the so-called thinspo models she admired on Instagram, she was diagnosed with anorexia.
WUEST: When she was in the worst of it and seeing the worst of it, she was in a hospital with a feeding tube and a wheelchair, and my daughter was in that situation a good 15 times.
CORNISH: And her daughter effectively chronicled that struggle in sketch books.
CECE: This is a woman with her phone and this is being kind of like fed into her mind that -- it says lies on it. This is something I had done before any of the social media awareness.
UNIDENTIFIED FEMALE: Is it good?
CORNISH: Now 16-year-old Mark lives across the country from CeCe, outside Los Angeles. But in a way, they inhabited the same universe online.
UNIDENTIFIED FEMALE: What's a matter?
UNIDENTIFIED MALE: Nothing. Just having anxiety from eating.
UNIDENTIFIED FEMALE: But you have to eat, you know that? It's very important.
UNIDENTIFIED MALE: I know.
CORNISH: When we first met Mark and his mom, Toni Cabral, Toni was spending her days driving Mark to therapy and doctors and specialists.
UNIDENTIFIED FEMALE: We'll get through this, Mark. This is where I'm going to be parked on that side over there. OK?
UNIDENTIFIED MALE: OK, thank you.
UNIDENTIFIED FEMALE: You're welcome, love.
CORNISH: And while he was getting care for his eating disorder --
UNIDENTIFIED FEMALE: This is Toni Cabral. I'm calling for my son Mark. I want to see if you guys have any room available for treatment.
CORNISH: She desperately searched for residential treatment centers she felt he needed to get well.
UNIDENTIFIED FEMALE: I'm calling to see if you guys have an eating disorder treatment center?
CORNISH: It was rough going.
UNIDENTIFIED FEMALE: I did call all the referral numbers that you gave me and no luck. It's just very frustrating. And it's hard to stay positive because my son is going through things that are very difficult and I can't help him.
I think he wants to be like the kids online. He showed me and his father a picture of what he wants to look like.
CORNISH (on-camera): What did it look like?
UNIDENTIFIED FEMALE: Not good. To me, my son is -- he's perfect just the way he is.
CORNISH (voice-over): But it's not just the images where video scrolls through on Instagram and TikTok. He says he's bullied on the apps, interactions that are hard for him to shake off.
UNIDENTIFIED MALE: When I was 18, like they posted something like really bad about me on Instagram.
CORNISH (on-camera): This is some kids at school?
UNIDENTIFIED MALE: Like some kids at school, yes. You know "Jurassic Park," right? The movie title thing, and how it has a dinosaur on the front of course. They put my face on there and they were like "Jurassic Mark." And it was my friends. I don't know. I guess I just spiraled after that.
TAMMY RODRIGUEZ, SELENA'S MOM: Selena was just a wonderful kid. She just was sunshine. She was just wonderful.
CORNISH (voice-over): Tammy Rodriguez is mom to Selena and her older sister Destiny.
DESTINY, SELENA'S SISTER: She could light up a room as soon as she walked in. And you knew the second when she walked in, too. Because with that light came a lot of noise after.
CORNISH: It was her love of dancing that initially drew her to TikTok. And Tammy's love of her daughter's creativity that drove her to let her join.
RODRIGUEZ: When she first got the phone, she was 11, 10 or 11. She would do TikToks but she would just keep them in like the drafts or whatever that option is.
[20:10:02]
She had done Snapchat, you know, she would use the filters. We'd send each other funny pictures. We didn't post it at all. I watched what she had for apps. I could go in there any time, you know, and I had the parental where she had to ask my permission if she could download it on Apple.
CORNISH (voice-over): So you're using parental controls.
RODRIGUEZ: Right.
CORNISH: You're feeling like I'm an engaged parent.
RODRIGUEZ: Yes. Right.
CORNISH (voice-over): What Tammy didn't know at the time was that Selena had figured out how to block her mother from seeing her online life.
RODRIGUEZ: She had saved her fingerprint and I didn't know she had saved it in my phone. So like if I fall asleep or whatever, she would use her fingerprint to get in and change the setting.
DESTINY: Once the pandemic had started, she was posting more, she became more recluse. She was focused on how many likes she has. How many followers she has. How many followers she's losing. Who is messaging her.
CORNISH: During the pandemic when Selena's school and social life moved online, she was regularly messaging with people on these apps. Some she knew. Some she did not.
RODRIGUEZ: There were adults that would reach out, which I was not aware of until not too long ago. Men. They knew she was a minor.
CORNISH: Men who commented on her physical appearance and asked for nude pictures. For instance this user who identified himself as an adult male from India. There were dozens of messages like these in just one account and she had multiple accounts.
What Tammy could not see she could sense. Selena was experiencing more than preteen growing pains.
RODRIGUEZ: She was definitely just a different child. She became very dark. She started cutting her hair and having eating disorders.
CORNISH (on-camera): You've probably, I assume, tried to take her phone from her?
RODRIGUEZ: Yes. One time her phone had died actually in the van and I was driving, and I wouldn't give her mine, so she grabbed my arm here and this -- she squeezed it so hard while I was driving, this was all black and blue. She didn't care about how she was going to hurt us. She had to get it back from us. She had broke Destiny's nose. You know this is your child, but at the same time it's not your child. You know, like a Dr. Jekyll and Mr. Hyde type thing. And you never knew what was going to set it off.
CORNISH (voice-over): Up next.
WUEST: I said, Cecilia, can you believe this place is actually trying to sue these social media forums and saying it's their fault?
(COMMERCIAL BREAK)
CORNISH: South of Cincinnati, Ohio, in small town Kentucky, hairdresser Candace Wuest makes her living in the beauty business.
[20:15:02]
WUEST: This very small town, it's kind of like a box, you know.
CORNISH: But that did not prepare her for her daughter's battle with low self-esteem and disordered eating.
CECE: Hi.
WUEST: How are you?
CECE: Good. Just -- yes.
WUEST: This is her first year of not being hospitalized since seventh grade and she's a senior in high school.
Got a lot of homework?
CECE: Yes.
WUEST: There were plenty of times where we sat down at this table and she just couldn't even eat one bite. And she would get up and try to run away and I would have to go get her.
CORNISH (on-camera): What was the turning point of her recovery then?
WUEST: December of 2021 she had her most active attempt at suicide. She had gone to the basement and was going to try to hang herself. But she texted a few of her friends to say good-bye. Of course my phone blew up immediately and I called 911, and I started racing down the steps. She's sitting on a chair like this, texting her friends saying good-bye and has, like, a hose and a chair.
CORNISH (voice-over): CeCe was taken to the ER. And in the past she would be taken to a well-appointed short-term facility, but it was full. CeCe ended up at a local hospital where Candace says she wasn't treated the same way and she wasn't forced to eat.
WUEST: So she didn't eat and she stood up to go to the bathroom, passed out, hit her head.
CECE: And I just was like, you know, I just can't do this anymore. I think it was a big turning point for me, physically at least. I mean, mentally I'm still working on it. But physically, it was just a point where I was just like, I have to change.
CORNISH: What followed were daily family meals where at first candace monitored her daughter's plates. And it was at one of those dinners where Candace mentioned an ad she had seen for a law firm that was helping families sue the social media companies. UNIDENTIFIED FEMALE: The Social Media Victims Law Centers has received
national attention for its efforts to hold companies accountable for causing these harms.
CORNISH: She was stunned that there were families blaming social media for things like eating disorders instead of themselves.
WUEST: I'm looking at it thinking, wow, here we go again with this unaccountability that seems to be taking over the whole country.
CORNISH (on-camera): You're the parent that might be watching this, right, who would say this is BS.
WUEST: Right.
CORNISH: This parent should have taken away --
WUEST: I get that parent 100 percent. Here in Cincinnati, conservative Christian, that's the mentality. It's my fault. Of course.
CORNISH (voice-over): But that night at the kitchen table, she heard a different take from her daughter.
WUEST: I said, Cecilia, can you believe this? This place is actually trying to sue these social media forum and saying it's their fault. I'm like, isn't that ridiculous? She said, it's not ridiculous. So you think it's good that they're going to come after these companies and she said absolutely, I think it's about damn time. She flat-out told me that, yes, she was pushed -- like material that taught her how to hide the food, how to get rid of calories on a certain level. How to purge. How to purge quietly in a way that your parents won't know.
CORNISH: Did you not realize all those years when you were doing treatment where she was getting that information?
WUEST: No. No. And that makes me just plum stupid, I think. You know, because, honestly, Audie, I thought we had things in place through our government that would protect children on this level. I didn't think that was legal. I went online to do my own research to find out what was going on when Cecilia said it did affect her. And this is some of the things that I saw, images of thigh gaps and bridge gaps.
And then I get to one where somebody had wrote fat, had scraped with a knife, fat, across their stomach. Do you know what my daughter did when she got home from residential in North Carolina after being there for three months, eight hours away from me? That's what she did. She carved fat on her stomach. You know, at that minute, like my mama bear, I was just like, oh, my gosh, this is the most evil shit I've ever seen in my life.
CORNISH (voice-over): So Candace and CeCe decided to sue Instagram. Toni Cabral and her son Mark heard about the law firm on the radio.
UNIDENTIFIED FEMALE: I look at, I said, why, that's you. I said write the number down. CORNISH: They're now suing Instagram and TikTok. And in Connecticut,
Tammy Rodriguez came across the idea of suing the social media companies on social media.
RODRIGUEZ: It felt like they were just telling her a story.
[20:20:01]
CORNISH: And she is now suing Instagram, Snapchat and TikTok.
MATTHEW BERGMAN, ATTORNEY: We really are trying to rectify an injustice. We have moral force.
CORNISH: Attorney Matthew Bergman founded the firm that represents all three families.
BERGMAN: I think that's kind of the story of what we're embarking on.
CORNISH: He says he opened it to take on the social media companies after he heard Facebook whistleblower Frances Haugen testify on Capitol Hill.
FRANCES HAUGEN, FACEBOOK WHISTLEBLOWER: Facebook's internal research is aware that there are a variety of problems facing children on Instagram. They know that severe harm is happening on children.
CORNISH: She says the company allows the harmful content to continue because it values engagement time, which means money, over safety. And she gave Congress these documents that show that Meta new social comparison or determining one's worth by comparing one's self to others is a problem on Instagram, and that the app makes one in five teens feel worse about themselves.
Haugen also testified that the company knows that the nine searches connected to those issues can quickly lead to harmful content.
HAUGEN: Facebook knows that its amplification algorithms, things like engagement-based ranking on Instagram, can lead children all the way from just something innocent like healthy recipes to anorexia promoting content over a very short period of time.
CORNISH: In a statement at the time, CEO Mark Zuckerberg said Haugen's testimony painted a false picture of the company and that the idea that the company would deliberately push upsetting content for profit is deeply illogical. He also wrote that it's very important to him that everything the company build is safe and good for kids.
BERGMAN: We have about 1700 clients. They're children that took their life after being bullied online. We also have cases involving children that have developed eating disorders, children that have developed depression or children that had been sexually abused and exploited online.
CORNISH (on-camera): That's a lot of different things to kind of put under the umbrella of harm done by these companies. Can you talk about what you see as connecting all of these different issues? BERGMAN: Yes. The algorithms are designed to maintain and enhance
engagement. And so over time, they're showing kids not what they want to see, but what they can't look away from. So that's one of the aspects that is unified.
CORNISH (voice-over): Joe Toscano is a former Silicon Valley designer turned activist.
JOE TOSCANO, FORMER SILICON VALLEY DESIGNER: An algorithm at its face is just a string of math equations put together to take a piece of data, do certain things to it, and give us an output. You can call it instructions.
CORNISH: Instructions that use the information it gleans from all your prior clicks, likes and comments, to tailor your feed to your wants so you spend more time on the app.
TOSCANO: They are companies built around advertising dollars. How do I sell an advertisement for more money? You know? You got to have either more users or you got to have people looking at content longer, or you got to have both.
CORNISH (on-camera): But why does it have to be negative?
TOSCANO: Research shows fear, uncertainty, doubt is what sells the best. It's what gets people to click the most.
CORNISH (voice-over): Up next, a high court ruling that could upend the fight started by Tammy and Destiny, Candace and CeCe, and Toni and Mark.
(COMMERCIAL BREAK)
[20:28:16]
BERGMAN: There's the court. We're walking down -- let's see -- Second and East Capital.
WUEST: Squirrel.
CORNISH: In February, the Supreme Court heard a case that would have a big impact on the fight being waged by people like CeCe, Candace and Tammy.
BERGMAN: Single file.
CORNISH: Against the big social media companies.
BERGMAN: Let's all of us line up.
CORNISH: Attorney Matt Bergman hired line holders to save covered spots for public tickets so his clients could get inside to hear the arguments.
BERGMAN: Tammy has been a real hero.
CORNISH: At issue, a 27-year-old law called Section 230.
SEN. RICHARD BLUMENTHAL (D-CT): It dates from a time when tech or the internet was nascent, vulnerable to action that could block it, shut it down entirely.
CORNISH: Courts have held that Section 230 shields internet companies from most of the responsibility for what users post on their sites.
BLUMENTHAL: Section 230 is a virtually complete shield of legal immunity that says you can't take action against a big tech company under almost all circumstances.
CORNISH: The case was "Gonzalez v. Google." The family of a young woman killed in the 2015 terrorist attack in Paris alleges that Google, owner of YouTube, designed an algorithm that recommended ISIS videos to users effectively helping to spread their terror message.
JIM SCIUTTO, CNN ANCHOR: Today, the Supreme Court is hearing a pivotal case that could profoundly change the way you and I use the internet.
WOLF BLITZER, CNN ANCHOR: Fundamentally disrupt the internet.
UNIDENTIFIED FEMALE: Reshape the internet as we know it.
CORNISH: Essentially the court was being asked to decide if Section 230 should still shield tech companies from liability or if the algorithms created to promote and recommend third-party content to users is a kind of editorial decision, speech they can be held accountable for.
[20:30:14]
(On-camera): So Meta, which is the parent company of Facebook and Instagram, they filed a friend of the court brief in support of Google, and in it Meta's lawyers argue that, quote, "so-called targeted recommendations reflect nothing more than how online services organize and display content." They're arguing that these algorithms, that you say cause harm, that they should get rid of, they're actually necessary to organize and navigate the internet itself.
BERGMAN: There are good algorithms and there are bad algorithms. There are safe algorithms --
CORNISH: I think they argue that there's no such thing. Right? That the algorithm does its work which is organize information and return information that you've requested, right?
BERGMAN: You can organize information algorithmically and not subject children to materials they don't want to hear. They hide behind Section 230 to not make the kind of decisions that every other well- run mature company in America makes which is, gee, we could make this product, it would be a little bit more dangerous, and we'd hurt some people and we're not going to do it because we might be sued.
CORNISH (voice-over): In February, the fate of 230 was still up to the Supreme Court which raised the question, what would happen to Bergman's cases if it remained the law of the land?
BERGMAN: Algorithmic recommendations are one of the things that make social media products unsafe for kids. But not the only thing.
CORNISH (on-camera): So your case doesn't rest on that alone?
BERGMAN: No, no. There was a reason why I brought Tammy and other families there. It is going to matter. But I'm not going to give up.
CORNISH (voice-over): As for Tammy's case, the defendants, Meta, Snap, TikTok and its parent company ByteDance have filed to dismiss it citing Section 230 of the Communications Decency Act. Their brief says it bars all of Tammy's claims which are fundamentally based on third- party content.
For now, the case rolls on. It's been bundled together with several other similar lawsuits against the social media giants, many of them represented by Bergman, at least for the discovery phase of the case. Tammy's lawsuit alleges that by mid-2021, Selena was communicating with over 2500 different individuals, all but a handful of whom were complete strangers to her, and many of whom were adult users of defendants' products.
BERGMAN: In Selena's case, all we have is one of her Instagram accounts and we don't yet have her Snapchat and TikTok accounts. I shudder to think what we're going to see when we get the rest of it.
CORNISH: Tammy's lawsuit describes multiple incidents when Selena exchanged sexually explicit images with users. In June, the lawsuit describes three instances where Selena sent sexually explicit material to users with whom she was communicating. At the same time, according to the suit, users were threatening to expose sexually explicit images unless she sent more.
In an Instagram DM given to CNN by her attorney, she asks one user whether he wants to expose her? "Do you want to expose my nudes," the 11-year-old types. "Answer me." In another Instagram DM, she tells the user she wants to kill herself and has a lot of depression. The user replies, you ain't killing yourself. And then proceeds to tell her he has her mom's Insta and says, don't make me leak you.
(On-camera): You know, your complaint says that Selena was bullied. Who bullied her and how does that connect to the social media aspect of this?
BERGMAN: Well, because she could be direct messaged and she was direct messaged by predatory adults that contacted her, that harangued her. She was insecure and young and convinced her to send photos of herself naked to predatory adults, and there were no guardrails. Any stranger could get in touch with her.
CORNISH (voice-over): At the time the default setting on Instagram for minor users was public. It's not anymore. By mid-May of 2021, apparently someone leaked nude photos of Selena and the tween who had already been receiving mental health treatment spiraled. RODRIGUEZ: Something happened. I don't really know what it was. Just
one day, and she ended up saying to one of her friends that she was going to kill herself. So her friend turned and told her teacher and Selena had told her this.
CORNISH: The police were called. Selena was taken to the ER and spent a short time in the hospital, and she was clearly struggling outside of the apps.
(On-camera): Why do you think this falls on the companies?
BERGMAN: The products are explicitly designed to evade parental responsibility. They make it difficult by design and encouraged kids to open multiple counts and evade parental oversight.
[20:35:06]
CORNISH: You know --
BERGMAN: Yes.
CORNISH: I don't think it does, right? So let's just --
BERGMAN: OK.
CORNISH: No one is making you open more than one social media account. You may want another social media account.
BERGMAN: Right.
CORNISH: No one is making you download more than one app. No one is making anyone do these things. So what is your argument that the companies are at fault? Help us understand that.
BERGMAN: Well, the companies are at fault because we know that young people don't make good decisions.
CORNISH (voice-over): Bergman is saying this because he's not just counting on whistleblower documents and personal stories. He's also looking at a growing body of research about kids' brains. More on that next.
(COMMERCIAL BREAK)
[20:40:36]
CORNISH: It's been three months since we first met Mark and his mom Toni Cabral.
UNIDENTIFIED MALE: It's hard to want to get better sometimes.
CORNISH: Mark has completed a residential program his mom found for him and is home again. But he's still struggling with his eating disorder.
UNIDENTIFIED MALE: I just want to look a certain way. But I feel like that certain way is going to kill me in the end.
UNIDENTIFIED FEMALE: It will.
CORNISH: And with disconnecting from his phone.
(On-camera): At this point, do you still have a bunch of social media accounts?
UNIDENTIFIED MALE: Yes. I have to be honest. I do.
CORNISH: This is happening as your family is essentially suing the tech companies.
UNIDENTIFIED MALE: Yes. I try not to use them as much and I try to look at more recovery-focused stuff. But it's hard to live without it because it's so addicting being on it.
CORNISH (voice-over): The researchers don't go quite that far.
DR. MITCH PRINSTEIN, CHIEF SCIENCE OFFICER, AMERICAN PSYCHOLOGICAL ASSOCIATION: Scientists have been staying cautiously away from the word addiction and instead talking about problematic social media use, which is a way of describing the sense that kids are having a hard time logging off even when they want to.
CORNISH (on-camera): Why stay away from addiction then?
PRINSTEIN: Addiction by definition is referring to dependency on a behavior that is always going to be harmful and negative. But social media and the social interactions that it facilitates can sometimes be helpful.
CECE: And here's me when I was like probably 6. I was quite the diva.
CORNISH (voice-over): CeCe too has found it difficult to fully disconnect.
(On-camera): Do you still have social media accounts?
CECE: I don't have Instagram but I have Snapchat and Facebook. You almost feel like you have to have it in order to be included.
CORNISH (voice-over): Eight out of 10 American teens say social media makes them feel more connected to their friends' lives. About a third admit to spending too much time on the apps and more than half say it would be hard to give up.
The long-term effects of all this usage on teens may not be known for years. But scientists like Mitch Prinstein and Eva Telzer at the University of North Carolina Chapel Hill are trying to figure it out.
(On-camera): Can I play the game? I check my phone a lot.
(Voice-over): For three years, Professor Telzer and her team had about 175 children starting from age 12 play this game once a year while inside an MRI machine. DR. EVA TELZER, DIRECTOR, DEVELOPMENTAL SOCIAL NEUROSCIENCE, UNC
CHAPEL HILL: We're taking pictures of their brain while they play this game.
CORNISH: The shape game is designed to mimic peer feedback on the social media apps. I'm clicking as shapes appear.
(On-camera): Hey, happy face.
(Voice-over): The faces I'm shown depend on how quickly I react.
TELZER: Peer faces are a very salient queue for children and adolescence, and smiling faces is a reward. Scowling faces is a punishment. We're looking at brain activation right before they get that feedback. So, like, as you anticipate maybe getting a smile from your peer or a scowl from your peer, what's happening in the brain?
CORNISH: What's happening, she discovered, is that the kids in her test group, the ones who check their social media habitually 15 times per day or more had more activity in the regions of their brains that process emotional reality.
TELZER: It's suggesting that this peer feedback is becoming more and more important to these adolescence. They're becoming hypersensitive to seeing and anticipating positive and negative feedback from their peers.
CORNISH: Let's underscore what she just said. The kids who check their phones frequently seem to become more sensitive to what their peers were saying and doing.
TELZER: It's hard to say exactly what this might mean. The brain may be adapting and changing in a way that helps them to navigate their digital worlds. It could also be that these brain regions are becoming so hypersensitive to that peer feedback that it could potentially be related to downstream mental health, like social anxiety or depression, or even more addictive social media behaviors.
CORNISH (on-camera): And to be clear, you're saying could and may because you're not sure yet.
TELZER: We don't know.
CORNISH (voice-over): Social media is relatively new and the studies of its effect on children are ongoing.
PRINSTEIN: It's sometimes really hard to live with the ambiguity of where we are scientifically.
[20:45:04]
We can't say that social media is all good and we can't say that it's all bad.
CORNISH: Despite the scientific uncertainty, there's an increasing willingness to try to blame big tech for social media addiction and the dramatic rise in depression and suicide among teens before the pandemic which has gotten worse since then, according to the U.S. surgeon general.
In addition to the families who are suing, there are now several school district that have filed against big tech, in Seattle, in Northern California, and outside Philadelphia.
MICHAEL SMERCONISH, CNN ANCHOR: The Bucks County D.A. compared the effect of social media on kids to opioid manufacturers and distributors.
CORNISH: Snap, TikTok and Meta declined to comment directly on the litigation but all stressed commitments to safety and well-being. Many state legislatures have started to weigh in, too. Utah's new law mandates genuine age verification and parental consent for minors. It's supposed to take effect next year. But there are expected to be legal challenges to try to stop it.
The industry is already challenging California's age appropriate designed bill passed last year after impassioned testimony from an 18- year-old named Emi Kim, seen here in the black mass.
EMI KIM, DIRECTOR OF LEGISLATIVE EFFORTS, LOG OFF MOVEMENT: Young people have suffered greatly, emotionally and mentally. The bill itself is designed to make sure that the way that a child interacts with these apps, it is designed with the highest privacy and safety settings available, so things like 24-hour scroll would be disabled, things like letting an adult DM a minor if the minor does not follow the adult back. Turning off location.
CORNISH: Emi is now a college student and occasional lobbyist at the state capital, unafraid to tell her own cyberbullying story.
KIM: I started searching up my classmates' profiles and scrolled down and notice that every once in a while on multiple kids' profiles were pictures that kids had secretly taken of me. And so there were pictures calling me fat, calling me ugly, calling me stupid, saying that I was this, that and everything else.
CORNISH: She says she was cyberbullied three separate times. The last time as a freshman in high school.
KIM: I want to make sure you guys have this. It's your general fact sheet.
CORNISH: Emi and these other young women are now pushing for another California bill that would forbid social media platforms from knowingly designing addictive algorithms or facilitating gun and drug sales among other things.
KIM: I want to ask that you support this because this is the kind of protection that 13-year-old me needed.
CORNISH: Ahead --
CLARE DUFFY, CNN BUSINESS WRITER: I put your birthday in there. CORNISH: Are the apps getting safer for children? We went online to
check it out.
(COMMERCIAL BREAK)
[20:52:07]
CORNISH: On a beautiful morning in late May, two months after the Supreme Court first heard Gonzalez versus Google, we checked in again began with Candace and CeCe.
(On-camera): Hey, we're back. We're back at the Capitol.
(Voice-over): Days earlier the high court avoided ruling on whether algorithms are protected by Section 230 and sent the case back to the lower court for reconsideration. The law remains unchanged.
(On-camera): If the highest court in the land kind of declines to get involved, it's a deterrent. But you're not ending your lawsuit.
WUEST: Absolutely not. We're just getting started.
CORNISH (voice-over): Despite her optimism, in late June, about a month after the Supreme Court had left Section 230 intact, the social media companies filed a motion to dismiss their case and scores of others, citing Section 230. But mother and daughter are pushing on and believe it's a fight that should be waged on two fronts, in the courts and in Congress.
CECE: I just think there should be more restrictions on social media and I think it should be a law.
CORNISH (on-camera): Is there any particular rule change you think the company should be forced to do?
WUEST: Stop pushing these algorithms on kids. That's messed up. You can't do that.
CORNISH (voice-over): Democratic Senator Richard Blumenthal of Connecticut agrees, and he has a bill.
BLUMENTHAL: The Kids Online Safety Act says you need to design your product so they don't cause these harms. Your product design are addictive in their effects and you can stop it.
CORNISH: Blumenthal recently reintroduced the bipartisan legislation with Republican Marsha Blackburn of Tennessee after Congress failed to act on it last session.
BLUMENTHAL: There are a couple of very important elements. First of all, give kids more options to disconnect from the algorithm. Second, make the algorithms more transparent. Give parents more control and a reporting mechanism when they see harm.
CORNISH: The bill is one of several floating around Capitol Hill. Another batch of lawmakers want parents to have to give consent and social media companies to do age verification, but there's no industry standard technology for how to do it, and there are also major concerns about the privacy problems that could result from either of these bills.
(On-camera): Are you feeling like anything will actually pass?
KARA SWISHER, VETERAN TECH JOURNALIST: No.
CORNISH: OK, I didn't even get the question out.
SWISHER: Yes.
CORNISH: Definitely not. Why?
SWISHER: Because they're incompetent to the task. This is 25 years in. There's no legislation. Like -- and the one that's there helps them avoid liability in every sense.
CORNISH (voice-over): Until Congress can pass something, there will be a patchwork of reforms force by individual states, along with changes the companies made following publicity on the issue.
DUFFY: One of the things that all of the companies, Snapshot, Instagram and TikTok have done is introduced this sort of family center feature where parents can link their accounts to their teen's accounts and get a bit of supervision.
[20:55:05]
The challenge is that all of this relies on your child being honest with you, that they're only platforms.
CORNISH: Clare Duffy covers the social media companies for CNN.
DUFFY: All right. Let's do it.
CORNISH: A few months ago she created an account with a team to report on what TikTok was serving to younger users.
DUFFY: Put your birthday in there.
CORNISH: We asked her to do it again for us on both TikTok and Instagram.
UNIDENTIFIED MALE: Start watching?
DUFFY: Yes. Let's do it.
CORNISH: After day one, the teen didn't have access to the account and for five days Claire spent about 20 minutes on each app per day. Claire did notice some changes. Instagram seemed to be trying to lessen exposure to eating disorder content.
DUFFY: When Instagram, for example, you search thinspo, which is sort of another one of these coded thin inspiration search terms. It doesn't pop up with any content. Instead you're getting a screen that says here are some resources that you can call if you're struggling. We don't get anything on the tags. And even if we type sort of partial thing here, it doesn't give us any results. So that's a good sign.
CORNISH: But you can get the same thinspo content on Instagram by typing in other search terms.
DUFFY: You're seeing these pictures of people with really large thigh gaps.
CORNISH: TikTok also give us warnings when we typed in some words associated with eating disorders.
DUFFY: On TikTok you're often getting that pop up, you're more than your weight, here are some resources, you can go to, and then you're still seeing all of the content underneath that.
CORNISH: In a statement, the company told CNN the safety of the TikTok community is of the utmost importance, and more than 40,000 global thrust and safety professionals work diligently to protect our community.
UNIDENTIFIED FEMALE: Good morning. It's so nice to see so many people here today in support of suicide awareness and prevention.
CORNISH: But any changes are too late for Selena.
UNIDENTIFIED FEMALE: Good morning. How are you? What team are you registering for?
RODRIGUEZ: Steps for Selena.
UNIDENTIFIED FEMALE: Thank you.
RODRIGUEZ: All right. So mine is going to be white for loss of a child. So Wednesday, July 21st, I went in the kitchen to put something in the fridge and I can still picture that second I pulled the refrigerator door open and I bent down and I saw her legs in the living room, and I just -- I ran over and I turned her over and I started screaming. And they got her in the ambulance, and they brought her to the hospital and they had called the detective to tell us that they couldn't save her.
UNIDENTIFIED MALE: I'd like you to take a good look around you. Look at the sea of people. We would all agree that none of us chose this path, it chose us.
RODRIGUEZ: We hold the social media companies, Facebook, Snapchat and Instagram responsible for Selena's wrongful death. I believe that she would still be here had they not pulled her in.
UNIDENTIFIED FEMALE: We're going to leave the park this way, wrapping around the baseball field behind us.
CORNISH: The social media companies don't comment on ongoing litigation, but Snap tells CNN that, "Nothing is more important to us than the well-being of our community." Meta, the parent company of Facebook and Instagram says, "Our thoughts are with the families represented in these complaints. We want to reassure every parent that we have their interest at heart in the work we're doing to provide teens with safe, supportive experiences online."
WUEST: There you go.
CORNISH (on-camera): How long do you want to keep telling the story and pushing this issue?
CECE: As long as I can in order to make change.
WUEST: One, two, three. Oh, perfect.
SWISHER: I think of all the things that they're worried about, it's this kind of movement. Because look who went down in that way before. Opiates, cigarettes. Started with kids. Started with kids.
CECE: I just want to touch people's lives and kind of, if I can at least save one person that's all that matters. I don't care about the money. I don't care about any of that. I just want to get the word out there.
This is another one I did.
CORNISH (voice-over): She plans to use art to help her do that.
(On-camera): So it's a figure who is holding themselves. It looks feminine.
CECE: Yes. The colors of her skin I feel like kind of encapsulate her outward beauty and her inner beauty. This is beautiful mess because I feel like we all are just beautiful messes.
CORNISH: What I noticed about her is she's not a skeleton.
UNIDENTIFIED MALE: Yes. And I think there's some power to that.
CORNISH (voice-over): There is.
(END VIDEOTAPE)
COOPER: You heard tonight that California may introduce new legislation to further limit adolescents' exposure to potentially harmful social media. It's not alone. State legislatures across the U.S. from New Jersey to Minnesota to Texas also have bills under consideration.
Thanks for watching. I'll see you next Sunday.