[Senate Hearing 118-497]
[From the U.S. Government Publishing Office]
S. Hrg. 118-497
BIG TECH AND THE ONLINE CHILD SEXUAL
EXPLOITATION CRISIS
=======================================================================
HEARING
before the
COMMITTEE ON THE JUDICIARY
UNITED STATES SENATE
ONE HUNDRED EIGHTEENTH CONGRESS
SECOND SESSION
----------
JANUARY 31, 2024
----------
Serial No. J-118-53
----------
Printed for the use of the Committee on the Judiciary
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
BIG TECH AND THE ONLINE CHILD SEXUAL EXPLOITATION CRISIS
S. Hrg. 118-497
BIG TECH AND THE ONLINE CHILD SEXUAL
EXPLOITATION CRISIS
=======================================================================
HEARING
before the
COMMITTEE ON THE JUDICIARY
UNITED STATES SENATE
ONE HUNDRED EIGHTEENTH CONGRESS
SECOND SESSION
__________
JANUARY 31, 2024
__________
Serial No. J-118-53
__________
Printed for the use of the Committee on the Judiciary
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
www.judiciary.senate.gov
www.govinfo.gov
______
U.S. GOVERNMENT PUBLISHING OFFICE
57-444 WASHINGTON : 2025
COMMITTEE ON THE JUDICIARY
RICHARD J. DURBIN, Illinois, Chair
SHELDON WHITEHOUSE, Rhode Island LINDSEY O. GRAHAM, South Carolina,
AMY KLOBUCHAR, Minnesota Ranking Member
CHRISTOPHER A. COONS, Delaware CHARLES E. GRASSLEY, Iowa
RICHARD BLUMENTHAL, Connecticut JOHN CORNYN, Texas
MAZIE K. HIRONO, Hawaii MICHAEL S. LEE, Utah
CORY A. BOOKER, New Jersey TED CRUZ, Texas
ALEX PADILLA, California JOSH HAWLEY, Missouri
JON OSSOFF, Georgia TOM COTTON, Arkansas
PETER WELCH, Vermont JOHN KENNEDY, Louisiana
LAPHONZA BUTLER, California THOM TILLIS, North Carolina
MARSHA BLACKBURN, Tennessee
Joseph Zogby, Majority Staff Director
Katherine Nikas, Minority Staff Director
C O N T E N T S
----------
OPENING STATEMENTS
Page
Durbin, Hon. Richard J........................................... 1
Graham, Hon. Lindsey O........................................... 3
WITNESSES
Chew, Shou....................................................... 11
Prepared statement........................................... 69
Responses to written questions............................... 102
Citron, Jason.................................................... 6
Prepared statement........................................... 77
Responses to written questions............................... 219
Spiegel, Evan.................................................... 9
Prepared statement........................................... 87
Responses to written questions............................... 314
Yaccarino, Linda................................................. 12
Prepared statement........................................... 91
Responses to written questions............................... 393
Zuckerberg, Mark................................................. 7
Prepared statement........................................... 97
Responses to written questions............................... 471
APPENDIX
Items submitted for the record................................... 67
BIG TECH AND THE ONLINE CHILD SEXUAL
EXPLOITATION CRISIS
----------
WEDNESDAY, JANUARY 31, 2024
United States Senate,
Committee on the Judiciary,
Washington, DC.
The Committee met, pursuant to notice, at 10 a.m., in Room
G50, Dirksen Senate Office Building, Hon. Richard J. Durbin,
Chair of the Committee, presiding.
Present: Senators Durbin [presiding], Whitehouse,
Klobuchar, Coons, Blumenthal, Hirono, Booker, Padilla, Ossoff,
Welch, Butler, Graham, Cornyn, Lee, Cruz, Hawley, Cotton,
Kennedy, Tillis, and Blackburn.
OPENING STATEMENT OF HON. RICHARD J. DURBIN,
A U.S. SENATOR FROM THE STATE OF ILLINOIS
Chair Durbin. This meeting of the Senate Judiciary
Committee will come to order. I thank all those in attendance.
I want to preface my remarks by saying that I've been in
Congress for a few years. Senator Graham has as well. If you do
not believe this is an idea whose time has come, take a look at
the turnout here.
Today, the Senate Judiciary Committee will continue its
work on an issue on the mind of most American families: how to
keep our kids safe from sexual exploitation and harm in the
internet age. Online child sexual exploitation includes the use
of online platforms to target and groom children, and the
production and endless distribution of child sexual abuse
material, CSAM, which can haunt victims for their entire lives,
and in some cases, take their lives.
Everyone here will agree this conduct is abhorrent. I'd
like to turn to a brief video to hear directly from the
victims, the survivors, about the impact these crimes have had
on them.
[Video is shown.]
Chair Durbin. Online child sexual exploitation is a crisis
in America. In 2013, the National Center for Missing and
Exploited Children, known as NCMEC, received approximately
1,380 cyber tips per day. By 2023, just 10 years later, the
number of cyber tips has risen to 100,000 reports a day. That's
a 100,000 daily reports of child sexual abuse material, also
known as CSAM.
In recent years, we've also seen an explosion in the so-
called financial sextortion, in which a predator uses a fake
social media account to trick a minor into sending explicit
photos or videos then threatens to release them unless the
victim sends money.
In 2021, NCMEC received a total of 139 reports of
sextortion. 2021. In 2023, through the end of October alone,
this number skyrocketed to more than 22,000. More than a dozen
children have died by suicide after becoming victims of this
crime. This disturbing growth in child sexual exploitation is
driven by one thing: changes in technology.
In 1996, the world's best-selling cell phone was the
Motorola StarTAC. While groundbreaking at the time, the
clamshell-style cell phone wasn't much different from a
traditional phone. It allowed users to make and receive calls,
and even receive text messages, but that was about it. Fast
forward to today, smartphones are in the pockets of seemingly
every man, woman, and teenager on the planet.
Like the StarTAC, today's smartphones allow users to make
and receive calls and texts, but they can also take photos and
videos, support live streaming, and offer countless apps. With
the touch of your finger, that smartphone that can entertain
and inform you can become a back alley where the lives of your
children are damaged and destroyed. These apps have changed the
ways we live, work, and play.
But as investigations have detailed, social media and
messaging apps have also given predators powerful new tools to
sexually exploit children. Your carefully crafted algorithms
can be a more powerful force on the lives of our children than
even the most best-intentioned parent.
Discord has been used to groom, abduct, and abuse children.
Meta's Instagram helped connect and promote a network of
pedophiles. Snapchat's disappearing messages have been co-opted
by criminals who financially extort young victims. TikTok has
become a ``platform of choice for predators to access, engage,
and groom children for abuse.'' And the prevalence of CSAM on X
has grown as the company has gutted its trust and safety
workforce.
Today, we'll hear from the CEOs of those companies. They're
not only the tech companies that have contributed to this
crisis, they're responsible for many of the dangers our
children face online. Their design choices, their failures to
adequately invest in trust and safety, their constant pursuit
of engagement and profit over basic safety have all put our
kids and grandkids at risk. Coincidentally, several of these
companies implemented commonsense child safety improvements
within the last week, days before their CEOs would have to
justify their lack of action before this Committee.
But the tech industry alone is not to blame for the
situation we're in. Those of us in Congress need to look in the
mirror. In 1996, the same year the Motorola StarTAC was flying
off shelves, and years before social media went mainstream, we
passed Section 230 of the Communications Decency Act. This law
immunized the then fledgling internet platforms from liability
for user-generated content.
Interesting, only one other industry in America has an
immunity from civil liability. We'll leave that for another
day. For the past 30 years, Section 230 has remained largely
unchanged, allowing Big Tech to grow into the most profitable
industry in the history of capitalism without fear of liability
for unsafe practices. That has to change.
Over the past year, this Committee has unanimously reported
five bills that would finally hold tech companies accountable
for child sexual exploitation on their platforms. Unanimous.
Take a look at the composition and Membership of the Senate
Judiciary Committee, and imagine if you will, there's anything
we could agree on unanimously. These five bills were the
objective of agreement. One of these bills is my STOP CSAM Act.
Critically, it would let victims sue online providers that
promote, or aid and abet online child sexual exploitation, or
that host or store CSAM.
This stand against online child sexual exploitation is
bipartisan and absolutely necessary. Let this hearing be a call
to action that we need to get kids online safety legislation to
the President's desk. I now turn to the Ranking Member, Senator
Graham.
STATEMENT OF HON. LINDSEY O. GRAHAM,
A U.S. SENATOR FROM THE STATE OF SOUTH CAROLINA
Senator Graham. Thank you, Mr. Chairman. The Republicans
will answer the call. All of us. Every one of us is ready to
work with you and our democratic colleagues on this Committee
to prove to the American people that while Washington is
certainly broken, there's a ray of hope, and it is here. It
lies with your children.
After years of working on this issue with you and others,
I've come to conclude the following: social media companies, as
they're currently designed and operate, are dangerous products.
They're destroying lives, threatening democracy itself. These
companies must be reigned in or the worst is yet to come.
Brandon Guffey is a Representative--Republican
Representative from South Carolina in the Rock Hill area. To
all the victims who came and showed us photos of your loved
ones, don't quit. It's working. You're making a difference.
Through you, we'll get to where we need to go so other people
won't have to show a photo of their family, the damage to your
family's been done. Hopefully, we can take your pain and turn
it into something positive so nobody else has to hold up a
sign.
Brandon's son got online with the Instagram and was tricked
by a group in Nigeria that put up a young lady posing to be his
girlfriend. And as things go at that stage in life, he gave her
some photos--compromising sexual photos--and it turned out that
she was part of a extortion group in Nigeria. They threatened
the young man that if you don't give us money, we're going to
expose these photos.
He gave them money, but it wasn't enough. They kept
threatening, and he killed himself. They threatened Mr. Guffey
and his son. These are bastards by any known definition. Mr.
Zuckerberg, you and the companies before us, I know you don't
mean it to be so, but you have blood on your hands. You have a
product----
[Applause.]
Senator Graham. You have a product that's killing people.
When we had cigarettes killing people, we did something about
it. Maybe not enough. You are going to talk about guns, we have
the ATF. Nothing here. There's not a damn thing anybody can do
about it. You can't be sued.
Now, Senator Blumenthal and Blackburn, who've been like the
dynamic duo here, have found emails from your company where
they warned you about this stuff, and you decided not to hire
45 people that could do a better job of policing this. So the
bottom line is you can't be sued. You should be, and these
emails would be great for punitive damages, but the courtroom's
closed. Every American abused by all the companies in front of
me. Of all the people in America we could give blanket
liability protection too, this would be the last group I would
pick.
[Applause.]
Senator Graham. It is now time to repeal Section 230. This
Committee is made up of ideologically the most different people
you could find. We've come together through your leadership,
Mr. Chairman, to pass five bills to deal with the problem of
exploitation of children. I'll talk about them in depth in a
little bit. The bottom line is, all these bills have met the
same fate. They go nowhere. They leave the Committee and they
die.
Now, there's another approach. What do you do with
dangerous products? You either allow lawsuits, you have
statutory protections to protect consumers, or you have a
commission of sorts to regulate the industry in question; to
take your license away if you have a license, to fine you.
None of that exists here. We live in America, in 2024,
where there is no regulatory body dealing with the most
profitable, biggest companies in the history of the world. They
can't be sued, and there's not one law on the book that's
meaningful protecting the American consumer. Other than that,
we're in a good spot.
So here's what I think's going to happen. I think after
this hearing today, we're going to put a lot of pressure on our
colleagues' leadership of the Republican, Democratic Senate to
let these bills get to the floor and vote. And I'm going to go
down, starting in a couple of weeks, make unanimous consent
request to do CSAM, do the EARN IT Act, do your bill, do all of
the bills, and you can be famous. Come and object. I'm going to
give you a chance to be famous.
Now, Elizabeth Warren and Lindsey Graham have almost
nothing in common. I promised her I would say that publicly.
[Laughter.]
The only thing worse than me doing a bill with Elizabeth
Warren is her doing a bill with me. We have sort of part that
because Elizabeth and I see an abuse here that needs to be
dealt with.
Senator Durbin and I have different political philosophies,
but I appreciate what you've done on this Committee. You have
been a great partner. To all of my Democratic colleagues, thank
you very, very much.
[Applause.]
Senator Graham. To my Republican colleagues, thank you all
very, very much. Save the applause for when we get a result.
This is all talk right now, but there will come a day if we
keep pressing to get the right answer for the American people.
What is that answer? Accountability.
Now, these products have an upside. You've enriched our
lives in many ways. Mr. Zuckerberg, you created a product I
use. The idea, I think, when you first came out of this, be
able to talk to your friends and your family, and pass on your
life to be able to have a place where you could talk to your
friends and family about good things going on in life. And I
use it. We all use it.
There's an upside to everything here, but the dark side
hasn't been dealt with. It's now time to deal with the dark
side because people have taken your idea and they have turned
it into a nightmare for the American people. They've turned it
into a nightmare for the world at large.
TikTok, we had a great discussion about how maybe Larry
Ellison through Oracle can protect American data from Chinese
communist influence. But TikTok, your representative in Israel,
quit the company because TikTok is being used in a way to
basically destroy the Jewish state. This is not just about
individuals. I worry that in 2024 our democracy will be
attacked again through these platforms by foreign actors. We're
exposed, and AI is just starting.
So to my colleagues, we're here for a reason. This
Committee has a history of being tough, but also doing things
that need to be done. This Committee has risen to the occasion.
There's more that we can do, but to the Members of this
Committee, let's insist that our colleagues rise to the
occasion also. Let's make sure that in the 118th Congress, we
have votes that would fix this problem. All you can do is cast
your vote at the end of the day, but you can urge the system to
require others to cast their vote.
Mr. Chairman, I will continue to work with you and
everybody on this Committee to have a day of reckoning on the
floor of the U.S. Senate. Thank you.
Chair Durbin. Thank you, Senator Graham. Today, we welcome
five witnesses whom I'll introduce now. Jason Citron, the CEO
of Discord Incorporated. Mark Zuckerberg, the founder and CEO
of Meta. Evan Spiegel, the co-founder and CEO of Snap
Incorporated. Shou Chew, the CEO of TikTok, and Linda
Yaccarino, the CEO of X Corporation, formerly known as Twitter.
I will note for the record that Mr. Zuckerberg and Mr. Chew
are appearing voluntarily. I'm disappointed that our other
witnesses did not offer that same degree of cooperation. Mr.
Citron, Mr. Spiegel, and Ms. Yaccarino are here pursuant to
subpoenas, and Mr. Citron only accepted services of his
subpoena after U.S. Marshals were sent to Discord's
headquarters at taxpayers' expense. I hope this is not a sign
of your commitment or lack of commitment to addressing the
serious issue before us.
After I swear in the witnesses, each witness will have 5
minutes to make an opening statement. Then, Senators will ask
questions in an opening round each of 7 minutes. I expect to
take a short break at some point during questioning to allow
the witnesses to stretch their legs. If anyone is in need of a
break at any point, please let my staff know.
Before I turn to the witnesses, I'd also like you to take a
moment to acknowledge that this hearing has gathered a lot of
attention, as we expected. We have a large audience, the
largest I've seen in this room, today. I want to make clear, as
with other Judiciary Committee hearings, we ask people to
behave appropriately. I know there is high emotion in this
room, for justifiable reasons, but I ask you to please follow
the traditions of the Committee.
That means no standing, shouting, chanting, or applauding
witnesses. Disruptions will not be tolerated. Anyone who does
disrupt the hearing will be asked to leave. The witnesses are
here today to address a serious topic. We want to hear what
they have to say. I thank you for your cooperation. Could all
of the witnesses please stand to be sworn in?
[Witnesses are sworn in.]
Chair Durbin. Let the record reflect that all the witnesses
have answered in the affirmative. Mr. Citron, please proceed
with your opening statement.
STATEMENT OF MR. JASON CITRON, CO-FOUNDER
AND CHIEF EXECUTIVE OFFICER,
DISCORD INCORPORATED, SAN FRANCISCO, CALIFORNIA
Mr. Citron. Good morning.
Chair Durbin. Good morning.
Mr. Citron. My name is Jason Citron, and I am the co-
founder and CEO of Discord. We are an American company with
about 800 employees living and working in 33 States. Today,
Discord has grown to more than 150 million monthly active
users.
Discord is a communications platform where friends hang out
and talk online about shared interests from fantasy sports to
writing music to video games. I've been playing video games
since I was 5 years old, and as a kid, it's how I had fun and
found friendship. Many of my fondest memories are of playing
video games with friends. We built Discord so that anyone could
build friendships playing video games from Minecraft, to
Wordle, and everything in between. Games have always brought us
together, and Discord makes that happen today.
Discord is one of the many services that have
revolutionized how we communicate with each other in the
different moments of our lives; iMessage, Zoom, Gmail, and on
and on. They enrich our lives; create communities; accelerate
commerce, healthcare, and education.
Just like with all technology and tools, there are people
who exploit and abuse our platforms for immoral and illegal
purposes. All of us here on the panel today, and throughout the
tech industry, have a solemn and urgent responsibility to
ensure that everyone who uses our platforms is protected from
these criminals, both online and off.
Discord has a special responsibility to do that because a
lot of our users are young people. More than 60 percent of our
active users are between the ages of 13 and 24. It's why safety
is built into everything we do. It's essential to our mission
and our business, and most of all, this is deeply personal. I'm
a dad with two kids. I want Discord to be a product that they
use and love, and I want them to be safe on Discord. I want
them to be proud of me for helping to bring this product to the
world.
That's why I'm pleased to be here today to discuss the
important topic of the online safety of minors. My written
testimony provides a comprehensive overview of our safety
programs. Here are a few examples of how we protect and empower
young people.
First, we've put our money into safety. The tech sector has
a reputation of larger companies buying smaller ones to
increase user numbers and boost financial results. But the
largest acquisition we've ever made at Discord was a company
called Sentropy. It didn't help us expand our market share or
improve our bottom line. In fact, because it uses AI to help us
identify, ban, and report criminals and bad behavior, it has
actually lowered our user count by getting rid of bad actors.
Second, you've heard of end-to-end encryption that blocks
anyone, including the platform itself, from seeing users'
communications. It's a feature on dozens of platforms but not
on Discord. That's a choice we've made. We don't believe we can
fulfill our safety obligations if the text messages of teens
are fully encrypted because encryption would block our ability
to investigate a serious situation, and when appropriate,
report to law enforcement.
Third, we have a zero-tolerance policy on child sexual
abuse material or CSAM. We scan images uploaded to Discord to
detect and block the sharing of this abhorrent material. We've
also built an innovative tool, Teen Safety Assist, that blocks
explicit images and helps young people easily report unwelcome
conversations. We've also developed a new semantic hashing
technology for detecting novel forms of CSAM called Clip, and
we're sharing this technology with other platforms through the
tech coalition.
Finally, we recognize that improving online safety requires
all of us to work together. So we partner with nonprofits, law
enforcement, and our tech colleagues to stay ahead of the curve
in protecting young people online. We want to be the platform
that empowers our users to have better online experiences, to
build true connections, genuine friendships, and to have fun.
Senators, I sincerely hope today is the beginning of an
ongoing dialog that results in real improvements in online
safety. I look forward to your questions and to helping the
Committee learn more about Discord.
[The prepared statement of Mr. Citron appears as a
submission for the record.]
Chair Durbin. Thank you, Mr. Citron. Mr. Zuckerberg.
STATEMENT OF MR. MARK ZUCKERBERG,
FOUNDER AND CHIEF EXECUTIVE OFFICER,
META, MENLO PARK, CALIFORNIA
Mr. Zuckerberg. Chairman Durbin, Ranking Member Graham, and
Members of the Committee, every day, teens and young people do
amazing things on our services. These are apps to create new
things, express themselves, explore the world around them, and
feel more connected to the people they care about. Overall,
teens tell us that this is a positive part of their lives, but
some face challenges online, so we work hard to provide parents
and teens support and controls to reduce potential harms.
Being a parent is one of the hardest jobs in the world.
Technology gives us new ways to communicate with our kids and
feel connected to their lives, but it can also make parenting
more complicated, and it's important to me that our services
are positive for everyone who uses them. We are on the side of
parents everywhere working hard to raise their kids.
Over the last 8 years, we've built more than 30 different
tools, resources, and features that parents can set time limits
for their teens using our apps, see who they're following, or
if they report someone for bullying. For teens, we've added
nudges to remind them when they've been using Instagram for a
while, or if it's getting late and they should go to sleep, as
well as ways to hide words or people without those people
finding out. We put special restrictions on teen accounts on
Instagram. By default, accounts for under 16s are set to
private, have the most restrictive content settings, and can't
be messaged by adults that they don't follow or people they
aren't connected to.
With so much of our lives spent on mobile devices and
social media, it's important to look into the effects on teen
mental health and well-being. I take this very seriously.
Mental health is a complex issue and the existing body of
scientific work has not shown a cause or a link between using
social media and young people having worse mental health
outcomes.
A recent National Academies of Sciences report evaluated
over 300 studies and found that research, ``did not support the
conclusion that social media causes changes in adolescent
mental health at the population level.'' It also suggested that
social media can provide significant positive benefits when
young people use it to express themselves, explore and connect
with others. Still, we're going to continue to monitor the
research and use it to inform our roadmap.
Keeping young people safe online has been a challenge since
the internet began, and as criminals evolve their tactics, we
have to evolve our defenses too. We work closely with law
enforcement to find bad actors and help bring them to justice,
but the difficult reality is that no matter how much we invest
or how effective our tools are, there are always more. There's
always more to learn and more improvements to make, but we
remain ready to work with Members of this Committee, industry,
and parents to make the internet safer for everyone.
I'm proud of the work that our teams do to improve online
child safety on our services and across the entire internet. We
have around 40,000 people overall working on safety and
security, and we've invested more than $20 billion in this
since 2016, including around $5 billion in the last year alone.
We have many teams dedicated to child safety and teen well-
being, and we lead the industry in a lot of the areas that
we're discussing today.
We built technology to tackle the worst online risks and
share it to help our whole industry get better. Like Project
Lantern, which helps companies share data about people who
break child safety rules, and we're founding members of Take It
Down, a platform which helps young people to prevent their nude
images from being spread online.
We also go beyond legal requirements and use sophisticated
technology to proactively discover abusive material, and as a
result, we find and report more inappropriate content than
anyone else in the industry. As the National Center for Missing
and Exploited Children put it this week, ``Meta goes above and
beyond to make sure that there are no portions of their network
where this type of activity occurs.''
I hope we can have a substantive discussion today that
drives improvements across the industry, including legislation
that delivers what parents say they want--a clear system for
age verification, and control over what apps their kids are
using. Three out of four parents want app store age
verification, and four out of five want parental approval of
whenever teens download apps. We support this. Parents should
have the final say on what apps are appropriate for their
children, and shouldn't have to upload their ID every time.
That's what app stores are for.
We also support setting industry standards on age-
appropriate content, and limiting signals for advertising to
teens to of age and location and not behavior. At the end of
the day, we want everyone who uses our services to have safe
and positive experiences.
Before I wrap up, I want to recognize the families who are
here today who have lost a loved one, or lived through some
terrible things that no family should have to endure. These
issues are important for every parent and every platform. I'm
committed to continuing to work in these areas, and I hope we
can make progress today.
[The prepared statement of Mr. Zuckerberg appears as a
submission for the record.]
Chair Durbin. Thank you. Mr. Spiegel.
STATEMENT OF MR. EVAN SPIEGEL,
CO-FOUNDER AND CHIEF EXECUTIVE OFFICER,
SNAP INCORPORATED, SANTA MONICA, CALIFORNIA
Mr. Spiegel. Chairman Durbin, Ranking Member Graham, and
Members of the Committee, thank you for convening this hearing,
and for moving forward important legislation to protect
children online. I'm Evan Spiegel, the co-founder and CEO of
Snap. We created Snapchat, an online service that is used by
more than 800 million people worldwide to communicate with
their friends and family.
I know that many of you have been working to protect
children online since before Snapchat was created, and we are
grateful for your long-term dedication to this cause, and your
willingness to work together to help keep our community safe. I
want to acknowledge the survivors of online harms and the
families who are here today who have suffered the loss of a
loved one. Words cannot begin to express the profound sorrow I
feel that a service we designed to bring people happiness and
joy has been abused to cause harm.
I want to be clear that we understand our responsibility to
keep our community safe. I also want to recognize the many
families who have worked to raise awareness on these issues,
push for change, and collaborated with lawmakers on important
legislation like the Cooper Davis Act, which can help save
lives.
I started building Snapchat with my co-founder, Bobby
Murphy, when I was 20 years old. We designed Snapchat to solve
some of the problems that we experienced online when we were
teenagers. We didn't have an alternative to social media. That
meant pictures shared online were permanent, public, and
subject to popularity metrics. It didn't feel very good.
We built Snapchat differently because we wanted a new way
to communicate with our friends that was fast, fun, and
private. A picture is worth a thousand words, so people
communicate with images and videos on Snapchat. We don't have
public likes or comments when you share your story with
friends. Snapchat is private by default, meaning that people
need to opt in to add friends and choose who can contact them.
When we built Snapchat, we chose to have the images and videos
sent through our service delete by default.
Unlike prior generations who've enjoyed the privacy
afforded by phone calls which aren't recorded, our generation
has benefited from the ability to share moments through
Snapchat that may not be picture perfect, but instead convey
emotion without permanence. Even though Snapchat messages are
deleted by default, we let everyone know that images and videos
can be saved by the recipient.
When we take action on illegal or potentially harmful
content, we also retain the evidence for an extended period,
which allows us to support law enforcement and hold criminals
accountable. To help prevent the spread of harmful content on
Snapchat, we approve the content that is recommended on our
service using a combination of automated processes and human
review.
We apply our content rules consistently and fairly across
all accounts. We run samples of our enforcement actions through
quality assurance to verify that we're getting it right. We
also proactively scan for known child sexual abuse material,
drug-related content, and other types of harmful content,
remove that content, deactivate and device block offending
accounts, preserve the evidence for law enforcement and report
certain content to the relevant authorities for further action.
Last year, we made 690,000 reports to the National Center
for Missing and Exploited Children leading to more than 1,000
arrests. We also removed 2.2 million pieces of drug-related
content, and blocked 705,000 associated accounts. Even with our
strict privacy settings, content moderation efforts, proactive
detection, and law enforcement collaboration, bad things can
still happen when people use online services. That's why we
believe that people under the age of 13 are not ready to
communicate on Snapchat.
We strongly encourage parents to use the device-level
parental controls on iPhone and Android. We use them in our own
household, and my wife approves every app that our 13-year-old
downloads. For parents who want more visibility and control, we
built Family Center on Snapchat where you can view who your
teen is talking to, review privacy settings, and set content
limits. We have worked for years with Members of the Committee
on legislation like the Kids Online Safety Act and the Cooper
Davis Act, which we are proud to support.
I want to encourage broader industry support for
legislation protecting children online. No legislation is
perfect, but some rules of the road are better than none. Much
of the work that we do to protect people that use our service
would not be possible without the support of our partners
across the industry, government, nonprofit organizations,
NGO's, and in particular, law enforcement and the first
responders who have committed their lives to helping keep
people safe.
I'm profoundly grateful for the extraordinary efforts
across our country and around the world to prevent criminals
from using online services to perpetrate their crimes. I feel
an overwhelming sense of gratitude for the opportunities that
this country has afforded me and my family. I feel a deep
obligation to give back and to make a positive difference, and
I'm grateful to be here today as part of this vitally important
democratic process.
Members of the Committee, I give you my commitment that
we'll be part of the solution for online safety. We'll be
honest about our shortcomings, and we'll work continuously to
improve. Thank you, and I look forward to answering your
questions.
[The prepared statement of Mr. Spiegel appears as a
submission for the record.]
Chair Durbin. Thank you, Mr. Spiegel. Mr. Chew.
STATEMENT OF MR. SHOU CHEW, CHIEF EXECUTIVE OFFICER,
TIKTOK INCORPORATED, SINGAPORE
Mr. Chew. Chair Durbin, Ranking Member Graham, and Members
of the Committee, I appreciate the opportunity to appear before
you today. My name is Shou Chew, and I'm the CEO of TikTok, an
online community of more than 1 billion people worldwide,
including well over 170 million Americans who use our app every
month to create, to share, and to discover.
Now, although the average age on TikTok in the U.S. is over
30, we recognize that special safeguards are required to
protect minors, and especially, when it comes to combating all
forms of CSAM. As a father of three young children myself, I
know that the issues that we're discussing today are horrific
and the nightmare of every parent. I am proud of our efforts to
address the threats to young people online from a commitment to
protecting them, to our industry leading policies, use of
innovative technology, and significant ongoing investments in
trust and safety to achieve this goal.
TikTok is vigilant about enforcing its 13-and-up age
policy, and offers an experience for teens that is much more
restrictive than you and I would have as adults. We make
careful product design choices to help make our app
inhospitable to those seeking to harm teens. Let me give you a
few examples of long-standing policies that you need to TikTok.
We didn't do them last week.
First, direct messaging is not available to any users under
the age of 16. Second, accounts for people under 16 are
automatically set to private along with their content.
Furthermore, the content cannot be downloaded and will not be
recommended to people they do not know. Third, every teen under
18, has a screen time limit automatically set to 60 minutes.
And fourth, only people 18 and above are allowed to use our
livestream feature.
I'm proud to say that TikTok was among the first to empower
parents to supervise their teens on our app with our family
pairing tools. This includes setting screen time limits,
filtering out content from the teens' feeds, amongst others. We
made these choices after consulting with doctors and safety
experts who understand the unique stages of teenage development
to ensure that we have the appropriate safeguards to prevent
harm and minimize risk.
Now, safety is one of the core priorities that defines
TikTok under my leadership. We currently have more than 40,000
trust and safety professionals working to protect our community
globally, and we expect to invest more than $2 billion in trust
and safety efforts this year alone, with a significant part of
that in our U.S. operations. Our robust community guidelines
strictly prohibit content or behavior that puts teenagers at
risk of exploitation or other harm, and we vigorously enforce
them.
Our technology moderates all content uploaded to our app to
help quickly identify potential CSAM and other material that
breaks our rules. It automatically removes the content or
elevates it to our safety professionals for further review. We
also moderate direct messages for CSAM and related material,
and use third-party tools like photo DNA and take it down to
combat CSAM to prevent content from being uploaded to our
platform.
We continually meet with parents, teachers, and teens. In
fact, I sat down with a group just a few days ago. We use their
insight to strengthen the protections on our platform, and we
also work with leading groups like the Technology Coalition.
The steps that we're taking to protect teens are a critical
part of our larger trust and safety work as we continue our
voluntary and unprecedented efforts to build a safe and secure
data environment for U.S. users, ensuring that our platform
remains free from outside manipulation and implementing
safeguards on our content recommendation and moderation tools.
Keeping teens safe online requires a collaborative effort
as well as collective action. We share the Committee's concern
and commitment to protect young people online, and we welcome
the opportunity to work with you on legislation to achieve this
goal. Our commitment is ongoing and unwavering because there is
no finish line when it comes to protecting teens.
Thank you for your time and consideration today. I'm happy
to answer your questions.
[The prepared statement of Mr. Chew appears as a submission
for the record.]
Chair Durbin. Thanks, Mr. Chew. Ms. Yaccarino.
STATEMENT OF MS. LINDA YACCARINO, CHIEF EXECUTIVE
OFFICER, X CORP., SAN FRANCISCO, CALIFORNIA
Ms. Yaccarino. Chairman Durbin, Ranking Member Graham, and
esteemed Members of the Committee, thank you for the
opportunity to discuss X's work to protect the safety of minors
online.
Today's hearing is titled a crisis which calls for
immediate action. As a mother, this is personal, and I share
the sense of urgency. X is an entirely new company, an
indispensable platform for the world and for democracy. You
have my personal commitment that X will be active and a part of
this solution.
While I joined X only in June 2023, I bring a history of
working together with governments, advocates, and NGO's to
harness the power of media to protect people. Before I joined,
I was struck by the leadership steps this new company was
taking to protect children. X is not the platform of choice for
children and teens.
We do not have a line of business dedicated to children.
Children under the age of 13 are not allowed to open an
account. Less than 1 percent of the U.S. users on X are between
the ages of 13 and 17, and those users are automatically set to
a private default setting, and cannot accept a message from
anyone they do not approve.
In the last 14 months, X has made material changes to
protect minors. Our policy is clear, X has zero tolerance
toward any material that features or promotes child sexual
exploitation. My written testimony details X's extensive
policies on content or actions that are prohibited, and include
grooming, blackmail, and identifying alleged victims of CSE.
We've also strengthened our enforcement with more tools and
technology to prevent those bad actors from distributing,
searching for, and engaging with CSE content. If CSE content is
posted on X, we remove it, and now we also remove any account
that engages with CSE content, whether it is real or computer
generated.
Last year, X suspended 12.4 million accounts for violating
our CSE policies. This is up from 2.3 million accounts that
were removed by Twitter in 2022. In 2023, 850,000 reports were
sent to NCMEC, including our first ever autogenerated report.
This is eight times more than was reported by Twitter in 2022.
We've changed our priorities. We've restructured our trust
and safety teams to remain strong and agile. We are building a
trust and safety center of excellence in Austin, Texas to bring
more agents in-house to accelerate our impact. We're applying
to the Technology Coalition's project, Lantern, to make further
industry-wide progress and impact. We've also opened up our
algorithms for increased transparency. We want America to lead
in this solution.
X commends the Senate for passing the REPORT Act, and we
support the SHIELD Act. It is time for a Federal standard to
criminalize the sharing of nonconsensual intimate material. We
need to raise the standards across the entire internet
ecosystem, especially for those tech companies that are not
here today and not stepping up. X supports the STOP CSAM Act.
The Kids Online Safety Act should continue to progress, and we
will support the continuation to engage with it and ensure the
protections of the freedom of speech.
There are two additional areas that require everyone's
attention. First, as the daughter of a police officer, law
enforcement must have the critical resources to bring these bad
offenders to justice. Second, with artificial intelligence,
offenders' tactics will continue to sophisticate and evolve.
Industry collaboration is imperative here.
X believes that the freedom of speech and platform safety
can and must coexist. We agree that now is the time to act with
urgency. Thank you. I look forward to answering your questions.
[The prepared statement of Ms. Yaccarino appears as a
submission for the record.]
Chair Durbin. Thank you very much, Ms. Yaccarino. Now we'll
go into rounds of questions. Seven minutes each for the Members
as well. I would like to make note of your testimony, Ms.
Yaccarino, I believe you are the first social media company to
publicly endorse the CSAM Act.
Ms. Yaccarino. It is our honor, Chairman.
Chair Durbin. That is progress, my friends. Thank you for
doing that. I'm still going to be asking some probing
questions, but let me get down to the bottom line here. I'm
going to focus on my legislation on CSAM. What it says is civil
liability if you intentionally or knowingly host or store child
sexual abuse materials or make child sex abuse materials
available. Second, intentionally or knowingly promote, or aid
and abet a violation of child sexual exploitation laws. Is
there anyone here who believes you should not be held civilly
liable for that type of conduct? Mr. Citron.
Mr. Citron. Good morning, Chair. You know, we very much
believe that this content is disgusting and that there are many
things about the STOP CSAM bill that I think are very
encouraging and we very much support adding more resources for
the CyberTipline and modernizing that along with giving more
resources to NCMEC. And I'd be very open to having
conversations with you and your team to talk through the
details of the bills and more.
Chair Durbin. I sure would like to do that because if you
intentionally or knowingly host or store CSAM, I think you
ought to at least be civilly liable. I can't imagine anyone who
would disagree with that.
Mr. Citron. Yes, it's disgusting content.
Chair Durbin. It certainly is. That's why we need you
supporting this legislation. Mr. Spiegel, I want to tell you, I
listened closely to your testimony here, and it's never been a
secret that Snapchat is used to send sexually explicit images.
In 2013, early in your company's history, you admitted this in
an interview. Do you remember that interview?
Mr. Spiegel. Senator, I don't recall this specific
interview.
Chair Durbin. You said that when you were first trying to
get people on the app, you would, ``go up to the people and be,
like, hey, you should try this application. You can send
disappearing photos. And they would say, oh, for sexting.'' Do
you remember that interview?
Mr. Spiegel. Senator, when we first created the
application, it was actually called Peekaboo, and the idea was
around disappearing images. The feedback we received from
people using the app is that they were actually using it to
communicate. So we changed the name of the application to
Snapchat, and we found that people were using it to talk
visually.
Chair Durbin. As early as 2017, law enforcement identified
Snapchat as the pedophiles go-to sexual exploitation tool. The
case of a 12-year-old girl identified in court only as LW shows
the danger. Over 2\1/2\ years, a predator sexually groomed her,
sending her sexually explicit images and videos over Snapchat.
The man admitted that he only used Snapchat with LW and not
any other platforms because he, ``knew the chats would go
away.'' Did you and everyone else at Snap really fail to see
that the platform was the perfect tool for sexual predators?
Mr. Spiegel. Senator, that behavior is disgusting and
reprehensible. We provide in-app reporting tools so that people
who are being harassed or who, you know, have been shared
inappropriate sexual content can report it in the case of
harassment or sexual content. We typically respond to those
reports within 15 minutes so that we can provide help.
Chair Durbin. When LW, the victim, sued Snapchat, her case
was dismissed under Section 230 of the Communications Decency
Act. Do you have any doubt that had Snap faced the prospect of
civil liability for facilitating sexual exploitation, the
company would've implemented even better safeguards?
Mr. Spiegel. Senator, we already work extensively to
proactively detect this type of behavior. We make it very
difficult for predators to find teens on Snapchat. There are no
public friends lists, no public profile photos. When we
recommend friends for teens, we make sure that they have
several mutual friends in common before making that
recommendation. We believe those safeguards are important to
preventing predators from misusing our platform.
Chair Durbin. Mr. Citron, according to Discord's website,
it takes, ``a proactive and automated approach to safety only
on servers with more than 200 members. Smaller servers rely on
server owners and community moderators to define and enforce
behavior.''
So how do you defend an approach to safety that relies on
groups of fewer than 200 sexual predators to report themselves
for things like grooming, trading in CSAM, or sextortion?
Mr. Citron. Chair, our goal is to get all of that content
off of our platform, and ideally prevent it from showing up in
the first place, or from people engaging in these kinds of
horrific activities. We deploy a wide array of techniques that
work across every surface on Discord.
I mentioned we recently launched something called Teen
Safety Assist, which works everywhere, and it's on by default
for teen users that kind of acts like a buddy that lets them
know if they're in a situation or talking with someone that may
be inappropriate so they can report that to us and block that
user. So we----
Chair Durbin. Mr. Citron, if that were working, we wouldn't
be here today.
Mr. Citron. Chair, this is an ongoing challenge for all of
us. That that is why we're here today. But we do have--15
percent of our company is focused on trust and safety, of which
this is one of our top issues. That's more people than we have
working on marketing and promoting the company. So we take
these issues very seriously, but we know it's an ongoing
challenge, and I look forward to working with you and
collaborating with our tech peers and the nonprofits to improve
our approach.
Chair Durbin. I certainly hope so. Mr. Chew, your
organization, business is one of the more popular ones among
children. Can you explain to us what you are doing
particularly, and whether you've seen any evidence of CSAM in
your business?
Mr. Chew. Yes, Senator. We have a strong commitment to
invest in trust and safety. And as I said in my opening
statement, I intend to invest more than $2 billion in trust and
safety this year alone. We have 40,000 safety professionals,
you know, working on this topic. We have built a specialized
child safety team to help us identify specialized issues,
horrific issues, like material like the ones you have
mentioned. If we identify any on our platform and we
proactively do detection, we will remove it, and we will report
them to NCMEC and other authorities.
Chair Durbin. Why is it TikTok allowing children to be
exploited into performing commercialized sex acts?
Mr. Chew. Senator, I respectfully disagree with that
characterization. Our live streaming product is not for anyone
below the age of 18. We have taken action to identify anyone
who violates that, and we remove them from using that service.
Chair Durbin. At this point, I'm going to turn to my
Ranking Member, Senator Graham.
Senator Graham. Thank you, Mr. Chairman. Mr. Citron, you
said we need to start a discussion. To be honest with you,
we've been having this discussion for a very long time. We need
to get a result, not a discussion. Do you agree with that?
Mr. Citron. Ranking Member, I agree this is an issue that
we've also been very focused on since we started our company in
2015, but this is first time we----
Senator Graham. Are you familiar with the EARN IT Act,
authored by myself and Senator Blumenthal?
Mr. Citron. A little bit. Yes.
Senator Graham. Okay. Do you support that?
Mr. Citron. We----
Senator Graham. Like, yes or no.
Mr. Citron. We're not prepared to support it today, but we
believe that Section----
Senator Graham. Do you support the CSAM Act?
Mr. Citron. The STOP CSAM Act, we are not prepared to
support it today either.
Senator Graham. Do you support the SHIELD Act?
Mr. Citron. We believe that the CyberTipline----
Senator Graham. Do you support it? Yes, or no?
Mr. Citron. We believe that the CyberTipline and NCMEC----
Senator Graham. I'll take that to be no. The Project Safe
Childhood Act. Do you support it?
Mr. Citron. We believe that----
Senator Graham. I'll take that to be no. The REPORT Act. Do
you support it?
Mr. Citron. Ranking Member Graham, we very much look
forward to having conversations with you and your team----
Senator Graham. We look forward to passing the bill that
will solve the problem. Do you support removing Section 230
liability protections for social media companies?
Mr. Citron. I believe that Section 230 needs to be updated.
It's a very old law.
Senator Graham. Do you support repealing it so people can
sue if they believe they're harmed?
Mr. Citron. I think that Section 230 as written, while it
has many downsides, has enabled innovation on the internet,
which I think has largely been----
Senator Graham. Thank you very much. So here you are. You
got--if you're waiting on these guys to solve the problem,
we're going to die waiting. Mr. Zuckerberg, I'll try to be
respectful here. The Representative from South Carolina, Mr.
Guffey's son, got caught up in a sex extortion ring in Nigeria
using Instagram. He was shaken down, paid money that wasn't
enough, and he killed himself using Instagram. What would you
like to say to him?
Mr. Zuckerberg. It's terrible. I mean, no one should have
to go through something like that.
Senator Graham. You think he should be allowed to sue you?
Mr. Zuckerberg. I think that they can sue us.
Senator Graham. Well, I think you should, and he can't. So
the bottom line here, folks, is that this Committee is done
with talking. We passed five bills unanimously that in their
different ways--and look at who did this. Senators Graham and
Blumenthal, Senators Durbin and Hawley, Senators Klobuchar and
Cornyn, Senators Cornyn and Klobuchar, and Senators Blackburn
and Ossoff. I mean, we've found common ground here that just is
astonishing. And we've had hearing after hearing, Mr. Chairman.
And the bottom line is, I've come to conclude gentlemen, that
you're not going to support any of this. Linda, how do you say
your last name?
Ms. Yaccarino. Yaccarino.
Senator Graham. Do you support the EARN IT Act?
Ms. Yaccarino. We strongly support the collaboration to
raise industry----
Senator Graham. No, no----
Ms. Yaccarino [continuing]. Practices to prevent CSAM.
Senator Graham [continuing]. No, no. Do you support the
EARN IT Act? In English, do you support the EARN IT Act? Yes,
or no? We don't need double speak here.
Ms. Yaccarino. We look forward to supporting and continue
our conversations. As you can see----
Senator Graham. Okay. So I take that as no. But you have
taken--the reason the EARN IT Act's important, you can actually
lose your liability protection when children are exploited and
you didn't use best business practices. See, the EARN IT Act
means you have to earn liability protection. You aren't given
it no matter what you do.
So to the Members of this Committee, it is now time to make
sure that the people who are holding up the signs can sue on
behalf of their loved ones. Nothing will change until the
courtroom door is open to victims of social media. $2 billion,
Mr. Chew, what percentage is that of what you made last year?
Mr. Chew. Senator, it's a significant and increasing
investment. As a private company, we're not----
Senator Graham. You pay taxes. I mean, 2 percent is what
percent of your revenue?
Mr. Chew [continuing]. Senator, we're not ready to share
our financials in public.
Senator Graham. Well, I just think $2 billion sounds a lot
unless you make a $100 billion. So the point is, you know, when
you tell us you're going to spend $2 billion, great, but how
much do you make? You know, it's all about eyeballs. Well, our
goal is to get eyeballs on you.
And it's just not about children, I mean, the damage being
done. Do you realize, Mr. Chew, that your TikTok representative
in Israel resigned yesterday?
Mr. Chew. Yes, I'm aware.
Senator Graham. Okay. And he said, ``I resigned from
TikTok. We're living at a time in which our existence as Jews
in Israel, and Israel is under attack and in danger. Multiple
screenshots taken from TikTok's internal employee chat platform
known as Lark, show how TikTok's trust and safety officers
celebrate the barbaric acts of Hamas and other Iranian-back
terror groups, including Houthis in Yemen.''
Mr. Chew. Senator, I need to make it very clear that pro-
Hamas content and hate speech is not allowed at all----
Senator Graham. Why did----
Mr. Chew [continuing]. In our company.
Senator Graham [continuing]. He resign--why did he resign?
Why did he quit?
Mr. Chew. Senator, we also do not allow any people----
Senator Graham. Do you why did he quit?
Mr. Chew. We do not allow this. We will investigate such
claims----
Senator Graham. But my question is, he quit. I'm sure he
had a good job. He gave up a good job because he thinks your
platform is being used to help people who want to destroy the
Jewish state, and I'm not saying you want that. Mr. Zuckerberg,
I'm not saying you want, as an individual, any of the harms. I
am saying that the products you have created with all the
upside have a dark side.
Mr. Citron, I am tired of talking. I'm tired of having
discussions. We all know the answer here, and here's the
ultimate answer: stand behind your product. Go to the American
courtroom and defend your practices. Open up the courthouse
door. Until you do that, nothing will change.
Until these people can be sued for the damage they're
doing, it is all talk. I'm a Republican who believes in free
enterprise, but also believe that every American who's been
wronged has to have somebody to go to to complain. There's no
commission to go to that can punish you. There's not one law in
the book because you oppose everything we do and you can't be
sued. That has to stop, folks.
How do you expect the people in the audience to believe
that we're going to help their families if we don't have some
system or a combination of systems to hold these people
accountable? Because for all the upside, the dark side is too
great to live with. We do not need to live this way as
Americans.
Chair Durbin. Thank you, Senator Graham. Senator Klobuchar
is next. She's been quite a leader on the subject for quite a
long time on the SHIELD Act and with Senator Cornyn on the
revenge porn legislation. Senator Klobuchar.
Senator Klobuchar. Thank you very much, Chairman Durbin,
and thank you Ranking Member Graham for those words. I couldn't
agree more. For too long we have been seeing the social media
companies turn a blind eye when kids have joined these
platforms in record numbers.
They have used algorithms that push harmful content because
that content got popular. They provided a venue, maybe not
knowingly at first, but for dealers to sell deadly drugs like
fentanyl. Our own head of our Drug Enforcement Administration
has said they basically have been captured by the cartels in
Mexico and in China.
So I strongly support, first of all, the STOP CSAM bill. I
agree with Senator Graham that nothing is going to change
unless we open up the courtroom doors. I think the time for all
of this immunity is done, because I think money talks even
stronger than we talk up here.
Two of the five bills, as noted, are my bills with Senator
Cornyn. One has actually passed through the Senate, but is
waiting action in the House. But the other one is the SHIELD
Act, and I do support appreciate those supportive of that bill.
This is about revenge porn. The FBI Director testified before
this Committee, there has been over 20 suicides of kids
attributed to online revenge porn in just the last year.
But for those parents out there and those families, this is
for them, about their own child, but it's also about making
sure this doesn't happen to other children. I know because I've
talked to these parents. Parents like Bridget Norring from
Hastings, Minnesota, who is out there today. Bridget lost her
teenage son after he took a fentanyl-laced pill that he
purchased on the internet. Amy Neville is also here. Platform,
got the pill. Amy Neville is also here. Her son, Alexander, was
only 14 when he died after taking a pill he didn't know was
actually fentanyl.
We're starting a law enforcement campaign, ``One pill
kills,'' in Minnesota, going to the schools with the sheriffs
and law enforcement. But the way to stop it is, yes, at the
border and at the points of entry, but we know that 30 percent,
some of the people that are getting the fentanyl are getting it
off the platforms.
Meanwhile, social media platforms generated $11 billion in
revenue in 2022 from advertising directed at children and
teenagers, including nearly $2 billion in ad profits derived
from users age 12 and under. When a Boeing plane lost a door in
mid-flight several weeks ago, nobody questioned the decision to
ground a fleet of over 700 planes. So why aren't we taking the
same type of decisive action on the danger of these platforms
when we know these kids are dying?
We have bills----
[Applause.]
Senator Klobuchar [continuing]. That have passed through
this incredibly diverse Committee when it comes to our
political views that have passed through this Committee, and
they should go to the floor. We should do something finally
about liability, and then we should turn to some of the other
issues that a number of us have worked on when it comes to the
charges for app stores, and when it comes to some of the
monopoly behavior and the self-preferencing. But I'm going to
stick with this today.
Facts: one-third of fentanyl cases investigated over 5
months, had direct ties to social media. That's from the DEA.
Facts: between 2012 and 2022, CyberTipline reports of online
child sexual exploitation increased from 415,000 to more than
32 million. And as I noted, at least 20 victims committed
suicide in sextortion cases.
So, I'm going to start with that with you, Mr. Citron. My
bill with Senator Cornyn, the SHIELD Act, includes a threat
provision that would help protection and accountability for
those that are threatened by these predators. Young kids get a
picture, send it in, think they got a new girlfriend, or a new
boyfriend, ruins their life or they think it's going to be
ruined, and they kill themselves. So could you tell me why
you're not supporting the SHIELD Act?
Mr. Citron. Senator, we think it's very important that
teens have a safe experience on our platforms. I think that the
portion to strengthen law enforcement's ability to investigate
crimes against children and hold bad actors accountable is
incredible.
Senator Klobuchar. So are you holding open that you may
support it?
Mr. Citron. We very much would like to have conversations
with you. We're open to discussing further, and we do welcome
legislation regulation. You know, this is a very important
issue for our country, and you know, we've been prioritizing
safety for----
Senator Klobuchar. Okay, thank you.
Mr. Citron [continuing]. Teens----
Senator Klobuchar. I'm much more interested in if you
support it because there's been so much talk at these hearings,
and popcorn throwing, and the like, and I just want to get this
stuff done. I'm so tired of this. It's been 28 years, what,
since the internet--we haven't passed any of these bills
because everyone's double-talk, double talk. It's time to
actually pass them. And the reason they haven't passed is
because of the power of your company. So let's be really,
really clear about that. So what you say matters. Your words
matter.
Mr. Chew, I'm a co-sponsor of Chair Durbin's STOP CSAM Act
of 2023, along with Senator Hawley, who's the lead Republican,
I believe, which, among other things, empowers victims by
making it easier for them to ask tech companies to remove the
material and related imagery from their platforms. Why would
you not support this bill?
Mr. Chew. Senator, we largely support it. I think the
spirit of it is very aligned with what we want to do. There are
questions about implementation that I think companies like us
and some other groups have, and we look forward to asking
those. And of course, if this legislation is law, we will
comply.
Senator Klobuchar. Mr. Spiegel, I know we talked ahead of
time. I do appreciate your company's support for the Cooper
Davis Act which will finally--it's a bill with Senator Shaheen
and Marshall, which will allow law enforcement to do more when
it comes to fentanyl. I think you know what a problem this is.
Devin Norring, a teenager from Hastings--I mentioned his mom is
here--suffered dental pain and migraine. So he bought what he
thought was a percocet over Snap, but instead he bought a
counterfeit drug laced with a lethal dose of fentanyl.
As his mom, who is here with us today said, ``All of the
hopes and dreams we as parents had for Devin were erased in the
blink of an eye, and no mom should have to bury their kid.''
Talk about why you support the Cooper Davis Act.
Mr. Spiegel. Senator, thank you. We strongly support the
Cooper Davis Act, and we will believe it will help DEA go after
the cartels, and get more dealers off the streets to save more
lives.
Senator Klobuchar. Okay. Are there others that support that
bill on this? No? Okay. Last, Mr. Zuckerberg. In 2021, The Wall
Street Journal reported on internal Meta research documents
asking, ``Why do we care about tweens?'' These were internal
documents. I'm quoting the documents. And answering its own
question by citing Meta internal emails, ``They are a valuable
but untapped audience.''
At a commerce hearing, I'm also on that Committee, I asked
Meta's head of global safety why children age 10 to 12 are so
valuable to Meta. She responded, ``We do not knowingly attempt
to recruit people who aren't old enough to use our apps.''
Well, when the 42 State attorneys general, Democrat and
Republican, brought their case they said this statement was
inaccurate.
Few examples. In 2021, she received an email--Ms. Davis--
from Instagram's research director saying that Instagram is
investing in experiencing targeting young age, roughly 10 to
12. In a February 2021 instant message, one of your employees
wrote that Meta is working to recruit Gen Alpha before they
reach teenage years. A 2018 email that circulated inside Meta
says that you were briefed that children under 13 will be
critical for increasing the rate of acquisition when users turn
13.
Explain that, with what I heard at that testimony at the
commerce hearing, that they weren't being targeted. And I just
ask, again, as the other witnesses were asked, why your company
does not support the STOP CSAM Act or the SHIELD Act?
Mr. Zuckerberg. Sure, Senator, I'm happy to talk to--to
both of those. We had discussions internally about whether we
should build a kids' version of Instagram, like the kids'
version----
Senator Klobuchar. I remember that.
Mr. Zuckerberg [continuing]. Of YouTube and other services.
We haven't actually moved forward with that, and we currently
have no plans to do so. So I can't speak directly to the exact
emails that you cited, but it sounds to me like they were
deliberations around a project that people internally thought
was important and we didn't end up moving forward with.
Senator Klobuchar. Okay. And the bills.
Mr. Zuckerberg. Yes.
Senator Klobuchar. What are you going to say about the two
bills?
Mr. Zuckerberg. Sure. So overall, I mean, my position on
the bills is I agree with the goal of all of them. There are
most things that I agree with within them. There are specific
things that I would probably do differently. We also have our
own legislative proposal for what we think would be most
effective in terms of helping the internet and the various
companies give parents control over the experience. So I'm
happy to go into the detail on any one of them, but ultimately,
I mean, I think that this is----
Senator Klobuchar. Well, I think these parents will tell
you that this stuff hasn't worked, to just give parents
control. They don't know what to do. It's very, very hard, and
that's why we are coming up with other solutions that we think
are much more helpful to law enforcement, but also this idea of
finally getting something going on liability. Because I just
believe with all the resources you have, that you actually
would be able to do more than you're doing, or these parents
wouldn't be sitting behind you right now in this Senate hearing
room.
Chair Durbin. Thank you----
Mr. Zuckerberg. Senator Klobuchar----
Chair Durbin [continuing]. Senator Klobuchar.
Mr. Zuckerberg [continuing]. Could I speak to that, or do
you want me to come back later?
Chair Durbin. Please, go ahead.
Mr. Zuckerberg. I don't think that parents should have to
upload an ID or prove that they're the parent of a child in
every single app that their children use. I think the right
place to do this and a place where it'd be actually very easy
for it to work is within the app stores themselves, where my
understanding is Apple and Google, already--or at least Apple,
already requires parental consent when a child does a payment
with an app. So it should be pretty trivial to pass a law that
requires them to make it so that parents have control anytime a
child downloads an app and offers consent of that.
And the research that we've done shows that the vast
majority of parents want that, and I think that that's the type
of legislation, in addition to some of the other ideas that
you-all have, that would make this a lot easier for parents.
Senator Klobuchar. Yes. Just to be clear, I remember one
mom telling me with all these things she could maybe do that
she can't figure out, it's like a faucet overflowing in a sink,
and she's out there with a mop while her kids are getting
addicted to more and more different apps, and being exposed to
material. We've got to make this simpler for parents so they
can protect their kids, and I just don't think this is going to
be the way to do it.
I think the answer is what Senator Graham has been talking
about, which is opening up the halls of the courtroom. So that
puts it on you guys to protect these parents, and protect these
kids, and then also to pass some of these laws which makes it
easier for law enforcement.
Chair Durbin. Thank you, Senator Klobuchar. We're going to
try to stick to the 7-minute rule. Didn't work very well, but
we're going to--I'll try to give additional time on the other
side as well. Senator, Cornyn.
Senator Cornyn. There's no question that your platforms are
very popular, but we know that while here in the United States,
we have an open society and a free exchange of information,
that there are authoritarian governments, there are criminals
who will use your platforms for the sale of drugs, for sex, for
extortion, and the like.
And Mr. Chew, I think your company is unique among the ones
represented here today because of its ownership by ByteDance, a
Chinese company. And I know there have been some steps that
you've taken to wall off the data collected here in the United
States, but the fact of the matter is that under Chinese law
and Chinese National Intelligence Law, all information
accumulated by companies in the People's Republic of China are
required to be shared with the Chinese Intelligence Services.
ByteDance, the initial release of TikTok, I understand was
in 2016. These efforts that you made with Oracle under the so-
called Project Texas to wall off the U.S. data was in 2021, and
apparently, allegedly, fully walled off in March 2023. What
happened to all of the data that TikTok collected before that?
Mr. Chew. Senator, thank you.
Senator Cornyn. From American users.
Mr. Chew. I understand. TikTok is owned by ByteDance, which
is majority owned by global investors, and we have three
Americans on the board out of five. You are right in pointing
out that over the last 3 years, we have spent billions of
dollars building out Project Texas, which is a plan that is
unprecedented in our industry. The wall off, firewall,
protected U.S. data from the rest of our staff. We also have
this----
Senator Cornyn. And I'm asking about all of the data that
you collected prior to that event.
Mr. Chew. Yes, Senator. We have started a data deletion
plan. I talked about this a year ago. We have finished the
first phase of data deletion through our data centers outside
of the Oracle Cloud Infrastructure. And we're beginning phase
two, where we will not only delete from the data centers, we
will hire a third party to verify that work. And then we will
go into, you know, for example, employees working laptops to
delete that as well.
Senator Cornyn. Was all of the data collected by TikTok
prior to Project Texas shared with the Chinese government
pursuant to the national intelligence laws of that country?
Mr. Chew. Senator, we have not been asked for any data by
the Chinese government, and we have never provided it.
Senator Cornyn. Your company is unique, again, among the
ones represented here today because you're currently undergoing
review by the Committee on Foreign Investment in the United
States. Is that correct?
Mr. Chew. Senator, yes, there are ongoing discussions, and
a lot of our Project Texas work is informed by the discussions
with many agencies under the CFIUS umbrella.
Senator Cornyn. Well, CFIUS is designed specifically to
review foreign investments in the United States for national
security risks. Correct?
Mr. Chew. Yes, I believe so.
Senator Cornyn. And your company is currently being
reviewed by this Interagency Committee at the Treasury
Department for potential national security risks?
Mr. Chew. Senator, this review is on acquisition of
Musical.ly, which is an acquisition that was done many years
ago.
Senator Cornyn. I mean, is this a casual conversation, or
are you actually providing information to the Treasury
Department about how your platform operates for evaluating a
potential national security risk?
Mr. Chew. Senator, it's been many years across two
administrations, and a lot of discussions around how our plans
are, how our systems work. We have a lot of robust discussions
about a lot of detail.
Senator Cornyn. Sixty-three percent of teens, I understand,
use TikTok. Does that sound about right?
Mr. Chew. Senator, I cannot verify that. We know we are
popular amongst many age groups. The average age in the U.S.
today for our user base is over 30, but we are aware we are
popular.
Senator Cornyn. And you reside in Singapore with your
family. Correct?
Mr. Chew. Yes. I reside in Singapore, and I work here in
the United States as well.
Senator Cornyn. And do your children have access to TikTok
in Singapore?
Mr. Chew. Senator, if they lived in the United States, I
would give them access to our under 13 experience. My children
are below the age of 13.
Senator Cornyn. My question is, in Singapore, do they have
access to TikTok, or is that restricted by domestic law?
Mr. Chew. We do not have an under 13 experience in
Singapore. We have that in the United States because we were
deemed a mixed audience app, and we created under 13 experience
in response to that.
Senator Cornyn. A Wall Street Journal article published
yesterday directly contradicts what your company has stated
publicly. According to the journal, employees under the Project
Texas say that U.S. user data, including user emails, birth
date, IP addresses, continue to be shared with ByteDance staff,
again, owned by a Chinese company. Do you dispute that?
Mr. Chew. Yes, Senator. There are many things about that
article. They are inaccurate. Where it gets right is that this
is a voluntary project that we built. We spend billions of
dollars. There are thousands of employees involved, and it's
very difficult because it's unprecedented.
Senator Cornyn. Why is it important that the data collected
from U.S. users be stored in the United States?
Mr. Chew. Senator, this was a project we built in response
to some of the concerns that were raised by Members of this
Committee and others.
Senator Cornyn. And that was because of concerns that the
data that was stored in China could be accessed by the Chinese
Communist Party by according to the National Intelligence Law.
Correct?
Mr. Chew. Senator, we are not the only company that does
business--you know, that has Chinese employees. For example,
we're not even the only company in this room that hires Chinese
nationals, but in order to address some of these concerns, we
have moved the data into the Oracle Cloud Infrastructure.
We built a 2,000-person team to oversee the management of
that data based here. We firewalled it off from the rest of the
organization, and then we open it up to third parties like
Oracle, and we will onboard others to give them third-party
validation. This is unprecedented access. I think we are unique
in taking even more steps to protect user data in the United
States.
Senator Cornyn. Well, you've disputed The Wall Street
Journal story published yesterday. Are you going to conduct any
sort of investigation to see whether there's any truth to the
allegations made in the article, or are you just going to
dismiss them outright?
Mr. Chew. Oh, we're not going to dismiss them. So we have
ongoing security inspections, not only by our own personnel,
but also by third parties to ensure that the system is rigorous
and robust. No system that any one of us can build is perfect,
but what we need to do is to make sure that we are always
improving it and testing it against bad people who may try to
bypass it. And if anyone breaks our policies within our
organization, we will take disciplinary action against them.
Chair Durbin. Thanks, Senator Cornyn. Senator Coons.
Senator Coons. Thank you, Chairman Durbin. First, I'd like
to start by thanking all the families that are here today. All
the parents who are here because of a child they have lost. All
the families that are here because you want us to see you and
to know your concern. You have contacted each of us in our
offices expressing your grief, your loss, your passion, and
your concern. And the audience that is watching can't see this,
they can see you, the witnesses from the companies, but this
room is packed as far as the eye can see.
And when this hearing began, many of you picked up and held
pictures of your beloved and lost children. I benefit from and
participate in social media, as do many Members of the
Committee, and our Nation, and our world. There are now a
majority of people on earth participating in and in many ways
benefiting from one of the platforms you have launched, or you
lead, or you represent.
And we have to recognize there are some real positives to
social media. It has transformed modern life, but it has also
had huge impacts on families, on children, on nations. And
there's a whole series of bills championed by Members of this
Committee that tries to deal with the trafficking in illicit
drugs, the trafficking in illicit child sexual material, the
things that are facilitated on your platforms that may lead to
self-harm or suicide.
So we've heard from several of the leaders on this
Committee--the Chair, and Ranking, and very talented and
experienced Senators. The frame that we are looking at, this is
consumer protection. When there is some new technology, we put
in place regulations to make sure that it is not overly
harmful. As my friend Senator Klobuchar pointed out, one door
flew off of one plane, no one was hurt, and yet the entire
Boeing fleet of that type of plane was grounded, and a Federal
fit-for-purpose agency did an immediate safety review.
I'm going to point not to the other pieces of legislation
that I think are urgent that we take up and pass, but to the
core question of transparency. If you are a company
manufacturing a product that is allegedly addictive and
harmful, one of the first things we look to is safety
information. We try to give our constituents, our consumers,
warnings; labels that help them understand what are the
consequences of this product and how to use it safely or not.
As you've heard, pointedly, from some of my colleagues, if
you sell an addictive, defective, harmful product in this
country in violation of regulations and warnings, you get sued.
And what is distinct about platforms as an industry is most of
the families who are here, are here because there were not
sufficient warnings, and they cannot effectively sue you.
So let me dig in for a moment, if I can, because each of
your companies voluntarily discloses information about the
content, and the safety investments you make, and the actions
you take.
There was a question pressed, I think it was by Senator
Graham earlier about TikTok. I believe, Mr. Chew, you said
invest $2 billion in safety. My background memo said, your
global revenue is $85 billion. Mr. Zuckerberg, my background
memo says, you're investing $5 billion in safety in Meta, and
your annual revenue is on the order of $116 billion.
So what matters? You can hear some expressions from the
parents in the audience. What matters is the relative numbers
and the absolute numbers. You are data folks. If there's
anybody in this world who understand data, it's you guys. So I
want to walk through whether or not these voluntary measures of
disclosure of content and harm are sufficient, because I would
argue we're here because they're not. Without better
information.
How can policymakers know whether the protections you've
testified about, the new initiatives, the starting programs,
the monitoring, and the takedowns are actually working? How can
we understand meaningfully how big these problems are without
measuring and reporting data?
Mr. Zuckerberg, your testimony referenced a National
Academy of Sciences study that said at the population level,
there is no proof about harm for mental health. Well, it may
not be at the population level, but I'm looking at a room full
of hundreds of parents who have lost children. And our
challenge is to take the data and to make good decisions about
protecting families and children from harm.
So let me ask about what your companies do or don't report,
and I'm going to particularly focus on your content policies
around self-harm and suicide. And I'm just going to ask a
series of yes or no questions. And what I'm getting at is do
you disclose enough.
Mr. Zuckerberg, from your policies prohibiting content
about suicide or self-harm, do you report an estimate of the
total amount of content, not a percentage of the overall, not a
prevalence number, but the total amount of content on your
platform that violates this policy? And do you report the total
number of views that self-harm or suicide-promoting content
that violates this policy gets on your platform?
Mr. Zuckerberg. Yes. Senator, we pioneered a quarterly
reporting on our community standards enforcement across all
these different categories of harmful content. We focus on
prevalence, which you mentioned because what we're focused on
is what percent of the content that we take down----
Senator Coons. So Mr. Zuckerberg, I'm going to interrupt
you.
Mr. Zuckerberg [continuing]. Where our systems proactively
identify----
Senator Coons. You're very talented. I have very little
time left. I'm trying to get an answer to a question, not as a
percentage of the total, because remember it's a huge number.
So the percentage is small. But do you report the actual amount
of content and the amount of views, self-harm content received?
Mr. Zuckerberg. No. I believe we focus on prevalence.
Senator Coons. Correct. You don't. Ms. Yaccarino, yes or
no. Do you report it or you don't?
Ms. Yaccarino. Senator, as a reminder, we have less than 1
percent of our users that are between the ages of 13 and 17.
Senator Coons. Do you report the absolute number----
Ms. Yaccarino. We report the number of----
Senator Coons [continuing]. Of how many images and how
often do you----
Ms. Yaccarino [continuing]. Posts and accounts that we've
taken down. In 2023----
Senator Coons. Yes.
Ms. Yaccarino [continuing]. We've taken over almost a
million posts down in regards to mental health and self-harm.
Senator Coons. Mr. Chew, do you disclose the number of
appearances of these types of content and how many are viewed
before they're taken down?
Mr. Chew. Senator, we disclosed the number we take down
based on each category of violation and how many of that were
taken down proactively before it was reported.
Senator Coons. Mr. Spiegel.
Mr. Spiegel. Yes, Senator, we do disclose.
Senator Coons. Mr. Citron.
Mr. Citron. Yes, we do.
Senator Coons. So, I've got three more questions I'd love
to walk through if I had unlimited time. I will submit them for
the record.
The larger point is that platforms need to hand over more
content about how the algorithms work, what the content does,
and what the consequences are. Not at the aggregate, not at the
population level, but the actual numbers of cases so we can
understand the content.
In closing, Mr. Chairman, I have a bipartisan bill, the
Platform Accountability and Transparency Act, co-sponsored by
Senators Cornyn, Klobuchar, Blumenthal on this Committee, and
Senator Cassidy and others. It's in front of the Commerce
Committee, not this Committee. But it would set reasonable
standards for disclosure and transparency to make sure that
we're doing our jobs based on data.
Yes, there's a lot of emotion in this field,
understandably, but if we're going to legislate responsibly
about the management of the content on your platforms, we need
to have better data. Is there any one of you willing to say now
that you support this bill? Mr. Chairman, let the record
reflect a yawning silence from the leaders of the social media
platforms. Thank you.
Chair Durbin. Thanks, Senator Coons. We're on one of two,
the first of two roll calls, and so please understand if some
of the Members leave and come back. It's no disrespect, they're
doing their job. Senator Lee.
Senator Lee. Thank you, Mr. Chairman. Tragically, survivors
of sexual abuse are often repeatedly victimized and
revictimized over, and over, and over again by having
nonconsensual images of themselves on social media platforms.
There's a NCMEC study that pointed out there was one instance
of CSAM that reappeared more than 490,000 times after it had
been reported--after it had been reported.
So we need tools in order to deal with this. We need,
frankly, laws in order to mandate standards so that this
doesn't happen; so that we have a systematic way of getting rid
of this stuff, because there is literally no plausible
justification no way of defending this.
One tool, one that I think would be particularly effective
is a bill that I'll be introducing later today, and I invite
all my Committee Members to join me. It's called the PROTECT
Act. The PROTECT Act would, in pertinent part, require websites
to verify age and verify that they've received consent of any
and all individuals appearing on their site in pornographic
images. And it also requires platforms to have meaningful
processes for an individual seeking to have images of him or
herself removed in a timely manner.
Ms. Yaccarino, based on your understanding of existing law,
what might it take for a person to have those images removed,
say from X?
Ms. Yaccarino. Senator Lee, thank you. It sounds like what
you are going to introduce into law in terms of ecosystem-wide
and user consent sounds exactly like part of the philosophy of
why we're supporting the SHIELD Act, and no one should have to
endure nonconsensual images being shared online.
Senator Lee. Yes. And without that, without laws in place--
and it's fantastic anytime a company as you've described with
yours, wants to take those steps. It's very helpful. It can
take a lot longer than it should, and sometimes it does to the
point where somebody had images shared 490,000 times after it
was reported to the authorities. And that's deeply concerning.
But yes, the PROTECT Act would work in tandem with--it's a good
compliment to the SHIELD Act.
Mr. Zuckerberg, let's turn to you next. As you know, I feel
strongly about privacy, and believe that one of the best
protections for an individual's privacy online involves end-to-
end encryption. We also know that a great deal of grooming and
sharing of CSAM happens to occur on end-to-end encrypted
systems. Tell me, does Meta allow juvenile accounts on its
platforms to use encrypted messaging services within those
apps?
Mr. Zuckerberg. Sorry, Senator, what do you mean juvenile?
Senator Lee. Underage. People under 18.
Mr. Zuckerberg. Under 18. We allow people under the age of
18 to use WhatsApp, and we do allow that to be encrypted. Yes.
Senator Lee. Do you have a bottom-level age at which
they're not allowed to use it?
Mr. Zuckerberg. Yes. I don't think we allow people under
the age of 13.
Senator Lee. Okay. What about you, Mr. Citron. Discord, do
you allow kids to have accounts to access encrypted messaging?
Mr. Citron. Discord is not allowed to be used by children
under the age of 13, and we do not use end-to-end encryption
for text messages. You know, we believe that it's very
important to be able to respond to--well, from law enforcement
requests, and we're also working on proactively building
technology.
We're working with a nonprofit called Thorn to build a
grooming classifier so that our Teen Safety Assist feature can
actually identify these conversations, if they might be
happening, so we can intervene and give those teens tools to
get out of that situation, or potentially even report those
conversations and those people to law enforcement.
Senator Lee. And then encryption, as much as it can prove
useful elsewhere, it can be harmful, especially if you are on a
site where, you know, children are being groomed and exploited.
If you allow children onto an end-to-end encryption-enabled app
that can prove problematic.
Now, let's go back to you for a moment, Mr. Zuckerberg.
Instagram recently announced that it's going to restrict all
teenagers from access to eating disorder material, suicidal
ideation-themed material, self-harm content, and that's
fantastic. That's great. What's odd, what I'm trying to
understand is why it is that Instagram is only restricting
access to sexually explicit content, but only for teens ages 13
to 15. Why not restrict it for 16-and 17-year-olds as well?
Mr. Zuckerberg. Senator, my understanding is that we don't
allow sexually explicit content on the service for people of
any age.
Senator Lee. How is that going?
[Laughter.]
Mr. Zuckerberg. You know, our prevalence metrics suggests
that, I think, it's 99 percent or so of the content that we
remove, we're able to identify automatically using AI systems.
So I think that our efforts in this, while they're not perfect,
I think are industry-leading.
The other thing that you asked about was self-harm content,
which is what we recently restricted, and we made that shift of
the--I think the state of the science is shifting a bit.
Previously, we believed that when people were thinking about
self-harm, it was important for them to be able to express that
and get support.
And now more of the thinking in the field is that it's just
better to not show that content at all, which is why we
recently moved to restrict that from showing up for those teens
at all.
Senator Lee. Okay. Is there a way for parents to make a
request on what their kid can see or not see on your sites?
Mr. Zuckerberg. There are a lot of parental controls. I'm
not sure if there--I don't think that we currently have a
control around topics, but we do allow parents to control the
time that the children are on the site. And also, a lot of it
is based on kind of monitoring and understanding what the
teen's experience is--what they're interacting with.
Senator Lee. Mr. Citron, Discord allows pornography on its
site. Now, reportedly, 17 percent of minors who use Discord
have had online sexual interactions on your platform. 17
percent. And 10 percent have those interactions with someone
that the minor believed to be an adult. Do you restrict minors
from accessing Discord servers that host pornographic material
on them?
Mr. Citron. Senator, yes, we do restrict minors from
accessing content that is marked for adults, and Discord also
does not recommend content to people. Discord is a chat app, we
do not have a feed or an algorithm that boosts content. So, we
allow adults to share content with other adults in adult-
labeled spaces, and we do not allow teens to access that
content.
Senator Lee. Okay. I see my time's expired. Thank you.
Senator Whitehouse [presiding]. Welcome, everyone. We are
here in this hearing because, as a collective, your platforms
really suck at policing themselves. We hear about it here in
Congress with fentanyl and other drug dealing facilitated
across platforms. We see it and hear about it here in Congress
with harassment and bullying that takes place across your
platforms. We see it and hear about it here in Congress with
respect to child pornography, sex exploitation, and blackmail,
and we are sick of it.
It seems to me that there is a problem with accountability
because these conditions continue to persist. In my view,
Section 230, which provides immunity from lawsuit, is a very
significant part of that problem. If you look at where bullies
have been brought to heel recently, whether it's Dominion
finally getting justice against Fox News after a long campaign
to try to discredit the election equipment manufacturer. Or
whether it's the moms and dads of the Sandy Hook victims
finally getting justice against InfoWars and his campaign of
trying to get people to believe that the massacre of their
children was a fake put on by them; or even now more recently,
with a writer getting a very significant judgment against
Donald Trump. After years of bullying and defamation, an honest
courtroom has proven to be the place where these things get
sorted out.
And I'll just describe one case, if I may. It's called Doe
v. Twitter. The plaintiff in that case was blackmailed in 2017
for sexually explicit photos and videos of himself, then aged
13 to 14. A compilation video of multiple CSAM videos surfaced
on Twitter in 2019. A concerned citizen reported that video on
December 25, 2019, Christmas Day. Twitter took no action. The
plaintiff, then a minor in high school in 2019, became aware of
this video from his classmates in January 2020. You're a high
school kid, and suddenly there's that. That's a day that's hard
to recover from.
Ultimately, he became suicidal. He and his parents
contacted law enforcement and Twitter to have these videos
removed on January 21, and again on January 22, 2020, and
Twitter ultimately took down the video on January 30, 2020,
once Federal law enforcement got involved.
That's a pretty foul set of facts. And when the family sued
Twitter for all those months of refusing to take down the
explicit video of this child, Twitter invoked Section 230, and
the district court ruled that the claim was barred.
There is nothing about that set of facts that tells me that
Section 230 performed any public service in that regard. I
would like to see very substantial adjustments to Section 230
so that the honest courtroom, which brought relief and justice
to E. Jean Carroll after months of defamation, which brought
silence, peace, and justice to the parents of the Sandy Hook
children after months of defamation and bullying by InfoWars
and Alex Jones, and which brought significant justice and an
end to the campaign of defamation by Fox News to a little
company that was busy just making election machines.
So, my time is running out, I'll turn to--I guess Senator
Cruz is next, but I would like to have each of your companies
put in writing what exemptions from the protection of Section
230 you would be willing to accept, bearing in mind the fact
situation in Doe v. Twitter, bearing in mind the enormous harm
that was done to that young person and that family by the
nonresponsiveness of this enormous platform over months, and
months, and months, and months.
Again, think of what it's like to be a high school kid, and
have that stuff up in the public domain, and have the company
that is holding it out there in the public domain react so
disinterestedly. Okay? Will you put that down in writing for
me? One, two, three, four, five yeses. Done.
Senator Whitehouse. Senator Cruz.
Senator Cruz. Thank you, Mr. Chairman. Social media is a
very powerful tool, but we're here because every parent I know,
and I think every parent in America is terrified about the
garbage that is directed at our kids. I have two teenagers at
home, and the phones they have are portals to predators, to
viciousness, to bullying, to self-harm, and each of your
companies could do a lot more to prevent it.
Mr. Zuckerberg, in June 2023, The Wall Street Journal
reported that Instagram's recommendation systems were actively
connecting pedophiles to accounts that were advertising the
sale of child sexual abuse material. In many cases, those
accounts appear to be run by underage children themselves,
often using code words and emojis to advertise illicit
material. In other cases, the accounts included indicia that
the victim was being sex trafficked.
Now, I know that Instagram has a team that works to prevent
the abuse and exploitation of children online, but what was
particularly concerning about the Wall Street Journal expose
was the degree to which Instagram's own algorithm was promoting
the discoverability of victims for pedophiles seeking child
abuse material.
In other words, this material wasn't just living on the
dark corners of Instagram. Instagram was helping pedophiles
find it by promoting graphic hashtags, including #pedowhore and
#preteensex, to potential buyers. Instagram also displayed the
following warning screen to individuals who were searching for
child abuse material, ``These results may contain images of
child sexual abuse.'' And then you gave users two choices,
``Get resources or see results anyway.'' Mr. Zuckerberg, what
the hell were you thinking?
Mr. Zuckerberg. All right. Senator, the basic science
behind that is that when people are searching for something
that is problematic, it's often helpful to, rather than just
blocking it, to help direct them toward something that could be
helpful for getting them to get help. But we also----
Senator Cruz. I understand, ``get resources.'' In what sane
universe is there a link for see results anyway?
Mr. Zuckerberg. Well, because we might be wrong. We try to
trigger this warning, or we try to--when we think that there's
any chance that the results might be----
Senator Cruz. Okay. You might be wrong. Let me ask you, how
many times was this warning screen displayed?
Mr. Zuckerberg. I don't know, but the----
Senator Cruz. You don't know. Why don't you know?
Mr. Zuckerberg. I don't know the answer to that off the top
of my head, but----
Senator Cruz. You know what, Mr. Zuckerberg, it's
interesting you say you don't know it off the top of your head
because I asked it in June 2023 in an oversight letter, and
your company refused to answer. Will you commit right now to,
within 5 days, answering this question for this Committee?
Mr. Zuckerberg. We'll follow up on that?
Senator Cruz. Is that a yes? Not a we'll follow up. I know
how lawyers write statements saying we're not going to answer.
Will you tell us how many times this warning screen was
displayed? Yes or no?
Mr. Zuckerberg. Senator, I'll personally look into it. I'm
not sure if we have----
Senator Cruz. Okay. So you're refusing to answer that. Let
me ask you this, how many times did an Instagram user who got
this warning that you're seeing images of child sexual abuse,
how many times did that user click on, ``see results anyway?''
I want to see that.
Mr. Zuckerberg. Senator, I'm not sure if we stored that,
but I'll personally look into this, and we'll follow up after--
--
Senator Cruz. And what follow up did Instagram do when you
have a potential pedophile clicking on, ``I'd like to see child
porn.'' What did you do next when that happened?
Mr. Zuckerberg. Senator, I think that an important piece of
context here is that any content that we think is child sexual
abuse----
Senator Cruz. Mr. Zuckerberg, that's called a question.
What did you do next when someone clicked, ``You may be getting
child sexual abuse images,'' and they click, ``see results
anyway?'' What was your next step? You said you might be wrong.
Did anyone examine was it in fact child sexual abuse material?
Did anyone report that user? Did anyone go and try to protect
that child? What did you do next?
Mr. Zuckerberg. Senator, we take down anything that we
think is sexual abuse material on the service, and we do----
Senator Cruz. Did anyone verify whether it was in fact
child sexual abuse material?
Mr. Zuckerberg. Senator, I don't know if every single
search result we're following up on, but in----
Senator Cruz. Did you report the people who wanted it?
Mr. Zuckerberg. Senator, do you want me to answer your
question?
Senator Cruz. Yes. I want you to answer the question I'm
asking. Did you report----
Mr. Zuckerberg. Give me some time to speak then.
Senator Cruz [continuing]. The people who click, ``see
results anyway?''
Mr. Zuckerberg. That's probably one of the factors that we
use in reporting, and in general, and we've reported more
people and done more reports like this to NCMEC, the National
Center of Missing Exploited Children, than any other company in
the industry. We proactively go out of our way across our
services to do this, and have made it--I think, it's more than
26 million reports, which is more than the whole rest of the
industry combined. So I think the allegation----
Senator Cruz. So Mr. Zuckerberg----
Mr. Zuckerberg [continuing]. That we don't take this
seriously----
Senator Cruz [continuing]. Your company and every social
media company needs to do much more to protect children. All
right. Mr. Chew, in the next couple of minutes I have, I want
to turn to you. Are you familiar with China's 2017 National
Intelligence Law, which states, ``All organizations and
citizens shall support, assist, and cooperate with national
intelligence efforts in accordance with the law, and shall
protect national intelligence work secrets they are aware of?''
Mr. Chew. Yes. I'm familiar with this.
Senator Cruz. TikTok is owned by ByteDance. Is ByteDance
subject to the law?
Mr. Chew. For the Chinese businesses that ByteDance owns,
yes, it will be subject to this, but TikTok is not available in
Mainland China. And Senator, as we talked about in your office,
we built Project Texas to put this out of reach.
Senator Cruz. So, ByteDance is subject to the law. Now,
under this law, which says, ``shall protect national
intelligence work secrets they're aware of,'' it compels people
subject to the law to lie to protect those secrets. Is that
correct?
Mr. Chew. I cannot comment on that. What I said, again, is
that we have----
Senator Cruz. Because you have to protect those secrets.
Mr. Chew. No, Senator, TikTok is not available in Mainland
China. We have moved the data into an American product
infrastructure----
Senator Cruz. But TikTok is controlled by ByteDance, which
is subject to this law. Now, you said earlier, and I wrote this
down, ``We have not been asked for any data by the Chinese
government, and we have never provided it.'' I'm going to tell
you, and I told you this when you and I met last week in my
office, I do not believe you, and I'll tell you, the American
people don't either.
If you look at what is on TikTok in China, you are
promoting to kids' science and math videos, educational videos,
and you limit the amount of time kids can be on TikTok. In the
United States, you are promoting to kids' self-harm videos and
anti-Israel propaganda. Why is there such a dramatic
difference?
Mr. Chew. Senator, that is just not accurate. There is a
lot of----
Senator Cruz. There's not a difference between what kids
see in China and what kids see here?
Mr. Chew. Senator, TikTok is not available in China. It's a
separate experience there. But what I'm saying is----
Senator Cruz. But you have a company that is essentially
the same except it promotes beneficial materials instead of
harmful materials.
Mr. Chew. That is not true. We have a lot of science and
math content here on TikTok. There's so much of it----
Senator Cruz. All right. Let me point to this, Mr. Chew.
There was a report recently that compared hashtags on Instagram
to hashtags on TikTok, and what trended, and the differences
were striking. So for something like #TaylorSwift or #Trump,
researchers found roughly two Instagram posts for every one on
TikTok. That's not a dramatic difference.
That difference jumps to 8-to-1 for the #Uyghur, and it
jumps to 30-to-1 for the #Tibet, and it jumps to 57-to-1 for
#Tiananmen
Square, and it jumps to 174-to-1 for the #HongKongProtest. Why
is it that on Instagram people can put up a #HongKongProtest
174 times compared to TikTok? What censorship is TikTok doing
at the request of the Chinese government?
Mr. Chew. None. Senator----
Senator Cruz. Can you explain that differential?
Mr. Chew. That analysis is flawed, has been debunked by
other external sources like the Cato Institute. Fundamentally,
a few things happen here. Not all videos carry hashtags. That's
the first thing. The second thing is that you cannot
selectively choose a few words within a certain time period----
Senator Cruz. Why the difference between Taylor Swift and
Tiananmen Square? What happened at Tiananmen Square?
Mr. Chew. Senator, there was a massive protest during that
time. But what I'm trying to say is our users can freely come
and post this content----
Senator Cruz. Why would there be no difference on Taylor
Swift or a minimal difference, and a massive difference on
Tiananmen Square, Hong Kong?
Chair Durbin [presiding]. Senator, could you wrap up,
please?
Mr. Chew. Senator, our algorithm does not suppress a new
content simply based on----
Senator Cruz. Could you answer that question? Why is there
a difference?
Mr. Chew. Like I said, I think this analysis is flawed.
You're selectively choosing some words over some periods. We
haven't been around this----
Senator Cruz. There is an obvious----
Mr. Chew [continuing]. And other apps----
Senator Cruz [continuing]. Difference. 174-to-1 for Hong
Kong compared to Taylor Swift is dramatic.
Chair Durbin. Senator Blumenthal.
Senator Blumenthal. Mr. Zuckerberg, you know who Antigone
Davis is, correct?
Mr. Zuckerberg. Yes.
Senator Blumenthal. She's one of your top leaders. In
September 2021, she was global head of safety, correct?
Mr. Zuckerberg. Yes.
Senator Blumenthal. And you know that she came before a
Subcommittee, the Commerce Committee that I chaired at the
time, Subcommittee on Consumer Protection, correct?
Mr. Zuckerberg. Yes.
Senator Blumenthal. And she was testifying on behalf of
Facebook, right?
Mr. Zuckerberg. Meta, but, yes.
Senator Blumenthal. It was then Facebook, but Meta now. And
she told us, and I'm quoting, ``Facebook is committed to
building better products for young people and to doing
everything we can to protect their privacy, safety, and well-
being on our platforms.''
And she also said kids' safety is an area where, ``we are
investing heavily.'' We now know that statement was untrue. We
know it from an internal email that we have received. It's an
email written by Nick Clegg. You know who he is, correct?
Mr. Zuckerberg. Yes.
Senator Blumenthal. He was Meta's president of global
affairs, and he wrote a memo to you which you received,
correct? It was written to you.
Mr. Zuckerberg. Senator, I can't see the email, but sure,
I'll assume that you got it. Correct.
Senator Blumenthal. And he summarized Facebook's problems.
He said, ``We are not on track to succeed for our core well-
being topics; problematic use, bullying and harassment
connections, and SSI,'' meaning suicidal self-injury. He said
also in another memo, ``We need to do more, and we are being
held back by a lack of investment.'' This memo has the date of
August 28, just weeks before that testimony from Antigone
Davis. Correct?
Mr. Zuckerberg. Sorry, Senator, I'm not sure what the date
the testimony was.
Senator Blumenthal. Well, those are the dates on the
emails. Nick Clegg was asking you, pleading with you for
resources to back up the narrative to fulfill the commitments.
In effect, Antigone Davis was making promises that Nick Clegg
was trying to fulfill, and you rejected that request for 45 to
84 engineers to do well-being or safety.
We know that you rejected it from another memo. Nick
Clegg's assistant, Tim Colburne, who said Nick did email Mark
referring to that earlier email to emphasize his support for
the package, but it lost out to the various other pressures and
priorities.
We've done a calculation that those, potentially, 84
engineers would've cost Meta about $50 million in a quarter
when it earned $9.2 billion. And yet it failed to make that
commitment in real terms, and you rejected that request because
of other pressures and priorities. That is an example from your
own internal document of failing to act. And it is the reason
why we can no longer trust Meta, and, frankly, any of the other
social media to in effect grade their own homework.
The public, and particularly the parents in this room, know
that we can no longer rely on social media to provide the kind
of safeguards that children and parents deserve. And that is
the reason why passing the Kids Online Safety Act is so
critically important.
Mr. Zuckerberg, do you believe that you have a
constitutional right to lie to Congress?
Mr. Zuckerberg. Senator, no, but I mean you----
Senator Blumenthal. Well, let me just clarify for you.
Mr. Zuckerberg [continuing]. Quoted a bunch of words, and
I'd like the opportunity to respond to----
Senator Blumenthal. Let me just clarify for you. In a
lawsuit brought by hundreds of parents, some in this very room,
alleging that you made false and misleading statements
concerning the safety of your platform for children. You argued
in not just one pleading, but twice, in December, and then in
January, that you have a constitutional right to lie to
Congress. Do you disavow that filing in court?
Mr. Zuckerberg. Senator, I don't know what filing you're
talking about, but I testified----
Senator Blumenthal. It's a filing from----
Mr. Zuckerberg [continuing]. Honestly and truthfully, and I
would like the opportunity to respond to the previous things
that you showed as well.
Senator Blumenthal. Well, I have a few more questions, and
let me ask others who are here because I think it's important
to put you on record. Who will support the Kids Online Safety
Act? Yes or no. Mr. Citron?
Mr. Citron. There are parts of the Act that we think are
great and----
Senator Blumenthal. No. It's a yes or no question. I'm
going to be running out of time. So I'm assuming the answer is
no if you can't answer yes.
Mr. Citron. We very much think that the National----
Senator Blumenthal. That's a no.
Mr. Citron [continuing]. Privacy Standard would be great.
Senator Blumenthal. Mr. Siegel.
Mr. Siegel. Senator, we strongly support the Kids' Online
Safety Act, and we've already implemented many of its core
provisions.
Senator Blumenthal. Thank you. I welcome that support along
with Microsoft's support. Mr. Chew.
Mr. Chew. Senator, with some changes we can support it.
Senator Blumenthal. Now in its present form, do you support
it? Yes, or no?
Mr. Chew. We are aware that some groups have raised some
concerns. It's important to understand how----
Senator Blumenthal. I'll take that as a no. Ms. Yaccarino.
Ms. Yaccarino. Senator, we support KOSA, and will continue
to make sure that it accelerates, and make sure it continues to
offer a community for teens that are seeking that voice.
Senator Blumenthal. Mr. Zuckerberg.
Mr. Zuckerberg. Senator, we support the age-appropriate
content standards, but would have some suggestions----
Senator Blumenthal. Yes or no, Mr. Zuckerberg.
Mr. Zuckerberg [continuing]. On how to implement it.
Senator Blumenthal. Do you support the Kids Online Safety
Act?
Mr. Zuckerberg. Senator, I think these are nuanced----
Senator Blumenthal. You're in public, and I'm just asking
whether you'll support it or not.
Mr. Zuckerberg. These are nuanced things. I think that the
basic spirit is right. I think the basic ideas in it are right,
and there are some ideas that I would debate how to best----
Senator Blumenthal. Unfortunately, I don't think we can
count on social media, as a group, or Big Tech, to support this
measure. And in the past, we know it's been opposed by armies
of lawyers and lobbyists. We're prepared for this fight.
But I am very, very glad that we have parents here because
tomorrow we're going to have an advocacy day, and the folks who
really count, the people in this room who support this measure,
are going to be going to their representatives and their
Senators, and their voices and faces are going to make a
difference.
Senator Schumer has committed that he will work with me to
bring this bill to a vote, and then we will have real
protection for children and parents online. Thank you, Mr.
Chairman.
Chair Durbin. Thank you, Senator Blumenthal. We have a vote
on. Has Senator Cotton--have you voted and Senator Hawley. You
haven't voted yet? You're next. And I don't know how long the
vote will be open, but I'll turn it over to you.
Senator Hawley. Thank you, Mr. Chairman. Mr. Zuckerberg,
let me start with you. Did I hear you say in your opening
statement that there's no link between mental health and social
media use?
Mr. Zuckerberg. Senator, what I said is, I think it's
important to look at the science. I know people widely talk
about this as if that is something that's already been proven,
and I think that the bulk of the scientific evidence does not
support that.
Senator Hawley. Well, really, let me just remind you of
some of the science from your own company. Instagram studied
the effect of your platform on teenagers. Let me just read you
some quotes from The Wall Street Journal's report on this,
company ``Researchers found that Instagram is harmful for a
sizable percentage of teenagers, most notably teenage girls.''
Here's a quote from your own study. ``We make body image
issues worse for 1-in-3 teen girls.'' Here's another quote.
``Teens blamed Instagram--'' this is your study, ``for
increases in the rate of anxiety and depression. This reaction
was unprompted and consistent across all groups.'' That's your
study.
Mr. Zuckerberg. Senator, we try to understand the feedback
and how people feel about the services. We can improve----
Senator Hawley. Wait a minute, your own study says that you
make life worse for one in three teenage girls. You increase--
--
Mr. Zuckerberg. No, Senator, that's not what it says.
Senator Hawley [continuing]. Anxiety and depression. That's
what it says, and you're here testifying to us in public that
there's no link. You've been doing this for years. For years
you've been coming in public and testifying under oath that
there's absolutely no link, your product is wonderful, the
science is nascent, full speed ahead, while internally, you
know full well your product is a disaster for teenagers.
Mr. Zuckerberg. Senator, that's not true.
Senator Hawley. And you keep right on doing what you're
doing, right?
[Applause.]
Mr. Zuckerberg. That's not true. That's not true.
Senator Hawley. Let me show you some other facts----
Mr. Zuckerberg. We can show you data if you want----
Senator Hawley [continuing]. I know that you're familiar
with--wait a minute, wait a minute. That's not a question.
That's not a question. Those are facts, Mr. Zuckerberg. That's
not a question.
Mr. Zuckerberg. Those aren't facts.
Senator Hawley. Let me show you some more facts. Here's
some information from a whistleblower who came before the
Senate, testified under oath in public. He worked for you. It's
a senior executive. Here's what he showed he found when he
studied your products.
So for example, this is girls between the ages of 13 and 15
years old. Thirty-seven percent of them reported that they had
been exposed to nudity on the platform, unwanted, in the last 7
days. Twenty-four percent said that they had experienced
unwanted sexual advances. They'd been propositioned in the last
7 days. Seventeen percent said they had encountered self-harm
content pushed at them in the last 7 days.
Now, I know you're familiar with these stats because he
sent you an email where he lined it all out. I mean, we've got
a copy of it right here. My question is, who did you fire for
this? Who got fired because of that?
Mr. Zuckerberg. Senator, we study all this because it's
important and we want to improve our services.
Senator Hawley. Well, you just told me a second ago you
studied it. That there was no linkage. Who did you fire?
Mr. Zuckerberg. Senator, I said you mischaracterized----
Senator Hawley. Thirty-seven percent of teenage girls
between 13 and 15 were exposed to unwanted nudity in a week on
Instagram. You knew about it. Who did you fire?
Mr. Zuckerberg. Senator, this is why we're building all----
Senator Hawley. Who did you fire?
Mr. Zuckerberg. Senator, I don't think that that's----
Senator Hawley. Who did you fire?
Mr. Zuckerberg. I'm not going to answer that.
Senator Hawley. It's because you didn't fire anybody,
right? You didn't----
Mr. Zuckerberg. Senator, I don't think----
Senator Hawley [continuing]. Take any significant action.
Mr. Zuckerberg [continuing]. It's not appropriate to talk
about, like, H.R. decisions----
Senator Hawley. It's not appropriate? Do you know who's
sitting behind you? You've got families from across the Nation
whose children are either severely harmed, or gone, and you
don't think it's appropriate to talk about steps that you took,
the fact that you didn't fire a single person? Let me ask you
this. Have you compensated any of the victims?
Mr. Zuckerberg. Sorry?
Senator Hawley. Have you compensated any of the victims?
These girls, have you compensated them?
Mr. Zuckerberg. I don't believe so.
Senator Hawley. Why not? Don't you think they deserve some
compensation for what your platform has done? Help with
counseling services, help with dealing with the issues that
your services caused?
Mr. Zuckerberg. Our job is to make sure that we build tools
to help keep people safe.
Senator Hawley. Are you going to compensate them?
Mr. Zuckerberg. Senator, our job, and what we take
seriously is making sure that we build industry-leading tools
to find harmful content----
Senator Hawley. To make money.
Mr. Zuckerberg [continuing]. And take it off the services--
--
Senator Hawley. To make money.
Mr. Zuckerberg [continuing]. And to build tools that
empower parents.
Senator Hawley. So you didn't take any action. You----
Mr. Zuckerberg. That's not true, Senator.
Senator Hawley [continuing]. Didn't fire anybody. You
haven't compensated a single victim. Let me ask you this.
There's families of victims here today. Have you apologized to
the victims?
Mr. Zuckerberg. I----
Senator Hawley. Would you like to do so now?
Mr. Zuckerberg. Well----
Senator Hawley. They're here. You're on national
television. Would you like now to apologize to the victims who
have been harmed by your product? Show him the pictures.
[Applause.]
Senator Hawley. Would you like to apologize for what you've
done to these good people?
[Addressing audience.]
Mr. Zuckerberg. I'm sorry for everything that you've all
gone through. It's terrible. No one should have to go through
the things that your families have suffered. And this is why we
invest so much, and are going to continue doing industry-
leading efforts to make sure that no one has to go through the
types of things that your families have had to suffer.
Senator Hawley. You know, why Mr. Zuckerberg, why should
your company not be sued for this? Why is it that you can
claim--you hide behind a liability shield? You can't be held
accountable. Shouldn't you be held accountable personally? Will
you take personal responsibility?
Mr. Zuckerberg. Senator, I think I've already answered
this. I mean, these issues----
Senator Hawley. We'll try this again. Will you take
personal responsibility?
Mr. Zuckerberg. Senator, I view my job and the job of our
company as building the best tools that we can to keep our
community safe----
Senator Hawley. Well, you're failing at that.
Mr. Zuckerberg. Well, Senator, we're doing an industry-
leading effort. We build AI tools that----
Senator Hawley. Oh, nonsense. Your product is killing
people. Will you personally commit to compensating the victims?
You're a billionaire. Will you commit to compensating the
victims? Will you set up a compensation fund----
Mr. Zuckerberg. Senator----
Senator Hawley [continuing]. With your money?
Mr. Zuckerberg [continuing]. I think these are
complicated----
Senator Hawley. With your money.
Mr. Zuckerberg. Senator, these are complicated issues----
Senator Hawley. No, that's not a complicated question.
That's a yes, or no. Will you set up a victim's compensation
fund with your money, the money you made on these families
sitting behind you? Yes, or no?
Mr. Zuckerberg. Senator, I don't think that that's--my
job----
Senator Hawley. Sounds like a no.
Mr. Zuckerberg [continuing]. Is to make sure we make good
tools. My job is to make sure----
Senator Hawley. Sounds like a no. Your job is to be
responsible for what your company has done. You've made
billions of dollars on the people sitting behind you here.
You've done nothing to help them. You've done nothing to
compensate them. You've done nothing to put it right. You could
do so here today and you should. You should, Mr. Zuckerberg.
Before my time expires, Mr. Chew, let me just ask you. Your
platform, why should your platform not be banned in the United
States of America? You are owned by a Chinese communist company
or a company based in China. The editor-in-chief of your parent
company is a Communist Party Secretary. Your company has been
surveilling Americans for years.
According to leaked audio from more than 80 internal TikTok
meetings, China-based employees of your company have repeatedly
accessed nonpublic data of United States citizens. Your company
has tracked journalists, improperly gaining access to their IP
addresses user data in an attempt to identify whether they're
writing negative stories about you. Why should--your platform
is basically an espionage arm for the Chinese Communist Party.
Why should you not be banned in the United States of America?
Mr. Chew. Senator, I disagree with your characterization.
Many of what you have said, we have explained in a lot of
detail. TikTok is used by 170 million Americans.
Senator Hawley. I know, but when every single one of those
Americans are in danger from the fact that you track their
keystrokes, you track their app usage, you track their location
data, and we know that all of that information can be accessed
by Chinese employees who are subject to the dictates of the
Chinese Communist Party.
Mr. Chew. That is not----
Senator Hawley. Why should you not be banned in this
country?
Mr. Chew. Senator, that is not accurate. A lot of what you
described we collect, we don't.
Senator Hawley. It is 100 percent accurate. Do you deny
that, repeatedly, American's data has been accessed by
ByteDance employees in China?
Mr. Chew. We built a project that cost us billions of
dollars to stop that, and we have made a lot of progress----
Senator Hawley. And it hasn't been stopped. According to
The Wall Street Journal report from just yesterday, even now,
``ByteDance workers, without going through official channels,
have access to the private information of American citizen''--
I'm quoting from the article--``private information of American
citizens, including their birthday, their IP address, and
more.'' That's now.
Mr. Chew. Senator, as we know, the media doesn't always get
it right. What we have, what we have----
Senator Hawley. But the Chinese Communist Party does?
Mr. Chew. I'm not saying that. What I'm saying is that we
have been--we have spent billions of dollars to build this
project. It's rigorous, it's robust, it's unprecedented, and
I'm proud of the work that the 2,000 employees are doing to
protect the data of American users.
Senator Hawley. But it's not protected. That's the problem,
Mr. Chew. It's not protected at all. It's subject to Communist
Chinese Party inspection and review, your app, unlike anybody
else sitting here, and heaven knows I've got problems with
everybody here. But your app, unlike any of those, is subject
to the control and inspection of a foreign hostile government
that is actively trying to track the information of whereabouts
of every American that they get their hands on. Your app ought
to be banned in the United States of America for the security
of this country.
[Applause.]
Senator Hawley. Thank you, Mr. Chairman.
Chair Durbin. Senator Hirono.
Senator Hirono. Thank you, Mr. Chairman. As we've heard,
children face all sorts of dangers when they use social media
from mental health harms to sexual exploitation, even
trafficking. Sex trafficking is a serious problem in my home
State of Hawaii, especially for native Hawaiian victims. Social
media platforms are being used to facilitate this trafficking,
as a well as the creation and distribution of CSAM is deeply
concerning. But it's happening.
For example, several years ago, a military police officer
stationed in Hawaii was sentenced to 15 years in prison for
producing CSAM. As part of his online exploitation of a minor
female, he began communicating with this 12-year-old girl
through Instagram. He then used Snapchat to send her sexually
explicit photos and to solicit such photos from her. He later
used these photos to blackmail her.
And just last month the FBI arrested a neo-Nazi cult leader
in Hawaii who lured victims to his Discord server. He used that
server to share images of extremely disturbing child sexual
abuse material interspersed with Neo-Nazi imagery. Members of
his child exploitation and hate group are also present on
Instagram, Snapchat, X, and TikTok, all of which they used to
recruit potential members and victims.
In many cases, including the ones I just mentioned, your
companies played a role in helping law enforcement investigate
these offenders. But by the time of the investigation, so much
damage had already been done.
This hearing is about how to keep children safe online, and
we've listened to all of your testimony to seemingly impressive
safeguards for young users. You try to limit the time that they
spend, you require parental consent, you have all of these
tools. Yet, trafficking and exploitation of minors online and
on your platforms continues to be rampant.
Nearly all of your companies make your money through
advertising, specifically by selling the attention of your
users. Your product is your users. As a made-up product
designer wrote in an email, ``Young ones are the best ones. You
want to bring people to your service young and early.'' In
other words, hook them early.
Research published last month by Harvard School of Public
Health estimates that Snap makes an astounding 41 percent of
its revenues by addressing to users under 18. With TikTok, it's
35 percent. Seven of the 10 largest Discord servers attracting
many paying users are for games used primarily by teens, by
children.
All this is to say that social media companies, yours, and
others, make money by attracting kids to your platforms. But
ensuring safety doesn't make money. It costs money. If you are
going to continue to attract kids to your platforms, you have
an obligation to ensure they're safe on the platforms because
the current situation is untenable. That is why we're having
this hearing. But to ensure safety for our children, that costs
money. Your companies cannot continue to profit off young users
only to look the other way when those users, our children, are
harmed online.
We've had a lot of comments about Section 230 protections,
and I think we are definitely heading in that direction. And
some of the five bills that we have already passed out of this
Committee talks about limiting the liability protections for
you.
Senator Hirono. This is for Mr. Zuckerberg. Last November,
the Privacy and Technology Subcommittee heard testimony from
Arturo Bejar. In response to one of my questions about how to
ensure that social media companies focus more on child safety,
he said, and I am paraphrasing a little bit, Mr. Bejar said
what will change their behavior is at the moment that Mark
Zuckerberg declares earnings, and these earnings have to be
declared to the SEC.
So he has to say, last quarter we made $34 billion, and the
next thing he has to say is how many teens experienced unwanted
sexual advances on his platform. Mr. Zuckerberg, will you
commit to reporting measurable child safety data on your
quarterly earnings reports and calls?
Mr. Zuckerberg. Senator, it's a good question. We actually
already have a quarterly report that we issue and do a call to
answer questions for how we're enforcing our community
standards. That includes not just the child safety issues and
metrics----
Senator Hirono. So is that a yes?
Mr. Zuckerberg. We have a separate call that we do this on,
but we've led the industry----
Senator Hirono. I think that, you know, you have to report
your earnings to the SEC. Will you report to them this kind of
data--and by numbers, by the way, because as Senator Coons said
and others have said percentages don't really tell the full
story. Will you report to the SEC the number of teens--and
sometimes you don't even know whether they're teens or not,
because they just claim to be adults.
Will you report the number of underage children on your
platforms who experience unwanted CSAM and other kinds of
messaging that harm them? Will you commit to citing those
numbers to the SEC when you make your quarterly report?
Mr. Zuckerberg. Well, Senator, I'm not sure it would make
as much sense to include in the SEC filing, but we file it
publicly so that way everyone can see this. And I'd be happy to
follow up and talk about what specific metrics. I think the
specific things are some of the ones that you just mentioned
around underage people under our services, we don't allow
people under the age of 13 on our service. So if we find anyone
who's under the age of 13, we remove them from our service.
Now, I'm not saying that people don't lie and that there
aren't----
Senator Hirono. Yes, apparently, they're.
Mr. Zuckerberg [continuing]. Anyone who's under the age of
13 who's using it, but I'm not going to be able to--we're not
going to be able to count how many people there are because,
fundamentally, if we identify that someone is underage, we've
removed them from the service.
Senator Hirono. I think that's really important that we get
actual numbers because these are real human beings. That's why
all these parents and others are here. Because each time that a
young person is exposed to this kind of unwanted material and
they get hooked, it is a danger to that individual. So, I'm
hoping that you are saying that you do report this kind of
information to, if not the SEC, that it is made public. I think
I'm hearing that yes you do, so.
Mr. Zuckerberg. Yes, Senator. I think we report more
publicly on our enforcement than any other company in the
industry, and we're very supportive of transparency measures.
Senator Hirono. I'm running out of time, Mr. Zuckerberg,
but so I will follow up with what exactly it is that you do
report.
Senator Hirono. Again, for you, when Meta automatically
places young people's accounts--and you testified to this--on
the most restrictive privacy and content sensitivity sessions,
and yet teens are able to opt out of these safeguards. Isn't
that right?
Mr. Zuckerberg. Yes.
Senator Hirono. It's not mandatory that they remain on
these settings. They can opt out.
Mr. Zuckerberg. Senator, yes, we default teens into a
private account. So they have a private and restricted
experience, but some teens want to be creators, and want to
have content that they share more broadly. And I don't think
that that's something that should just blanketly be banned.
Senator Hirono. Why not? I think it should be mandatory
that they're not--that they remain on the more restrictive
settings.
Mr. Zuckerberg. Senator, I think there's----
Senator Hirono. They have to start somewhere.
Mr. Zuckerberg. I mean, a lot of teens create amazing
things, and I think with the right supervision, and parenting,
and controls, I think that that's like--I don't think that
that's the type of thing that you want to just not allow anyone
to be able to do. I think you want to make it so that----
Senator Hirono. My time is up, but I have to say that there
is an argument that you-all make for every single thing that we
are proposing. And I share the concern that I have about the
blanket limitation on liabilities that we provide all of you.
And I think that that has to change, and that is on us, on
Congress, to make that change. Thank you, Mr. Chairman.
Chair Durbin. Thank you, Senator Hirono. Senator Cotton.
Senator Cotton. Mr. Chew, let's cut straight to the chase.
Is TikTok under the influence of the Chinese Communist Party?
Mr. Chew. No, Senator. We are a private business.
Senator Cotton. Okay. So you can say that your parent,
ByteDance, is subject to the 2017 National Security Law, which
requires Chinese companies to turn over information to the
Chinese government and conceal it from the rest of the world.
You concede that, correct?
Mr. Chew. Senator, the Chinese business----
Senator Cotton. There's no question, you conceded it
earlier.
Mr. Chew. Any global businesses that does business in China
has to follow the local laws.
Senator Cotton. Okay. Isn't it the case that ByteDance also
has an internal Chinese Communist Party committee?
Mr. Chew. Like I said, all businesses that operate in China
have developed their local law.
Senator Cotton. So your parent company is subject to the
National Security law that requires it to answer the party. It
has its own internal Chinese Communist Party committee. You
answer to that parent company, but you expect us to believe
that you're not under the influence of the Chinese Communist
Party?
Mr. Chew. I understand this concern, Senator, which is why
we built Project Texas.
Senator Cotton. It was a yes or no question. Okay. But you
used to work for ByteDance, didn't you? You were the CFO for
ByteDance?
Mr. Chew. That is correct, Senator.
Senator Cotton. In April, 2021, while you were the CFO, the
Chinese Communist Party's China Internet Investment Fund
purchased a 1 percent stake in ByteDance's main Chinese
subsidiary, the ByteDance Technology Company. In return for
that so-called 1 percent golden share, the party took one of
three board seats at that subsidiary company. That's correct,
isn't it?
Mr. Chew. It's for the Chinese business.
Senator Cotton. Is that correct?
Mr. Chew. It is for the Chinese business.
Senator Cotton. Yes. That deal was finalized on April 30,
2021. Isn't it true that you were appointed the CEO of TikTok
on the very next day, on May 1, 2021?
Mr. Chew. Well, it is a coincidence.
Senator Cotton. It's a coincidence----
Mr. Chew. Yes.
Senator Cotton [continuing]. That you were the CFO----
Mr. Chew. Senator, that----
Senator Cotton [continuing]. And then the Chinese Communist
Party took its golden share in its board seat, and the very
next day you were appointed the CEO of TikTok. That's a hell of
a coincidence.
Mr. Chew. It really is, Senator.
Senator Cotton. Yes, it is. Okay. And before ByteDance, you
were at a Chinese company called Xiaomi. Is that correct?
Mr. Chew. Yes. I used to work around the world.
Senator Cotton. Where did you live when you worked at
Xiaomi?
Mr. Chew. I lived in China. There were many experts.
Senator Cotton. Where exactly?
Mr. Chew. In Beijing, in China.
Senator Cotton. How many years did you live in Beijing?
Mr. Chew. Senator, I worked there for about 5 years.
Senator Cotton. So you lived there for 5 years?
Mr. Chew. Yes.
Senator Cotton. Is it the case that Xiaomi was sanctioned
by the U.S. Government in 2021 for being a Communist Chinese
military company?
Mr. Chew. I'm here to talk about TikTok. I think--I think
they then had a lawsuit and it was overturned. I can't remember
the details.
Senator Cotton. No, no----
Mr. Chew. It's another company.
Senator Cotton [continuing]. It's the Biden administration
that reversed those sanctions just like--by the way, they
reversed the terrorist designation on the Houthis in Yemen.
How's that working out for them? But it was sanctioned as a
Chinese Communist military company. So you said today, as you
often say, that you live in Singapore. Of what nation are you a
citizen?
Mr. Chew. Singapore.
Senator Cotton. Are you a citizen of any other nation?
Mr. Chew. No, Senator.
Senator Cotton. Have you ever applied for Chinese
citizenship?
Mr. Chew. Senator, I served my Nation in Singapore. No, I
did not.
Senator Cotton. Do you have a Singaporean passport?
Mr. Chew. Yes. And I served my military for 2\1/2\ years in
Singapore.
Senator Cotton. Do you have any other passports from----
Mr. Chew. No, Senator.
Senator Cotton [continuing]. Any other nations? Your wife
is an American citizen, your children are American citizens?
Mr. Chew. That's correct.
Senator Cotton. Have you ever applied for American
citizenship?
Mr. Chew. No, not yet.
Senator Cotton. Okay. Have you ever been a member of the
Chinese Communist Party?
Mr. Chew. Senator, I'm Singaporean. No.
Senator Cotton. Have you ever been associated or affiliated
with the Chinese Communist Party?
Mr. Chew. No, Senator. Again, I'm Singaporean.
Senator Cotton. Let me ask you something, hopefully a
simple question. You said earlier in response to a question
that what happened at Tiananmen Square in June 1989 was a
massive protest. Anything else happen in Tiananmen Square?
Mr. Chew. Yes, I think it's well documented. It was a
massacre there. Yes.
Senator Cotton. There was an indiscriminate slaughter of
hundreds or thousands of Chinese citizens. Do you agree with
the Trump administration and the Biden administration, that the
Chinese government is committing genocide against the Uyghur
people?
Mr. Chew. Senator, I've said this before. I think it's
really important that anyone who cares about this topic or any
topic can freely express themselves on TikTok.
Senator Cotton. It's a very simple question that unites
both parties in our country and governments around the world.
Is the Chinese government committing genocide against the
Uyghur people?
Mr. Chew. Senator, anyone, including, you know, you, can
come onto TikTok----
Senator Cotton. Yes or no----
Mr. Chew [continuing]. And talk about this topic----
Senator Cotton [continuing]. I'm asking you.
Mr. Chew [continuing]. Or any topic you don't understand.
Senator Cotton. You're a worldly, cosmopolitan, well-
educated man who's expressed many opinions on many topics. Is
the Chinese government committing genocide against the Uyghur
people?
Mr. Chew. Actually, Senator, I talk mainly about my
company----
Senator Cotton. Yes, or no?
Mr. Chew [continuing]. And I'm here to talk about what
TikTok does.
Senator Cotton. Yes, or no?
Mr. Chew. We allow----
Senator Cotton. You're here to give testimony--to give
testimony that's truthful, and honest, and complete. Let me ask
you this. Joe Biden last year said that Xi Jinping was a
dictator. Do you agree with Joe Biden? Is Xi Jinping a
dictator?
Mr. Chew. Senator, I'm not going to comment on any world
leaders.
Senator Cotton. Why won't you answer these very simple
questions?
Mr. Chew. Senator, it's not appropriate for me as a
businessman to comment on the world leaders.
Senator Cotton. Are you scared that you'll lose your job if
you say anything negative about the Chinese Communist Party?
Mr. Chew. I disagree with that. You'll find content that is
critical of China on our platform.
Senator Cotton. The next time you go on--are you scared
that you'll be arrested and disappear the next time you go to
Mainland China?
Mr. Chew. Senator, you will find content that's critical of
China and any other country freely on TikTok.
Senator Cotton. Okay. Let's turn to what TikTok, a tool of
the Chinese Communist Party, is doing to America's youth. Does
the name Mason Edens ring a bell?
Mr. Chew. Senator, you may have to give me more specifics,
if you don't mind.
Senator Cotton. Yes. He was a 16-year-old Arkansan. After a
breakup in 2022, he went on your platform and searched for
things like inspirational quotes and positive affirmations.
Instead, he was served up numerous videos glamorizing suicide
until he killed himself by gun. What about the name Chase
Nasca? Does that ring a bell?
Mr. Chew. Would you mind giving me more details, please?
Senator Cotton. He was a 16-year-old who saw more than
1,000 videos on your platform about violence and suicide until
he took his own life by stepping in front of a train. Are you
aware that his parents, Dean and Michelle, are suing TikTok and
ByteDance for pushing their son to take his own life?
Mr. Chew. Yes, I'm aware of that.
Senator Cotton. Okay. Finally, Mr. Chew, has the Federal
Trade Commission sued TikTok during the Biden administration?
Mr. Chew. Senator, I cannot talk about whether there's any
ongoing----
Senator Cotton. Are you currently being sued by the Federal
Trade Commission?
Mr. Chew. Senator, I cannot talk about any potential
lawsuits, whether they happen----
Senator Cotton. I didn't say potential--actual. Are you
being sued by the Federal Trade Commission?
Mr. Chew. Senator, I think I've given you my answer. I
cannot talk about----
Senator Cotton. The answer's no. Ms. Yaccarino's company is
being sued, I believe. Mr. Zuckerberg's company is being sued,
I believe. Yet, TikTok, the agent of the Chinese Communist
Party is not being sued by the Biden administration. Are you
familiar with the name Cristina Caffarra?
Mr. Chew. You may have to give me more details.
Senator Cotton. Cristina Caffarra was a paid advisor to
ByteDance, your Communist-influenced parent company. She was
then hired by the Biden FTC to advise on how to sue Mr.
Zuckerberg's company.
Mr. Chew. Senator, ByteDance is a global company and not a
Chinese Communist company. It's owned by global investors.
Senator Cotton. Public reports indicate that your lobbyist
visited the White House more than 40 times in 2022. How many
times did your company's lobbyist visit the White House last
year?
Mr. Chew. I don't know that, Senator.
Senator Cotton. Are you aware that the Biden campaign and
the Democratic National Committee is on your platform, they
have TikTok accounts?
Mr. Chew. Senator, we encourage people to come on----
Senator Cotton. Which, by the way----
Mr. Chew [continuing]. To create content.
Senator Cotton [continuing]. They won't let their staffers
use their personal phones. They give them separate phones that
they only use TikTok on.
Mr. Chew. We encourage everyone to join, including
yourself, Senator.
Senator Cotton. So all these companies are being sued by
the FTC. You're not. The FTC has a former paid advisor of your
parent talking about how they can sue Mr. Zuckerberg's company.
Joe Biden's reelection campaign, the Democratic National
Committee is on your platform. Let me ask you, have you or
anyone else at TikTok communicated with or coordinated with the
Biden administration, the Biden campaign, or the Democratic
National Committee to influence the flow of information on your
platform?
Mr. Chew. We work with anyone, any creators who want to use
our campaign. It's all the same process that we have----
Senator Cotton. Okay. So what we have here, we have a
company that's a tool of the Chinese Communist Party that is
poisoning the minds of America's children, in some cases,
driving them to suicide. And that at best, the Biden
administration is taking a pass on, at worse, may be in
collaboration with. Thank you, Mr. Chew.
Chair Durbin. Thank you, Senator Cotton. So we're going to
take a break now. We're on the second roll call. Members can
take advantage of if they wish. The break will last about 10
minutes. Please do your best to return.
[Whereupon the hearing was recessed and reconvened.]
Chair Durbin. The Senate Judiciary Committee will resume.
We have nine Senators who have not asked questions yet, in 7-
minute rounds, and we'll turn first to Senator Padilla.
Senator Padilla. Thank you, Mr. Chair. Colleagues, as we
reconvene, I'm proud once again to share that I am one of the
few Senators with younger children. And I lead with that
because as we are having this conversation today, it's not lost
on me that between my children, who are all now in a teen and
preteen category, and their friends, I see this issue very up
close and personal.
And in that spirit, I want to take a second to just
acknowledge and thank all the parents who are in the audience
today, many of whom have shared their stories with our offices.
And I credit them for finding strength through their suffering,
through their struggle, and channeling that into the advocacy
that is making a difference. I thank all of you.
Now, I appreciate, again, personally, the challenges that
parents, and caretakers, school personnel, and others face in
helping our young people navigate this world of social media
and technology in general. Now, the services our children are
growing up with provide them unrivaled access to information. I
mean, this is beyond what previous generations have
experienced, and that includes learning opportunities,
socialization, and much, much more.
But we also clearly have a lot of work to do to better
protect our children from the predators and predatory behavior
that these technologies have enabled. And yes, Mr. Zuckerberg,
that includes exacerbating the mental health crisis in America.
Nearly all teens we know have access to smartphones and the
internet and use the internet daily. And while guardians do
have primary responsibility for caring for our children, the
old adage says, ``it takes a village,'' and so society as a
whole, including leaders in the tech industry, must prioritize
the health and safety of our children.
Now, I'll dive into my questions now and be specific,
platform by platform, witness by witness on the topic of some
of the parental tools you have each made reference to.
Mr. Citron, how many minors are on Discord, and how many of
them have caretakers that have adopted your Family Center tool?
And if you don't have the numbers, just say that quickly and
provide that to our office.
Mr. Citron. We can follow up with you on that.
Senator Padilla. How have you ensured that young people and
their guardians are aware of the tools that you offer?
Mr. Citron. We make it very clear to use it--to teens on
our platform what tools are available----
Senator Padilla. That sounds very vague.
Mr. Citron [continuing]. And our Teen Safety Assist is
enabled by default,
Senator Padilla. What specifically do you do? What may be
clear to you is not clear to the general public. So what do you
do, in your opinion, to make it very clear?
Mr. Citron. So our Teen Safety Assist, which is a feature
that helps teens keep themselves safe in addition to blocking
and blurring images that may be sent to them, that is on by
default for teen accounts, and it cannot be turned off. We
market to our teen users directly on our platform, we launched
our Family Center. We create a promotional video, and we put it
directly on our product. So when every teen opened the app, in
fact, every user opened the app, they got an alert like, Hey,
Discord has this. They want you to use it.
Senator Padilla. Thank you. Look forward to the data that
we're requesting.
Mr. Zuckerberg, across all of Meta services from Instagram,
Facebook, Messenger, and Horizon, how many minors use your
applications? And of those minors, how many have a caretaker
that has adopted the parental supervision tools that you offer?
Mr. Zuckerberg. I can follow up with the specific stats on
that, Senator.
Senator Padilla. Okay. It would be very helpful not just
for us to know, but for you to know as a leader of your
company. Same question, how are you ensuring that young people
and their guardians are aware of the tools that you offer?
Mr. Zuckerberg. We run pretty extensive ad campaigns both
on our platforms and outside. We work with creators and
organizations like Girl Scouts to make sure that there's broad
awareness of the tools.
Senator Padilla. Okay. Mr. Spiegel, how many minors use
Snapchat, and of those minors, how many have caretakers that
are registered with your Family Center?
Mr. Spiegel. I believe in the United States, there are
approximately 20 million teenage users of Snapchat. I believe
approximately 200,000 parents use Family Center, and about
400,000 teens have linked their account to their parents using
Family Center.
Senator Padilla. So 200,000 and 400,000. Sounds like a big
number, but small in percentage of the minors using Snapchat.
What are you doing to ensure that young people and their
guardians are aware of the tools you offer?
Mr. Spiegel. Senator, we create a banner for Family Center
on the user's profiles so that accounts we believe maybe of the
age, that they could be parents, can see the entry point into
Family Center easily.
Senator Padilla. Okay. Mr. Chew, how many minors are on
TikTok, and how many of them have a caregiver that uses your
family tools?
Mr. Chew. Senator, I need to get back to you on the
specific numbers. But we were one of the first platforms to
give what we call family pairing to parents. You go to
settings, you turn on the QR code--your teenager's QR code, and
yours--you scan it. And what it allows you to do is you can set
screen time limits, you can filter out some keywords, you can
turn on a more restricted mode. And we are always talking to
parents. I met, you know, a group of parents, and teenagers,
and high school teachers last week to talk about what more we
can provide in the family pairing mode.
Senator Padilla. Ms. Yaccarino, how many minors use X, and
are you planning to implement safety measures or guidance for
caretakers like your peer companies have?
Ms. Yaccarino. Thank you, Senator. Less than 1 percent of
all U.S. users are between the ages of 13 and 17.
Senator Padilla. Less than 1 percent of how many?
Ms. Yaccarino. Of 90 million U.S. users.
Senator Padilla. Okay. So still hundreds of thousands
continue?
Ms. Yaccarino. Yes, yes, and every single one is very
important. Being a 14-month-old company, we have reprioritized
child protection and safety measures, and we have just begun to
talk about and discuss how we can enhance those with parental
controls.
Senator Padilla. Let me continue with the follow-up
question for Mr. Citron. In addition to keeping parents
informed about the nature of various internet services, there's
a lot more we obviously need to do.
For today's purposes, while many companies offer a broad
range of quote unquote user empowerment tools, it's helpful to
understand whether young people even find these tools helpful.
So I appreciate you sharing your Teen Safety Assist, and the
tools, and how you're advertising it, but have you conducted
any assessments of how these features are impacting minor's use
of your platform?
Mr. Citron. Our intention is to give teens tools,
capabilities, that they can use to keep themselves safe, and
also, so our teams can help keep teens safe. We recently
launched Teen Safety Assist last year, and I do not have a
study off the top of my head, but we'd be happy to follow up
with you on that.
Senator Padilla. Okay. My time is up. I'll have follow-up
questions for each of you, either in the second round or
through statements for the record on a similar assessment of
the tools that you've proposed.
Senator Padilla. Thank you, Mr. Chair.
Chair Durbin. Thank you, Senator Padilla. Senator Kennedy.
Senator Kennedy. Thank you all for being here. Mr. Spiegel,
I see you hiding down there. What does yadda yadda yadda mean?
Mr. Spiegel. I'm not familiar with the term Senator.
Senator Kennedy. Very uncool. Can we agree that what you
do, not what you say, what you do is what you believe and
everything else is just cottage cheese?
Mr. Spiegel. Yes, Senator.
Senator Kennedy. Do you agree with that? Speak up. Don't be
shy. I've listened to you today. I've heard a lot of yadda
yadda-ying, and I've heard you talk about the reforms you've
made, and I appreciate them. And I've heard you talk about the
reforms you're going to make, but I don't think you're going to
solve the problem. I think Congress is going to have to help
you. I think the reforms you're talking about, to some extent,
are going to be like putting paint on rotten wood.
And I'm not sure you're going to support this legislation.
I'm not. The fact is that you and some of your internet
colleagues who are not here, are no longer--you're not
companies, you're countries. You're very, very powerful, and
you and some of your colleagues who are not here have blocked
everything we have tried to do in terms of reasonable
regulation. Everything from privacy to child exploitation.
And in fact, we have a new definition of recession. A
recession is when--we know we're in a recession when Google has
to lay off 25 Members of Congress. That's what we're down to.
We're also down to this fact: that your platforms are hurting
children. I'm not saying they're not doing some good things,
but they're hurting children.
And I know how to count votes, and if this bill comes to
the floor of the U.S. Senate, it will pass. What we're going to
have to do--and I say this with all the respect that I can
muster--is convince my good friend, Senator Schumer, to go
Amazon, by spying online and bring this bill to the Senate
floor, and the House will then pass it. Now, that's one
person's opinion. I may be wrong, but I doubt it.
Mr. Zuckerberg, let me ask you a couple of questions. Might
wax a little philosophical here. I have to hand it to you. You
have convinced over 2 billion people to give up all of their
personal information, every bit of it, in exchange for getting
to see what their high school friends had for dinner Saturday
night. That's pretty much your business model, isn't it?
Mr. Zuckerberg. It's not how I would characterize it. And
we give people the ability to connect with the people they care
about, and to engage with the topics that they care about.
Senator Kennedy. And you take this information, this
abundance of personal information, and then you develop
algorithms to punch people's hot buttons, and steer to them
information that punches their hot buttons again, and again,
and again to keep them coming back and to keep them staying
longer. And as a result, your users see only one side of an
issue. And so, to some extent, your platform has become a
killing field for the truth, hasn't it?
Mr. Zuckerberg. I mean, Senator, I disagree with that
characterization. You know, we build ranking and
recommendations because people have a lot of friends and a lot
of interests, and they want to make sure that they see the
content that's relevant to them. We're trying to make a product
that's useful to people, and make our services as helpful as
possible for people to connect with the people they care about
and the interest they care about.
Senator Kennedy. But you don't show them both sides. You
don't give them balanced information. You just keep punching
their hot buttons, punching their hot buttons. You don't show
them balanced information so people can discern the truth for
themselves, and you rev them up so much that so often your
platform and others becomes just cesspools of snark where
nobody learns anything, don't they?
Mr. Zuckerberg. Well, Senator, I disagree with that. I
think people can engage in the things that they're interested
in and learn quite a bit about those. We have done a handful of
different experiments and things in the past around news and
trying to show content on, you know, diverse set of
perspectives. I think that there's more that needs to be
explored there, but I don't think that we can solve that by
ourselves. One of the things that I saw----
Senator Kennedy. Do you think--I'm sorry to cut you off,
Mr. President, but I'm going to run out of time. Do you think
your users really understand what they're giving to you, all
their personal information, and how you process it, and how you
monetize it? Do you think people really understand?
Mr. Zuckerberg. Senator, I think people understand the
basic terms. I mean, I think that there's--I actually think
that a lot of people overestimate the amount of information we
have----
Senator Kennedy. Let me put it another way. We spent a
couple years since we talked about this. Does your user
agreements still suck?
[Laughter.]
Mr. Zuckerberg. I'm not sure how to answer that, Senator. I
think there's----
Senator Kennedy. Can you still have a dead body in all that
legalese where nobody can find it?
Mr. Zuckerberg. Senator, I'm not quite sure what you're
referring to, but I think people get the basic deal of using
these services. It's a free service. You're using it to connect
with the people you care about. If you share something with
people, other people will be able to see your information. It's
inherently--and if you're putting something out there to be
shared publicly or with a private set of people, it's--you
know, you're inherently putting it out there. So I think people
get that basic part of how this works.
Senator Kennedy. But Mr. Zuckerberg, you're in the
foothills of creepy. You track people who aren't even Facebook
users. You track your own people, your own users who are your
product, even when they're not on Facebook.
I'm going to land this plane pretty quickly, Mr. Chairman.
I mean, it's creepy, and I understand you make a lot of money
doing it, but I just wonder if our technology is greater than
our humanity. I mean, let me ask you this final question.
Instagram is harmful to young people, isn't it?
Mr. Zuckerberg. Senator, I disagree with that. That's not
what the research shows on balance. That doesn't mean that
individual people don't have issues, and that there aren't
things that we need to do to help provide the right tools for
people. But across all of the research that we've done
internally, I mean, this--you know, the survey that the Senator
previously cited, you know, there are 12 or 15 different
categories of harm that we asked teens if they felt that
Instagram made it worse or better. And across all of them,
except for the one that Senator Hawley cited, more people said
that using Instagram----
Senator Kennedy. I've got to land this plane, Mr.
Zuckerberg.
Mr. Zuckerberg [continuing]. Contributed to issues that
they faced, either positive or----
Senator Kennedy. We just have to agree to disagree. If you
believe that Instagram--I'm not saying it's intentional, but if
you agree that Instagram--if you think that Instagram is not
hurting millions of our young people, particularly young teens,
particularly young women, you shouldn't be driving it. It is.
Thanks.
Chair Durbin. Senator Butler.
Senator Butler. Thank you, Mr. Chair. And thank you to our
panelists who've come to have an important conversation with
us. Most importantly, I want to appreciate the families who
have shown up to continue to be remarkable champions of your
children and your loved ones, for being here, and in particular
to California families that I was able to just talk to on the
break. The families of Sammy Chapman from Los Angeles and
Daniel Puerta from Santa Clarita. They are here today and are
doing some incredible work to not just protect the memory and
legacy of their boys, but the work that they're doing is going
to protect my 9-year-old. And that is indeed why we are here.
There are a couple questions that I want to ask some
individuals. Let me start with a question for each of you. Mr.
Citron, have you ever sat with a family and talked about their
experience and what they need from your product? Yes, or no?
Mr. Citron. Yes. I have spoken with parents about how we
can build tools to help them.
Senator Butler. Mr. Spiegel, have you sat with families and
young people to talk about your products and what they need
from your product?
Mr. Spiegel. Yes, Senator.
Senator Butler. Mr. Chew?
Mr. Chew. Yes. I just did it 2 weeks ago. Like, for
example----
Senator Butler. I don't want to know what you did for the
hearing prep, Mr. Chew. I just wanted to know if----
Mr. Chew. No, it's an example.
Senator Butler [continuing]. You did anything----
Mr. Chew. Senator, it's an example.
Senator Butler [continuing]. In terms of designing the
product that you are creating. Mr. Zuckerberg, have you sat
with parents and young people to talk about how you design
product for your consumers?
Mr. Zuckerberg. Yes. Over the years, I've had a lot of
conversations with parents----
Senator Butler. You know, that's interesting, Mr.
Zuckerberg, because we talked about this last night, and you
gave me a very different answer. I asked you this very
question.
Mr. Zuckerberg. Well, I told you that I wasn't--that I
didn't know what specific processes our company had for----
Senator Butler. No, Mr. Zuckerberg, you said to me that you
had not.
Mr. Zuckerberg. I must have misspoke.
Senator Butler. I want to give you the room to misspeak,
Mr. Zuckerberg, but I asked you this very question. I asked all
of you this question and you told me a very different answer
when we spoke, but I won't belabor it.
A number of you have talked about the--I'm sorry, X Ms.
Yaccarino, have you talked to parents directly, young people,
about designing your product?
Ms. Yaccarino. As a new leader of X, the answer is yes.
I've spoken to them about the behavioral patterns because less
than 1 percent of our users are in that age group, but yes, I
have spoken to them.
Senator Butler. Thank you, ma'am. Mr. Spiegel, there are a
number of parents whose children have been able to access
illegal drugs on your platform. What do you say to those
parents?
Mr. Spiegel. Well, Senator, we are devastated that we
cannot----
Senator Butler. To the parents. What do you say to those
parents, Mr. Spiegel?
Mr. Spiegel. I'm so sorry that we have not been able to
prevent these tragedies. We work very hard to block all search
terms related to drugs from our platform. We proactively look
for and detect drug-related content. We remove it from our
platform, preserve it as evidence, and then we refer it to law
enforcement for action.
We've worked together with nonprofits and with families on
education campaigns because the scale of the fentanyl epidemic
is extraordinary. Over 100,000 people lost their lives last
year, and we believe people need to know that one pill can
kill. That campaign was viewed more than 260 million times on
Snapchat. We also launched----
Senator Butler. Mr. Spiegel, there are two fathers in this
room who lost their sons. They were 16 years old. Their
children were able to get those pills from Snapchat. I know
that there are statistics, and I know that there are good
efforts. None of those efforts are keeping our kids from
getting access to those drugs on your platform.
Now, as a California company, all of you, I've talked with
you about what it means to be a good neighbor, and what
California families and American families should be expecting
from you. You owe them more than just a set of statistics. And
I look forward to you showing up on all pieces of this
legislation--all of you, showing up on all pieces of
legislation to keep our children safe.
Mr. Zuckerberg, I want to come back to you. I talked with
you about being a parent to a young child who doesn't have a
phone, you know, is not on social media at all. And one of the
things that I am deeply concerned with as a parent to a young
black girl, is the utilization of filters on your platform that
would suggest to young girls utilizing your platform, the
evidence that they are not good enough as they are.
I want to ask more specifically and refer to some
unredacted court documents that revealed that your own
researchers concluded that these face filters that mimic
plastic surgery, negatively impact youth mental health, indeed,
and well-being. Why should we believe--why should we believe
that because--that you are going to do more to protect young
women and young girls when it is that you give them the tools
to affirm the self-hate that is spewed across your platforms?
Why is it that we should believe that you are committed to
doing anything more to keep our children safe?
Mr. Zuckerberg. Sorry, there's a lot to unpack there.
Senator Butler. There is a lot.
Mr. Zuckerberg. We give people tools to express themselves
in different ways, and people use face filters and different
tools to make media, and photos, and videos that are fun or
interesting across a lot of the different products that are----
Senator Butler. Plastic surgery pins are good tools to
express creativity?
Mr. Zuckerberg. Senator, I'm not speaking to that
specifically.
Senator Butler. Skin lightening tools are tools to express
creativity?
Mr. Zuckerberg. Senator, I'm not----
Senator Butler [continuing]. This is the direct thing that
I'm asking about.
Mr. Zuckerberg. I'm not defending any specific one of
those. I think that the ability to kind of filter and edit
images is generally a useful tool for expression for that
specifically. I'm not familiar with the study that you're
referring to, but we did make it so that we're not recommending
this type of content to teens.
Senator Butler. I made no reference to a study. To court
documents that revealed your knowledge of the impact of these
types of filters on young people, generally young girls in
particular, and----
Mr. Zuckerberg. Senator, I disagree with that
characterization. I think that----
Senator Butler. With court documents?
Mr. Zuckerberg [continuing]. There have been hypothesis--I
haven't seen any documents that says----
Senator Butler. Okay. Mr. Zuckerberg, my time is up. I hope
that you hear what is being offered to you, and are prepared to
step up and do better. I know this Senate Committee is going to
do our work to hold you to greater account. Thank you, Mr.
Chair.
Chair Durbin. Senator Tillis.
Senator Tillis. Thank you, Mr. Chair. Thank you all for
being here. I don't feel like I'm going to have an opportunity
to ask a lot of questions, so I'm going to reserve the right to
submit some for the record.
Senator Tillis. We've had hearings like this before. I've
been in the Senate for 9 years. I've heard hearings like this
before. I've heard horrible stories about people who have died,
committed suicide, been embarrassed. Every year we have an
annual flogging, every year. And what materially has occurred
over the last 9 years? Do any of you-all--just a yes or no
question. Do any of you-all participate in an industry
consortium trying to make this fundamentally safe across
platforms? Yes, or no, Mr. Zuckerberg?
Mr. Zuckerberg. Yes.
Ms. Yaccarino. There's a variety of organizations that we
work----
Senator Tillis. Do you participate in them?
Ms. Yaccarino. Which organization, Senator?
Senator Tillis. I should say, does anyone here not
participate in an industry--I actually think it would be
immoral for you all to consider it a strategic advantage to
keep safe--or, to keep private something that would secure all
these platforms to avoid this sort of problem. Do you-all agree
with that? That anybody that would be saying, you want ours
because ours is the safest, and these haven't figured out the
secret sauce--that you as an industry realize this is an
existential threat to you-all if we don't get it right. Right?
I mean, you've got to secure your platforms. You got to
deal with this. Do you not have an inherent mandate to do this?
Because it would seem to me if you don't, you're going to cease
to exist. I mean, we could regulate you out of business if we
wanted to.
And the reason I'm saying, it may sound like criticism,
it's not a criticism. I think we have to understand that there
should be an inherent motivation for you to get this right. Our
Congress will make a decision that could potentially put you
out of business.
Here's the reason I have a concern with that though. I just
went on the internet while I was listening intently to all the
other Members speaking, and I found a dozen different platforms
outside the United States. Ten of which are in China, two of
which are in Russia. Their daily average subscriber or active
membership numbers in the billions. Well, people say you can't
get on China's version of TikTok. I took me one quick search on
my favorite search engine to find out exactly how I could get
an account on this platform today.
And so the other thing that we have to keep in mind, I come
from technology. I could figure out, ladies and gentlemen, I
could figure out how to influence your kid without them ever
being on a social media platform. I can randomly send texts and
get a bite, and then find out an email address and get
compromising information.
It is horrible to hear some of these stories. And I have
shared the--and I've had these stories occur in my hometown
down in North Carolina. But if we only come here and make a
point today, and don't start focusing on making a difference,
which requires people to stop shouting, and start listening,
and start passing language here, the bad actors are just going
to be off our shores.
I have another question for you all. How many people
roughly--if you don't know the exact numbers, okay. Roughly,
how many people do you have looking 24 hours a day at these
horrible images, and just go real quick with an answer down the
line, and filtering it out?
Mr. Zuckerberg. It's most of the 40,000, about, people who
work on safety.
Senator Tillis. And again.
Ms. Yaccarino. We have 2,300 people all over the world.
Senator Tillis. Okay.
Mr. Chew. We have 40,000 trust and safety professionals
around the world.
Mr. Spiegel. We have approximately 2,000 people dedicated
to trust and safety and content moderation.
Mr. Citron. Our platform is much, much smaller than these
folks. We have hundreds of people. And it's looking at the
content, and 15 percent of our workforce is focused on it.
Senator Tillis. I've already mentioned, these people have a
horrible job. Many of them experience--they have to get
counseling for all the things they see. We have evil people out
there, and we're not going to fix this by shouting past or
talking past each other. We're going to fix this by every one
of you-all being at the table, and hopefully coming closer to
what I heard one person say, supporting a lot of the good
bills, like one that I hope Senator Blackburn mentions when she
gets a chance to talk.
But guys, if you're not at the table and securing these
platforms, you're going to be on it. And the reason why I'm not
okay with that is that if we ultimately destroy your ability to
create value and drive you out of business, the evil people
will find another way to get to these children.
And I do have to admit--I don't think my mom's watching
this one--but there is good. We can't look past good that is
occurring. My mom, who lives in Nashville, Tennessee, and I
talked to her yesterday and we talked about a Facebook post
that she made a couple of days ago. We don't let her talk to
anybody else. That connects my 92-year-old mother with her
grandchildren and great-grandchildren. That lets a kid who may
feel awkward in school to get into a group of people and relate
to people. Let's not throw out the good because we haven't all
together focused on rooting out the bad.
Now, I guarantee you, I could go through some of your
governance documents and find a reason to flog every single one
of you because you didn't place the emphasis on it that I think
you should. But at the end of the day, I find it hard to
believe that any of you people started this business, some of
you in your college dorm rooms, for the purposes of creating
the evil that is being perpetrated on your platforms.
But I hope that every single waking hour, you are doing
everything you can to reduce it. You're not going to be able to
eliminate it. And I hope that there are some enterprising young
tech people out there today that are going to go to parents and
say, ladies and gentlemen, your children have a deadly weapon.
They have a potentially deadly weapon, whether it's a phone or
a tablet. You have to secure it. You can't assume that they're
going to be honest and say that they're 16 when they're 12.
We all have to recognize that we have a responsibility to
play and you guys are at the tip of the spear. So I hope that
we can get to a point to where we are moving these bills. If
you got a problem with them, state your problem. Let's fix it.
No is not an answer. And know that I want the United States to
be the beacon for innovation, to be the beacon for safety, and
to prevent people from using other options that have existed
since the internet has existed to exploit people, and count me
in as somebody that will try and help out. Thank you, Mr.
Chair.
Chair Durbin. Thank you, Senator Tillis. Next is Senator
Ossoff.
Senator Ossoff. Thank you, Mr. Chairman. And thank you to
our witnesses today. Mr. Zuckerberg, I want to begin by just
asking a simple question, which is, do you want kids to use
your platform more or less?
Mr. Zuckerberg. Well, we don't want people under the age of
13 using our----
Senator Ossoff. Do you want teenagers 13 and up to use your
platform more or less?
Mr. Zuckerberg. Well, we would like to build a product that
is useful and that people want to use it more.
Senator Ossoff. My time is going to be limited. So do you
want them to use it more or less? Teenagers, 13 to 17 years
old, do you want them using Meta products more or less?
Mr. Zuckerberg. I'd like them to be useful enough that they
want to use them more.
Senator Ossoff. You want them to use it more. I think
herein we have one of the fundamental challenges. In fact, you
have a fiduciary obligation, do you not, to try to get kids to
use your platform more?
Mr. Zuckerberg. It depends on how you define that. We
obviously are a business, but----
Senator Ossoff. I'm sorry, Mr. Zuckerberg, it's self-
evident that you have a fiduciary obligation to get your users,
including users under 18, to use and engage with your platform
more rather than less. Correct?
Mr. Zuckerberg. Over the long-term. But in the near-term,
we often take a lot of steps, including we made a change to
show less videos on the platform. That reduced the amount of
time by more than 50 million hours.
Senator Ossoff. Okay. But if your shareholders asked you,
``Mark--'' I wouldn't, Mr. Zuckerberg here, but your
shareholders might be on a first name basis with you. ``Mark,
are you trying to get kids to use Meta products more or less?''
You'd say more, right?
Mr. Zuckerberg. Well, I would say that over the long term,
we're trying to create the most value----
Senator Ossoff. Yes. So the 10-K you filed with the SEC, a
few things I want to note here are some quotes. And this is a
filing that you signed, correct?
Mr. Zuckerberg. Yes.
Senator Ossoff. Yes. ``Our financial performance has been
and will continue to be significantly determined by our success
in adding, retaining, and engaging active users.'' Here's
another quote. ``If our users decrease their level of
engagement with our products, our revenue, financial results,
and business may be significantly harmed.''
Here's another quote. ``We believe that some users,
particularly younger users, are aware of and actively engaging
with other products and services, similar to, as a substitute
for ours.'' It continues, ``In the event that users
increasingly engage with other products and services, we may
experience a decline in use and engagement in key demographics
or more broadly, in which case our business would likely be
harmed.''
You have an obligation as the chief executive to encourage
your team to get kids to use your platform more.
Mr. Zuckerberg. Senator, I think this is----
Senator Ossoff. Is that not self-evident? You have a
fiduciary obligation to your shareholders to get kids to use
your platform more.
Mr. Zuckerberg. I think that the thing that's not intuitive
is the direction is to make the products more useful so that
way people want to use them more. We don't give the teams
running the Instagram feed or the Facebook feed a goal to
increase the amount of time that people spend.
Senator Ossoff. Yes. But you don't dispute and your 10-K
makes it clear you want your users engaging more and using more
the platform. And I think this gets to the root of the
challenge because it's the overwhelming view of the public.
Certainly, in my home State of Georgia.
And we've had some discussions about the underlying science
that this platform is harmful for children. I mean, you are
familiar with, and not just your platform, by the way, social
media in general, 2023 report from the Surgeon General about
the impact of social media on kids' mental health, which cited
evidence that kids who spend more than 3 hours a day on social
media have double the risk of poor mental health outcomes,
including depression and anxiety. Are you familiar with that
Surgeon General report and the underlying study?
Mr. Zuckerberg. I read the report. Yes.
Senator Ossoff. Do you dispute it?
Mr. Zuckerberg. No, but I think it's important to
characterize it correctly. I think what he was flagging in the
report is that there seems to be a correlation, and obviously
the mental health issue is very important. So it's something
that needs to be studied further.
Senator Ossoff. Yes, everyone knows there's a correlation.
Everyone knows that kids who spend a lot of time, too much time
on your platforms are at risk. And it's not just the mental
health issues--let me ask you another question. Is your
platform safe for kids?
Mr. Zuckerberg. I believe it is, but there's a----
Senator Ossoff. Hold on a second. Let me ask you----
Mr. Zuckerberg [continuing]. Difference between correlation
and causation.
Senator Ossoff [continuing]. Because we're not going to be
able to get anywhere. We want to work in a productive, open,
honest, and collaborative way with the private sector to pass
legislation that will protect Americans, that will protect
American children above all, and that will allow businesses to
thrive in this country. If we don't start with an open, honest,
candid, realistic assessment of the issues, we can't do that.
The first point is you want kids to use the platform more.
In fact, you have an obligation to. But if you're not willing
to acknowledge that it's a dangerous place for children--the
internet is a dangerous place for children, not just your
platform, isn't it? Isn't the internet a dangerous place for
children?
Mr. Zuckerberg. I think it can be. Yes. There's both great
things that people can do and there are harms that we need to
work toward--yes.
Senator Ossoff. It's a dangerous place for children. There
are families here who have lost their children. There are
families across the country whose children have engaged in
self-harm, who have experienced low self-esteem, who have been
sold deadly pills on the internet. The internet's a dangerous
place for children, and your platforms are dangerous places for
children. Do you agree?
Mr. Zuckerberg. I think that there are harms that we need
to work to mitigate. I mean, I'm not going to----
Senator Ossoff. Why not? Why not just acknowledge it? Why
do we have to do the very careful code?
Mr. Zuckerberg. Well, I just disagree with the
characterization----
Senator Ossoff. Which characterization? That the internet's
a dangerous place for children?
Mr. Zuckerberg. I think you're trying to characterize our
products as inherently dangerous, and I think that's----
Senator Ossoff. Inherent or not, your products are places
where children can experience harm. They can experience harm to
their mental health. They can be sold drugs. They can be preyed
upon by predators. They're dangerous places, and yet you have
an obligation to promote the use of these platforms by
children.
And look, all I'm trying to suggest to you, Mr. Zuckerberg,
and my time is running short, is that in order for you to
succeed, you and your colleagues here, we have to acknowledge
these basic truths. We have to be able to come before the
American people, the American public, the people in my State of
Georgia, and acknowledge the internet is dangerous, including
your platforms. There are predators lurking. There are drugs
being sold. There are harms to mental health that are taking a
huge toll on kids' quality of life.
And yet you have this incentive, not just you, Mr.
Zuckerberg, all of you have an incentive to boost, maximize
use, utilization, and engagement. And that is where public
policy has to step in to make sure that these platforms are
safe for kids so kids are not dying, so kids are not
overdosing, so kids are not cutting themselves or killing
themselves because they're spending all day scrolling instead
of playing outside. And I appreciate all of you for your
testimony. We will continue to engage as we develop this
legislation. Thank you.
Chair Durbin. Senator from Tennessee.
Senator Blackburn. Thank you, Mr. Chairman. Thank you to
each of you for coming. And I know some of you had to be
subpoenaed to get here, but we do appreciate that you-all are
here.
Mr. Chew, I want to come to you first. We've heard that
you're looking at putting a headquarters in Nashville, and
likewise in Silicon Valley and Seattle. And what you're going
to find probably is that the welcome mat is not going to be
rolled out for you in Nashville like it would be in California.
There are a lot of people in Tennessee that are very concerned
about the way TikTok is basically building dossiers on our
kids, the way they are building those on their Virtual U. And
also, that that information is held in China, in Beijing, as
you responded to Senator Blumenthal and I last year in
reference to that question.
And we also know that a major music label yesterday said
they were pulling all of their content off your site because of
your issues on payment, on artificial intelligence, and because
of the negative impact on our kids' mental health. So we will
see how that progresses.
Mr. Zuckerberg, I want to come to you. We have just had,
Senator Blumenthal and I, of course, have had some internal
documents in emails that have come our way. One of the things
that really concerned me is that you referred to your young
users in terms of their lifetime value of being roughly $270
per teenager. And each of you should be looking at these kids,
their T-shirts they're wearing today say, ``I'm worth more than
$270.'' We've got some standing up in those t-shirts.
[Applause.]
Senator Blackburn. And some of the children from our State,
some of the children, the parents that we have worked with,
just to think whether it is Becca Schmidt, David Malloch, Sarah
Flatt, and Lee Schopt, would you say that life is only worth
$270? What could possibly lead you--I mean, I listened to that.
I know you're a dad, I'm a mom, I'm a grandmom. And how could
you possibly even have that thought? It's astounding to me.
And I think this is one of the reasons that States, 42
States, are now suing you because of features that they
consider to be addictive, that you are pushing forward. And in
the emails that we've got from 2021, that go from August to
November, there is the Staff Plan that is being discussed. And
Antigone Davis, Nick Clegg, Cheryl Sandberg, Chris Cox, Alex
Schultz, Adam Mosseri, are all on this chain of emails on the
well-being plan. And then we get to one, ``Nick did email Mark
for--to emphasize his support for the package, but it sounds
like it lost out to various other pressures and priorities.''
See, this is what bothers us. Children are not your
priority. Children are your product. Children you see as a way
to make money, and protecting children in this virtual space--
you made a conscious decision even though Nick Clegg and others
were going through the process of saying this is what we do.
These documents are really illuminating. And it just shows me
that growing this business, expanding your revenue, what you
were going to put on those quarterly filings, that was the
priority. The children were not. It's very clear.
I want to talk with you about the pedophile ring because
that came up earlier and The Wall Street Journal reported on
that. And one of the things that we found out was after that
became evident, then you didn't take that content down. And it
was content that showed that teens were for sale and were
offering themselves to older men.
And you didn't take it down because it didn't violate your
community standards. Do you know how often a child is bought or
sold for sex in this country? Every 2 minutes. Every 2 minutes
a child is bought or sold for sex. That's not my stat. That is
a TBI stat.
Now finally this content was taken down after a
congressional staffer went to Meta's global head of safety. So
would you please explain to me and to all these parents why
explicit predatory content does not violate your platform's
terms of service or your community standards?
Mr. Zuckerberg. Sure, Senator, let me try to address all of
the things that you just said. It does violate our standards.
We work very hard to take it down.
Senator Blackburn. Didn't take it down.
Mr. Zuckerberg. Well, we've reported, I think it's more
than 26 million examples of this kind of content.
Senator Blackburn. Didn't take it down until a
congressional staffer brought it up.
Mr. Zuckerberg. It may be that in this case we made a
mistake and missed something. But we have----
Senator Blackburn. I think you make a lot of mistakes----
Mr. Zuckerberg [continuing]. Leading teams that identify
more than----
Senator Blackburn [continuing]. So let's move. I want to
talk with you about your Instagram creators program, and about
the push we found out through these documents that you actually
are pushing forward because you want to bring kids in early.
You see these younger tweenagers as, ``valuable, but an
untapped audience,'' quoting from the emails, and suggesting
teens are actually household influencers to bring their younger
siblings into your platform, into Instagram.
Now, how can you ensure that Instagram creators, your
product, your program, does not facilitate illegal activities
when you fail to remove content pertaining to the sale of
minors. And it's happening once every 2 minutes in this
country.
Mr. Zuckerberg. Senator, our tools for identifying that
kind of content are industry-leading. That doesn't mean we're
perfect. There are definitely issues that we have, but we
continue to invest----
Senator Blackburn. Mr. Zuckerberg, yes, there are a lot
that is slipping through. It appears that you're trying to be
the premier sex trafficking site.
Mr. Zuckerberg. Of course not, Senator.
Senator Blackburn [continuing]. In this country.
Mr. Zuckerberg. Senator, that's ridiculous.
Senator Blackburn. It's not ridiculous. You want to turn
around and tell the people that----
Mr. Zuckerberg. We don't want this content on our
platforms, and we----
Senator Blackburn. Why don't you take it down?
Mr. Zuckerberg. We do take it down?
Senator Blackburn. We're here discussing----
Mr. Zuckerberg. We do more work----
Senator Blackburn. We need you to work with us----
Mr. Zuckerberg [continuing]. To take it down than----
Senator Blackburn. No, you are not. You are not. And the
problem is, we've been working on this--Senator Welch is over
there. We've been working on this stuff for a decade. You have
an army of lawyers and lobbyists that have fought us on this
every step of the way. You work with Net Choice, the Cato
Institute, Taxpayers Protection Alliance, and Chamber of
Progress to actually fight our bipartisan legislation to keep
kids safe online. So, are you going to stop funding these
groups? Are you going to stop lobbying against this, and come
to the table and work with us? Yes, or no?
[Applause.]
Mr. Zuckerberg. Senator, we have a----
Senator Blackburn. Yes or no?
Mr. Zuckerberg. Of course, we'll work with you on the
legislation. I mean, it's----
Senator Blackburn. Okay. The door is open. We've got all
these bills, you need to come to the table. Each and every one
of you need to come to the table, and you need to work with us.
Kids are dying.
[Applause.]
Chair Durbin. Senator Welch.
Senator Welch. I want to thank my colleague, Senator
Blackburn for her decade of work on this. I actually have some
optimism. There is a consensus today that didn't exist, say, 10
years ago, that there is a profound threat to children, to
mental health, to safety. There's not a dispute; that was in
debate before. That's a starting point.
Secondly, we're identifying concrete things that can be
done in four different areas. One is industry standards, two is
legislation, three are the courts, and then four, is a proposal
that Senator Bennett, Senator Graham, myself, and Senator
Warren have to establish an agency, a Governmental agency whose
responsibility would be to engage in this on a systematic,
regular basis with proper resources.
And I just want to go through those. I appreciate the
industry standard decisions and steps that you've taken in your
companies, but it's not enough. And that's what I think you're
hearing from my colleagues. Like for instance, where there are
layoffs is in the trust and verify programs. That's alarming
because it looks like there is a reduction in emphasis on
protecting things. Like you just added, Ms. Yaccarino, 100
employees in Texas in this category. And how many did you have
before?
Ms. Yaccarino. The company is just coming through a
significant restructuring. So we've increased the number of
trust and safety employees and agents all over the world by at
least 10 percent so far in the last 14 months, and we'll
continue to do so specifically in Austin, Texas.
Senator Welch. All right. Mr. Zuckerberg, my understanding
is there have been layoffs in that area as well. There's added
jobs there at Twitter, but at Meta, have there been reductions
in that?
Mr. Zuckerberg. There have been across the board, not
really focused on that area. I think our investment is
relatively consistent over the last couple of years. We
invested almost $5 billion in this work last year, and I think
this year, we'll be on the same order of magnitude.
Senator Welch. All right. And another question that's come
up is when to the horror of a user of any of your platforms,
somebody has an image on there that's very compromising, often
of a sexual nature, is there any reason in the world why a
person who wants to take that down can't have a very simple
same day response to have it taken down? I'll start with
Twitter.
Ms. Yaccarino. I'm sorry, Senator. I was taking notes.
Could you repeat the question?
Senator Welch. Well, there's a lot of examples of a young
person of finding out about an image that is of them, and
really compromises them, and actually can create suicidal
thoughts and they want to call up or they want to send an email
and say, take it down. I mean, why is it not possible for that
to be responded to immediately?
Ms. Yaccarino. Well, we all strive to take down any type of
violative content or disturbing content immediately. At X, we
have increased our capabilities with a two-step reporting
process.
Senator Welch. Shouldn't it just be standard? If I'm a
parent or I'm a kid and I want this down, shouldn't there be
methods in place where it comes down? You can see what the
image is.
Ms. Yaccarino. And an ecosystem-wide standard would improve
and actually enhance the experience for users at all our
platforms.
Mr. Zuckerberg. There actually is an organization that I
think a number of the companies up here are a part of called
Take It Down. It's some technology that we and a few others
built, but basically----
Senator Welch. So you-all are in favor of that because----
Mr. Zuckerberg. Oh yes, this already exists.
Senator Welch [continuing]. Then it's going to give some
peace of mind to people. All right? It really, really matters.
I don't have that much time. So we've talked about the
legislation, and Senator Whitehouse had asked you to get back
with your position on Section 230, which I'll go to in a
minute. But I would welcome each of you responding as to your
company's position on the bills that are under consideration in
this hearing. All right? I'm just asking you to do that.
Third, the court. This big question of Section 230. And
today, I'm pretty inspired by the presence of the parents who
have turned their extraordinary grief into action and hope that
other parents may not have to suffer what for them is a
devastating--for everyone, a devastating loss.
Senator Whitehouse asked you all to get back very
concretely about Section 230 and your position on that. But
it's an astonishing benefit that your industry has that no
other industry has. They just don't have to worry about being
held accountable in court if they're negligent. So you've got
some explaining to do, and I'm just reinforcing Senator
Whitehouse's request that you get back specifically about that.
And then finally, I want to ask about this notion. It's
this idea of a Federal agency who's resourced and whose job is
to be dealing with public interest matters that are really
affected by Big Tech. It's extraordinary what has happened in
our economy with technology and your companies represent
innovation and success.
But just as when the railroads were ascendant, and were in
charge and ripping off farmers because of practices they were
able to get away with; just as when Wall Street was flying
high, but there was no one regulating blue sky laws, we now
have a whole new world in the economy. And Mr. Zuckerberg, I
remember you testifying in the Energy and Commerce Committee,
and I asked you your position on the concept of a Federal
regulatory agency. My recollection is that you were positive
about that. Is that still the case?
Mr. Zuckerberg. I think it could be a reasonable solution.
There are obviously pros and cons to doing that versus through
the normal--the current structure of having different
regulatory agencies focused on specific issues. But because a
lot of the things tradeoff against each other, like one of the
topics that we talked about today is encryption, and that's
obviously really important for privacy and security.
Senator Welch. Can we just go down the line? I'm at the
end, but thank you. Ms. Yaccarino.
Ms. Yaccarino. Senator, I think the industry initiative to
keep those conversations going would be something X would be
very, very proactive about. If you think about our support of
the REPORT Act, the SHIELD Act, the STOP CSAM Act, our support
of the Project Safe Child Act, I think our intentions are clear
to participate in lead here.
Senator Welch. Mr. Chew.
Mr. Chew. Senator, we support national privacy legislation,
for example. So that sounds like a good idea. We just need to
understand what it means.
Senator Welch. All right. Mr. Spiegel.
Mr. Spiegel. Senator, we'll continue to work with your
team, and we'd certainly be open to exploring the right
regulatory body for big technology.
Senator Welch. But the idea of a regulatory body is
something that you can see has merit?
Mr. Spiegel. Yes, Senator.
Senator Welch. And Mr. Citron.
Mr. Citron. Yes. We're very open to working with you, and
our peers, and anybody, on helping make the internet a safer
place. You know, I think you mentioned this is not a one
platform problem, right? So we do look to collaborate with
other companies, and with nonprofits, and the Government.
Senator Welch. Okay. I thank you all. Mr. Chairman, I yield
back.
Chair Durbin. Thank you, Senator Welch. Well, we're going
to conclude this hearing, and thank you all for coming today.
You probably have your scorecard out there. You've met, at
least, 20 Members of this Committee, and have your own
impressions of their questioning or approach and the like.
But the one thing I want to make clear as Chairman of this
Committee for the last 3 years is this was an extraordinary
vote on an extraordinary issue. A year ago, we passed five
bills unanimously in this Committee. You heard all the
Senators, every spot on the political spectrum was covered.
Every single Senator voted unanimously in favor of the five
pieces of legislation we've discussed today. It ought to tell
everyone who follows Capitol Hill and Washington, a pretty
stark message.
We get it, and we live it as parents and grandparents. We
know what our daughters, and sons, and others are going
through. They cannot cope. They cannot handle this issue on
their own. They're counting on us as much as they're counting
on the industry to do the responsible thing.
And some believe with impressions of our witnesses and the
companies they represent, that's you're right as an American
citizen, but you ought to also leave with the determination to
keep the spotlight on us to do something. Not just to hold a
hearing, bring out a good, strong crowd of supporters for
change, but to get something done. No excuses, no excuses.
We've got to bring this to a vote.
What I found in my time in the House, in the Senate, is
that's the day, that's the moment of reckoning. Speeches
notwithstanding, press releases, and the like. The moment of
reckoning is when we call a vote on these measures. It's time
to do that. I don't believe there's ever been a moment in
America's wonderful history when a business or industry has
stepped up and said, ``Regulate us. Put some legal limits on
us.''
Businesses exist by and large to be profitable. And I think
that we got to get behind that and say profitability at what
cost. Senator Kennedy, our Republican colleague said, ``Is our
technology greater than our humanity?'' I think that is a
fundamental question that he asked. What I would add to it, are
our politics greater than technology? We're going to find out.
I want to thank a few people before we close up here. I've got
several staffers who've worked so hard on this. Alexandra
Gelber. Thank you very much, Alexandra. Jeff Hanson, Scott
Robinson.
[Applause.]
Chair Durbin. Last point I'll make, Mr. Zuckerberg, is just
a little advice to you. I think your opening statement on
mental health needs to be explained because I don't think it
makes any sense.
There isn't a parent in this room who's had a child that's
gone through an emotional experience like this that wouldn't
tell you and me, ``They changed right in front of my eyes. They
changed. They hold themselves up in their room. They no longer
reached out to their friends. They lost all interest in
school.''
These are mental health consequences that I think come with
the abuse of this right to have access to this type of
technology. So I will just--I see my colleague is--do you want
to say a word?
Senator Graham. I think it was a good hearing. I hope
something positive comes from it. Thank you all for coming.
Chair Durbin. The hearing record is going to remain open
for a week for statements, and questions may be submitted by
Senators by 5 p.m. on Wednesday. Once again, thanks to the
witnesses for coming. The hearing stands adjourned.
[Whereupon, at 1:49 p.m., the hearing was adjourned.]
[Additional material submitted for the record follows.]
A P P E N D I X
Submitted by Chair Durbin:
ADL--Fighting Hate for Good...................................... 741
AEHT--Alliance to End Human Trafficking.......................... 744
CAIDP--Center for AI and Digital Policy.......................... 769
CDT--Center for Democracy & Technology........................... 776
End OSEAC--Coalition to Protect Kids Online...................... 780
End OSEAC--Coalition to Protect Kids Online--
Testimony by Christine Almadjian................................ 784
Global Survivor Network--GSN..................................... 788
Home Office--Homeland Security................................... 791
Letter to Congress-Abigail (Survivor)............................ 792
Letter to Congress-D.J. (Survivor)............................... 793
Letter to Congress-Elle (Survivor)............................... 796
Letter to Congress-Leah (Survivor)............................... 799
Letter to Congress-Lexie (Survivor).............................. 800
Letter to Congress-Millie (Survivor)............................. 802
Letter to Congress-James (Parent)................................ 803
Letter to Congress-Julia (Parent)................................ 804
Letter to Congress-Zack (Loving Brother)......................... 805
Letter to Congress (Redact)...................................... 807
ParentsTogether--Hart Research Associates--Polling Data Research. 811
Submission for the Record--Sydnie Collins........................ 846
Submission for the Record--Alix Fraser........................... 849
Submission for the Record--Arielle Geismar....................... 851
Submission for the Record--Trisha Prabhu......................... 853
Submission for the Record--Mary Rodee............................ 855
Submission for the Record--Uldouz Wallace........................ 857
Survivor Parents................................................. 858
UK Approach on E2EE.............................................. 862
UK Online Safety Act............................................. 864
Dangerous by Design--Council for Responsible Social Media--CRSM
https://www.govinfo.gov/content/pkg/CHRG-118shrg57444/CHRG-
118shrg
57444-add1.PDF
Submitted by Ranking Member Graham:
UK Approach on E2EE.............................................. 862
UK Online Safety Act............................................. 864
Submitted by Senator Klobuchar:
Survivor Parents................................................. 858
Submitted by Senator Blumenthal:
Count on Mothers--KOSA--Report Findings.......................... 874
Docket 518--Zuckerberg--Motion to Dismiss........................ 884
Docket 518-1--Zuckerberg--Motion to Dismiss--Appendix A.......... 904
Docket 538--Zuckerberg--Opposition--Motion to Dismiss............ 906
Docket 555--Zuckerberg--Reply in support of--Motion to Dismiss... 932
Survivor Parents................................................. 858
Submitted by Senator Blackburn:
Social Media Victims Law Center--SMVLC--Advertising Directed to
Underage Kids................................................... 953
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
[all]