[Senate Hearing 117-843]
[From the U.S. Government Publishing Office]




                                                        S. Hrg. 117-843

                   PROTECTING KIDS ONLINE: SNAPCHAT, 
                          TIKTOK, AND YOUTUBE

=======================================================================

                                HEARING

                               before the

                   SUBCOMMITTEE ON CONSUMER PROTECTION,
                    PRODUCT SAFETY, AND DATA SECURITY

                                 of the

                         COMMITTEE ON COMMERCE,
                      SCIENCE, AND TRANSPORTATION
                          UNITED STATES SENATE

                    ONE HUNDRED SEVENTEENTH CONGRESS

                             FIRST SESSION

                               __________


                            OCTOBER 26, 2021

                               __________

    Printed for the use of the Committee on Commerce, Science, and 
                             Transportation






                 [GRAPHIC NOT AVAILABLE IN TIFF FORMAT]






                Available online: http://www.govinfo.gov



                               ______
                                 

                 U.S. GOVERNMENT PUBLISHING OFFICE

54-901 PDF                WASHINGTON : 2024











       SENATE COMMITTEE ON COMMERCE, SCIENCE, AND TRANSPORTATION

                    ONE HUNDRED SEVENTEENTH CONGRESS

                             FIRST SESSION

                   MARIA CANTWELL, Washington, Chair

AMY KLOBUCHAR, Minnesota             ROGER WICKER, Mississippi, Ranking
RICHARD BLUMENTHAL, Connecticut      JOHN THUNE, South Dakota
BRIAN SCHATZ, Hawaii                 ROY BLUNT, Missouri
EDWARD MARKEY, Massachusetts         TED CRUZ, Texas
GARY PETERS, Michigan                DEB FISCHER, Nebraska
TAMMY BALDWIN, Wisconsin             JERRY MORAN, Kansas
TAMMY DUCKWORTH, Illinois            DAN SULLIVAN, Alaska
JON TESTER, Montana                  MARSHA BLACKBURN, Tennessee
KYRSTEN SINEMA, Arizona              TODD YOUNG, Indiana
JACKY ROSEN, Nevada                  MIKE LEE, Utah
BEN RAY LUJAN, New Mexico            RON JOHNSON, Wisconsin
JOHN HICKENLOOPER, Colorado          SHELLEY MOORE CAPITO, West 
RAPHAEL WARNOCK, Georgia                 Virginia
                                     RICK SCOTT, Florida
                                     CYNTHIA LUMMIS, Wyoming

                 Melissa Porter, Deputy Staff Director
       George Greenwell, Policy Coordinator and Security Manager
                 John Keast, Republican Staff Director
            Crystal Tully, Republican Deputy Staff Director
                      Steven Wall, General Counsel

                                 ------                                

         SUBCOMMITTEE ON CONSUMER PROTECTION, PRODUCT SAFETY, 
                           AND DATA SECURITY

RICHARD BLUMENTHAL, Connecticut,     MARSHA BLACKBURN, Tennessee, 
    Chair                                Ranking
AMY KLOBUCHAR, Minnesota             JOHN THUNE, South Dakota
BRIAN SCHATZ, Hawaii                 ROY BLUNT, Missouri
EDWARD MARKEY, Massachusetts         JERRY MORAN, Kansas
TAMMY BALDWIN, Wisconsin             MIKE LEE, Utah
BEN RAY LUJAN, New Mexico            TODD YOUNG, Indiana







                            C O N T E N T S

                              ----------                              
                                                                   Page
Hearing held on October 26, 2021.................................     1
Statement of Senator Blumenthal..................................     1
Statement of Senator Blackburn...................................     4
Statement of Senator Klobuchar...................................    27
Statement of Senator Markey......................................    31
Statement of Senator Thune.......................................    35
Statement of Senator Baldwin.....................................    37
Statement of Senator Cantwell....................................    39
Statement of Senator Lee.........................................    41
Statement of Senator Lujan.......................................    44
Statement of Senator Cruz........................................    48
Statement of Senator Moran.......................................    51
Statement of Senator Sullivan....................................    55
Statement of Senator Lummis......................................    57

                               Witnesses

Jennifer Stout, Vice President of Global Public Policy, Snap Inc.     5
    Prepared statement...........................................     7
Michael Beckerman, Vice President and Head of Public Policy, 
  Americas, TikTok...............................................    10
    Prepared statement...........................................    11
Leslie Miller, Vice President, Government Affairs and Public 
  Policy, YouTube................................................    17
    Prepared statement...........................................    18

                                Appendix

Response to written questions submitted to Jennifer Stout by:
    Hon. Richard Blumenthal......................................    75
    Hon. Amy Klobuchar...........................................    77
Response to written questions submitted to Michael Beckerman by:
    Hon. Richard Blumenthal......................................   212
    Hon. Amy Klobuchar...........................................   215
    Hon. Marsha Blackburn........................................   218
    Hon. Rick Scott..............................................   219
    Hon. Cynthia Lummis..........................................   220
    Hon. John Thune..............................................   221
    Hon. Ted Cruz................................................   221
    Hon. Mike Lee................................................   225
Response to written questions submitted to Leslie Miller by:
    Hon. Richard Blumenthal......................................   225
    Hon. Amy Klobuchar...........................................   231
    Hon. Marsha Blackburn........................................   235
    Hon. Rick Scott..............................................   235
    Hon. Cynthia Lummis..........................................   236
    Hon. John Thune..............................................   237
    Hon. Mike Lee................................................   239







 
                   PROTECTING KIDS ONLINE: SNAPCHAT, 
                          TIKTOK, AND YOUTUBE

                              ----------                              


                       TUESDAY, OCTOBER 26, 2021

                               U.S. Senate,
      Subcommittee on Consumer Protection, Product 
                         Safety, and Data Security,
        Committee on Commerce, Science, and Transportation,
                                                    Washington, DC.
    The Committee met, pursuant to notice, at 10:01 a.m., in 
room SR-253, Russell Senate Office Building, Hon. Richard 
Blumenthal, Chairman of the Subommittee, presiding.
    Present: Senators Blumenthal [presiding], Cantwell, 
Klobuchar, Markey, Baldwin, Lujan, Blackburn, Thune, Moran, 
Lee, Cruz, Sullivan, and Lummis.

         OPENING STATEMENT OF HON. RICHARD BLUMENTHAL, 
                 U.S. SENATOR FROM CONNECTICUT

    Senator Blumenthal. Welcome to this hearing on protecting 
kids on social media. I want to thank the Ranking Member, 
Senator Blackburn, who has been a very, very close partner in 
this work, as well as Chairman Cantwell and Ranking Member 
Wicker for their support. We are joined by Senator Thune and 
Senator Amy Klobuchar and Senator Ed Markey. They have all been 
extraordinary leaders in this effort. And today, I should note, 
is the first time that TikTok and Snap have appeared before 
Congress. I appreciate you and YouTube for your testimony this 
morning. Means a lot.
    Our hearing with the Facebook whistleblower, Frances 
Haugen, was a searing indictment, along with her documents of a 
powerful, gigantic corporation that puts profits ahead of 
people, especially our children. There has been a definite and 
deafening drumbeat of continuing disclosures about Facebook. 
They have deepened America's concerns and outrage and have led 
to increasing calls for accountability, and there will be 
accountability.
    This time is different. Accountability to parents and the 
public, accountability to Congress, accountability to investors 
and shareholders, and accountability to the Securities and 
Exchange Commission and other Federal agencies, because there 
is ample credible evidence to start an investigation there. But 
today we are concerned about continuing to educate the American 
public and ourselves about how we can face this crisis. What we 
learned from Ms. Haugen's disclosures and reporting since then 
is absolutely repugnant and abhorrent about Instagram's 
algorithms creating a perfect storm, in the words of one of 
Facebook's own researchers.
    As that person said, it exacerbates downward spirals, 
harmful to teens, it fuels hate and violence, it prioritizes 
profits over the people that it hurts. In effect, the 
algorithms push emotional and provocative content, toxic 
content that amplifies depression, anger, hate, anxiety because 
those emotions attract and hook kids and others to their 
platforms. In effect, the more content and more extreme 
versions of it are pushed to children who expressed an interest 
in online bullying, eating disorders, self-harm, even suicide. 
And that is why we now have a drumbeat of demands for 
accountability, along with the drumbeats of disclosure. But we 
are hearing the same stories and reports of the same harms 
about the tech platforms that are represented here today.
    I have heard from countless parents and medical 
professionals in Connecticut and elsewhere around the country 
about the same phenomenon on Snapchat, YouTube, and TikTok. In 
effect, that business model is the same, more eyeballs means 
more dollars. Everything that you do is to add users, 
especially kids, and keep them on your apps for longer. I 
understand from your testimony that your defense is, we are not 
Facebook, we are different, and we are different from each 
other. Being different from Facebook is not a defense. That bar 
is in the gutter. It is not a defense to say that you are 
different. What we want is not a race to the bottom, but really 
a race to the top. So we will want to know from you what 
specific steps you are taking that protect children, even if it 
means foregoing profits. We are going to want to know what 
research you have done.
    Similar to the studies and data that have been disclosed 
about Facebook. And we are going to want to know whether you 
will support real reforms, not just the tweaks and the minor 
changes that you suggested and recounted in your testimony. The 
picture that we have seen from an endless stream of videos that 
is automatically selected by sophisticated algorithms shows 
that you too drive and strive to find something that teens will 
like and then drive more of it to them.
    If you learn that a teen feels insecure about his or her 
body, it is a recipe for disaster because you start out on 
dieting tips, but the algorithm will raise the temperature, it 
will flood that individual with more and more extreme messages, 
and after a while, all of the videos are about eating 
disorders. It is the same rabbit hole, driving kids down those 
rabbit holes created by algorithms that leads to dark places 
and encourages more destructive hate and violence.
    We have done some listening. I heard from Nora in Westport, 
who allowed her 11-year-old daughter, Avery, on TikTok because 
she thought it was just girls dancing. Avery wanted to exercise 
more with the shutdown of school and sports so like most people 
she went online. Nora told me about the rabbit hole that TikTok 
and YouTube's algorithms pulled her daughter into. She began to 
see ever more extreme videos about weight loss. Avery started 
exercising compulsively and ate only one meal a day. Her body 
weight dropped dangerously low, and she was diagnosed with 
anorexia. Avery is now luckily in treatment, but the financial 
cost of care is an extreme burden, and her education has 
suffered. We heard from parents who had the same experience.
    So we, not only listened, but we checked ourselves. On 
YouTube, my office created an account as a teenager. Like 
Avery, we watched a few videos about extreme dieting and eating 
disorders. They were easy to find. YouTube's recommendation 
algorithm began to promote extreme dieting and eating disorder 
videos each time we opened the app. These were often videos 
about teens starving themselves, as you can see from this 
poster. The red is eating disorder related content, the green 
is all the other videos. One is before and the other is after 
the algorithm kicked in. You can see the difference. We also 
received these recommendations each time we watched other 
videos. It is mostly eating disorder content. There was no way 
out of this rabbit hole.
    Another parent in Connecticut wrote to me about how their 
son was fed a constant stream of videos on TikTok related to 
disordered eating and calorie counting after looking up 
athletic training. As scientific research has shown, eating 
disorders and body comparison also significantly affect young 
men on social media. Young men often feel compelled to bulk up 
to look a certain way. Again, I heard about this pressure all 
too often. We created an account on TikTok. Troublingly, it was 
easy searching TikTok, to go from men's fitness to steroid use. 
It took us only 1 minute, 1 minute to find TikTok accounts 
openly promoting and selling illegal steroids.
    We all know the dangers of steroids, and they are illegal. 
All of this research and facts and disclosures send a message 
to America's parents, you cannot trust big tech with your kids. 
Parents of America cannot trust these apps with their children. 
And big tech cannot say to parents, you must be the 
gatekeepers, you must be the social media copilots, you must be 
the app police because parents should not have to bear that 
burden alone. We need stronger rules to protect children 
online, real transparency, real accountability. I want a market 
where the competition is to protect children, not to exploit 
them, not a race to the bottom, but a competition for the top.
    We have said that this moment is for big tech, a big 
tobacco moment. And I think there is a lot of truth to that 
contention because it is a moment of reckoning. The fact is 
that like big tobacco, big tech has lured teens despite knowing 
its products can be harmful. It has sought to associate itself 
with celebrities, fashion, and beauty, everything that appeals 
to a young audience. And like big tobacco, Facebook hid from 
parents and the public the substantial evidence that Instagram 
could have a negative effect on teen health. But the products 
are different. Big tech is not irredeemably bad like big 
tobacco. Big tobacco and the tobacco products, when used by the 
customer as the way the manufacturer intended, actually kill 
the customer.
    As Ms. Haugen said, our goal is not to burn Facebook to the 
ground. It is to bring out the best to improve it, to impose 
accountability. As she said, ``we can have social media we 
enjoy that connects us without tearing apart our democracy, 
putting our children in danger, and sowing ethnic violence 
around the world. We can do better.'' And I agree. Thank you, 
and we will turn now to the Ranking Member.

              STATEMENT OF HON. MARSHA BLACKBURN 
                  U.S. SENATOR FROM TENNESSEE

    Senator Blackburn. Thank you, Mr. Chairman. And thank you 
to our witnesses for being here today. We do appreciate this. 
And Mr. Chairman, I thank your staff for the good work that 
they have done working to facilitate these hearings on holding 
big tech accountable. We appreciate that. And today's 
conversation is something that is much needed, and it is long 
overdue.
    For too long, we have allowed platforms to promote and 
glorify dangerous content for its kid and teen users. In the 
weeks leading up to this hearing, I have heard from parents, 
from teachers, from mental health professionals who are all 
wondering the same thing, how long are we going to let this 
continue? And what will it take for platforms to finally crack 
down on the viral challenges, the illicit drugs, the eating 
disorder content, and the child sexual abuse material? We find 
this on your platforms, and teachers and parents and mental 
health physicians cannot figure out why you allow this to 
happen.
    It seems like every day that I hear stories about kids and 
teens who are suffering after interacting with TikTok, YouTube, 
and Snapchat. Kids as young as nine have died doing viral 
challenges on TikTok. And we have seen teen girls lured into 
inappropriate sexual relationships with predators on Snapchat. 
You are parents. How can you allow this? How can you allow 
this? I have learned about kids and teens who commit suicide 
due to bullying that they have suffered on these sites and the 
platforms' refusal to work with law enforcement and families to 
stop the harassment when asked. If it were your child, what 
would you do to protect your child? Does it matter to you?
    My staff has viewed abusive content featuring minors and 
videos of people slitting their wrists on YouTube. It is there. 
Yet all the while, kids and teens are flocking to these sites 
in increasing numbers in the platforms love it as they know 
that youth are a captive audience, one which will continue 
consuming the content that is fed to them through these 
algorithms, even if it puts them in danger. They are curious. 
They get pulled down the rabbit hole. They continue to watch. 
And these platforms are getting more and more data about our 
children. But do we know what they are doing with that data?
    In the case of Facebook, we learned that they are using it 
to sell their products to younger and younger children, those 
who cannot legally use their services. But these platforms, you 
all know you have children on these platforms that are too 
young to be on these platforms and you allow it to continue 
because it is money--it is money in the bank. It is money in 
your paycheck. And obviously money trumps everything. And with 
some of our witnesses today, we have real reason to question 
how they are collecting and using the data that they get from 
American children and teens.
    I have made no secret about my concerns that TikTok, which 
is owned by Beijing owned ByteDance, is paving the way for the 
Chinese Government to gain unfettered access to our children 
and teens. And TikTok, despite vague assurances that say, and I 
am quoting, ``store data outside of China'' has not alleviated 
my concerns in the slightest. In fact, earlier this year, they 
changed their privacy policy to allow themselves to collect 
even more data on Americans. Now they want your face prints and 
voice prints in addition to your geolocation data and your 
keystroke patterns and rhythms.
    They also collect audio that comes from devices connected 
to your smartphone, like smart speakers. They collect face and 
body attributes from videos, as well as the objects and the 
scenery that appear in those videos. This makes sense to some 
degree to create videos. But given China's propensity to 
surveil its own citizens, why should we trust that TikTok 
through ByteDance isn't doing the same to us?
    Most Americans have no idea this is happening. And while 
they hope on the face of things to reassure us by creating U.S. 
based offices for their high priced lobbyists and marketing 
personnel, it just is not enough. We see through it all, as we 
will get into today. TikTok's own privacy policies give them an 
out to share data with ByteDance and the Chinese Communist 
Party. Yet most Americans have absolutely no idea if the 
Chinese Communist Party is getting their information.
    The time has come where we must focus on what Congress can 
do to secure American consumers' personal data. As a mother and 
a grandmother, I know this is doubly important when it comes to 
our children. As this hearing will show, we can't afford to 
wait on that. So thank you to the witnesses. We look forward to 
your cooperation. Thank you, Mr. Chairman.
    Senator Blumenthal. Thanks, Senator Blackburn. We have with 
us this morning Ms. Jennifer Stout, Vice President of Global 
Public Policy at Snapchat. Ms. Stout is the Vice President of 
Global Policy, and prior to Snapchat, she spent most of her 
career in Government working for then Senator Joe Biden and at 
the United States Department of State. Mr. Michael Beckerman is 
Vice President and Head of Public Policy at America's TikTok.
    He joined TikTok in February 2020 and leads their 
Government relations. He was previously the Founding President 
and CEO of the Internet Association. We are joined by Ms. 
Leslie Miller, remotely, Vice President, Government Affairs and 
Public Policy at YouTube. Ms. Miller leads YouTube's public 
policy team. Previously, she served as Acting Head of Global 
Policy for Google. Why don't we begin with your testimony, Ms. 
Stout? Thank you.

 STATEMENT OF JENNIFER STOUT, VICE PRESIDENT OF GLOBAL PUBLIC 
                       POLICY, SNAP INC.

    Ms. Stout. Thank you, Mr. Chairman. Chairman Blumenthal, 
Ranking Member Blackburn, and members of the Subcommittee, 
thank you so much for the opportunity to be here today. My name 
is Jennifer Stout, and I am the Vice President of Global Public 
Policy at Snap. I have been in this role for nearly 5 years 
after spending almost two decades in public service, more than 
half in Congress.
    I have tremendous respect for this institution and the work 
that you are doing to ensure that young people are having safe 
and healthy online experiences. To understand Snap's approach 
to protecting young people on our platform, it is helpful to 
start at the beginning. Snapchat's founders were part of the 
first generation to grow up with social media. Like many of 
their peers, they saw that while social media was capable of 
making a positive impact, it also had certain features that 
troubled them.
    These platforms encouraged people to broadcast their 
thoughts permanently. Young people were constantly measuring 
themselves by likes, by comments, trying to present a perfect 
version of themselves because of social pressures and judgment. 
Social media also evolved to feature an endless feed of 
unvetted content, exposing individuals to a flood of viral, 
misleading, and harmful information.
    Snapchat is different. Snapchat was built as an antidote to 
social media. From the start, there were three key ways we 
prioritized privacy and safety. First, we decided to have 
Snapchat opened to a camera instead of a feed of content. This 
created a blank canvas for friends to visually communicate with 
each other in a way that was more immersive than text. Second, 
we embraced strong privacy principles and the idea of 
ephemerality, making images delete by default. Social media may 
have normalized having a permanent record of conversations 
online, but in real life, friends don't break out their tape 
recorder to document every conversation.
    Third, we focused on connecting people who are already 
friends in real life by requiring that both Snapchatters opt in 
to being friends in order to communicate because in real life, 
friendships are mutual. We have worked hard to keep evolving 
responsibly. Understanding the potential negative effects of 
social media, we made proactive choices to ensure that all of 
our future products reflected those early values. We were 
influenced by existing regulatory frameworks that govern 
broadcast and telecommunications when developing the parts of 
our app where users had the potential to reach a large 
audience.
    Discover, which is our closed content platform and features 
content from professional media publishers and verified users, 
and Spotlight, where creators can submit creative and 
entertaining videos to share, are both structured in a way that 
does not allow unvetted content to get reach. Our design 
protects our audience and makes us different. When it comes to 
young people, we have made intentional choices to apply 
additional protections to keep them safe. We have adopted 
responsible design principles and rigorous processes that 
consider the privacy and safety of new features right from the 
beginning. We take into account the unique sensitivities of 
young people. We intentionally make it harder for strangers to 
find minors by not allowing public profiles for users under 18.
    We have long deployed age dating tools to prevent minors 
from viewing age regulated content and ads. We make no effort 
and have no plans to market to young children. Individuals 
under the age of 13 are not permitted to create Snapchat 
accounts, and if we find them, we remove them. Additionally, we 
are developing new tools that will give parents more oversight 
over how their teens are using Snapchat. Protecting the well-
being of our community is something we approach with both 
humility and determination. Over 500 million people around the 
world use Snapchat and 95 percent of our community says that 
Snapchat makes them feel happy because it connects them to 
their friends.
    We have a moral responsibility to take into account the 
best interests of our users in everything we do, and we 
understand that there is more work to be done. As we look to 
the future, we believe that regulation is necessary. But given 
the different speeds at which technology develops and which the 
rate at which regulation can be implemented, regulation alone 
can't get the job done. Technology companies must take 
responsibility to protect the communities they serve. If they 
don't, Government must act to hold them accountable.
    We fully support the Subcommittee's approach and efforts to 
investigate these issues, and we welcome a collaborative 
approach to problem-solving that keeps our young people safe 
online. Thank you again for the opportunity to appear today, 
and I look forward to answering your questions.
    [The prepared statement of Ms. Stout follows:]

                 Prepared Statement of Jennifer Stout, 
            Vice President of Global Public Policy, Snap Inc
Introduction
    Chairman Blumenthal, Ranking Member Blackburn, and members of the 
Subcommittee, thank you for the opportunity to appear before you today. 
My name is Jennifer Stout and I serve as the Vice President of Global 
Public Policy at Snap Inc., the parent company of Snapchat. It's an 
honor and privilege to be back in the Senate 23 years after first 
getting my start in public service as a Senate staffer, this time in a 
much different capacity--to speak about Snap's approach to privacy and 
safety, especially as it relates to our youngest community members. I 
have been in this role for nearly five years, after spending almost two 
decades in public service, more than half of which was spent in 
Congress. I have tremendous respect for this institution and the work 
you and your staff are doing to make sure that tech platforms ensure 
that our youth are having safe and healthy online experiences.
    To understand Snap's approach to protecting young people on our 
platform, it's helpful to start at the beginning. Snapchat's founders 
were part of the first generation to grow up with social media. Like 
many of their peers, they saw that while social media was capable of 
making a positive impact, it also had certain features that negatively 
impacted their friendships. These platforms encouraged people to 
publicly broadcast their thoughts and feelings, permanently. Our 
founders saw how people were constantly measuring themselves against 
others through ``likes'' and comments, trying to present a version of 
themselves through perfectly curated images, and carefully scripting 
their content because of social pressure. Social media also evolved to 
feature an endless feed of unvetted content, exposing people to a flood 
of viral, misleading, and harmful content.
    Snapchat was built as an antidote to social media. In fact, we 
describe ourselves as a camera company. Snapchat's architecture was 
intentionally designed to empower people to express a full range of 
experiences and emotions with their real friends, not just the pretty 
and perfect moments. In the formative years of our company, there were 
three major ways our team pioneered new inventions to prioritize online 
privacy and safety.
    First, we decided to have Snapchat open to a camera instead of a 
feed of content. This created a blank canvas for friends to visually 
communicate with each other in a way that is more immersive and 
creative than sending text messages.
    Second, we embraced strong privacy principles, data minimization, 
and the idea of ephemerality, making images delete-by-default. This 
allowed people to genuinely express themselves in the same way they 
would if they were just hanging out at a park with their friends. 
Social media may have normalized having a permanent record of 
conversations online, but in real life, friends don't break out their 
tape recorder to document every single conversation for public 
consumption or permanent retention.
    Third, we focused on connecting people who were already friends in 
real life by requiring that, by default, both Snapchatters opt-in to 
being friends in order to communicate. Because in real life, 
friendships are mutual. It's not one person following the other, or 
random strangers entering our lives without permission or invitation.
A Responsible Evolution
    Since those early days, we have worked to continue evolving 
responsibly. Understanding the potential negative effects of social 
media, we made proactive choices to ensure that all of our future 
products reflected those early values.
    We didn't need to reinvent the wheel to do that. Our team was able 
to learn from history when confronting the challenges posed by new 
technology. As Snapchat evolved over time, we were influenced by 
existing regulatory frameworks that govern broadcast and 
telecommunications when developing the parts of our app where users 
could share content that has the potential to reach a large audience. 
For instance, when you talk to your friends on the phone, you have a 
high expectation of privacy, whereas if you are a public broadcaster 
with the potential to influence the minds and opinions of many, you are 
subject to different standards and regulatory requirements.
    That dichotomy helped us to develop rules for the more public 
portions of Snapchat that are inspired by broadcast regulations. These 
rules protect our audience and differentiate us from other platforms. 
For example, Discover, our closed content platform where Snapchatters 
get their news and entertainment, exclusively features content from 
either professional media publishers who partner with us, or from 
artists, creators, and athletes who we choose to work with. All of 
these content providers have to abide by our Community Guidelines, 
which apply to all of the content on our platform. But Discover 
publisher partners also must abide by our Publisher Guidelines, which 
include requiring that content is fact-checked or accurate and age-
gated when appropriate. And for individual creators featured in 
Discover, our human moderation teams review their Stories before we 
allow them to be promoted on the platform. While we use algorithms to 
feature content based on individual interests, they are applied to a 
limited and vetted pool of content, which is a different approach from 
other platforms.
    On Spotlight, where creators can submit creative and entertaining 
videos to share with the broader Snapchat community, all content is 
first reviewed automatically by artificial intelligence before gaining 
any distribution, and then human-reviewed and moderated before it can 
be viewed by more than 25 people. This is done to ensure that we reduce 
the risk of spreading misinformation, hate speech, or other potentially 
harmful content.
    We don't always get it right the first time, which is why we 
redesign parts of Snapchat when they aren't living up to our values. 
That's what happened in 2017 when we discovered that one of our 
products, Stories, was making Snapchatters feel like they had to 
compete with celebrities and influencers for attention because content 
from celebrities and friends were combined in the same user interface. 
As a result of that observation, we decided to separate ``social'' 
content created by friends from ``media'' content created by 
celebrities to help reduce social comparison on our platform. This 
redesign negatively impacted our user growth in the short-term, but it 
was the right thing to do for our community.
Protecting Young People on Snapchat
    Our mission--to empower people to express themselves, live in the 
moment, learn about the world, and have fun together--informed 
Snapchat's fundamental architecture. Adhering to this mission has 
enabled us to create a platform that reflects human nature and fosters 
real friendships. It continues to influence our design processes and 
principles, our policies and practices, and the resources and tools we 
provide to our community. And it undergirds our constant efforts to 
improve how we address the inherent risks and challenges associated 
with serving a large online community.
    A huge part of living up to our mission has been building and 
maintaining trust with our community and partners, as well as parents, 
lawmakers, and safety experts. Those relationships have been built 
through the deliberate, consistent decisions we have made to put 
privacy and safety at the heart of our product design process.
    For example, we have adopted responsible design principles that 
consider the privacy and safety of new products and features right from 
the beginning of the development process. And we've made those 
principles come to life through rigorous processes. Every new feature 
in Snapchat goes through a defined privacy and safety review, conducted 
by teams that span Snap--including designers, data scientists, 
engineers, product managers, product counsel, policy leads, and privacy 
engineers--long before it sees the light of day.
    While more than 80 percent of our community in the United States is 
18 or older, we have spent a tremendous amount of time and resources to 
protect teenagers. We've made thoughtful and intentional choices to 
apply additional privacy and safety policies and design principles to 
help keep teenagers safe. That includes:

   Taking into account the unique sensitivities and 
        considerations of minors when we design products. That's why we 
        intentionally make it harder for strangers to find minors by 
        banning public profiles for people under 18 and are rolling out 
        a feature to limit the discoverability of minors in Quick Add 
        (friend suggestions). And why we have long deployed age-gating 
        tools to prevent minors from viewing age-regulated content and 
        ads.

   Empowering Snapchatters by providing consistent and easy-to-
        use controls like turning location sharing off by default and 
        offering streamlined in-app reporting for users to report 
        concerning content or behaviors to our Trust and Safety teams. 
        Once reported, most content is actioned in under 2 hours to 
        minimize the potential for harm.

   Working to develop tools that will give parents more 
        oversight without sacrificing privacy--including plans to 
        provide parents the ability to view their teen's friends, 
        manage their privacy and location settings, and see who they're 
        talking to.

   Investing in educational programs and initiatives that 
        support the safety and mental health of our community--like 
        Friend Check Up and Here for You. Friend Check Up prompts 
        Snapchatters to review who they are friends with and make sure 
        the list is made up of people they know and still want to be 
        connected with. Here for You provides support to users who may 
        be experiencing mental health or emotional crises by providing 
        tools and resources from experts.

   Preventing underage use. We make no effort--and have no 
        plans--to market to children, and individuals under the age of 
        13 are not permitted to create Snapchat accounts. When 
        registering for an account, individuals are required to provide 
        their date of birth, and the registration process fails if a 
        user inputs an age under the age of 13. We have also 
        implemented a new safeguard that prevents Snapchat users 
        between 13-17 with existing accounts from updating their 
        birthday to an age of 18 or above. Specifically, if a minor 
        attempts to change their year of birth to an age over 18, we 
        will prevent the change as a way to ensure that users are not 
        accessing age-inappropriate content within Snapchat.
Conclusion and Looking Ahead
    We're always striving for new ways to keep our community safe, and 
we have more work left to do. We know that online safety is a shared 
responsibility, spanning a host of sectors and actors. We are committed 
to doing our part in concert with safety partners including our Safety 
Advisory Board, technology industry peers, government, and civil 
society. From technology-focused and awareness-raising initiatives, to 
research and best practice sharing, we are actively engaged with 
organizations dedicated to protecting minors online. We also know that 
there are many complex problems and technical challenges across our 
industry, including age verification of minors, and we remain committed 
to working with partners and policymakers to identify robust industry-
wide solutions.
    Protecting the wellbeing of Snapchatters is something we approach 
with both humility and steadfast determination. Over 500 million people 
around the world use Snapchat every month and while 95 percent of our 
users say Snapchat makes them feel happy, we have a moral 
responsibility to take into account their best interests in everything 
we do. That's especially true as we innovate with augmented reality--
which has the potential to positively contribute to the way we work, 
shop, learn, and communicate. We will apply those same founding values 
and principles as we continue to experiment with new technologies like 
the next generation of augmented reality.
    As we look to the future, computing and technology will become 
increasingly integrated into our daily lives. We believe that 
regulation is necessary but given the speed at which technology 
develops and the rate at which regulation can be implemented, 
regulation alone can't get the job done. Technology companies must take 
responsibility and actively protect the communities they serve.
    If they don't, the government must act swiftly to hold them 
accountable. We fully support the Subcommittee's efforts to investigate 
these issues and welcome a collaborative approach to problem solving 
that keeps our society safe.
    Thank you again for the opportunity to appear before you today and 
discuss these critical issues. I look forward to answering your 
questions.

    Senator Blumenthal. Thanks, Ms. Stout. Mr. Beckerman.

  STATEMENT OF MICHAEL BECKERMAN, VICE PRESIDENT AND HEAD OF 
                PUBLIC POLICY, AMERICAS, TIKTOK

    Mr. Beckerman. Chairman Blumenthal, Ranking Member 
Blackburn, and members of the Subcommittee, my name is Michael 
Beckerman, and I am the Vice President of Public Policy for the 
Americas at TikTok. I am also the father of two young 
daughters. I am passionate about ensuring that our children 
stay safe online. I joined TikTok after nearly a decade 
representing the Internet industry at large because I saw an 
opportunity to help TikTok responsibly grow from a young 
startup to a trusted entertainment platform.
    TikTok is not a social network based on followers or social 
graph. It is not an app that people check to see what their 
friends are doing. You watch TikToks, you create on TikTok. The 
passion and creativity and diversity of our community has 
fueled new cultural trends, chart topping artists, and 
businesses across the country. It has been a bright spot for 
American families who create videos together, and I have heard 
from countless friends and family and even members of the 
Senate and your staff about how joyful and fun and entertaining 
and authentic TikTok content truly is.
    And I am proud of the hard work that our safety teams do 
every single day to safeguard our community, and that our 
leadership makes safety and wellness a priority, particularly 
to protect teens on the platform. Being open and humble is 
important to the way we operate at TikTok. In the context of 
the hearing today, that means we seek out feedback from experts 
and stakeholders to constantly improve. When we find areas or 
find flaws where we can do better, we hold ourselves 
accountable and we find solutions.
    Turning a blind eye to areas where we can improve is not 
part of our company's DNA. But most importantly, we strive to 
do the right thing, protecting people on the platform. When it 
comes to protecting minors, we work to create age appropriate 
experiences for teens throughout their development. We have 
proactively built privacy and safety protections with this in 
mind. For example, people under 16 have their accounts set to 
private automatically. They can't host live streams, and they 
can't send direct messages on our platform, and we don't allow 
anyone to send off platform videos, images, or links via direct 
messaging.
    These are perhaps underappreciated product choices that go 
a long way to protect teens, and we made these decisions which 
are counter to industry norms or our own short term growth 
interests because we are committed to do what is right and 
building for the long term. We support parents in their 
important role to protect teens. That's why we have built 
parental controls called family pairing that empower a parent 
to link their TikTok account in a simple way from their own 
device of the parent to their teenager's account to enable a 
range of privacy and safety controls.
    And I encourage all the parents that are listening to the 
hearing today to take an active role in your teen's phone and 
app use. And if they are on TikTok, please check out family 
pairing. Visit our youth portal, read through our Guardians 
guide that is on our safety center. Our tools for parents are 
industry leading, innovative, but we are always looking to add 
and improve. It is important to know that our work is not done 
in a vacuum.
    It is critical for platforms, experts, and Governments to 
collaborate on solutions that protect the safety and well-being 
of teens. That is why we partnered with Common Sense Networks, 
who helps us ensure that we provide age appropriate content and 
our under-13 app experience. We also work closely with the 
National Center for Missing and Exploited Children, the 
National Parent Teacher Association, the Digital Wellness Lab 
at Boston Children's Hospital, and our U.S. Content Advisory 
Council.
    TikTok has made tremendous strides to promote the safety 
and well-being of teens, but we also acknowledge, and we are 
transparent about the room that we have to grow and improve. 
For example, we are investing in new ways for our community to 
enjoy content based on age appropriateness or family comfort. 
And we are developing more features that empower people to 
shape and customize their experience in the app. But there is 
no finish line when it comes to protecting children and teens.
    The challenges are complex, but we are determined to work 
hard and keep the platform safe and create age appropriate 
experiences. We do know trust must be earned, and we are 
seeking to earn trust through a higher level of action, 
transparency, and accountability, as well as a humility to 
learn and improve. Thank you for your leadership on these 
important issues. I look forward to answering the questions of 
the Committee.
    [The prepared statement of Mr. Beckerman follows:]

  Prepared Statement of Michael Beckerman, Vice President and Head of 
                    Public Policy, Americas, TikTok
    Chairman Blumenthal, Ranking Member Blackburn, and distinguished 
members of the Subcommittee, thank you for the opportunity to appear 
before you today to discuss how industry is working to provide a safe 
and secure online experience for younger users.
    My name is Michael Beckerman, and I am the Vice President and Head 
of Public Policy for the Americas at TikTok. Prior to joining TikTok, I 
was the founding President and CEO of the Internet Association, which 
represents leading global Internet companies on public policy issues. I 
also served for twelve years as a Congressional staffer, including as 
the Deputy Staff Director of the House Committee on Energy and 
Commerce.
    TikTok is a global entertainment platform where people create and 
watch short-form videos. Our mission is to inspire creativity and bring 
joy, and that mission is the foundation for our privacy and safety 
policies that aim to protect and promote the well-being of minors on 
the app. The TikTok community has ushered in a new era in user-
generated content by fostering authenticity. One of the unique things 
about TikTok, is how our powerful yet easy-to-use tools democratize 
video creation, enabling everyday people to express themselves 
creatively and find their community on the platform. This approach has 
resulted in more authentic content and has helped launch new cultural 
trends from feta pasta to the resurgence of Fleetwood Mac's ``Dreams.'' 
It has allowed small businesses to find their voice, and to expand 
their reach and customer base, and it has been a bright spot for 
American families during the Covid-19 pandemic. Parents are sharing 
their journeys through fatherhood and motherhood, building an inclusive 
community for families of all backgrounds. In a study conducted by 
Nielson, TikTok was the only app where a top reason for usage was, ``to 
lift my spirits.''
    Since its launch a few years ago, TikTok has experienced 
exponential growth, and billions of videos are being created each 
month. We continue to build out and strengthen our teams and technology 
in support of our commitment to maintain a safe and supportive 
environment for our community.
    We empower people who use the app with a robust set of controls to 
customize their individual experience, and we are constantly working to 
build new features that support a positive and safe environment. For 
example, we give people the ability to restrict who can send them 
direct messages (with direct messages turned off entirely for people 
under 16 years old), to filter out comments containing keywords they 
select or that otherwise may be inappropriate, to delete comments in 
bulk, to easily report activity that violates our policies, and to 
shape the type of content that they will see. We have also introduced 
well-being resources to support people who want to share their well-
being journey in our community. Developed with input from experts, 
these resources--accessible in our app and on our Safety Center--
provide information to support people impacted by eating disorders or 
struggling with thoughts of suicide or self-harm. For example, if 
someone searches for key terms related to suicide, self-harm, or eating 
disorders, we direct them to expert information and support, including 
the Crisis Text Line, National Suicide Prevention Lifeline, and the 
National Eating Disorders Helpline.
    We recognize that childhood and adolescence are transformative 
phases of life. In the United States, we have two separate experiences: 
TikTok for Younger Users for children under 13 years old, and TikTok 
for those at least 13 years of age. We have proactively implemented 
privacy and safety protections to promote the well-being of children 
and teenagers, and we continue to work on changes to support age-
appropriate experiences on our platform. We understand and respect 
adolescents' desire and need for greater autonomy, but we also 
appreciate that they are still learning and growing, which is why our 
product is designed to be sensitive to the differing developmental 
stages of childhood, adolescence, and adulthood.
    Our goal of providing an age-appropriate experience for our younger 
users starts with a neutral, industry-standard age screening, which 
requires individuals creating an account to provide their birthdate. 
However, our commitment to enforcing our minimum age requirement does 
not end with age screening. We take additional actions to identify and 
remove suspected underage account holders from the TikTok platform. For 
example, we train our safety moderation teams to be alert to explicit 
admissions that an account may belong to someone under the age of 13. 
We also use other information, such as keywords or user reports, to 
help identify potential underage accounts. If, after reviewing all 
relevant information, our safety team determines that an account may 
belong to an underage person, we will suspend and remove the account. 
As part of our commitment to transparency and accountability, since the 
first quarter of 2021, we have provided reports about our removal of 
suspected underage accounts, which totaled more than 11 million 
removals between April and June 2021. We are proud to be an industry 
leader in making these disclosures.
TikTok for Younger Users
    In the United States, if an individual registers for TikTok as 
under the age of 13, they are directed to TikTok for Younger Users, a 
curated viewing experience with stringent safeguards and privacy 
protections designed specifically for people under 13. In TikTok for 
Younger Users, youth can experience and make fun, creative, and even 
educational videos on their device. However, they cannot post videos on 
the platform, comment on others' videos, message with others, or 
maintain a profile or followers. No advertisements are shown in the 
under 13 experience.
    Earlier this year, we announced a partnership with Common Sense 
Networks as part of our ongoing efforts to improve upon the TikTok for 
Younger Users experience. Through our partnership, Common Sense 
Networks is providing additional guidance on the appropriateness of 
content for children under 13, thus strengthening our work to create an 
entertaining yet safe viewing environment for our younger audience.
Our Approach to Teen Safety and Privacy
    Teens who are at least 13 years old can use the main TikTok 
platform. Today's youth are growing up in a media world and to help 
them safely manage their digital spaces, we provide them with age-
appropriate privacy and safety settings and controls. The privacy and 
safety settings TikTok has developed, reflect careful consideration of 
not only the differences between children and teenagers, but also 
within the 13-18 teenage group.
    Our Minor Safety team holds a high bar of rigor for developing 
policy. We are staffed with experts from the fields of adolescent 
development, prevention science, and child protection. Our policies are 
informed by peer-reviewed academic literature and ongoing consultation 
with external scholars. TikTok works with leading youth safety and 
well-being experts, as well as adolescent psychologists to inform our 
approach, including the Family Online Safety Institute, National 
Parent-Teacher Association, Common Sense Networks, and the Digital 
Wellness Lab at Boston Children's Hospital. We seek out feedback, 
research, and best practices from such experts and organizations, and 
we use this information to help us design TikTok in way that considers 
and supports the unique needs of teens. Based on input from these 
experts and published research in this space, we have adopted a 
nuanced, age-appropriate approach that distinguishes between early 
teens (age 13-15) and late teens (16-17), which is reflected in our 
teen-related privacy and safety settings as discussed below.
Family Pairing and Parental Oversight
    TikTok is committed to providing parents and guardians visibility 
into, and control over, how their teenagers use the app. Parents and 
guardians are critical partners in ensuring the safety of teens. They 
cannot do it alone and neither can we. We are continuously looking for 
ways to involve parents and guardians in their teen's experience on our 
platform. To that end, in 2020, TikTok unveiled our Family Pairing 
features, which empower parents or guardians to customize privacy and 
safety settings for their teenage users, which we continue to improve 
in consultation with youth and family safety experts. We are pleased to 
see some of the other platforms exploring similar features to give more 
meaningful choice and tools to parents.
    Family Pairing allows a parent or guardian to link their account 
(from the parent's device) with their teenager's account and enable 
privacy and safety controls. Through Family Pairing, parents can choose 
from the following settings:

   Screen Time Management: Parents can decide how long their 
        teen can spend on TikTok each day. Parents can do this directly 
        from their own accounts on their personal device.

   Restricted Mode: Parents can help limit the appearance of 
        content that may not be appropriate for all audiences.

   Search: Parents can decide whether their teen can search for 
        content, accounts, hashtags, or sounds.

   Discoverability: Parents can decide whether their teen's 
        account is private (in which case the teen decides who can see 
        their content) or public (in which case anyone can view their 
        content). While under 16 accounts are private by default, this 
        feature adds a layer of parental oversight.

   Suggest Account to Others: Parents can decide whether their 
        teen's account can be recommended to others.

   Direct Messages: Direct message is not available for teens 
        under age 16. For teens aged 17-18, parents may restrict who 
        can send messages to their teen's account or turn off direct 
        messaging completely.

   Liked Videos: Parents can decide whether anyone can view the 
        videos that their teen likes.

   Comments: Parents can decide whether anyone can comment on 
        their teen's videos.

    Even when Family Pairing is not enabled, teens themselves can 
always take advantage of these tools by selecting them individually 
through TikTok's Digital Wellbeing setting. While parents and their 
teens should remain in control of deciding what is right for their 
family, our hope is that our Family Pairing features will encourage 
families to have important conversations about digital safety and well-
being.
    In addition to Family Pairing, we recently launched an updated 
Guardian's Guide to provide an overview of TikTok and the variety of 
tools and controls we have built into the product to keep our community 
safe. The guide also provides general information on common Internet 
safety concerns for families.
    We also developed a guide for the National PTA to provide families 
and educators with a comprehensive overview of the TikTok app and the 
safety tools and resources available to create age-appropriate 
experiences on the platform. The guide was sent to dozens of schools 
around the country and was published in a U.S. Department of Education 
newsletter.
Privacy and Safety by Default
    The privacy and safety of our teenage users is our priority. To 
that end, we have implemented numerous privacy settings and controls 
that reflect our commitment to provide our teenage users with a 
positive, safe, and age-appropriate experience, including the 
following:

   Private Accounts: The default setting for all TikTok 
        accounts of teens under 16 is set to ``private.'' That means 
        that only the accounts that the teen approves can follow them 
        and watch their videos.

   Suggest Your Account to Others: The ``suggest your account 
        to others'' feature is disabled by default for teens under 16. 
        This means the teen's account will not be suggested to other 
        users unless the teen changes the setting to ``on.''

   Direct Messages: Direct messaging is automatically disabled 
        for users under 16. For older teenage accounts (16-17), their 
        direct messaging setting is set to ``No One'' by default. We do 
        not allow anyone, regardless of age, to send off-platform 
        videos, images, or links via direct messaging. We made this 
        decision to further protect teens from exploitation, bullying 
        or other unwanted communications, as studies have shown that 
        this type of content is spread via private messaging.

   Livestreaming: Livestream hosting is disabled for teens 
        under 16.

   Comments: Teens under 16 can allow Friends (followers who 
        the teen follows back) or ``No One'' to comment on their 
        videos; the Everyone comment function is not available for 
        under 16 teens.

   Video Downloads: The ability to download videos created by 
        teens under 16 is disabled. For teens between 16-17, the 
        default setting in this feature is set to Off. If teens aged 
        16-17 choose to turn the download feature on, they will receive 
        a pop-up asking them to confirm their choice that others can 
        download their videos.

   Duet/Stitch: The ``Duet'' function allows users to react, 
        contribute to, or sing along with another user's video. Users 
        can also ``Stitch,'' which allows a user to clip a portion of 
        another user's video to include as part of their own video. 
        Duet and Stitch are disabled for teens under 16, whereas the 
        default setting for teens 16-17 is set to Friends.

   Gifting: Users must be 18 or over to buy, send, or receive 
        virtual gifts during Livestreams.

    Building on these age-appropriate protective measures, and in 
keeping with developments in the global privacy landscape, we recently 
have introduced new settings:

   We have added a pop-up that appears when teens under 16 are 
        ready to publish their first video, asking them to choose who 
        can watch the video. They will not be able to publish their 
        video until they make a selection.

   We have provided additional context to help teens aged 16-17 
        understand how downloads work, so they can choose the option 
        that is most comfortable for them. If they opt to turn the 
        feature on, they will now receive a pop-up asking them to 
        confirm that choice before others can download their videos.

   When someone aged 16-17 joins TikTok, their Direct Messaging 
        setting will be set to ``No One'' by default. To message 
        others, they will need to switch to a different sharing option. 
        Existing accounts that have never used direct messaging before 
        will receive a prompt asking them to review and confirm their 
        privacy settings the next time they use this feature.

   Accounts for age 13-15 teens will not receive push 
        notifications from 9 p.m., and accounts for age 16-17 teens 
        will have push notifications disabled starting at 10 p.m.
Preventing Bullying and Harassment
    TikTok stands against bullying and harassment in any form. We 
continue to look for ways to promote open and respectful discussion 
while prohibiting inappropriate and harmful comments on our platform. 
We launched #CreateKindness, a global campaign and a creative video 
series aimed at raising awareness about online bullying and encouraging 
people to choose to be kind toward one another. The campaign has 
already received more than one billion views.
    Bullying and harassment are violations of our Community Guidelines, 
and when we see such content and behavior, we take action. To further 
help foster kindness in our community, we have rolled out the following 
features this year:

   A prompt asking people to reconsider posting unkind or 
        inappropriate comments.

   Two features that give users more control over the comments 
        on their videos:

     With our new Filter All Comments feature, users can 
            decide which comments will appear on their videos. When 
            enabled, comments are not displayed unless the creator 
            approves them using the comment management tool. This 
            feature builds on our existing collection of comment 
            controls that allow people to automatically filter spam, 
            offensive comments, and specific keywords.

     We have also introduced the ability to delete multiple 
            comments at once or report them for potentially violating 
            our Community Guidelines. Accounts that post bullying or 
            other negative comments can now be blocked in bulk, up to 
            100 at a time.

    In addition, we work to educate teens and families about bullying 
prevention. We recently launched a bullying prevention guide on our 
Safety Center to help teens, parents and guardians, and educators learn 
about the issue and settings that can help prevent bullying on our 
platform.
Combatting Child Exploitation
    TikTok is firmly committed to protecting the safety of our 
community, and we take a zero-tolerance approach to all forms of child 
sexual exploitation and abuse (CSEA), including child sexual abuse 
material (CSAM). We continually monitor, update, and reinforce our 
efforts to stop the creation and sharing of CSAM. TikTok has a 
dedicated investigations group within our Trust & Safety team focused 
on detecting emerging trends and patterns regarding child exploitation 
and abuse as part of our efforts to keep our platform safe. TikTok 
adheres to the Voluntary Principles to Counter Online Child Sexual 
Exploitation and Abuse, which provide a framework that can be 
consistently applied across sectors and services to respond to changing 
societal and offending behaviors and reduce risks for users.
    TikTok's efforts to thwart CSAM and child exploitative content is 
premised on the three P's: Policies, Product, and Partners.

   Policies: The threat posed by CSAM and child exploitative 
        content remains a global challenge that requires collaboration 
        with academic experts, child safety groups, industry 
        organizations, and governments. TikTok's global Trust & Safety 
        teams are comprised of experienced professionals whose 
        backgrounds span product, policy, compliance, child safety, 
        law, and privacy. Trust & Safety leaders collaborate closely 
        with regional regulators, policymakers, governments, and law 
        enforcement agencies to promote the highest possible standard 
        of user safety. Our Los Angeles-based Trust & Safety team is 
        responsible for developing and enforcing moderation policies 
        and strategies in the United States.

    TikTok's Community Guidelines prohibit content that depicts or 
        disseminates child abuse, child nudity, or the sexual 
        exploitation of children. When we become aware of such content, 
        we take immediate action to remove the violative content, 
        terminate the pertinent accounts, and, where appropriate, 
        report cases of CSAM to the National Center for Missing and 
        Exploited Children (NCMEC) and other law enforcement agencies. 
        In 2020, TikTok made 22,692 reports regarding CSAM and child 
        exploitative content to NCMEC's CyberTipline.

    TikTok's multifaceted strategy to combat CSAM and child 
        exploitative content employs both human-centered moderators as 
        discussed below, as well as machine-based moderation tools such 
        as photo-identification technology. Additionally, we filter 
        red-flag language and share information with NCMEC about 
        situations that may indicate grooming behavior, according to 
        their policies and industry norms.

   Product: TikTok does not allow off-platform videos, images, 
        or links to be sent through direct messages or comments to 
        videos. This was a deliberate decision we made, supported by 
        relevant studies, and helps to close off a potential vector for 
        the propagation and dissemination of CSAM and child 
        exploitative content. We provide in-app reporting mechanisms to 
        ensure users can report content that violates our Community 
        Guidelines, including content that violates our minor safety 
        rules. Finally, for our under 13 users, we have implemented 
        additional restrictions and safeguards in TikTok for Younger 
        Users to help guard against the risks of CSAM and child 
        exploitative content.

   Partners: TikTok leverages the expertise of child safety 
        organizations, including the Family Online Safety Institute, 
        ConnectSafely, National PTA, Cyberbullying Research Center, 
        Boston Children's Hospital's Digital Wellness Lab, Internet 
        Watch Foundation, NCMEC to continually assess and enhance our 
        minor safety policies and product features. We also recently 
        joined the Technology Coalition, a global alliance of leading 
        Internet companies, to work collaboratively toward solutions 
        that protect children from online sexual exploitation.
Content Moderation and Dangerous Challenges
    Content moderation policy and implementation for the United States 
is led by our U.S. Trust and Safety team in Los Angeles, with the goal 
of developing equitable policies that are transparent and can be 
consistently enforced.
    We solicit a diverse range of feedback from external experts in 
digital safety and human rights, including members of our Content 
Advisory Council. We also work with accredited and independent fact-
checking organizations that help us assess the veracity of content, 
such as Lead Stories or Science Feedback, so that we can take the 
appropriate action in line with our Community Guidelines. We greatly 
value our collaboration with independent researchers on industry-wide 
challenges, which helps strengthen how we enforce our policies to keep 
our platform safe and welcoming.
    Our Community Guidelines are designed to foster a safe environment 
that nourishes creativity, and we rely on our community members to 
responsibly honor these parameters. We strictly prohibit dangerous acts 
and challenges on our platform and work to aggressively remove videos 
and hashtags that promote such behavior. TikTok has even stricter 
policies when it comes to inappropriate content featuring minors. Our 
policies prohibit any content that depicts or promotes activities that 
may jeopardize youth well-being, including physical challenges, dares, 
or stunts.
    While we do not condone any harmful, dangerous, or criminal 
behavior, we understand that some individuals may nevertheless decide 
to post such content. We aggressively look to remove such content and 
related hashtags as soon as possible. To educate our community about 
the dangers of such behavior, TikTok has partnered with creators to 
raise awareness against these dangerous challenges.
    Unfortunately, ``dangerous challenges'' are often sensationalized 
in the traditional media and can go ``viral'' in news accounts or from 
tweets by public officials, even when they are not actually appearing 
on the app. With regards to reports of scheduled challenges on TikTok, 
as a prominent disinformation researcher who focuses on TikTok recently 
pointed out: ``When I looked into this, I could not find a single 
TikTok actually endorsing this behavior. All evidence indicates this is 
a hoax turned into reality by local news and school districts reacting 
to completely unconfirmed rumors.'' A number of news reports have 
surfaced that many of the alleged challenges are in fact hoaxes that 
originated on another platform and were never on TikTok at all.
    We recently saw content related to ``devious licks'' gain traction 
on TikTok and other platforms. Our moderation teams worked swiftly to 
remove this content and redirect hashtags and search results to our 
Community Guidelines to discourage such behavior. We issued specialized 
guidance to our teams on this violative content and proactively 
detected and removed content, including videos, hashtags, and audio 
associated with the trend. Additionally, our teams are continuously 
staying alert to emerging violative content as well as new variations 
of spellings for hashtags relating to challenges. To help reach schools 
and parents, we actively worked with National PTA throughout this cycle 
to explain our diligent actions and reach families with messages that 
support ongoing, open conversations about digital safety and etiquette 
with their teens to ensure they are using social media productively and 
responsibly. As part of our PTA Connected initiative, National PTA has 
teamed up with TikTok to develop a guide that provides parents and 
guardians with information about TikTok, digital safety, and how they 
can help ensure their teens are using technology productively and 
responsibly.
Transparency into Enforcement of Minor Safety Rules
    TikTok aims to be transparent and accountable when it comes to 
minor safety. We publish a quarterly Transparency Report that discloses 
the metrics related to violative content that we remove pursuant to our 
Community Guidelines. This includes data about content that violates 
our minor safety rules, as well as how quickly we detect and remove 
such content. In the second quarter of 2021, for example, 41.3 percent 
of content removed violated our minor safety policies. Of those videos, 
97.6 percent of videos were removed before they were reported to us, 
and 95.4 percent of videos were removed within 24 hours of being 
posted. This continued improvement in proactive detection is the result 
of our efforts to improve our policies and procedures that work to 
identify and flag violative content early on, before it is even viewed.
Conclusion
    There is no finish line when it comes to protecting children and 
teens. The challenges are complex and constantly evolving, but with 
humility and determination, we are working hard to build upon our work 
to keep our platform safe for everyone, especially children and teens. 
It truly does take a village to keep minors safe online, and we will 
continue to work with teenagers, parents, child safety experts and 
organizations, policymakers, and other interested stakeholders to 
continue to improve and do better. Thank you for your time and 
consideration.

    Senator Blumenthal. Thanks, Mr. Beckerman. Ms. Miller, I 
hope you are with us. Please proceed.

STATEMENT OF LESLIE MILLER, VICE PRESIDENT, GOVERNMENT AFFAIRS 
                   AND PUBLIC POLICY, YOUTUBE

    Ms. Miller. Sorry, I was--I think I am having a bit of 
technical difficulty. Can you hear me OK?
    Senator Blumenthal. We can hear you and now we can see you. 
Thanks.
    Ms. Miller. OK, wonderful. Thank you. Chairman Blumenthal, 
Ranking Member Blackburn, and distinguished members of the 
Subcommittee, thank you for the opportunity to appear before 
you today. My name is Leslie Miller, and I am the Vice 
President of Public Policy at YouTube. As young people spend 
more time online and given their changing needs as they grow 
up, it is crucial to put in place protections that allow them 
age appropriate access to information.
    We do this by investing in the partnerships, technologies, 
and policies that create safer environments that allow children 
to express their imagination and curiosity and empower families 
to create the right experiences for their children. Our 
internal teams include experts who come from child development, 
child psychology, and children's media backgrounds. They work 
closely with the product teams to ensure that product design 
reflects an understanding of children's unique needs and 
abilities, and how they evolve over time.
    The advice from trusted experts informs our ongoing 
improvements for YouTube Kids and our child safety policies. 
Our child safety policy--our child safety specific policies, 
which I describe in greater detail in my submitted testimony, 
prohibit content that exploits or endangers minors on YouTube. 
Using a combination of machine learning and human reviewers 
across the globe, we commit significant time and resources to 
removing this harmful content as quickly as possible.
    Between April and June of this year, we removed nearly 1.8 
million videos for violations of our child safety policies, of 
which about 85 percent were removed before they had 10 views. 
We are constantly working to improve our safeguards. We have 
also invested significant resources to empower parents with 
greater control over how their children view content on 
YouTube.
    In 2015, we created YouTube Kids as a way for kids to more 
safely pursue their curiosity and explore their interests while 
providing parents more tools to control and customize the 
experience for their families. Videos on YouTube Kids include 
popular children's videos, diverse new content, and content 
from trusted partners. After we launched YouTube Kids, we heard 
from parents that tweens have different needs, which weren't 
being fully met by our products.
    That is why we have worked with parents and experts across 
the globe in areas related to child safety, child development, 
and digital literacy to develop a solution for these parents, 
which we called supervised experiences. We launched this 
earlier this year on the main YouTube platform.
    Parents now have the option of choosing between three 
different content choices, content generally suitable for 
viewers aged 9 plus, or 13 plus, and then most of YouTube. The 
most of YouTube option excludes all age restriction--age 
restricted content on the main platform. We want to give 
parents the controls that allow them to make the right choices 
for their children. On YouTube Kids and even on YouTube for all 
underage users, auto play is off by default. In the coming 
months, we will be launching additional parental controls in 
the YouTube Kids app, including the ability for a parent to 
choose a locked default auto play setting.
    Our take a break reminders and bedtime settings are also on 
by default in these experiences. YouTube treats personal 
information from anyone watching children's content on the 
platform as coming from a child, regardless of the age of the 
user. This means that on videos classified as made for kids, we 
limit data collection and use, and as a result, we restrict or 
disable some product features.
    For example, we do not serve personalized ads on this 
content on our main YouTube platform, and we do not support 
features such as comments or live chat. And to be clear, we 
have never allowed personalized advertising on YouTube Kids or 
YouTube supervised experiences.
    There is no issue more important than the safety and well-
being of our kids online, and we are committed to working 
closely with you to address these challenges. I again would 
like to thank you for the opportunity to appear today and look 
forward to answering your questions.
    [The prepared statement of Ms. Miller follows:]

Prepared Statement of Leslie Miller, Vice President, Government Affairs 
                        & Public Policy, YouTube
Introduction
    Chairman Blumenthal, Ranking Member Blackburn, and distinguished 
members of the subcommittee: thank you for the opportunity to appear 
before you today. My name is Leslie Miller, and I am the Vice President 
of Public Affairs and Public Policy at YouTube.
    In my role, I lead a diverse and global team that advises the 
company on public policy issues around online, user-generated content, 
supporting YouTube's mission to ``give everyone a voice and show them 
the world.'' Central to our work is our commitment to responsibility--
and nowhere is this more important than when it comes to protecting 
kids and teens.
    As young people spend more time online, and given their changing 
needs as they grow up, it's crucial to put in place protections that 
also allow them age-appropriate access to information. We do this by 
investing in the partnerships, technologies, and policies that create 
safer environments that allow children to express their imagination and 
curiosity and empower families to create the right experiences for 
their children. At the same time, we recognize the potential risks that 
children may face online, and have invested extensively in industry-
leading machine learning technologies that identify potential harms 
quickly and at scale.
    Our internal teams include experts who come from child development, 
child psychology and children's media backgrounds. They work closely 
with the product teams to ensure that product design reflects an 
understanding of children's unique needs and abilities and how they 
evolve over time. We also work extensively with external experts in 
online safety, content quality, mental health, trauma, digital literacy 
and child development; this collaboration is essential to ensure we 
have the best information and evidence available to address new and 
emerging challenges kids may face online. The advice from trusted 
experts informs our on-going improvements to YouTube Kids and our child 
safety policies, and was instrumental in the recent creation of YouTube 
Supervised Experiences, our new option for parents of teens and tweens 
on our main YouTube platform. We are grateful for how these 
collaborations have informed how we can build and refine our products 
and policies to keep kids safe on YouTube.
    In my testimony today, I will (1) describe our responsibility 
framework and how it applies to child safety, (2) provide an overview 
of YouTube Kids and the YouTube Supervised Experiences, (3) describe 
our child safety policies, and (4) explain how we encourage the 
development of healthy technology use.
                    YOUTUBE RESPONSIBILITY FRAMEWORK
    Responsibility is our number one priority at YouTube, and nowhere 
is that more important than in our work around kids. Some speculate 
that we hesitate to address problematic content or ignore the well-
being of youth online because it benefits our business; this is simply 
not true. Ensuring the safety of children is not only the right thing 
to do, it also helps us to earn the trust of parents, who see that we 
are building a safe environment for kids and families. It helps us 
reassure our advertising partners that their brands are safe on 
YouTube; responsibility is good for business. We have made significant 
investments over the past few years in policies, technology, and teams 
that help provide kids and families with the best protections possible. 
Our approach towards responsibility involves 4 ``Rs'' of responsibility 
described in detail below.
Remove
    YouTube uses a combination of machine learning and human review to 
enforce our policies and we regularly report on the content removed for 
violating our policies in our quarterly Community Guidelines 
Enforcement Report.
    We have clear policies, that I will describe in great detail in a 
later section, that prohibit content that exploits or endangers minors 
on YouTube and we have committed significant time and resources toward 
removing violative content as quickly as possible. We use a combination 
of machines and humans to enforce our policies. Machine learning is 
well-suited to detect patterns, which helps us to find content similar 
to other content we've already removed, even before it's ever viewed. 
Expert human review allows us to appreciate context and nuance when 
enforcing our policies. Between April and June (Q2) of this year we 
removed 1,874,729 videos for violations of our child safety policies, 
of which approximately 85 percent were removed before they had 10 
views.
    We have also seen a significant decline in a metric known as the 
Violative View Rate (VVR). This metric is an estimate of the proportion 
of video views that violate our Community Guidelines, to include our 
child safety policies but excluding spam, in a given quarter. Our data 
science teams have spent more than two years refining this metric, 
which we consider to be an important indicator in measuring the 
effectiveness of our efforts to fight and reduce abuse on YouTube. In 
Q2 of 2021, YouTube's VVR was 0.19-0.21 percent, meaning that out of 
every 10,000 views on YouTube, 19-21 are views of content that violates 
any of our Community Guidelines. This reduction is due in large part to 
our investments in machine learning to identify potentially violative 
content at scale. We expect to continue to reduce this metric over 
time.
Raise Authoritative Content & Reduce Visibility on Borderline Content
    Over the last few years, we've worked with child development 
specialists to provide guidance to YouTube creators on how to create 
high-quality content for children. As a result of extensive 
consultations on children's media, digital learning, and the study of 
good citizenship, we established a set of quality principles to help 
guide our kids and family creator ecosystem. These principles include 
content that demonstrates or encourages respect and healthy habits, 
such as being a good friend, content that is thought-provoking or 
imaginative, such as arts and crafts activities, and content that 
celebrates and encourages diversity and inclusion.
    These principles help determine what content is included in YouTube 
Kids. We also use these principles to determine which high-quality 
content we raise up in our recommendations on YouTube. This means that 
when you're watching content for kids and families on YouTube, we aim 
to recommend videos that are age-appropriate, educational, and inspire 
creativity and imagination.
    At the same time, we use these principles to reduce kids content 
that is low-quality (but doesn't violate our Community Guidelines) in 
our recommendations on YouTube, and remove channels from YouTube Kids. 
Examples of this content include videos that are heavily commercial or 
promotional, or encourage negative behaviors or attitudes. Our work 
here is ongoing and we regularly reevaluate and update our systems and 
policies.
Reward Trusted Creators
    We set a higher bar for what channels can make money on YouTube. In 
addition to our Community Guidelines, creators also need to follow our 
monetization policies to join the YouTube Partner Program (YPP), which 
allows creators to monetize their videos. Every channel applying to YPP 
undergoes review by a trained rater to make sure it meets our policies 
and we continually keep these guidelines current. We also regularly 
review and remove channels that don't comply with our policies.
    Earlier this month, we shared additional monetization policies--
which align with the quality principles discussed above--for channels 
that primarily create kids and family content on YouTube. Going 
forward, these principles will have not only an impact on 
recommendations and inclusion in YouTube Kids, but also on 
monetization.
    Channels that primarily target young audiences need to deliver 
high-quality content and comply with kids-specific monetization 
policies. For example, channels that have predominantly low-quality 
kids content may be suspended from YPP. If an individual video violates 
these quality principles, it may see limited or no ads.
             YOUTUBE KIDS & YOUTUBE SUPERVISED EXPERIENCES
    In 2015, we created YouTube Kids as a way for kids to more safely 
explore their interests and curiosity while providing parents more 
tools to control and customize the experience for their families. 
Videos on YouTube Kids include popular children's videos, diverse new 
content, and content from trusted partners. Our approach to determining 
eligibility in YouTube Kids is to first identify a very small subset of 
channels that have a low likelihood of uploading inappropriate content 
from the full YouTube corpus of content that is available. A channel 
must meet our high-confidence criteria, such as the general absence of 
unsuitable themes (e.g., shocking or scary themes), spammy behavior, 
and unsafe or unhealthy content (e.g., glamorizing negative habits or 
unseemly behavior). SciShow Kids or Art for Kids Hub are good examples 
of channels that fit these criteria. This approach, which is intended 
to ensure high-quality and appropriate content is available for 
viewing, significantly narrows the universe of content that then must 
pass further machine-learning and human review based on the content 
safety policy guidelines.
    Because each child is unique, we also give parents the ability to 
fully customize what their kids watch. For example, parents can block a 
video or channel, can handpick which videos to make available for their 
kids, and also have an option to only allow content selected by trusted 
partners like PBS Kids and UNICEF.
    Our engineering and human reviewer teams continuously work together 
to ensure that these filters are working as intended, and that content 
is appropriate for children. Machine learning systems work to 
proactively detect violations, while human reviewers around the world 
quickly remove violations detected by the system or flagged by users. 
We are constantly working to improve our safeguards and offer more 
features to help parents create the right experience for their 
families.
    After we launched YouTube Kids, we heard from parents of older 
children that tweens and teens have different needs, which weren't 
being fully met by our products. That is why we've worked with parents 
and experts across the globe in areas related to child safety, child 
development, and digital literacy to develop a solution for parents of 
tweens and teens, which we call Supervised Experience.
    Supervised Experience launched in March 2021 on the main YouTube 
platform. Parents now have the option of choosing between three 
different content choices, content generally suitable for viewers aged 
9+ or 13+ and then the `Most of YouTube' option. `Most of YouTube' 
excludes all age-restricted content (18+) on the main platform. To be 
clear, kids under 13 who are not in a Supervised Experience are not 
allowed on YouTube. We took action on more than 7M accounts the first 3 
quarters of 2021 when we learned they may belong to a user under the 
age of 13--3M of those in Q3 alone as we have ramped up our automated 
removal efforts.
    When a parent opens a supervised Google account, their child's 
experience feels much like regular YouTube, but certain features are 
disabled for younger audiences. For example, we don't serve 
personalized ads. We also have clear guidelines that prohibit certain 
types of advertising on Supervised Experiences, which include ads 
related to weight loss and diets or ads for dating sites. YouTube 
supervised experiences also have disabled in-app purchases, as well as 
features such as uploading videos or livestreams and reading or writing 
comments.
    To help parents understand more about the YouTube Supervised 
Experience, we developed this guide, building on Google's successful Be 
Internet Awesome digital literacy resources, and in partnership with 
the National PTA, Parent Zone UK and other leading experts. We will 
continue to partner with these and other groups to provide easy to use 
resources specifically for parents to help them keep their kids safe 
online.
                         CHILD SAFETY POLICIES
    YouTube has long had policies that prohibit content that endangers 
the emotional and physical well-being of minors. We remove content that 
could cause minor participants or viewers emotional distress, content 
showing a minor participating in dangerous activities or encouraging 
minors to do dangerous activities, and content that involves 
cyberbullying or harassment involving minors. We also remove sexually 
explicit content featuring minors and content that sexually exploits 
minors, and report this content to the National Center for Missing and 
Exploited Children. In the first half of this year YouTube made over 
120,000 such reports.
    YouTube also has policies prohibiting content that promotes suicide 
or self-harm. We remove content promoting or glorifying suicide, 
content providing instructions on how to self-harm or die by suicide 
and content containing graphic images of self-harm posted to shock or 
disgust viewers. The Stanford Internet Observatory \1\ recently 
highlighted how we handle searches related to self-harm and suicidal 
ideation, writing that they were ``impressed that YouTube's Community 
Guidelines on suicide and self-injury provide resources, including 
hotlines and websites, for those having thoughts of suicide or self-
harm, for 27 countries.''
---------------------------------------------------------------------------
    \1\  https://cyber.fsi.stanford.edu/io/self-harm-policies-report
---------------------------------------------------------------------------
    Sometimes content doesn't violate our policies, but it may not be 
appropriate for viewers under 18. In these cases, we may place an age-
restriction on the video. Age-restricted videos are not viewable to 
users who are under 18 years of age or signed out.
    In addition to these specially designed policies, YouTube treats 
personal information from anyone watching children's content on the 
platform as coming from a child, regardless of the age of the user. 
This means that on videos classified as ``made for kids'', we limit 
data collection and use, and as a result, we restrict or disable some 
product features. For example, we do not serve personalized ads on this 
content on our main YouTube platform and we do not support features 
such as comments, live chat, notification bell, stories, and save to 
playlist. To be clear, we have never allowed personalized advertising 
on YouTube Kids or YouTube's Supervised Experience.
    Our efforts to protect children go beyond our platform, and include 
our efforts to enable others in industry to better protect children as 
well. In 2015, we introduced CSAI Match--a proprietary technology 
developed by our teams that detects and removes child sexual abuse 
imagery--or CSAI--online. CSAI content is given a unique digital 
fingerprint, which is then used to detect duplicate copies across our 
platforms, and matched against a list of known fingerprints. While 
existing technologies like PhotoDNA, which focused on still images, 
CSAI Match was and is industry leading in its focus on identifying CSAI 
in video. CSAI Match is also resistant to manipulation, so it 
dramatically increases the number of violative videos that can be 
detected. With hundreds of hours of content uploaded every minute, CSAI 
Match helps us to quickly identify and remove known CSAI at scale.
    We share CSAI Match technology with the industry, free of charge, 
and have partnered with companies and NGOs, including some on this 
panel today, as well as Adobe, Reddit, and Tumblr, allowing them and 
others to prevent the distribution of these videos on their platforms 
as well. Once CSAI Match is integrated, it can help our industry 
partners quickly identify harmful content, minimizing the number of 
people exposed to it and better safeguarding the privacy of victims of 
CSAI. We're committed to expanding our partnerships to help prevent the 
spread of this abusive material, both on and off our platform, on a 
global scale.
    We recognize that protecting children from abuse is a mission that 
no one company, industry or part of society, can accomplish alone. We 
were proud to have been part of the group of companies who contributed 
our expertise to the Five Country Ministerial Forum in developing the 
Voluntary Principles to Counter Online Child Sexual Abuse and 
Exploitation. These principles provide an ambitious roadmap for 
companies of all sizes and stages of development to identify, prevent 
and report CSAI.
    We also partner with others through the Technology Coalition--an 
organization made up of 24 members representing different parts of the 
industry. Through the Technology Coalition we work with stakeholders 
around the world to help advance cutting-edge research, technology 
innovation, and sharing of knowledge and best practice on how to 
prevent, detect and report child sexual abuse and exploitation. The 
Tech Coalition recently announced the funding of five global research 
projects to help increase our understanding and awareness of how to use 
artificial intelligence to combat child sexual exploitation at scale.
                           DIGITAL WELLBEING
    The relationship between technology use and physical and mental 
wellbeing is complex, especially for children and young people. Recent 
studies have highlighted that digital media use can help teens 
communicate with peers and family, seek helpful resources if they are 
experiencing distress, and find opportunities for learning and 
entertainment that can help combat isolation.\2\ \3\ \4\ Original 
content like the Workout Badges helps encourage playful movement even 
when kids are stuck at home. These are opportunities that have become 
even more important to young people and families during COVID-19, when 
organizations as diverse as ministries of education to local churches 
have used YouTube to help families stay connected.
---------------------------------------------------------------------------
    \2\  https://www.pewresearch.org/internet/2018/05/31/teens-social-
media-technology-2018/
    \3\ https://www.commonsensemedia.org/research/coping-with-covid19-
how-young-people-use-digital-media-to-manage-their-mental-health
    \4\ https://www.commonsensemedia.org/sites/default/files/uploads/
pdfs/tweens-teens-tech-and-mental-health-full-report-final-for-web1.pdf
---------------------------------------------------------------------------
    Fostering a sense of wellbeing on and offline is incredibly 
important to us. We do this by highlighting content that helps kids 
learn, play and do, encouraging them to be mindful of the time they 
spend on screens. As described earlier, we prioritize content that 
helps kids stay safe and healthy, and understand the world around them. 
For example we have featured content in YouTube Kids that help kids 
learn about how viruses are spread, or videos that prepare them to 
return to school.
    We also create tools, built into our systems, that help kids and 
their parents set healthy limits. For teens and for those in our 
YouTube Supervised Experience and in YouTube Kids, autoplay is now off 
by default. We've also added an autoplay option on YouTube Kids and 
turned autoplay off by default in the app. In the coming months, we'll 
be launching additional parental controls in the YouTube Kids app, 
including the ability for a parent to choose a ``locked'' default 
autoplay setting. Our ``Take a Break'' reminders and bedtime settings 
are also on by default in these experiences. In addition, we have 
worked with creators to develop a series of PSAs to help children 
reflect on the time they spend online, and build empathy for others.
                               CONCLUSION
    There is no issue more important than the safety and wellbeing of 
our kids online, and we are committed to working closely with you to 
address these challenges. I again would like to thank you for the 
opportunity to appear today and look forward to answering your 
questions.

    Senator Blumenthal. Thanks, Ms. Miller. We are going to do 
5 minute rounds. We have votes at 11 o'clock, we have three 
votes, but I think we will be able to juggle the questioning 
and the testimony, and if necessary, we will take a short 
recess. Let me begin, as you know, by now, in August, Senator 
Blackburn and I wrote to Facebook asking whether they had done 
any research, whether they had any facts that showed harm to 
children.
    In effect, they denied it, they dodged the question. They 
disclaimed that Instagram is harmful to teens. Let me ask you, 
and I think it is pretty much a yes or no question, the same 
question that we asked Facebook, have any of your companies 
conducted research on whether your apps can have a negative 
effect on children's or teen's mental health or well-being, and 
whether your apps promote addiction like these? Have you done 
that research? Ms. Stout?
    Ms. Stout. Senator, we have conducted research. Much of our 
research is focused on our products and how we can improve our 
products and services to really meet the needs of our users and 
our community. And as I mentioned in my opening testimony, some 
of the research that we did shows that 95 percent of users say 
that Snapchat makes them happy.
    Senator Blumenthal. Will you make that research available 
to the Subcommittee?
    Ms. Stout. Yes, Senator, we would.
    Senator Blumenthal. Mr. Beckerman.
    Mr. Beckerman. Senator, thank you for the question. We 
believe that research should be done in a transparent way, and 
for us, we partner with external experts and stakeholders to 
get their feedback. And we think that is something that 
everybody can work together on. And additionally, we have 
actually supported passage of the Camera Act, which would have 
additional funding at NIH, and we would love to see this done 
in an external way, in a transparent way.
    Senator Blumenthal. Ms. Miller.
    Ms. Miller. Yes, Senator, we work with experts to leverage 
their insights and research to make sure that our product and 
policy decisions are up to date based on their insights.
    Senator Blumenthal. Now, I have asked whether the research 
has been done that could show negative effects or addictive 
like impacts. You have all indicated that you have done that 
research, and Ms. Stout has indicated that her company will 
make it available. I am assuming Mr. Beckerman, you--your 
company will. You are nodding. And Ms. Miller?
    Ms. Miller. Yes, Senator, we have published some research 
and we would make additional available.
    Senator Blumenthal. Let me add--ask now about the black box 
algorithms. As you know, these algorithms exist, they function 
to drive sometimes toxic content at kids. More of it and more 
extreme versions of it. The consequences are potentially 
catastrophic, but the companies are in effect, grading their 
own homework. They are evaluating their own effects on kids 
when it comes to addiction and harms.
    Let me ask you a similar question, do you provide external 
independent researchers with access to your algorithms, data 
sets, and data privacy practices? In other words, if an 
academic researcher comes to you on child psychology and wants 
to determine whether one of your products causes teen mental 
health issues or addiction, could they get access to raw data 
from you without interference? Ms. Stout?
    Ms. Stout. So, Senator, I think it is important to remember 
that on Snapchat, algorithms work very differently. Very little 
of our content is sorted algorithmically----
    Senator Blumenthal. Well, I am going to apologize because I 
am going to interrupt just to say, the question is about access 
to independent researchers on those algorithms that you do use. 
And there is no question that you have algorithms, correct?
    Ms. Stout. Correct, Senator. We do have algorithms, but 
they just operate very differently. To your question on whether 
we have had any requests for outside researchers or mental 
health specialists to come and access that, to my knowledge, we 
have not, Senator.
    Senator Blumenthal. But you would provide access to them?
    Ms. Stout. Yes, I think it is important to understand that 
algorithms for us just operate differently. So it is--to 
compare them against different platforms just would mean very 
different things.
    Senator Blumenthal. Well, that is one of the facts that an 
independent or external researcher would verify, whether----
    Ms. Stout. Indeed.
    Senator Blumenthal.--they are different and how they are 
different if they are. Mr. Beckerman.
    Mr. Beckerman. Yes Senator, we believe that transparency 
for the algorithm is incredibly important. We were one of the 
first companies to publish publicly a deep dove in how our 
algorithm works. We have also open transparency and 
accountability centers where you invite outside experts and we 
invite you, Senator, and your staff to come in and see exactly 
how the algorithm works.
    Additionally, it is important to give additional choice to 
people. And in your feed, for example, on TikTok, if you are 
not interested, you can indicate that. And we are adding 
additional tools to give choice and transparency to individuals 
as they are using TikTok.
    Senator Blumenthal. So external access, OK. Ms. Miller.
    Ms. Miller. Senator, we are very transparent as it relates 
to the way in which our machine learning works. For example, we 
have a quarterly transparency report that summarizes the videos 
and channels that we removed based on violating our community 
guidelines. Earlier this year, we rolled out an additional 
statistic called the violative view rate----
    Senator Blumenthal. Again, I really apologize for 
interrupting. The question is whether you provide external 
independent researchers with access to your algorithms and data 
sets and data privacy protection--you allow external 
researchers----
    Ms. Miller. We regularly--I am sorry, Senator.
    Senator Blumenthal. You provide that access?
    Ms. Miller. We regularly partner with experts, for example, 
in child development, mental health to work with----
    Senator Blumenthal. They are experts chosen by you. If 
somebody independent came to you and wanted that access, yes or 
no, would you permit it?
    Ms. Miller. Senator, it would depend on the details, but we 
are always looking to partner with experts in these important 
fields.
    Senator Blumenthal. Well, I am going to cite the difference 
between your response and Mr. Beckerman's and Ms. Stout's, 
which indicates certainly a strong hesitancy, if not 
resistance, to providing access. Let me ask you, Ms. Miller, I 
think one of the issues here really relates to the claim that 
these sites are transparent and truthful, which is belied by 
our actual experience and the fact that they favor regulation.
    In the case of Facebook, they have mounted armies of 
lawyers, paid millions of dollars to fight regulation, whether 
it is the EARN IT Act and other responsible reforms to Section 
230 or privacy legislation or requirements to be more 
transparent on algorithms. According to the details made public 
last week, in a multi-state antitrust case, Google has, 
``sought a coordinated effort to forestall and diminish child 
privacy protections in proposed regulations by the FTC and 
legislation.''
    That filing described attempts to encourage Facebook and 
Microsoft to fight privacy rules and back down on advocacy for 
legislation, in particular meeting where that exchange 
occurred. This disclosure made news, but everybody in D.C. 
really knew what was true. What was new is that Google's 
hypocrisy was finally called out. The fact is that Google and 
YouTube have been fighting against privacy behind the scenes 
for years. It is hidden in plain sight. It is an open secret.
    You have been lobbying the FTC to weaken our existing 
privacy rule. You spent vast sums of money fighting 
California's privacy rules. I want to ask you, Ms. Miller, what 
work has YouTube done to lobby against Congress strengthening 
online protections for children? And is that report and that 
claim by the multi-state plaintiffs accurate?
    Ms. Miller. Senator, I understand that the material that 
you are referencing was regarding our point of view on e-
privacy legislation in Europe. Our CEO, Sundar Pichai, has 
regularly called for comprehensive privacy legislation in the 
U.S.
    And on behalf of YouTube, I am not aware of any efforts 
other than to be involved in conversations in a multi-
stakeholder way as it relates to any legislation or bills that 
are being introduced regarding the oversight or regulation of 
companies such as YouTube.
    Senator Blumenthal. So you are saying that these reports 
about political pressure and lobbying against children's 
privacy and safeguards are just totally false?
    Ms. Miller. I think that we work with lawmakers such as 
yourself regularly to have conversations to share what we are 
doing on the platform, the updated protections we are putting 
in place, but also to hear your concerns and to work with you 
as you contemplate new regulations.
    Senator Blumenthal. Will you commit that you will support 
privacy legislation, as has been proposed?
    Ms. Miller. Senator, I am not deeply involved in the 
details of any specific privacy legislation, but I commit that 
we will work with you and partner with you on Federal privacy 
legislation.
    Senator Blumenthal. Would you support a ban on targeted 
advertising to children and young teens?
    Ms. Miller. Senator, at YouTube, we prohibit personalized 
advertising in YouTube Kids, as well as in supervised 
experiences.
    Senator Blumenthal. Would you support a ban in law?
    Ms. Miller. We have not waited for laws in order to put 
those types of protections in place.
    Senator Blumenthal. Well, I hope that you will and that you 
will be perhaps more forthcoming in the next round of 
questioning. I will turn to the Ranking Member.
    Senator Blackburn. And thank you, Mr. Chairman. Mr. 
Beckerman, I want to come to you first. In the past, TikTok has 
said that it has never, nor would it ever share and provide 
user data to the Chinese Government, even if asked. Yet your 
privacy policy says you can disclose data collected to respond 
to Government inquiries.
    It also says you share data you collect with your parent 
companies and affiliates, and that you transmit user 
information to servers and data centers overseas. And earlier 
this year, the Chinese Communist Party acquired an ownership 
stake and a seat on the board of ByteDance. So does TikTok 
share user data with its parent company, ByteDance?
    Mr. Beckerman. Thank you, Senator. This is an important 
question, and I am glad you are asking. TikTok does not----
    Senator Blackburn. Quickly.
    Mr. Beckerman. Excuse me?
    Senator Blackburn. Quickly, please.
    Mr. Beckerman. Yes, Senator, we do not share information 
with the Chinese Government, and I would like to point you to a 
Citizen Lab report, which is one of the most well-respected 
global National Security experts, where they said our research 
shows that there is no overt data transmission to the Chinese 
Government and our testing TikTok did not contact any servers 
within China.
    And then the report goes on to state, Senator, that TikTok 
does not pose a threat to National Security, and I am happy to 
submit that report for the record.
    Senator Blackburn. Let me ask you--please submit the report 
for the record. Do any ByteDance employees have access to 
TikTok user data or any role in creating their algorithm?
    Mr. Beckerman. Senator, U.S. user data is stored in the 
United States, our backups are in Singapore, and we have a 
world renowned U.S. based security team that handles access to 
the data.
    Senator Blackburn. I understand that you say you store it 
in Singapore. Tell me about programmers, product developers, 
and the data teams. Are they housed in China?
    Mr. Beckerman. Senator, like many technology companies, we 
have engineers in the United States and throughout--and around 
the world.
    Senator Blackburn. And so they have access to algorithms 
and data?
    Mr. Beckerman. Senator, we have engineers in the United 
States, and we also have engineers----
    Senator Blackburn. That answer is yes. What about Douyin? 
ByteDance says they are fully separate, but do Douyin employees 
have any access to TikTok user data or input into the 
algorithm?
    Mr. Beckerman. Senator, that is a completely different app 
from TikTok and----
    Senator Blackburn. No, it is a related company, you might 
want to check that. If the Chinese Communist Party ask you for 
U.S. user data, what is to stop you from providing it since 
they have a seat on the board of ByteDance and they have a 
financial stake in the company?
    Mr. Beckerman. Senator, that is not accurate. One, they do 
not have a stake in TikTok at all.
    Senator Blackburn. Oh, yes they do. It happened in August.
    Mr. Beckerman. Senator, that is not accurate.
    Senator Blackburn. That is ByteDance and let's--we can 
clarify that for the record. But the record is that the Chinese 
Communist Party acquired a stake in ByteDance in August, and 
they now have a seat on the board. So let's talk about TikTok's 
privacy policy. It says you collect and keep a wide variety of 
information, including biometric data such as face prints, 
voice prints, geolocation information, browsing and search 
history, not just on TikTok, but on other apps, and keystroke 
patterns and rhythms. Why do you need all of this personal 
data, especially on our children, which seems to be more than 
any other platform collects?
    Mr. Beckerman. Senator, many outside researchers and 
experts to look at this have pointed out that TikTok actually 
collects less data than many of our peers. And on the keystroke 
issue----
    Senator Blackburn. Outside researchers that you are paying 
for?
    Mr. Beckerman. No, Senator.
    Senator Blackburn. You would submit that to independent 
outside researchers? Because what we are seeing with all of 
this biometric data and the keystroke patterns, that you are 
exceeding that. So what do you do with this? Are you creating a 
virtual you of the children that are on your site?
    Mr. Beckerman. Senator, I don't know what you mean by 
virtual you.
    Senator Blackburn. Well, a virtual you is you in your 
presence online. It is like a virtual dossier. I am sure you 
understand that term. And what do you need with all of this 
information?
    Do you track children's viewing patterns? Are you building 
a replication of where they go, their search history, their 
voice, their biometrics? And why does TikTok and ByteDance and 
Douyin need that information on our children?
    Mr. Beckerman. Senator, TikTok is an entertainment platform 
where people watch and enjoy and create short form videos. It 
is about uplifting and entertaining content. People love it. 
And I disagree with the characterization of the way----
    Senator Blackburn. That is it from the positive. But there 
is also a negative, and the negative is that you are building a 
profile, a virtual you of our children because of the data that 
you are collecting. You mentioned the family, parent provision 
that you have. So when you have a parent that goes on that, are 
they opening their data to TikTok, and is TikTok following them 
or following and capturing their search history?
    See Mr. Beckerman, when you capture all of this data and 
you hold all of this data, then you are invading the property--
the private--the privacy of individuals that are on your site. 
And that applies to you, to Ms. Stout, to Ms. Miller. Because 
you are--you say because you are using the platform, we can do 
this.
    But in essence, what you are doing is making our children 
and their data--you are making that the product, because you 
turn around and you sell it and then basically it becomes 
weaponized against their users. Mr. Chairman, I am over time. I 
have several questions for Ms. Stout and Ms. Miller. And we 
will do those in the second round.
    Senator Blumenthal. We will have a second round. Senator 
Klobuchar.

               STATEMENT OF HON. AMY KLOBUCHAR, 
                  U.S. SENATOR FROM MINNESOTA

    Senator Klobuchar. Thank you to both of you. Reports 
indicate that nearly half of kids 9 to 12 and a third of kids 
aged 7 to 9 use social media platforms like Facebook, 
Instagram, Snap, TikTok, YouTube. I don't think parents are 
going to stand by while our kids and our democracy become 
collateral damage to a profit game. And I heard last night Mark 
Zuckerberg's words to his earnings and his earnings report.
    And while he may be out there acting as a victim at his $29 
billion quarter earnings report meeting, the true victims, the 
true victims, the mom in Duluth who can't get her kid off 
Facebook to do her homework, the dad mourning losing a child to 
a Snap speed filter that measured him, echoing the kid at going 
123 miles per hour trying to beat the filter, or a child 
exposed to content glorifying eating disorders on TikTok. So I 
have had a case right in my state, two cases actually of young 
people who got drugs through Snap, and I wanted to first start 
out with that, with you, Ms. Stout.
    This is a story--there are two kids, Devin Norring and Ryan 
McPherson. Devin's story, he was suffering from dental pain at 
the beginning of the pandemic. He couldn't go in to see a 
doctor. He had been given a Percocet before, and a classmate 
said he had a Percocet. Well, what this young man did not know 
is that this Percocet was laced with fentanyl, and he died just 
like that. As his mom said in a letter to me, all of the hopes 
and dreams we as parents had for Devin were erased in the blink 
of an eye.
    A group of parents, including Devin's mother, Bridgett, 
demanded answers and accountability from Snap on this issue in 
a letter to you in September. Ms. Stout, I want to know what 
the answers are. Will you commit to providing more information 
about the automated tools Snap uses to proactively search for 
illegal drug related content as the parents ask?
    Ms. Stout. Senator, I very much appreciate you raising this 
issue because it has been a devastating crisis that has been 
afflicting our young people. I want to make clear, we are 
absolutely determined to remove all drug dealers from Snapchat, 
and we have been very public about our efforts in this space. 
First of all, we have stepped up our operational efforts, and 
my heart goes out to the family of Devin Norring.
    I have met with Bridget Norring. I met with her back in 
April. I heard from her and other families to understand what 
is happening to them, their experience, and also what is 
happening on Snapchat. We have stepped up and we have deployed 
proactive detection measures to get ahead of what the drug 
dealers are doing. They are constantly evading our tactics, not 
just on Snapchat, but on every platform.
    We have also stepped up our work with law enforcement. Just 
last week, we had a law enforcement summit where we gathered 
over 2,000 members of law enforcement across the country so 
that we can understand what they are dealing with and find out 
best practices on how we can get them the information they need 
to help their investigation.
    And finally, Senator, this is so important, but we have 
deployed in education and awareness campaign because what is 
happening on our platforms all across social media and 
technology platforms is that young people who are suffering 
from mental health and stress, induced by the pandemic and 
other issues, they are reaching for substances, oftentimes 
pills and opioids, but these substances are laced with 
fentanyl, enough fentanyl to kill them----
    Senator Klobuchar. But here is my problem, is that if a kid 
had just walked into, say, a pharmacy, you wouldn't be able to 
buy that or get that. But in this case, they can get on your 
platform and just find a way to buy it. And that is the 
problem.
    And I guess I want to know, are you going to get your 
drugs--I appreciate everything you said. I appreciate you 
meeting with the mom. Are you going to get drugs off Snapchat 
when you have all these kids, half the kids in America read--
looking at these platforms?
    Ms. Stout. I assure you, this is such a top priority for 
our entire company. And Senator, it is not just happening on 
our platform, it is happening on others. So therefore, we need 
to work collectively with other platforms, the other companies 
that are here today----
    Senator Klobuchar. That is good.
    Ms. Stout.--to work together.
    Senator Klobuchar. OK, thank you. I think there is other 
ways to do this too, as creating liability when this happens so 
maybe that will make you work even faster, so we don't lose 
another kid. Mr. Beckerman, recent investigation by The Wall 
Street Journal found that TikTok's algorithm can push young 
users into content glorifying eating disorders, drug violence. 
Have you stopped that?
    Mr. Beckerman. Yes, Senator. I don't agree with the way The 
Wall Street Journal went about that, but with that said, we 
have made a number of improvements to the way people can have 
control over the algorithm and have age appropriate content on 
TikTok.
    Senator Klobuchar. OK. And what are those changes like? Are 
they still getting--they are completely protected now from this 
content?
    Mr. Beckerman. Senator, the content related to drugs, as 
you are as you are pointing out, violates our community 
guidelines. As it relates to minors safety, over 97 percent of 
violative content is removed proactively. Of course, we want to 
get to 100 percent, and it is something that we are constantly 
working on.
    Senator Klobuchar. Did you--are you aware of any research 
studies your company has conducted about how your apps may push 
content promoting eating disorders to teens?
    Mr. Beckerman. No, Senator.
    Senator Klobuchar. OK. Did you ask for any turtle studies 
on eating disorders before testifying?
    Mr. Beckerman. Not that I am aware of, but we do work with 
outside experts to understand all of these issues. I think it 
should be done in a transparent way. As I mentioned earlier, I 
would love to see the Camera Act passed so we can have 
additional research in the public domain that all of us can 
learn from and improve.
    Senator Klobuchar. OK, I will save my questions for Ms. 
Miller for the next round. Thank you.
    Senator Blumenthal. Thanks, Senator Klobuchar. And again, I 
would remind everyone that you have committed to provide the 
research that you just referred to in your responses to Senator 
Klobuchar. All of you have committed to provide that research, 
and we will look forward to receiving it within days or weeks, 
not months. And I particularly appreciate Senator Klobuchar's 
reference to creating liability as a strong incentive, which 
would involve reform of Section 230. I understand----
    Senator Klobuchar. And Mr. Chair, if I could put this 
letter in from the parents into the record.
    Senator Blumenthal. Without objection.
    [The information referred to follows:]

                        7 Sep 2021 | Amy Neville

                        A LETTER TO EVAN AT SNAP
Summary: Our open letter to Evan Spiegel of SnapChat
PRESS RELEASE
    Aliso Viejo, CA (September 7, 2021)--A group of seven parents whose 
children died from counterfeit pills sold by drug dealers on Snapchat 
publicly demanded that the social media giant measure and report their 
response to law enforcement and the removal of drug dealers from the 
platform.
    In the letter signed by eight families, parents specifically wanted 
Snapchat to measure and report to the public its response time to valid 
law enforcement requests. ``Nobody is asking them to fulfill requests 
that don't go through proper legal channels,'' said Amy Neville, mother 
of murdered 14-year-old Alex Neville, ``but days can mean more deaths 
if drug dealers who have deadly fake pills aren't identified when law 
enforcement requests it. We need to know that they are processing these 
with the urgency that lives are at stake.''
    Amy Neville and other families participated in a meeting where 
Snapchat stated that they weren't adequately staffed to respond to law 
enforcement requests quickly.
    The letter also demands that Snapchat measure their program to find 
and remove drug dealers from the platform and refer these individuals 
to law enforcement proactively. ``If Snapchat discovers evidence that 
someone is dealing drugs on its platform, it must bring law enforcement 
in.'' said Jaime Puerta, father of murdered sixteen-year-old Daniel 
Puerta. ``If they just remove their account quietly and they keep 
dealing and someone dies, Snapchat is morally responsible for that 
death.''
    The letter asks that Snapchat convene a nine-member oversight 
committee of outside experts in law enforcement, public health and 
safety, and parents whose children have been harmed to review policies, 
procedures, and enforcement of Snapchat's efforts to clean up its 
platform.
                                 ______
                                 
                 OUR LETTER TO EVAN SPIEGEL AT SNAP INC
Evan Spiegel, CEO
Michael O'Sullivan, General Counsel
2772 Donald Douglas Loop N
Santa Monica, CA 90405

September 7, 2021

    We, the undersigned, lost our children to opioids sold by dealers 
on Snapchat--a platform that caters to users too young to understand 
the opioid crisis or the risks of counterfeit pills. All social media 
platforms have a moral responsibility to the young people that use them 
and we believe they must take that responsibility seriously.
    We ask that you take the following steps to publicly address the 
danger of criminal behavior on your platform.
Responsiveness to law enforcement
    We write today because some of the investigations into our own 
children's deaths have encountered resistance to law enforcement 
requests for information because, as Snapchat representatives told us, 
they are not adequately resourced to be responsive enough.
Proactive measures to stop drug trafficking
    All social media platforms should have proactive programs that 
include automated software to search for evidence of drug dealing, 
human review, and referral to law enforcement when such activity is 
found.
    We have heard rumors that Snapchat may use such technology, but we 
seek a formal declaration of its use and commitment to law enforcement 
referrals. Without the power of law enforcement referrals, criminals 
can simply continue their activities with new accounts on the service.
Screening of business profiles
    Social media companies must carefully scrutinize accounts that 
advertise and perform other business functions on their platforms to 
avoid a repeat of the problems Google experienced in 2011. Snapchat 
should ensure that only NABP-approved pharmacies are permitted to 
promote pharmaceutical products to U.S. residents. Violators should be 
referred to the FDA's Office of Criminal Investigation.
External transparency committee to evaluate Snapchat's efforts
    Snapchat drug dealers are preying on young people across the 
country, and public sources have already linked these sellers to deaths 
in fifteen states. Snapchat needs oversight and transparency to 
reassure parents that it is accountable to protect our children on the 
platform.
    We call on Snapchat to establish a transparency committee of nine 
individuals, three drawn from Federal and state narcotics law 
enforcement, three from parents who have lost children to Snapchat drug 
dealers, and three with an expertise in the area of public health. The 
committee would receive redacted quarterly reports that show:

   The date Snapchat received each law enforcement request.

   The date Shapchat fulfilled that request. (If multiple 
        fulfilments were required, list the date of each of those and 
        the date of the final fulfillment.)

   The reason any specific request was not fulfilled.

   The number of criminal referrals sent to law enforcement and 
        the agencies the referrals were sent to.

   The number of business accounts terminated or suspended for 
        drug-trafficking criminal behavior.

    It is assumed that information about each law enforcement request 
will be sufficiently redacted to protect the integrity of active 
investigations.
    Our families have suffered a great deal of pain as a result of drug 
dealers on Snapchat, and our children have died. Snapchat's verbal 
commitments, genuine good will, and public service messaging efforts 
are welcome, but the potential for more harm screams for transparency 
and oversight. We call upon Snapchat to demonstrate that they are doing 
the work to protect the young users of their platform.

Amy Neville
Mother of Alexander Neville, 14
Murdered by a fentanyl-based fake oxycodone sold by Snapchat drug 
dealer, 2020

Jaime Puerta
Father of Daniel Joseph Puerta-Johnson, 16
Murdered by counterfeit blue M30 oxycodone pill sold by Snapchat drug 
dealer, 2020

Maria Ortega
Mother of Adrian De Jesus, 19
Murdered by a fentanyl-based counterfeit oxycodone sold by Snapchat 
drug dealer in 2020

Christine Capelouto
Mother of Alexandra Capelouto, 20
Murdered by counterfeit blue M30 oxycodone pill sold by Snapchat drug 
dealer, 2019
Poisoned and Killed with a fentanyl based counterfeit ecstasy pill sold 
by a drug dealer on Snapchat, 2020.
        VOID is a non-profit organization located in California

    Senator Klobuchar. Thank you.
    Senator Blumenthal. We have been joined by Senator Cantwell 
remotely.
    The Chair. Mr. Chairman, I would defer to my colleagues, 
Senator Markey and Senator Baldwin.
    Senator Blumenthal. Thanks very much. Senator Markey.

               STATEMENT OF HON. EDWARD MARKEY, 
                U.S. SENATOR FROM MASSACHUSETTS

    Senator Markey. Thank you. Thank you, Mr. Chairman, very 
much. The problem is clear, big tech preys on children and 
teens to make more money. Now is the time for the legislative 
solutions to these problems, and that starts with privacy. I 
have introduced bipartisan legislation to give children and 
teens a privacy bill of rights for the 21st century. Today, a 
13 year old girl on these apps has no privacy rights.
    She has no ability to say no, no, you can't gobble up data 
about me. No, you can't use that data to power algorithms that 
push toxic content toward me. No, you can't profile me to 
manipulate me and keep me glued to your apps. No. You have no 
rights. 13 year old girl in the United States of America in the 
year 2021.
    My bipartisan Children and Teens Online Privacy Protection 
Act gives 13, 14, and 15 year olds that right to say no. To 
each witness, do you support my legislation to update the 
Children's Online Privacy Protection Act to give that 13, 14, 
and 15 year old control of their data? Ms. Stout.
    Ms. Stout. Senator Markey, I just want to say that we 
absolutely support a Federal privacy proposal, and we have 
worked very hard with members of this body to try to----
    Senator Markey. Do you support my child protection--my teen 
protection law? Do you support it?
    Ms. Stout. I think--so, Senator, we agree that there should 
be additional protections put against young people to protect 
them further from----
    Senator Markey. Right, so you have had a chance to look at 
the child online privacy protection update that I have 
introduced. It has been out there for years. Do you support it 
or not?
    Ms. Stout. I think, Senator, we would love to talk to you a 
bit more about some of the issues----
    Senator Markey. No, I want to talk--this is just what 
drives us crazy. We want to talk. We want to talk. We want to 
talk. This bill has been out there for years, and you still 
don't have a view on it. Do you support it or not?
    Ms. Stout. I think there are things that we would like to 
work with you on, Senator.
    Senator Markey. Listen, this this is just the all game--Mr. 
Beckerman, do you support the Child Online Privacy Protection 
Act being updated the way my legislation does?
    Mr. Beckerman. Senator, I want to thank you for your 
leadership on this issue. It has been very important. Yes, we 
agree that COPPA needs to be updated, particularly as it 
relates to the way age verification happens across the 
internet. It is an area that I think has been not given as much 
attention as it deserves. We do agree that COPPA does need to 
be updated.
    Senator Markey. Do you support my legislation to update it? 
You have had plenty of time to look at it.
    Mr. Beckerman. I--we like your approach. However, I think a 
piece that should be included is a better way to verify age 
across the internet, across apps, rather than the system that 
is in place now. And I think with that improvement, it would be 
something that we would be happy to support.
    Senator Markey. OK. Ms. Miller.
    Ms. Miller. Senator, we also support the goals of updated 
comprehensive privacy legislation. On your specific bill, I 
know we have had conversations with your staff in a 
constructive manner and I would welcome continuing to do that.
    Senator Markey. Yes, but it is going to happen soon because 
this is a crisis that--thank you, Senator Blumenthal. Thank 
you. Congresswoman Blackburn, this has just surfaced in a way 
that is clear that we don't have any more time. We have to get 
this finished. Your platforms are full of young teens.
    Among young teens, 49 percent say they are on TikTok, 52 
percent say they are on Snapchat, 81 percent say they are on 
YouTube. And those users, those 13 year olds, they deserve the 
right to say, no, you can't track me. You all--you agree with 
that, Ms. Stout? You can't track me. Do you agree with that?
    Ms. Stout. Yes, I agree with that.
    Senator Markey. Do you agree with that, Mr. Beckerman?
    Mr. Beckerman. Yes, Senator.
    Senator Markey. Do you agree with that, Ms. Miller?
    Ms. Miller. Yes, Senator, we have tools for all of our 
users to handle and control and make choices as it relates to 
the information that is gathered.
    Senator Markey. The bill also would ban targeted ads to 
children, apps that should never be allowed to track a 10 year 
old's browsing history and bombard him with ads based on that 
data. Ms. Miller, you said that YouTube Kids platform prohibits 
targeted ads to children. Do you agree that Congress must ban 
targeted ads to children?
    Ms. Miller. Senator, I defer to you and your peers in terms 
of what you would want to move forward, but again, we have not 
waited for laws like this----
    Senator Markey. No, I am saying to you, would you support 
to make sure there is a uniform banning of this practice? If 
you have already adopted it as a company, would you support 
that being the standard that we have for all platforms across 
the country?
    Ms. Miller. As you describe it, it is consistent with the 
approach we have already taken.
    Senator Markey. And you would support it, is that what you 
are saying?
    Ms. Miller. Senator, again, we are already doing this and 
would support it consistent----
    Senator Markey. No, I am asking you. Would you--no, we are 
trying to draft the law here. Would you support that provision 
being in a law, to prohibit it?
    Ms. Miller. Senator, yes, as we already prohibit targeted 
advertising----
    Senator Markey. OK, that is all we are trying to do, just 
get to yes so that we can legislate it--so that we can 
legislate. Mr. Beckerman, Ms. Stout, same question, should 
kids--ban targeted ads to kids?
    Ms. Stout. Senator, we offer those tools already where kids 
can opt out and not have targeted ads served to them.
    Senator Markey. Would you support that as a national law, 
that we ban it?
    Ms. Stout. Senator, an example of that has been the age 
appropriate design code, which we adhere to. And I can tell 
you, Senator, we are looking at exactly that model to apply it 
elsewhere----
    Senator Markey. Do you support it as a law that this body 
passes this year just to prohibit it? If you say it is wrong, 
should we prohibit it?
    Ms. Stout. We offer those, and we agree with the approach, 
so----
    Senator Markey. So yes, you do support it? Yes or no?
    Ms. Stout. I mean, we agree with your approach. So we are 
applying it to more----
    Senator Markey. I appreciate what you are doing. Right, we 
are now trying to say, if you support it then we just wouldn't 
want to prohibit anyone else from doing it. Yes?
    Ms. Stout. I think we are very close, Senator. Yes.
    Senator Markey. Very close. OK, Mr. Beckerman.
    Mr. Beckerman. Yes, Senator. And I would also say that we 
should go beyond that. And part of the approach that we have 
taken is certain categories of ads shouldn't be shown to 
teenagers and young adults at all, and I think that should be 
part of the approach as well.
    Senator Markey. OK, so we also need to go beyond privacy 
and tackle to design features that harm young people, take 
``like'' buttons. As Senator Blumenthal and I have a bill, the 
Kids Act, which would ban these and other features that 
quantify popularity.
    The research is clear, these features turned apps into 
virtual popularity contests and are linked to feelings of 
rejection, low self-worth, and depression. Even YouTube Kids 
has acknowledged this problem and does not have like buttons. 
To each witness, should Congress ban features that quantify 
popularity for kids? Yes or no?
    Ms. Stout. Senator, as I mentioned in my opening statement, 
we don't have those metrics. We have never had a like button or 
comments because we don't think it should be a popularity 
contest, but we support that.
    Senator Markey. So you would support that? OK. Mr. 
Beckerman.
    Mr. Beckerman. Senator, I think this one is one that we 
should--it is a little bit more complex. I would be happy to 
have a conversation, but we have implemented much of the age 
appropriate design code here in the United States and would 
encourage similar measures.
    Senator Markey. So, you--I don't know that there was any 
answer in that. Would--you said it is complicated. Do you 
support banning it or not?
    Mr. Beckerman. On banning likes?
    Senator Markey. Likes, yes.
    Mr. Beckerman. I mean--I think if you want to set it by age 
that is something that we could look at.
    Senator Markey. Ms. Miller.
    Ms. Miller. Senator, as you noted, we already prohibit 
things on YouTube Kids such as being able to comment, and we 
would support working with you in regulation in this area.
    Senator Markey. OK, well, you would support working with 
us, but would you support banning likes?
    Ms. Miller. Yes, Senator. Again, we already do not allow 
for this on the YouTube--YouTube Kids platform.
    Senator Markey. And again, the American Academy of 
Pediatrics just declared a national state of emergency for 
children and teen mental health. We need to outlaw the online 
features that exacerbate this crisis. The question that, you 
know, we have to answer ultimately is, you know, whether or 
not, for example, we are going to ban auto play for kids.
    With this feature, when one video ends, another quickly 
begins. Kids stay glued to their phones, so apps collect more 
data and make more money. Today, 82 percent of parents are 
worried about their kids screen. Time to each of you today, do 
you agree that Congress should ban auto play for kids? Yes or 
no? Ms. Miller, we will start with you this time.
    Ms. Miller. Senator, each of the items that you are 
outlining, we already prohibit. We do--we have auto default--
excuse me, the default set to auto play off on YouTube Kids, as 
well as for supervised experiences.
    Senator Markey. OK, so you would support that being 
legislated?
    Ms. Miller. Yes, sir.
    Senator Markey. OK. Mr. Beckerman.
    Mr. Beckerman. First Senator, we have take a break videos 
and time management tools. But as it relates to auto play, on 
TikTok, you have to proactively switch to the next video.
    Senator Markey. So would you support that legislation 
passing then which would ban auto play?
    Mr. Beckerman. We would be happy to look at it and talk to 
you about it.
    Senator Markey. You don't do it.
    Mr. Beckerman. Again, I think it is important, as we look 
at age appropriate features for teens, it is something that we 
build in TikTok proactively. But again, as we look at 
legislation, I do think a first step is something about--around 
age verification across apps.
    Senator Markey. Again, this is the historic problem. Yes, 
Ms. Stout, would you support it?
    Ms. Stout. Senator Markey, I don't believe we have auto 
play on Snapchat, so I would defer and say that that is 
something we need to look at more closely. And I am not 
familiar with that piece of the proposal in your legislation.
    Senator Markey. OK, great. We have a lot of work to do. And 
we have to telescope the timeframe, I think, Mr. Chairman, to 
get these people to finally----
    Senator Blumenthal. Telescoping the time-frame sounds good.
    Senator Markey.--finally start acting--thank you, Mr. 
Chairman.
    Senator Blumenthal. Thank you. Thanks, Senator Markey. 
Thanks for all your good work. Senator Baldwin--or actually, I 
see Senator Cantwell is here. Oh, sorry, Senator. OK.

                 STATEMENT OF HON. JOHN THUNE, 
                 U.S. SENATOR FROM SOUTH DAKOTA

    Senator Thune. Thank you, Mr. Chairman. We all know social 
media offers a lot of benefits and opportunities, but like has 
been expressed this morning, I have concerns about the lack of 
transparency online and limited accountability of big tech 
companies.
    And one of the major problems with social media that has 
been increasingly concerning is social media platforms' use of 
algorithms to shape and manipulate user experience resulting in 
individuals being trapped in what we call the filter bubble. 
The filter bubble can be particularly troubling for younger 
users.
    For example, a recent Wall Street Journal article described 
in detail how TikTok's algorithm serves up sex and drug videos 
to minors. I have a bill, the Filter Bubble Transparency Act, 
and another bill called the Pact Act that would make 
significant strides in addressing the lack of transparency 
online.
    And importantly, the Filter Bubble Transparency Act would 
give consumers the option to engage with Internet platforms 
without being manipulated by opaque algorithms. So let me just 
ask you, do you believe consumers should be able to use social 
media platforms without being manipulated by algorithms 
designed to keep them engaged on the platform? Mr. Beckerman 
let's start with you, and then we will go with Ms. Miller and 
Ms. Stout.
    Mr. Beckerman. Yes, Senator, we agree that there needs to 
be transparency in the way algorithms work and additional 
choice for individuals as they are using them.
    Senator Thune. Ms. Miller.
    Ms. Miller. Senator, we do provide transparency in the way 
that our systems and practices work.
    Senator Thune. Ms. Stout.
    Ms. Stout. Senator Thune, it is important to understand 
that what we apply algorithms to is a very small set of 
content, and we do provide transparency to our users where 
users get to select interest categories that then determine the 
kind of content that they are served up. But it is not an 
unlimited list or set of user generated content. It is quite 
narrow.
    Senator Thune. But I don't know that you or Ms. Miller 
really answered the question. That is, should consumers who use 
these social media platforms be able to use them without being 
manipulated by algorithms?
    Ms. Stout. Senator, yes, I agree with you.
    Senator Thune. Ms. Miller.
    Ms. Miller. Yes, Senator.
    Senator Thune. Mr. Beckerman, what is your response to the 
Wall Street Journal article that described in detail how 
TikTok's algorithm serves up sex and drug videos to minors?
    Mr. Beckerman. Senator, thank you for the question. Sex and 
drugs are violations of our community guidelines and have no 
place on TikTok. As it relates to the Wall Street Journal 
article, we disagree with that being an authentic experience 
that an actual user would have.
    Senator Thune. Your platform is perhaps more driven by 
algorithms in any other social media platform available today, 
more so even than Facebook. Unlike Facebook, TikTok's 
algorithms not constrained by a user's social network. On July 
29, 2020, TikTok's former CEO Kevin Mayer, wrote, and I quote, 
``that we believe all companies should disclose their 
algorithms, moderation policies, and data flows to 
regulators.''
    TikTok also states on its website that it makes TikTok 
source code available for testing and evaluation to guests at 
its transparency and accountability center. Has TikTok 
disclosed their algorithms, moderation policies, and data flows 
to any Federal or State regulators?
    Mr. Beckerman. Senator, yes. I mean, as we pointed out, we 
do have these transparency centers and we have done I don't 
think over 100 tours with members of Senator staff and others 
in the U.S. Government, and we would be happy to continue to be 
transparent in how that works.
    Senator Thune. And I think maybe that Senator Blumenthal 
maybe touched on this. But in keeping with TikTok's disclosure 
practices announced in July 2020, would you commit to providing 
TikTok's algorithms, moderation policies, and data flows to 
this committee so that we may have independent experts review 
them?
    Mr. Beckerman. Yes, sir.
    Senator Thune. Thank you. Ms. Miller, does YouTube engage 
in efforts to change its users' attitudes, behaviors, or 
influence its users in any way?
    Ms. Miller. Senator, when users come to YouTube, they come 
to search and discover all types of content. For example, how 
to bake bread, to watch a church service, or to do exercise. 
And as a result, they are introduced to a diversity of content 
that isn't based on a particular network that they are a part 
of.
    In so doing, there may be additional videos that are 
recommended to them based on some signals. But those signals 
will be overrided [sic] if they--to make sure that we are not 
recommending harmful content.
    Senator Thune. Back to Mr. Beckerman. All Chinese Internet 
companies are compelled by China's national intelligence law to 
turn over any and all data that the Government demands, and 
that power is not limited by China's borders. Has TikTok 
provided data to the Chinese Government on Chinese persons 
living in the United States or elsewhere outside of China?
    Mr. Beckerman. No, Senator. TikTok is also not available in 
China. And as I like to point out, that our servers with U.S. 
data are stored in the United States.
    Senator Thune. Does TikTok censor videos of tank man, the 
famous video of the young man who stood his ground in front of 
a procession of Chinese army tanks during the 1989 Tiananmen 
Square crackdown in Beijing?
    Mr. Beckerman. No, Senator. You can--you can find that 
content on TikTok if you search for it.
    Senator Thune. Mr. Chairman, I would suggest that, as has 
already been pointed out on the Committee, that I think there 
are a number of things that we need to address. Congress needs 
to be heard from in this space and particularly with respect to 
the use of algorithms and the way that users are manipulated, 
and as you all have already pointed out, particularly young 
people. So I hope that we can move quickly and directly and in 
a meaningful way to address this issue. Thank you.
    Senator Blumenthal. Thank you. I think we have strong 
bipartisan consensus on that issue. Thank you, Senator Thune. 
Senator Baldwin.

               STATEMENT OF HON. TAMMY BALDWIN, 
                  U.S. SENATOR FROM WISCONSIN

    Senator Baldwin. Thank you, Chairman Blumenthal. I would 
like to just note that the series of hearings really began with 
the revelation that internal research at Facebook revealed the 
negative impacts on teenagers' body images from using the 
company's Instagram platform. And we learned, based on research 
by the Chairman's staff, how quickly somebody on Instagram can 
go from viewing content on healthy eating to being directed 
toward postings that focus on unhealthy practices, including 
glorifying eating disorders.
    I know we don't have Facebook and Instagram before us 
today, but I am particularly concerned about the impact that 
that type of content can have on young users. I recently joined 
Senators Klobuchar and Capito, with whom we sponsored the Adam 
Weston Act legislation to support training and education on 
eating disorders, on a letter to Facebook and Instagram seeking 
more details about how they handle this issue.
    But I want to ask each of you, can you briefly outline the 
steps your companies are taking to remove content that promotes 
unhealthy body image and eating disorders and direct users to 
supportive resources instead? And in particular, how are you 
focusing on this issue with regard to your younger users? Why 
don't we start with Mr. Beckerman and TikTok?
    Mr. Beckerman. Thank you, Senator. I myself have two young 
daughters, and this is something that I care a lot about in our 
teams at TikTok hear a lot about. One, I want to show you we do 
aggressively remove content like you are describing that would 
be problematic for eating disorders and problem eating.
    Second, we work with outside groups and direct people that 
are seeking help. And one thing we have heard is that people 
who are struggling with eating disorders or other weight loss 
issues come to TikTok to express themselves in a positive way. 
And so it has been a more of a positive source. And last, we 
don't allow ads that target people based on weight loss and 
that kind of content.
    Senator Baldwin. Ms. Stout.
    Ms. Stout. Thank you, Senator Baldwin. I want to make clear 
that the content that you describe, content that glorifies 
eating disorders or self-harm is a complete violation of our 
community guidelines. Also, as I described earlier, we don't 
allow unvetted, unmoderated content from being surfaced up to 
our users.
    Discover, which is our media publisher platform, which we 
partner on with people and publishing companies like The Wall 
Street Journal or NBC News, all of that content is vetted and 
moderated ahead of time. But specifically----
    Senator Baldwin. Can I interrupt and ask, is that done 
through AI or humans?
    Ms. Stout. No. These are handpicked partners that Snapchat 
has selected to say in this closed garden of content, which is 
Discover, we will allow certain publishers and media companies 
to provide news, entertainment content, ESPN or CMT or, you 
know, The Washington Post, in fact, so users can come look at 
that content.
    It is all pre-moderated and curated, so it is not an 
unlimited source of user generated content where you could go 
down a rabbit hole, perhaps, and access that kind of hurtful, 
damaging content on body image. But I think you raise a very 
interesting question, what are the products that you are 
surfacing?
    How are we helping users find positive resources? And as a 
result, we did conduct research about the mental health effects 
of body image and self-harm, and we created a product called 
Here for You. This was created in 2020 in the height of the 
pandemic. When users are in Snapchat and they search anorexia 
or eating disorder, instead of perhaps being led to content 
that could be harmful, that content, which is against our 
guidelines, we now surface expert resources that show content 
that can help that user, maybe help them, maybe help their 
friends.
    So this is a redirection of that kind of search for 
potentially hurtful and harmful content that then steers the 
user to resources that may help them or a member of their, you 
know, circle of friends.
    Senator Baldwin. Ms. Miller.
    Ms. Miller. Senator, we take a comprehensive and really 
holistic approach on topics like these. We prohibit content 
that promotes or glorifies things such as eating disorders. It 
has no place on our platform. But we also realize that users 
come to share their stories about these experiences or find a 
community, let alone to find authoritative sources, which is 
what we raise up on searches like this.
    In addition, we also roll out programs and initiatives such 
as the With Me campaign, whereby we are encouraging users to 
spend their time, particularly during COVID, in pursuing 
healthy habits. So we look at this in a very holistic way to 
make sure that YouTube is a platform where people come and they 
have a healthy experience, and we again prohibit the type of 
content that glorifies or promotes these issues, such as eating 
disorders.
    Senator Baldwin. And just if I can follow up the same way I 
did with Ms. Stout, when you remove content, how are you 
filtering that out? Are you using artificial intelligence or 
are you using a team of humans who are--a team of people who 
are looking at that content and deciding whether to remove it 
or not?
    Ms. Miller. It is a mix, Senator. So it is a mix of when we 
develop content policies, we rely on experts to inform the 
development of these policies and then we have machine learning 
to help us capture this type of content at scale. You will see 
in our quarterly transparency report that more than 90 percent 
of content that violates our community guidelines are flagged 
originally by machines. And then there is a mix of human 
reviewers.

               STATEMENT OF HON. MARIA CANTWELL, 
                  U.S. SENATOR FROM WASHINGTON

    The Chair. Thank you. I want to thank the, Senator 
Blumenthal and Senator Blackburn for holding the Subcommittee 
hearing, and as witnesses can see, our colleagues are well-
informed and very anxious to get legislative fixes to things 
that they think are crucial to protecting individuals and 
protecting people's privacy, so I want to thank them for that.
    Yesterday, Motherboard Vice had an article that--basically 
the headline was location data from GPS data from apps are 
given even when people have opted out. So basically, they are--
I am going to enter this for the record, unless there is 
objection, but ``the news highlights a stark problem that 
smartphone users, that they can't actually be sure if some apps 
are respecting their explicit preferences around data sharing. 
The data transfer presents an issue for the location data 
companies themselves.''
    So basically, these companies are reporting information 
about location, even when people have explicitly opted out. And 
so they are continuing to collect this information. That is 
what the reported researchers and Motherboard found. So I have 
a question, do you believe that location data is sensitive data 
and should be collected only with consumers' consent? All the 
witnesses, please.
    Ms. Stout. Yes, Senator, we agree.
    Mr. Beckerman. Yes, Senator, agree.
    Ms. Miller. Yes, Senator. And for users, they have access 
to their account under my activity and my account and can 
modify their settings, delete their history, and things of that 
nature. It is all just one click away.
    The Chair. So any Federal privacy law should make sure that 
that is adhered to. I see a nod--is that----
    Ms. Miller. Yes, Senator.
    Mr. Beckerman. Yes, Senator.
    Ms. Stout. Yes.
    The Chair. OK, thank you. Do any of you share location data 
with the company that is in this article? It is Huk--I think H-
U-K. They are a major data----
    Ms. Stout. Senator, I have never heard of that company. I 
am not aware.
    The Chair. OK.
    Mr. Beckerman. Senator, I am not aware of the company, but 
we also don't collect GPS data.
    The Chair. Well, it would be--you would be affiliated with 
them in some way. I mean, they are getting this information 
anyway. So I am sorry, the last witness, if you. Do you know--
--
    Ms. Miller. Senator, I am also not aware of----
    The Chair. OK. Maybe you can--maybe you can help us for the 
record on this so that we know. But this is exactly what the 
public is frustrated about and concerned about, particularly 
when harm can be done, that, you know, they go to a website, 
they say to the website, I don't want my sensitive information 
to be shared, and then there is this conglomerate of data 
gathering on top of that, that is not honoring those wishes as 
it relates to the interface with those apps.
    So this is, I think, exactly why we need a strong privacy 
law and why we should, you know, protect consumers on this. In 
the Facebook hearing we had, we had this discussion about 
advertising, and the issue of whether advertisers knew exactly 
what the content was that was being advertised.
    Now I get that, you know, we are also seeing a migration of 
major companies like Procter & Gamble and others who are moving 
off of the Internet because there is the--they are like, I am 
done with it. I don't want my ad--because it is now run by a 
system. I don't want my ad just appearing next to certain kinds 
of content.
    And so but what was more startling is that there may be 
actual deceptive practices here where people are saying, oh, 
this content, is this when in reality it is something else. In 
some of these cases we just discussed with Facebook, 
objectionable hate speech and content that we don't even think 
should be online.
    And yet, that is not what the advertisers knew. So on your 
websites, do advertisers know what they are--what content they 
are being placed next to?
    Ms. Stout. So, Senator, I can respond to your question. 
Yes, our advertisers do know where they are advertising--their 
advertisements show up. And as I had mentioned in Discover, 
which is that closed curated garden, those advertisements 
appear next to publishers and verified users that we have hand-
selected to allow to appear.
    So on a platform like Snapchat, there is no broadcast 
disinformation or hate speech, and that is why I think Snapchat 
is in fact, a very appealing place for advertisers because they 
know their advertisements will be placed next to safe content.
    The Chair. Mr. Beckerman.
    Mr. Beckerman. Yes, Senators. Advertisers come to TikTok 
particularly because our content is known for being so 
authentic and uplifting and fun. And you know, we see ads that 
are very much like TikTok videos, which are the same themes.
    The Chair. Ms. Miller.
    Ms. Miller. Senator, we have worked with our advertising 
partners over the years to make sure that they have trust in 
the fact that advertising on YouTube is safe for their brands 
in the same way that we have worked significantly to make sure 
that users themselves have a safe experience on the platform.
    And the advertising associations have recognized the work 
that we have done in this space so that their brands are safe 
on the platform.
    The Chair. Thank you. I will probably have a follow up on 
this, but Senator Lee.

                  STATEMENT OF HON. MIKE LEE, 
                     U.S. SENATOR FROM UTAH

    Senator Lee. Thank you, Madam Chair. Ms. Miller, I would 
like to start with you if that is all right. Want to ask you a 
particular question regarding YouTube's app age rating. And 
now, Google Play has the app rating set at Teen, meaning 13 and 
up, while the Apple Store has it rated as 17 and up. Could you 
tell me why this disparity exists?
    That is, if Apple determined that the age rating for 
YouTube ought to be 17 and up, why did Google determine that 
its own app should be rated as Teen, meaning 13 and up?
    Ms. Miller. Senator, I am unfamiliar with the differences 
that you have just outlined, but I would be happy to follow up 
with you and your staff once I get more details on this.
    Senator Lee. OK. Yes, I would love to know about that. Just 
it is a simple question, and I understand it you may not be 
able to answer it right now, if--as it sounds, you don't have 
the information. But I would just like to know why that 
difference exists and whether you agree or disagree with the 
fact that Google has created its own app as 13 and up while 
Apple has rated it 17 and up. But I am happy to follow up on 
that in writing or otherwise.
    Ms. Stout, I want to address a similar issue with regard to 
Snapchat. Now, Snapchat is rated 12 and up on Apple, and it is 
rated teen on the Google Play Store. Any idea where--why there 
is that disparity there?
    Ms. Stout. Senator, that is a very good question, and I--
for some reason, I have heard somewhere the reason why Apple 
lists at 12 and up, it is an app that is intended for a teen 
audience.
    Senator Lee. Right. Why is there a disparity between the 
age rating and the content that is available on that platform?
    Ms. Stout. Senator, the content that appears on Snapchat is 
appropriate for an age group of 13 and above.
    Senator Lee. Yes, let's talk about that for a minute 
because I beg to differ. In anticipation of this discussion, of 
this hearing, I had my staff create a Snapchat account for a 13 
year old--for a 15 year old child. Now they didn't select any 
content preferences for the account. They simply entered a 
name, a birth year, and an e-mail address.
    And then when they opened the Discover page on Snapchat 
with its default settings, they were immediately bombarded with 
content that I can most politely describe as wildly 
inappropriate for a child, including recommendations for, among 
other things, an invite to play an online sexualized video game 
that is marketed itself to people who are 18 and up, tips on, 
``why you shouldn't go to bars alone,'' notices for video games 
that are rated for ages 17 and up, and articles about porn 
stars.
    Now, let me remind you that this inappropriate content that 
has by default been recommended for a 15 year old child is 
something that was sent to them by an app just using the 
default settings. So I respectfully, but very strongly beg to 
differ on your characterization that the content is in fact 
suitable for children 13 and up, as you say.
    Now, according to your own website, Discover is a list of 
recommended stories. So how and why does Snapchat choose these 
inappropriate stories to recommend to children? How does that 
happen? How would that happen?
    Ms. Stout. So, Senator, allow me to explain a little bit 
about Discover. Discover really is a closed content platform. 
And yes, indeed, we do select, and hand select partners that we 
work with, and that kind of content is designed to appear on 
Discover and resonate with an audience that is 13 and above. I 
am unfamiliar and I have taken notes about what you have said 
that your account surfaced.
    I want to make clear that what--content and community 
guidelines suggest that any online sexual video game should be 
age gated to 18 and above. So I am unclear why that content 
would have shown up in an account that was for a 14 year old. 
But these community guidelines and publisher guidelines that 
are on top of those guidelines are intended to be an age 
appropriate experience for a 13 year old.
    Senator Lee. Right. I understand that you have these 
community guidelines that are there, that they note that 
advertisers and media partners in Discover agree to additional 
guidelines. What are these additional guidelines? And I mean, I 
can guess only that they permit these age inappropriate 
articles to be shared with children. How would that not be the 
case?
    Ms. Stout. Senator, so these additional guidelines on top 
of community guidelines are things that suggest they may not 
glorify violence that any news articles must be accurate, and 
fact checked. That there is no----
    Senator Lee. Well, I am sure the articles about the porn 
stars were accurate and fact checked, and I am sure that the 
tips on why you shouldn't go to bars alone are accurate and 
fact checked. But that is not my question. And this is about 
whether it is appropriate for children ages 13 and up, as you 
have certified?
    Ms. Stout. Absolutely. And Senator, I think this is an area 
where we are constantly evolving, and if there are any 
instances where these publishers are surfacing content to an 
age cohort that is inappropriate, then they will be removed 
from our platform.
    Senator Lee. OK, so you do review them? What kind of 
oversight you conduct on this and what----?
    Ms. Stout. We use a variety of human review as well as 
automated review. And so I would very much be interested in 
talking to you and your staff about what kind of content this 
was, because if it violates our guidelines, that kind of 
content would come down.
    And just Senator one last thing, while I would agree with 
you, tastes vary when it comes to the kind of content that is 
promoted on Discover, there is no content there that is 
illegal. There is no content there that is hurtful. I mean, it 
really is intended to be a closed ecosystem where we have 
better control over the type of content that surfaces.
    Senator Lee. Madam Chair, I just have one follow-up 
question. I just realized. I will be brief----
    The Chair. Yes, go ahead, Senator Lee. And then our 
colleague, Senator Lujan.
    Senator Lee. Thank you. Thank you so much, Madam Chair. So 
Snapchat has assured its users that it doesn't collect 
identifying data on them for advertising. How does Snapchat 
then decide what content is pushed to the top of their Discover 
page?
    Ms. Stout. So, Senator, if you go into your Snapchat 
account, you have the ability to select preferences, interest 
categories. And there are several interest categories that a 
user can select or unselect if they wish. That could be they 
like, you know, to watch movies, or they enjoy sports, or they 
are fans of country music.
    At any point, it is completely transparent, and a user has 
the ability to go in and select what they like, and that 
determines the kind of content that is surfaced to them. If 
there is any content that they don't like, they can uncheck or 
check, and that really generates the kind of content that a 
user in Discover would see.
    Senator Lee. My time has expired. Thank you so much, Madam 
Chair. I really do think we have got to get to the bottom of 
this--these app ratings here are inappropriate. We all know 
that there is content on Snapchat and on YouTube, among many 
other places that is not appropriate for children ages 12 or 13 
and up.
    The Chair. Well, I thank you, Senator Lee, and I would say 
to my line of questioning, and it is not appropriate to tell 
advertisers that it is not located next to content and then it 
is next to content that is inappropriate. Senator Lujan.

               STATEMENT OF HON. BEN RAY LUJAN, 
                  U.S. SENATOR FROM NEW MEXICO

    Senator Lujan. Thank you very much, Madam Chair. Ms. Stout, 
in your testimony, you mentioned that all content on Snapchat, 
on the Spotlight page is human reviewed before it can be viewed 
by more than 25 people. Ms. Stout, yes or no, does human review 
help Snapchat reduce the spread of potentially harmful content?
    Ms. Stout. Yes, Senator, we believe it does.
    Senator Lujan. And I appreciate Snapchat's approach to this 
problem. More platforms should work to stop harmful content 
from going viral. However, far too often we find companies say 
one thing to Congress and then once attention is diverted and 
the public is distracted, they go around and do the very thing 
they were warning us against. Can I hold you to that? Will 
Snapchat continue to keep a human in the loop before content is 
algorithmically promoted to large audiences?
    Ms. Stout. Well, Senator, this is the first time I have 
testified here before Congress, so please hold me to it. But, 
at Snapchat, we have taken a very human moderation first 
approach, not just on spotlight, but across our platforms. So 
yes, indeed, human moderation will continue to play a huge part 
of how we moderate content on our platform and how we keep our 
users safe.
    Senator Lujan. I am glad to see the importance of platforms 
taking responsibility before they amplify content and 
especially publish it to a mass audience, and it is something 
that many of us share. It is why I introduced the Protecting 
Americans from Dangerous Algorithms Act as well. Online 
platforms must be responsible when they are actively promoting 
hateful and dangerous content.
    Ms. Miller, I am grateful that YouTube is making an effort 
to be more transparent regarding the number of users that view 
content in violation of your community guidelines. However, I 
am concerned with one trend. Earlier this year, I wrote a 
letter to YouTube with 25 of my colleagues on the crisis of 
non-English misinformation on the platform. We need to make 
sure all communities, no matter the language they use at home, 
have the same access to good, reliable information.
    Ms. Miller, will YouTube publish its violative view rates 
broken down by language?
    Ms. Miller. Senator, thank you for your question, and what 
you are referring to is this latest data point that we shared 
earlier this year in which for every 10,000 views on YouTube, 
19 to 21 of those views are of content that is violative. And 
we apply our content policies at a global scale across 
languages. We do not preference any one language over another. 
And this includes for the violative view rate.
    Senator Lujan. And Ms. Miller, I don't believe that's good 
enough. When we don't break algorithms down by their 
performances across different groups of people, we end up 
making existing gaps, existing biases worse. We have seen this 
with facial recognition technology that unfairly targeted 
communities of color. And according to reports, we are seeing 
this happen right now on YouTube. So I will ask again, will 
YouTube publish its violative view rate broken down by 
language?
    Ms. Miller. Senator, I would be happy to follow up with you 
to talk through these details, as I said, for all of our 
content policies and the enforcement there within and the 
transparency we provide. It is global in scope, and it is 
across languages.
    Senator Lujan. I definitely look forward to following up 
and working with you in that space. Mr. Beckerman, before 
launching TikTok for younger users, did TikTok do any internal 
research to understand the impact it would have on young 
children?
    Mr. Beckerman. Thank you, Senator. I am not aware, but for 
TikTok, for younger users, the content is curated with common 
sense networks, and it is an age appropriate experience, but I 
am not aware of any specific research.
    Senator Lujan. I would like to follow up on that as well. 
Products like TikTok can lead to addictive behavior and body 
image issues in young children, and it is critical that 
platforms work to understand these problems before they take 
place. This is a very serious issue, and it is one that is 
finally getting the attention that it deserves with revelations 
and whistleblowers that have come forth. I urge you to take 
this opportunity to begin a transparent public evaluation of 
the impact your product is having on young children.
    And in the end, I just want to follow up on something that 
many of us have commented on leading up to these important 
hearings, and I appreciate the Chair's attention to this, the 
Ranking Member, both of them have authored legislation. Our 
Chair and Ranking Member of the Subcommittee have also 
partnered on legislative initiatives. It is critically 
important that we continue moving forward and that we markup 
legislation and get something adopted. And I am certainly 
hopeful that here in the United States, we are paying attention 
to what is happening in other parts of the world.
    Again, Europe is outpacing the United States and being 
responsible with legislative initiatives surrounding protecting 
their consumers. There is no reason we can't do that here as 
well, and I just want to thank Chair Cantwell for the work that 
she has been doing in this particular space, and I definitely 
look forward to working with everyone to make sure that we are 
able to get this done here in the United States. Thank you so 
much and I yield back.
    The Chair. Senator Lujan, so you have reintroduced that 
bill in the Senate. Is that right? So, OK, thank you. Very much 
appreciate that and appreciate your leadership. So the 
witnesses--We are waiting on the return of Senator Blumenthal, 
so I could go and vote. If he doesn't come in the next minute 
or so, we will just take a short recess because we are way past 
time to get over there. But I want to thank all the members who 
have participated thus far because we have had a very robust 
discussion today.
    You can see that this is a topic that the members of this 
committee feel very, very passionately about and obviously 
believe that there is much more that we need to be doing in 
this particular area. So I appreciate everybody's attendance 
and focus, and again, want to thank Senator Blumenthal and 
Senator Blackburn for their leadership in having both of these 
hearings. And for the larger full committee, we had planned to, 
you know, move forward on many of these agenda items anyway, 
but we are very appreciative of the Subcommittee doing some of 
this work and having members have a chance to have very 
detailed interactions on these policies that we need to take 
action on. So very much appreciate that. So I see Senator 
Blumenthal has returned. Thank you so much and I will turn it 
over to you.
    Senator Blumenthal. Thank you, Chairman Cantwell, and 
thanks for your excellent work on this issue. I would like to 
ask some additional questions on legislative proposals. One of 
the suggestions that Senator Klobuchar raised was legal 
responsibility and liability, which, as you know, is now 
precluded by Section 230.
    Let me ask each of you, would you support responsible 
measures like the EARN IT Act, which I have proposed, to impose 
some legal responsibility and liability, cutting back on the 
immunity that Section 230 affords? Ms. Stout.
    Ms. Stout. Senator, we agree in Snapchat that there should 
be an update to the intermediary platform liability laws, 
CDA230, and in fact, the last time this body addressed a 
reform, which was for the SESTA/FOSTA, Snapchat was a company 
that actively participated in that and helped draft 
legislation. So we would welcome another opportunity, Senator, 
to work with you on that.
    Senator Blumenthal. Thank you. Would you support the EARN 
IT Act, which, as you know, Senator Graham and I have proposed 
it imposes liability and affords victims the opportunity to 
take action against platforms that engage in child pornography 
and related abuses?
    Ms. Stout. Of course, Senator, we completely prohibit that 
kind of activity, that illegal activity, and we actively look 
for it. And when we find it, we remove it. If you would allow 
me to get back to, it has been a while since I have looked at 
the EARN IT Act. I do recall when you and Senator Graham 
introduced it. But I believe that the spirit of your 
legislation is something we would very much support.
    Senator Blumenthal. Well, you had the opportunity before to 
say whether you supported it. So far, you haven't. Will you 
commit to supporting it?
    Ms. Stout. Senator, I again, my memory is failing me a 
little bit, but I do believe that the provisions in the EARN IT 
Act were many of the provisions that we supported, so I would 
be happy to come back with a more specific answer for you.
    Senator Blumenthal. Mr. Beckerman.
    Mr. Beckerman. Thank you, Senator. We do agree that there 
needs to be a higher degree of accountability and 
responsibility for platforms, particularly as it relates to 
content moderation, that needs to be done in a way that allows 
all platforms to moderate in an appropriate and aggressive way 
to make sure that the kinds of content that none of us want to 
see on the Internet or on any of our platforms is able to be 
removed.
    Senator Blumenthal. Do you support changes in Section 230 
to impose liability?
    Mr. Beckerman. There absolutely can and should be changes. 
But again, in a way that would allow companies like ours that 
are good actors, that are aggressively moderating our platform 
in a way that we think is responsible, to be able to continue 
to do so.
    Senator Blumenthal. Will you support the EARN IT Act?
    Mr. Beckerman. We would be happy--again, we agree with the 
spirit of it, and we would be happy to work with you and your 
staff on that bill.
    Senator Blumenthal. Well, the bill again was reported 
unanimously out of the Judiciary committee during the last 
session. It hasn't changed significantly. Did you support it 
then?
    Mr. Beckerman. I think the concern would be unintended 
consequences that would lead to hampering a company's ability 
to remove and police violative content on platforms.
    Senator Blumenthal. Is that a yes or a no?
    Mr. Beckerman. It is a maybe.
    Senator Blumenthal. Well, so far we have two maybes. Ms. 
Miller.
    Ms. Miller. Senator, I am aware of a number of proposals 
regarding potential updates to 230, and me and my team, as well 
as the other teams across Google, have been involved in the 
conversations regarding these various proposals.
    I would just like to say, though, that we see 230 as the 
backbone of the internet, and it is what allows us to moderate 
content, to make sure that we are taking down content that 
leads to potentially eating disorders, for example, what we 
have been talking about here earlier, or self-harm.
    So we want to make sure that we continue to have 
protections in place so we can moderate our platforms so that 
they are safe and healthy for users. I am aware of the EARN IT 
Act, and I know again that our staffs have been speaking, but I 
understand I think there is still ongoing discussions regarding 
some portions of the proposal.
    But we also very much appreciate and understand the 
rationale as to why this was introduced, particularly around 
the area of child safety.
    Senator Blumenthal. Well, again, is that a yes or no? Do 
you support it?
    Ms. Miller. We support the goals of the EARN IT Act, but 
there are some details that I think are still being discussed.
    Senator Blumenthal. Well, you know, as Senator Markey has 
said, this is the talk that we have seen again and again and 
again and again. We support the goals, but that is meaningless 
unless you support the legislation. And it took a fight, 
literally bare knuckle fight to get through legislation that 
made an exception under SESTA for liability on human 
trafficking. Just one small piece of reform. And I join in the 
frustration felt by many of my colleagues that good intentions, 
support for goals, endorsement of purposes is no substitute for 
actual endorsement.
    I would ask that each and every one of you support the EARN 
IT Act, but also other specific measures that will provide for 
legal responsibility. And I think I know what Ms. Miller means 
by the claim that Section 230 provides a backbone, but it is a 
backbone without any real spine right now, because all it does 
is afford virtually limitless immunity to the Internet and to 
the companies that are here. I am going to interrupt my second 
round and call on Senator Cruz.

                  STATEMENT OF HON. TED CRUZ, 
                    U.S. SENATOR FROM TEXAS

    Senator Cruz. Thank you, Mr. Chairman. Mr. Beckerman, thank 
you for being here today. I understand this is the first time 
that TikTok is testifying before Congress. And I appreciate you 
making the company available to finally answer some questions. 
In your testimony, you talked about all the things you say 
TikTok is doing to protect kids online. And that is great.
    But I want to discuss the broader issue here, which is the 
control the Chinese Communist Party has over TikTok, its parent 
company ByteDance, and its sister companies like Beijing 
ByteDance Technology. Now TikTok has stated repeatedly that it 
doesn't share the data it collects from Americans with the 
Chinese Communist Party, and that it wouldn't do so if asked. 
It has also stated that with regards to data collected on and 
from Americans that data is stored in Virginia with a backup in 
Singapore.
    But these denials may in fact be misleading. A quick look 
at TikTok's privacy policy, in fact, just last night shows 
there is a lot more than meets the eyes. For example, in the 
``how we share your information'' section, one blurb reads, 
``we may share all of the information we collect with a parent, 
subsidiary, or other affiliate of our corporate group.'' 
Interestingly, in June of this year, the privacy policy was 
updated to state that TikTok, ``may collect biometric 
identifiers and biometric information as defined under U.S. 
laws such as face prints and voice prints.''
    Mr. Beckerman, does TikTok, consider ByteDance, the parent 
company of TikTok, which is headquartered in Beijing, to be a 
part of TikTok's ``corporate group'' as that term is used in 
your privacy policy?
    Mr. Beckerman. Thank you, Senator. This is an important 
question. I would just like to take an opportunity first to 
clear up misconceptions around some of the accusations that 
have been leveled against the company. I would like to point to 
independent research. I understand that trust needs to be 
earned----
    Senator Cruz. Mr. Beckerman, I get you may have brought a 
point you want to make. My question is simple and 
straightforward, does TiKTok consider ByteDance, the parent 
company headquartered in Beijing, to be part of TikTok's 
corporate group? That is a yes or no.
    Mr. Beckerman. Senator, access controls for our data is 
done by our U.S. teams, and as independent researchers, 
independent experts have pointed out, the data that TikTok has 
on the app is not of a National Security importance and is of 
low sensitivity. But again, we do hold that to a high standard, 
and we have access control----
    Senator Cruz. Mr. Beckerman, we are going to try a third 
time because the words that came out of your mouth have no 
relation to the question you were asked. Your privacy policy 
says you will share information with your corporate group. I am 
asking a very simple question, is ByteDance, your parent 
company headquartered in Beijing, part of your corporate group? 
Yes or no, as you use the term in your privacy policy?
    Mr. Beckerman. Senator, I think it is important that I 
address the broader point in your statement.
    Senator Cruz. So are you willing to answer the question, 
yes or no? It is a yes or no question. Are they part of your 
corporate group or not?
    Mr. Beckerman. Yes, Senator, it is.
    Senator Cruz. Yes it is. OK. So under your privacy policy, 
you are explicitly stating that you may be sharing data with 
them, including biometric identifiers, including face prints, 
including voice prints. Is that correct?
    Mr. Beckerman. No Senator, in the privacy policy it says 
that, if we are to collect biometric information, which we do 
not collect biometric data to identify Americans, we would 
provide consent and opportunity for consent first.
    Senator Cruz. But you also say we may share all of the 
information we collect with a parent, subsidiary, or other 
affiliate of our corporate group, which means with ByteDance 
headquartered in Beijing, correct?
    Mr. Beckerman. Under U.S. access control, sir.
    Senator Cruz. Alright. Second, what about Beijing ByteDance 
technology, which media reports from earlier this year showed 
Beijing took a minority stake in through a State backed 
internet investment Chinese entity, and on the board of which 
now sits Wu Shuguang, a CCP official who spent most of his 
career in Chinese propaganda, including with a stint at the 
Online Opinion Bureau under the Cyberspace Administration of 
China, China's Internet regulator.
    Would you consider Beijing ByteDance technology to be a 
part of TikTok's corporate group with whom TikTok would share 
all of the information it collects?
    Mr. Beckerman. Senator, I want to be clear that that entity 
has no affiliation with TikTok. It is based for domestic 
licenses of the business in China that is not affiliated or 
connected to TikTok.
    Senator Cruz. So are you saying no or--yes or no, as to 
whether Beijing ByteDance technology is part of your corporate 
group as the privacy policy defines it, it says we may share 
all of the information we collect with a parent, subsidiary or 
other affiliate, and presumably that is where it would fall, 
other affiliate of our corporate group. Is Beijing ByteDance 
Technology a ``other affiliate'' of your corporate group?
    Mr. Beckerman. Senator, I am saying that entity deals with 
domestic businesses within China----
    Senator Cruz. You are having a hard time. You are answering 
questions I am not asking. Again, it is a yes, no, is Beijing 
ByteDance technology a ``other affiliate'' of your corporate 
group as your own privacy policy defines it?
    Mr. Beckerman. Senator, I am just trying to be clear to 
answer your question. That entity is based in China for the 
Chinese business that is not affiliated or connected with 
TikTok.
    Senator Cruz. So that is twice you haven't answered. 
Let's--last time you did it on the third time so let's try it 
again. Again, it is a yes, no.
    Mr. Beckerman. The answer is the same, Senator.
    Senator Cruz. Which is?
    Mr. Beckerman. What I just said, that that entity is----
    Senator Cruz. What you just said did not answer the 
question. Let me just repeat the question again. Is Beijing 
ByteDance technology a ``other affiliate of our corporate 
group,'' as your privacy policy defines it, yes or no?
    Mr. Beckerman. Senator, as I stated, that entity does not 
have any relation to the TikTok entity.
    Senator Cruz. So, I will point out it took three questions 
to get you to answer about your parent. You finally answered 
yes that you can share all your information with your parent 
company based in Beijing. I have asked you three times about 
this sister company that is obviously another affiliate.
    You have refused three times. That may be revealing. Often, 
as Sherlock Holmes observed about the dogs that do not bark, it 
may be revealing that the Chinese propaganda minister that is 
serving on your sister company and that has been in the 
business of online propaganda, you are refusing to answer 
whether they fall under your privacy policy. That reveals, I 
think, a great deal, unfortunately.
    Mr. Beckerman. Senator, with all due respect. I am just 
trying to be accurate here. There is a lot of accusations that 
are just not true, and I want to make sure that it is clear----
    Senator Cruz. OK, I am going to give you one more chance 
and my time is over, but look, in baseball, three strikes, you 
are out. Tonight, the Astros are going to begin winning the 
World Series. Let's see if a fourth strike, you could actually 
answer the question. And it is a simple yes, no. Is Beijing 
ByteDance Technology, a ``other affiliate of our corporate 
group,'' as your privacy policy defines that term?
    Mr. Beckerman. Senator, as I pointed out before, my answer 
is the same.
    Senator Cruz. Yes or no? You didn't answer.
    Mr. Beckerman. Senator, I appreciate your trying with 
gotcha questions. I mean----
    Senator Cruz. It is not a gotcha question. I am asking 
about your policy----
    Mr. Beckerman.--deceitful and inaccurate about----
    Senator Cruz. Are you willing to answer this question, yes 
or no?
    Mr. Beckerman. Senator, I answered the question.
    Senator Cruz. You have not answered the question. Is it 
another affiliate, yes or no?
    Mr. Beckerman. Senator, I stated a number of times that 
that entity is a domestic entity within China for licenses----
    Senator Cruz. And apples are red. You stated something that 
is not the question I asked. Is it another affiliate as defined 
under your privacy policy, yes or no?
    Mr. Beckerman. Senator, I answered----
    Senator Cruz. You are here under oath. Are you going to 
answer the question--?
    Mr. Beckerman. I answered the question----
    Senator Cruz. Or were you instructed not to answer this 
question?
    Mr. Beckerman. No, Senator, I am just----
    Senator Cruz. So, you are just not refusing to answer it 
because you don't want to?
    Mr. Beckerman. Senator, it is not affiliated with TikTok. 
If that is your question, that is the answer.
    Senator Cruz. So, your answer. I want to be clear because 
you are under oath. Your answer is that Beijing ByteDance 
technology is not a ``other affiliate of our corporate group'' 
as your privacy policy uses that term. This is a legal question 
with consequence.
    Mr. Beckerman. Senator, I understand the question. As I 
pointed out, TikTok is not available in China. That is an 
entity that is for purposes of a license of a business in China 
that is not affiliated with TikTok.
    Senator Cruz. So for the record, you are refusing to answer 
the question.
    Mr. Beckerman. I believe I answered your question, Senator.
    Senator Cruz. Yes or no, tell me which one it is--just give 
me one word yes or no.
    Mr. Beckerman. Senator, I answered--I answered the 
question.
    Senator Cruz. You are not willing to say yes or no.
    Mr. Beckerman. It was not a yes or no question. I want to 
be precise. I want to be----
    Senator Cruz. Is this company another affiliate as defined 
in your privacy policy? That is binary. There is not a maybe. 
It is yes or no.
    Mr. Beckerman. Senator, the way I answered, I am not aware 
that, that is the answer to the question.
    Senator Cruz. OK, so you are refusing to answer the 
question. That does not give this committee any confidence that 
TikTok is doing anything other than participating in Chinese 
propaganda and espionage on----
    Mr. Beckerman. Senator, that is not accurate. And again, I 
would point you to----
    Senator Cruz. If it were if it were not accurate, you would 
answer the questions. And you have dodged the questions more 
than any witness I have seen in my 9 years serving in the 
Senate. That is saying something because witnesses often try to 
dodge questions. But you answer non-sequiturs and refuse to 
answer very simple questions. That, in my experience when a 
witness does that, it is because they are hiding something.
    Senator Blumenthal. Senator Moran.

                STATEMENT OF HON. JERRY MORAN, 
                    U.S. SENATOR FROM KANSAS

    Senator Moran. Mr. Chairman, thank you very much. Let me 
turn to Ms. Stout and ask a question about data privacy. So, 
Senator Blumenthal and many others have been working on a 
consumer data privacy bill now for the last several years.
    I have introduced a bill that includes an appropriately 
scaled right for consumers to correct and erase data that is 
collected or processed by covered entities, including social 
media companies. Ms. Stout, I understand that Snap currently 
allows users the right to correct or delete their user data. 
Would you please explain Snap's decision to proactively provide 
this service?
    Ms. Stout. Thank you, Senator, for the question. And I just 
want to say we applaud the Committee and your leadership on 
this issue, and we fully support a Federal comprehensive 
privacy bill, and look forward to continuing to work with you 
and your staff on that. To address your question, yes, Senator, 
Snap has been designed with a very privacy centric focus from 
the very outset.
    We don't collect a lot of data. We believe in data 
minimization, short data retention periods. And to the effect 
that you just--you made, as you pointed out, we don't store 
content forever, and we believe in giving users transparency 
and control, which includes giving them the ability to delete 
their data if they wish or the ability to download their data. 
We have a tool within the app. Users are able to download their 
data, which gives them essentially portability of any kind of 
information that they may have agreed to share or post or put 
on their Snapchat account.
    Senator Moran. And other platforms who may not take the 
same position that Snap has, tell me what it is that you give 
up in your ability to earn revenue? What is it that you lose by 
doing that, if anything?
    Ms. Stout. So Senator, we make tradeoffs every day that 
sometimes disadvantage our bottom line. And there are no rules 
or regulations that require companies like Snap to have short 
retention periods. That is why Federal privacy legislation is 
so critical, or why we choose to voluntarily have a data 
minimization practice.
    So, oftentimes, that means that advertisers find other 
platforms perhaps more enticing because those platforms keep a 
history of anything that has ever been searched or ever been 
shared, or location data that has ever been provided. And that 
is not the case on Snapchat. So, that is a tradeoff Senator 
that we make because we believe in being more private----
    Senator Moran. Because if you did it, otherwise, what would 
you what would you gain by doing so?
    Ms. Stout. We just believe we have a moral responsibility 
to limit the data that we collect on people.
    Senator Moran. You are answering that fine. But I just I am 
curious to know, are you giving up---can you generate money? 
What do you generate by keeping that data than in some other 
way using it?
    Ms. Stout. Yes, I think we limit ourselves in our ability 
to optimize for those advertisements and make more money. And 
we are a company that has not yet turned a profit. We have 
invested every dollar back into our company and we are here for 
the long game, and our real ultimate desire is to make a 
platform that is safe for our community.
    Senator Moran. Thank you. And Mr. Beckerman, what 
responsibility do platforms have to prevent harmful social 
media trends from spreading? And how can TikTok improve its 
algorithm to better comply to TikTok's own terms of service 
that prohibit, ``content that promotes or enables'' criminal 
activity?
    Mr. Beckerman. Thank you, Senator. We do have a 
responsibility to moderate our platform, along with our 
community guidelines, and do that in a transparent way.
    Senator Moran. Where does that responsibility come from?
    Mr. Beckerman. It comes from doing the right thing. You 
know, for us, we want to be a trusted platform, we want to be a 
platform where people have a joyful experience and like coming 
to the app and that is what we are seeing. And that starts with 
our community guidelines.
    Certain content, like you mentioned illegal activities, 
misinformation, and other categories that you wouldn't allow on 
the platform, we work really hard. I think our content 
moderation teams and our safety teams are often the unsung 
heroes of the companies that are working every day, 24/7 to 
ensure that community guidelines are met, and the platform 
stays positive, joyful.
    Senator Moran. The nature of my second question, the ad-on 
question was leading because it suggests that you are not 
complying with your own terms of service that prohibit content 
that promotes or enables criminal activity. And my question 
was, how can you improve your algorithm to better accomplish 
that? Maybe you want to discount the premise of the question.
    Mr. Beckerman. No, Senator, I mean, it is an important area 
where we always want to get to 100 percent. We released regular 
transparency reports and 94 percent of our removals of 
violative content are done proactively. Much of it is done 
within 24 hours or before there is any views. But we always 
want to strive to get 100 percent, and that is something that 
we fight and work on every single day.
    Senator Moran. Thank you. Thank you, Chairman and Ranking 
Member Blackburn. I think the series of hearings that the 
Subcommittee is having are hugely important to the nature of 
our country and to its future. Thank you.
    Senator Blumenthal. Thanks, Senator Moran. Senator 
Blackburn.
    Senator Blackburn. Thank you, Mr. Chairman. Ms. Miller, I 
would like to come to you. You talk about moderation is a 
combination of machine learning and human review. And it seems 
that YouTube has no problem pulling down videos that question 
abortion, global warming, vaccine mandates, but child abuse 
videos remain on your site. So I am interested to know more 
about the specific inputs that you use in these reviews. Who 
established--who establishes these inputs and who oversees them 
to make sure that you get them right?
    Ms. Miller. Senator, thank you for your question. So we 
heavily invest in making sure that all of our users, but 
particularly kids on the platform, have a safe experience. And 
the way that we do this is in with a number of levers. For 
example, we have content policies as it relates to child safety 
on the platform. So not putting minors into risky situations, 
in videos on the platform----
    Senator Blackburn. OK, then let me jump in and ask you then 
if you are saying you don't want to put children into risky 
videos. There is a world of self-harm content on your site, and 
a few searches come up with videos such as, and I am quoting 
from searches that we have done, songs to slit your wrists by, 
vertical slit wrist, how to slit your wrist, and painless ways 
to commit suicide. Now that last video, painless ways, was age 
gated. And--but do the self-harm and suicide videos violate 
YouTube's content guidelines, if you are saying you have these 
guidelines?
    Ms. Miller. Senator, I would certainly welcome following up 
with you on that video you may be referencing because we 
absolutely prohibit content regarding suicide.
    Senator Blackburn. Ms. Miller, I have to tell you, we have 
pulled these down in my office. Our team has worked on this 
because I think it is imperative that we take the steps that 
are necessary to prevent children and teens from seeing this 
content. And I just can't imagine that you all are continuing 
to allow children to figure out how to do this on your site, 
how to carry out self-harm. So yes, why don't you follow up 
with me for more detail, and I would like that response in 
writing.
    And I also talked to a film producer friend this morning, 
and the film trailer for, ``I'm Not Ashamed,'' which was based 
on the story of Rachel Scott. She was the first victim of the 
Columbine attacks, and the film focused on her faith and how it 
helped in her life. So why would you remove this film trailer 
and block its distributor from being on your site? And you did 
this for 11 months? You did not put the trailer back up until a 
Hollywood Reporter called and said, why have you done this? You 
got an answer on that one?
    Ms. Miller. Senator, I am sorry, but I am not familiar with 
that specific removal.
    Senator Blackburn. OK, then let's review this. And we can 
submit the documentation. I had it sent back over to me. Ms. 
Stout, I want to come to you. We had an issue in Memphis with a 
48 year old man who was on your site. He raped a 16 year old 
Memphis teen and he claimed to be a music producer. He lured 
her into this relationship. And one of the news articles 
recently called Snapchat the app of choice for sexual 
predators.
    This is something that is of tremendous concern and much of 
it from what I understand, from talking to moms and talking to 
grand moms, is they use the Snap Map Location service, and I 
know you are probably going to say only your friends can follow 
you, but somehow people are getting around that.
    And these sexual predators who are following young people 
are using this map to get to their location. So we had this in 
Memphis with the rape. We have another child that was in the 
middle part of the state, that the predator followed her, and 
she tried to commit suicide because she knew her family was 
going to find out. So are you taking steps? Do you want to give 
me a written answer as to the steps that you all are taking? 
How are you going to get a handle on this? This is endangering 
young women.
    Ms. Stout. Senator, I am more than happy to give you a 
detailed written answer, and you have written to us in the past 
and I appreciate your leadership in following up on this issue. 
I want to just make crystal clear that the exploitation of 
minors is absolutely deplorable. It is our highest priority to 
prevent this type of event from happening on our platform. But 
with respect to the map, yes, indeed, location--appearing on 
the map is off by default for everyone, not only for minors, 
but for everyone.
    So in order to appear to someone on the map and share your 
location, you must be bi-directional friends with that person. 
I will say on--with respect to grooming, this is an area where 
we spend a tremendous amount of time and resources to try to 
prevent. Snapchat makes it intentionally difficult for 
strangers to find people that they don't know. We do not have 
open profiles, we do not have browsable pictures.
    We don't have the ability to understand who people's 
friends are and where they go to school. So Senator, I would be 
more than happy to follow up with you in writing and provide 
you more details.
    Senator Blackburn. Let's do that so we get some more 
detail. One question for all three of you, and you can answer 
this in writing if you choose. You have all talked about the 
research work that you do. Do you get parental consent when you 
are doing research on children? And can you provide us a copy 
of the parental consent form? We asked Facebook for this, and 
they punted the question repeatedly.
    And Ms. Miller, I think you need to provide the Committee 
clarity. You said that you all had never spoken out against 
online privacy. I think the Chairman and I may question that a 
little bit. So my question to you--and you can come back to us 
with a little bit more depth on this. Did you fight it as a 
part of the Internet Association as they were fighting privacy 
on your behalf?
    And just one thing to wrap, Mr. Chairman, going back to Mr. 
Beckerman. With the confusion there seems to be around the 
ownership of TikTok, which their parent company is ByteDance, 
and the CCP does have a seat on the board, they have a 
financial stake in ByteDance, their--Douyin is a affiliated 
entity. And we know there is a relationship with the Chinese 
Communist Party through all of this. And then I checked, and I 
know that TikTok user data is stored in the U.S. and Singapore.
    And until quite recently, Singapore data services were run 
by Alibaba, which is another Chinese company. So what we need 
to have from you, Mr. Beckerman, is some clarity on the chain 
of ownership, the transmission in the sharing processes that 
are around U.S. consumer data, especially the data and 
information of our children. And with that, I will yield back. 
Thank you, Mr. Chairman.
    Senator Blumenthal. Thanks, Senator Blackburn. We are going 
to go to Senator Sullivan. We are going to finish the first 
round. Senator Sullivan, Senator Lummis, I am going to go vote. 
I should be back by the time that Senator Lummis finishes. And 
in the meantime, Senator Blackburn will preside. Thank you.

                STATEMENT OF HON. DAN SULLIVAN, 
                    U.S. SENATOR FROM ALASKA

    Senator Sullivan. Thank you, Mr. Chairman. And Mr. 
Beckerman, I know there has been a lot of questions which I 
certainly have concerns about sharing data with the Chinese 
Communist Party, given the ownership or at least board 
influence. Senator Blackburn was just talking about that, I 
know Senator Cruz raised the issues. I want to raise a related 
issue. It is what I refer to as kowtow capitalism, kowtow 
capitalism.
    And I think that you guys are Exhibit A of kowtow 
capitalism. What is kowtow capitalism? It is American 
executives of American companies censoring Americans' First 
Amendment rights in America so as not to offend the Chinese 
Communist Party and, or to gain access to the Chinese market. 
So we see it on Wall Street. We see it in the movie studios, we 
see it with the NBA.
    So let me ask a couple of questions related to that. Could 
a TikTok user--a TikTok user could put up a video that 
criticizes the Chairman of this committee, the Ranking Member, 
any Senator, President Biden, former President Trump couldn't 
he? I mean, not like some horrible, violent suggestion, but 
just a criticism of an elected official. Is that common?
    Mr. Beckerman. Yes. Is it common--actually TikTok really 
isn't the place for politics? We don't allow political ads. And 
so political content is not typically what you----
    Senator Sullivan. You don't have like funny----
    Mr. Beckerman. It wouldn't be a violation. Yes, and 
wouldn't be a violation of our community guidelines. As long as 
it is not mis or disinformation or something hateful, then that 
would be allowed.
    Senator Sullivan. Good. That is good. That is free speech. 
I would hope you would answer that way. Could a TikTok user put 
up a video criticizing Xi Jinping. I know he is sensitive, for 
example, of being compared to Winnie the Pooh. Could a TikTok 
user put up videos that, kind of, make fun of him maybe with 
references to Winnie the Pooh? I don't know why he doesn't like 
Winnie the Pooh, but some reason you can't put Winnie the Pooh 
up anywhere in China. Can a TikTok user do that?
    Mr. Beckerman. Yes, Senator.
    Senator Sullivan. Really?
    Mr. Beckerman. Senator, our community guidelines are done 
for the United States market by our team in California and our 
moderators are done here. And that wouldn't be a violation of 
our community guidelines.
    Senator Sullivan. OK, so you--what about--in 2019, you 
admitted to censoring videos mentioning Tibetan independence. 
Can a TikTok user mention Tibetan independence?
    Mr. Beckerman. Yes, Senator.
    Senator Sullivan. So what happened in 2019 when you guys 
admitted to censoring a video related to that?
    Mr. Beckerman. I am not familiar with that incident, but I 
can assure you that that would not be a violation of our 
community guidelines and would be allowed on our platform.
    Senator Sullivan. OK. What about I think there was a TikTok 
public policy director in 2020 admitted that TikTok had 
previously censored content that was critical of the CCP in 
regard to forced labor, with regard to Uyghur Muslims. Is that 
true?
    Mr. Beckerman. That is incorrect. I mean, that would not be 
a violation of our community guidelines. That content would be 
permitted.
    Senator Sullivan. OK, so you are saying that your videos 
have never been censored by the Chinese Communist Party on any 
matter? These are--I am just reading from this, maybe these are 
all wrong.
    Mr. Beckerman. No, I can assure you that our content 
moderation teams are led by Americans. Our moderation 
guidelines are public and transparent. And content that is 
critical of any Government, frankly, as long as it meets our 
community guidelines, not mis or disinformation or hateful 
speech or something like that would be allowed. And I would 
encourage you to search for a number of these examples that you 
mentioned today on TikTok, and I am sure you could find them 
all.
    Senator Sullivan. So Tibetan independence and the Uyghur 
Muslim forced labor issues were not censored? I have wrong 
information----
    Mr. Beckerman. They are not currently censored.
    Senator Sullivan. Were they previously censored by anyone 
at TikTok? I thought--I am reading here, 2019, 2020, you 
admitted, somebody in TikTok admitted doing that. So that 
didn't happen?
    Mr. Beckerman. I am not aware of that. I can say that is 
not a violation of our guidelines now, and that is not how we 
moderate content.
    Senator Sullivan. OK. Well, listen, Madam Chair, or Mr. 
Chairman and the Ranking Member, I do think this issue of 
kowtow capitalism where American companies are censoring 
Americans, we are seeing it all the time, is an issue that this 
committee should be looking at because the Chinese Communist 
Party, of course, can crush freedom of speech in their own 
country, but they shouldn't be able to crush it in this 
country.
    And Mr. Beckerman, I am glad that you are denying any of 
this. I look forward to seeing videos somewhere, somehow on 
TikTok that are critical of the Chinese Communist Party. Not 
really hold my breath, but maybe, it is true. Maybe ByteDance 
and the CCP board members are fine with videos criticizing 
President Xi Jinping and other members of the Communist Party.
    So you are saying that is totally fine and completely 
acceptable policy for TikTok users?
    Mr. Beckerman. Yes, Senator. And just to be perfectly 
clear, there is not involvement from the Chinese Communist 
Party in moderation of TikTok, and it is all done by Americans 
from within the United States.
    Senator Sullivan. Right. Thank you.
    Mr. Beckerman. Thank you.
    Senator Blackburn. Thank you, Mr. Sullivan. And I can--I 
will let you--we are going to continue looking at these issues 
of the Chinese Communist Party's influence into U.S. companies 
and into technology, and the silencing and censoring of free 
speech of U.S. citizens online.
    Senator Sullivan. Right. It is a really important issue, I 
appreciate that. Thank you.
    Senator Blackburn. So thank you for that line of 
questioning. Yes, it is. Senator Lummis, you are recognized.

               STATEMENT OF HON. CYNTHIA LUMMIS, 
                   U.S. SENATOR FROM WYOMING

    Senator Lummis. Thank you, Madam Chairman, and I am going 
to start directly with questions. And if I have any time left, 
I would like to read a statement into the record. But I will 
start with a question for Ms. Miller. YouTube has implemented 
several features, such as auto play, that have been proven to 
make the platform difficult to stop using.
    What mechanisms has YouTube employed to ensure that 
children specifically have tools to counteract these design 
decisions? And do you believe those controls are sufficient?
    Ms. Miller. Senator, thank you for your question. Auto play 
is default off on YouTube Kids, as well as in supervised 
experiences on YouTube Main. So we have set those to default 
off. We do allow, if the default is changed to allow for auto 
play, for example, if a family is in a car and they would like 
to have the--the parents have decided they want auto play to 
continue, but we have set it to default off.
    Senator Lummis. And do you believe that sufficient?
    Ms. Miller. I think it is one of a number of tools that are 
important to make sure that kids have a healthy and safe 
experience on the platform. Another is that we do not deliver 
targeted advertising on YouTube Kids. Another is that we age 
gate content to make sure that minors do not see age 
inappropriate material.
    So it is only with a number of tools and protocols in place 
do we think that we are meeting the bar we have set for 
ourselves, that parents expect of us, experts in the field such 
as child development advise us on to make sure, again, that 
kids are having a safe experience on the platform.
    Senator Lummis. Thank you. Mr. Beckerman, after my staff 
reviewed your privacy policy, I want to list some of the items 
that TikTok will automatically collect from one of its users: 
that person's location, the device model of your phone, their 
browsing history outside and inside of TikTok, content of all 
messages sent on TikTok, their IP address, their biometric 
identifying information, and information from their phones, 
such as keystroke patterns in other apps.
    Do you believe this sort of mass data collection is 
necessary to deliver a high quality experience to your users?
    Mr. Beckerman. Senator, I thank you for that question. Some 
of those items that you listed off are things that we are not 
currently collecting, and we State that in the privacy policy, 
that if we were to, we would notify users and get their 
consent.
    Senator Lummis. And which of those that I may--named are--
have that condition?
    Mr. Beckerman. Yes. As it relates to biometric, I don't--I 
didn't write down every single thing. I would be happy to go 
through with you and your team on that.
    Senator Lummis. Perfect. We will follow up with you. My 
question is, regardless of which ones require consent of those 
I mentioned, why should any member of this committee feel 
comfortable with the vast amounts of data your company is 
collecting on our children, especially since TikTok has a 
relationship to the Chinese Communist Party?
    Mr. Beckerman. Senator, first off as it relates to data, 
TikTok actually collects less in many categories than many of 
our peers, and some of these things, as you mentioned, 
keystroke that is not collecting actually what people are 
typing. That is actually an anti-fraud, anti-spam measure that 
measures the cadence of typing because a bot, for example, 
would behave very differently than a human. And so it is not 
actually collecting what people are typing. It is an anti-fraud 
measure.
    Senator Lummis. Which of your competitors or other 
companies that you are aware of collect more information?
    Mr. Beckerman. I probably would point to Facebook and 
Instagram, for example.
    Senator Lummis. Well, I will ask the same questions of 
them. Thank you. This is for all our witnesses. Are your 
platforms specifically designed to keep users engaged as long 
as possible? You want to start, Mr. Beckerman?
    Mr. Beckerman. Senator, we want to make sure that people 
are having an entertaining experience. You know, like TV or 
movies, you know, TikTok is meant to be entertaining. But we do 
think we have a responsibility along with parents to make sure 
that it is being used in a responsible way. We have to take a 
break videos. We have time management tools. And family pairing 
is another tool where parents can help limit the time that 
their teenagers are spending on the app.
    Senator Lummis. But is the length of engagement a metric 
that your company uses in order to define success?
    Mr. Beckerman. There is multiple definitions of success, 
Senator, it is not just based on how much time somebody's 
spending on the app.
    Senator Lummis. But is that one of them? Is length of 
engagement one of the metrics?
    Mr. Beckerman. I think overall engagement is more important 
than the amount of time that is being spent.
    Senator Lummis. But is it one of the metrics?
    Mr. Beckerman. It is a metric that I think many platforms 
check on, on how much time people are spending on the app.
    Senator Lummis. Thank you. Ms. Stout, same question. Are 
your platforms designed to keep users engaged as long as 
possible?
    Ms. Stout. So, Senator, when you open up Snapchat, you 
don't open up Snapchat to a feed of other people's content 
designed to keep you consuming more and more content. You open 
up Snapchat to a blank camera, which is a blank canvas. It is a 
place where users come to talk to their friends in videos and 
in pictures.
    Senator Lummis. So but is it a metric that the company 
incorporates into your definitions of success?
    Ms. Stout. I believe we see success as if the platform is 
facilitating real live conversations and connections with 
friends. Snapchat is a place where friends come to talk to each 
other.
    Senator Lummis. But is it a metric? My question is, do you 
measure success in any way, shape, or form by how long people 
stay on your site? Is that one of multiple driving measures of 
success?
    Ms. Stout. I think the way I can answer that question is, 
it is one of many metrics.
    Senator Lummis. OK, thanks. Ms. Miller, same question. Are 
your platforms designed to keep users engaged as long as 
possible?
    Ms. Miller. Senator, our platforms are designed to allow 
users to search and discover all types of content. It is 
intended for them to have an enjoyable experience.
    Senator Lummis. I get it, but I am asking, is this one of 
the metrics by which you define success?
    Ms. Miller. Senator, we have a number of digital wellbeing 
tools designed to----
    Senator Lummis. But is this one of the metrics? Is it--is 
it one of them? I mean, there----
    Ms. Miller. It is.
    Senator Lummis. There could their numerous metrics but is 
this one of them?
    Ms. Miller. Yes. To the specific question that you are 
asking, we do look at, for example, if a video was watched 
through its entirety. That helps us determine whether or not 
that was a quality video relative to the search that the user 
had. So we do look at those types of data points to inform us 
as it relates to the experience that the user has had on the 
platform.
    Senator Lummis. Thank you. Madam Chairman, do I have time 
to enter an opening statement? Thank you, Madam Chairman. This 
generation of children will grow up under a level of 
surveillance well beyond any previous one.
    And although the recent Wall Street Journal reports focused 
on the problematic harms of Facebook, we know that the problem 
is endemic among our youth and bigger than Facebook alone. 
Children are impressionable. They are easily manipulated by 
targeting advertising to them, and they are readily influenced 
by the highly sophisticated algorithms that often surge--serve 
age inappropriate content to its youth users.
    These invisible algorithms continuously nudge our children 
in different directions, which can impact their development 
without their knowledge and without their parents' knowledge. 
These algorithms on these platforms were designed with adults 
in mind, not children. Only a tiny fraction of children 
understand the harm that can come from sharing sensitive 
information, pictures, or opinions that become part of their 
digital permanent record.
    But what is most alarming is that none of them can fully 
understand how the content fed to them by algorithms will shape 
their worldview during these formative years. So more must be 
done to promote responsible social media use. We must educate 
parents on how to teach their children to avoid the pitfalls of 
using these platforms.
    And more importantly, we must hold these platforms 
accountable for the effects that their design decisions have on 
our children. Mr. Chairman, thank you for the opportunity to 
add that opening statement to the record and thank you for your 
indulgence. I yield back.
    Senator Markey. Thank you and thank you for your leadership 
on these issues and your very insightful questions. We do have 
to protect kids in our country. You just put your finger on it. 
So let me ask this. Just going--following up on Senator Lummis 
and Senator Blumenthal, everyone on the panel here today.
    These kids are constantly posting content and their data is 
being tracked, stored, and monetized. But we know that young 
users lack the cognitive ability to grasp that their posts are 
going to live online forever. To each of our witnesses, do you 
agree that Congress should give children and teens, but more 
importantly, their parents, their ability, the right to erase 
their online data? Mr. Beckerman.
    Mr. Beckerman. Yes, Senator.
    Senator Markey. Ms. Stout.
    Ms. Stout. Yes, we do, Senator, but I would say that 
content on Snapchat does not appear permanently.
    Senator Markey. And again, I appreciate that. And to you, 
Ms. Miller.
    Ms. Miller. Yes, Senator. And users have the ability to 
delete their information as well as having auto delete tools.
    Senator Markey. So if they should have the right to delete 
it. Do agree with that, Ms. Miller?
    Ms. Miller. Yes, Senator.
    Senator Markey. OK, great. Today, apps collect troves of 
information about kids that have nothing to do with the apps 
service. For example, one gaming app that allows children to 
race cartoon cars with animal drivers has reportedly amassed 
huge amounts of kids data unrelated to the app's game, 
including location and browsing history. Why do apps gobble up 
as much information as they can about kids?
    Well, it is to make money. Congress, in my opinion, has to 
step in and prevent this harmful collection of data. Ms. 
Miller, do you agree that platforms should stop data collection 
that has nothing to do with fulfilling the app's service?
    Ms. Miller. Senator, we do limit the data that we collect, 
and this is particularly true, for example, on the YouTube Kids 
app. We limit the data collection to only rely on what is 
necessary to make sure that the platform runs.
    Senator Markey. So do you agree that that should become a 
law that all platforms have to do the same thing?
    Ms. Miller. Senator, I don't want to speak to whether or 
not it should become a law and, or the details of any proposed 
legislation, but at YouTube, we have not waited for a law to 
make sure that we have these protections.
    Senator Markey. I appreciate that. It is just it is time to 
just make up your mind, you know, yes or no on legislation. We 
need to move. Mr. Beckerman.
    Mr. Beckerman. Yes, Senator, we do need legislation. I 
think we are overdue on very strong national privacy laws.
    Senator Markey. Great. Ms. Stout.
    Ms. Stout. Yes, Senator, we absolutely collect less data, 
and it sounds as though collection of data that is irrelevant 
to the performance of the app does not appear to be within 
scope.
    Senator Markey. Today, popular influencers peddle products 
online while they flaunt their lavish lifestyles to young 
users. Influencer marketing like videos of online child 
celebrities opening new toys, getting millions of views but 
they are inherently manipulative to young kids, who often 
cannot tell that they are really paid advertisements that their 
heroes are pushing--that the hero is getting a monetary 
kickback from.
    My bill with Senator Blumenthal, the Kids Act, would ban 
this type of promotion of influencer marketing to kids. To each 
of the witnesses, do you agree that Congress should pass 
legislation to stop apps from pushing influencer marketing to 
children in our country? Yes or no? Ms. Miller.
    Ms. Miller. Senator, again, we have actually moved in this 
direction whereby we have a set of quality principles regarding 
the type of content that is made available on the YouTube Kids 
app, and in so doing, we make sure that we are not providing 
content that, for example, would have a significant prevalence 
of that type of material. And we also limit the types of ads 
that can be delivered on the app. I would--I apologize, I don't 
have the details----
    Senator Markey. Should we make it--should we make it 
illegal so that people out there who might be trying to 
influence children know that there is an enforceable penalty 
that----
    Ms. Miller. I absolutely think it is worth a discussion. I 
would need to stare at the details of such a bill.
    Senator Markey. Again, it has been around for a long time. 
Mr. Beckerman.
    Mr. Beckerman. Yes, Senator, we already limit the kinds of 
advertisements that can be served to teens, but we do agree 
that there should be additional transparency and additional 
privacy laws passed.
    Senator Markey. Ms. Stout.
    Ms. Stout. Senator, I would agree that I think for young 
people, there should be additional protections placed. So yes, 
we would be happy to look at that.
    Senator Markey. And by the way, we banned it on television 
because we know that we can't have the heroes just holding the 
product on Saturday morning and just saying, hey, kids, tell 
your parents to buy this right now. We banned it there. We have 
to ban it online. So thank you. And finally, push alerts.
    Studies show that 70 percent of teenagers report checking 
social media multiple times a day. Excessive use of social 
media has been linked to depression, anxiety, and feelings of 
isolation. The last thing an app should be doing is using 
methods like push notifications, automated messages that nudge 
users to open an app to make young users spend even more time 
online.
    To each of the witnesses, do you agree that Congress should 
pass, again, the law that Senator Blumenthal and I are trying 
to move, which would ban push alerts for children? Ms. Miller.
    Ms. Miller. I agree that additional protections should be 
in place regarding things such as push alerts for children.
    Senator Markey. OK, thank you. Mr. Beckerman.
    Mr. Beckerman. Senator, we already limit push 
notifications.
    Senator Markey. Should we--should we ban push 
notifications?
    Mr. Beckerman. I think that would be appropriate, but 
already we have already done that proactively for some push 
notifications.
    Senator Markey. OK. Ms. Stout.
    Ms. Stout. Yes, Senator. Snapchat does not utilize push 
notifications or nudges as the UK age appropriate design code 
pointed out, but we would very much be in agreement with your 
legislation.
    Senator Markey. Thank you. Thank you, Mr. Chairman.
    Senator Blumenthal. Thanks, Senator Markey. I understand 
Senator Klobuchar has a few more questions, and while we are 
waiting for her, I have a few too. So appreciate your patience. 
Let me begin by acknowledging, and I think you would 
acknowledge as well, the reason that you made many of the 
changes that you have is the UK's child safety law, the age 
appropriate design code. I think we need an American version of 
the British child safety law. And I want to ask about some of 
its provisions.
    Will you commit to supporting a child safety law that 
obligates companies to act in the best interests of children? 
It establishes in effect a duty of care that could be legally 
enforceable. Ms. Stout.
    Ms. Stout. Senator, we were very privileged to be able to 
work with the Information Commissioner's Office in the UK in 
their design of the age appropriate design code. We, of course, 
comply with the code as it has come into force this year. And, 
as I mentioned to Senator Markey, we are looking actively at 
that code to see how we could apply it outside of just the UK 
market and apply it to many of our other markets.
    So with respect to a child safety law that obligates 
companies to think about the protection in the safety of 
children that is something that Snap has already done without 
regulation. But, of course, we would be happy to work with the 
Committee.
    Senator Blumenthal. But the point is, and we would love to 
be in a world where we could rely on voluntary action by 
platforms like yourselves, but in effect, you have sacrificed 
that claim to be voluntary action or reliance on your voluntary 
action. Whether it is Facebook or your companies in various 
ways, I think you have shown that we can't trust big tech to 
police itself.
    And so when you say we already do it, well, you may decide 
to do it, but there is no legal obligation that you do it and 
there is no way to hold you accountable under current law. That 
is what we need to do. That is why I am asking you about a law. 
I am hoping that your answer is a yes, that you would support 
it. As a duty of care, it is a matter of law. You are nodding, 
and I hope that is a yes.
    Ms. Stout. Yes, Senator, and that was very much a part of 
my testimony in my opening statement, which because the time it 
takes regulation to be actually implemented, we don't believe 
that we should have to wait for that regulation. We are going 
to take the voluntary steps to best protect----
    Senator Blumenthal. Would you support it?
    Ms. Stout. Yes, Senator.
    Senator Blumenthal. Thank you. Mr. Beckerman.
    Mr. Beckerman. Yes, Senator. We have already voluntarily 
implemented much of the age appropriate design code here in the 
United States. But I do agree with you that companies need to 
do more, and I was struck by your comments in your opening 
statement about a race to the top.
    And that is very much the approach that we are trying to 
take at TikTok to do more, go above and beyond, and really be a 
place where it is seen that we are putting wellness of teens, 
and safety of teens in particular, ahead of other motives.
    Senator Blumenthal. So let me see if I can answer the 
question as I would if I were in your shoes. Yes, we strongly 
and enthusiastically support that kind of child safety law. We 
are already doing more than we would need to do under the law.
    Mr. Beckerman. Yes, Senator. And additionally, I do think 
as it relates to age verification, it is something that should 
be included and in measures like that and included in updates 
to COPPA, which are long overdue.
    Senator Blumenthal. Ms. Miller.
    Ms. Miller. Senator, I actually respectfully disagree that 
we only wait until we have legal obligations to put systems, 
practices, and protocols in place. For example, we rolled out 
YouTube Kids in 2015 to make sure that as kids were trying to 
be on the main platform, that we created a space that was 
particularly for them and their safety. We have rolled out a 
number of----
    Senator Blumenthal. I am going to interrupt you, Ms. 
Miller, because I think you misinterpret me. I am not 
suggesting that you wait. On the contrary, I am suggesting that 
you do it now. I think Mr. Beckerman and Ms. Stout perfectly 
well understood my question. Both of them would support that 
law. Would you, yes or no?
    Ms. Miller. I would support looking at any details as it 
relates to additional legal protections for kids in the U.S. As 
you may know, the age appropriate design code went into effect 
in the UK just over a month ago, so it is still early days, but 
we had already rolled out a number of the protections required 
in the UK, we rolled them out globally.
    Senator Blumenthal. Is that a yes or a no?
    Ms. Miller. Yes, I would be happy to work with you and your 
staff on a U.S. version.
    Senator Blumenthal. Would you support this legislation--
would you support a version of the UK child safety law?
    Ms. Miller. I would need to stare at the details of any 
specific bill, but I certainly support expansions of child 
safety protections.
    Senator Blumenthal. I am going to yield to Senator 
Klobuchar, and she is now ready to ask her question.
    Senator Klobuchar. Thank you. Thank you very much. Thank 
you for--a lot going on here today. Thank you for taking me 
remotely for the second round. So one of the things that I have 
tried to do in all of these hearings that we have had, 
including in the Judiciary Antitrust subcommittee that I chair, 
is taking the veil off this idea that this is just web, 
everyone has fun--that's what it is, and it is just a cool 
thing.
    Some of that is true, but it is also a huge profit making 
venture. And when you look at it that way as the most 
successful and biggest companies the world has ever known in 
the big tech platforms in terms of money, then you have to 
start looking at it, wait a minute, why haven't we done 
anything about privacy law? Or why haven't we done anything on 
Senator Markey's children's law?
    Why haven't we put in place some rules about transparency 
and algorithms, or mostly, from my perspective, done anything 
about competition policy, which is a market approach to get 
alternatives. So I just start with this question, which I have 
asked many of the platforms. The larger platforms.
    Ms. Stout, Snap reported that its advertising revenue per 
user in North America for the second quarter of 2021 was $7.37. 
How much of Snap's revenue came from users under the age of 18?
    Ms. Stout. Senator, I don't have that information for you, 
but I would be happy to take that back.
    Senator Klobuchar. OK, good. I appreciate that. I have been 
trying to get that from Facebook, of course. Just to give you a 
sense of Facebook's revenue per user from their own documents 
is $51 per user in--for the U.S. per quarter. Just to put it in 
some perspective. Mr. Beckerman, TikTok is a privately held 
company, so we don't have public documents on your advertising 
revenue per user. What is your best estimate of advertising 
revenue per U.S. user for the last quarter?
    Mr. Beckerman. Senator, I don't have those numbers, but I 
would be happy to go back and check in with the team.
    Senator Klobuchar. Do you think you can provide us that?
    Mr. Beckerman. Again, we are not a public company, but I 
will go back and see what we can find for you.
    Senator Klobuchar. OK, and again, I am trying to figure out 
the percentage from users under the age of 18, for us to get 
some perspective on how much of this is your--how much of the 
business and the future growth of the business is in kids. Ms. 
Miller, YouTube reported that its advertising revenue overall 
in the second quarter of 2021 was $7 billion. How much of 
YouTube's revenue came from users under the age of 18?
    Ms. Miller. Senator, I am--I don't know the answer to that, 
and I am not sure if we look at data internally that way, so I 
would also be happy to follow up with you. But I would like to 
note that as a company, we have long shared our revenue with 
our creators. And so over the last 3 years alone, we have paid 
out more than $30 billion to our creators.
    Senator Klobuchar. OK. I work with the Antitrust 
subcommittee. Takes me in a related path, and I recently 
introduced the bipartisan legislation, the American Innovation 
and Choice Online Act, with Senator Grassley and there is 
several people, including Senator Blumenthal, who are co-
sponsors of that legislation.
    And it is focused on a gnarly problem, which is that you 
have platforms that are self-preferencing their own stuff at 
the top. They are taking, in some cases, data that they 
uniquely have on other products and then making knockoff 
products and then underpricing the competitors. Ms. Miller, 
Roku says YouTube has made unfair demands in negotiations for 
carrying the YouTube TV app on Roku, including demanding Roku 
give preference to YouTube over other content providers in its 
search results, which is exactly what this legislation gets to, 
and give YouTube access to nonpublic data from Roku's users.
    Did YouTube make these demands for nonpublic data and 
preferencing in search results in negotiations with Roku?
    Ms. Miller. Senator, I am not involved in the negotiations 
with Roku. I know we have been having discussions with them for 
several months and we are trying to come to a resolution that 
is good for users as well as both companies, but I am not 
involved in the negotiations.
    Senator Klobuchar. OK. I will put this on the record for 
others in the company because I also I would like to know more 
generally if YouTube has ever demanded nonpublic data or 
preferencing in search results in negotiations with other 
providers. It just gives you a sense of the dominant platform 
by far in the area of search, being able to use that power over 
people who are simply trying to be on that platform.
    So I think that gets really to the core of what we are 
trying to do. I also just had a follow up on your YouTube 
banning all vaccine misinformation, which I commended you for 
at the time. How much content have you removed related to this 
policy change since you banned all anti-vaccine misinformation? 
And have you seen a change in the viewership rate?
    Ms. Miller. Senator, I feel bad--I think this is the 
question I would absolutely love to answer with more detail, 
but I know that we have removed so much video as it relates to 
COVID stuff.
    Senator Klobuchar. We will put it in writing, and we will 
get the answer that way. Thank you. My last question, Ms. 
Stout, Mr. Beckerman, I just mentioned this bill that Senator 
Grassley and I have introduced with 10 other co-sponsors aimed 
at ensuring that dominant digital platforms don't use their 
market power to thwart competition.
    Do you support some of the competition reforms in the bill 
and have you faced any challenges when it comes to competing 
with the largest digital platforms? That will be my last 
question.
    Ms. Stout. Senator, we have--we are aware of the bill that 
you introduced with Senator Grassley, and as a non-dominant 
platform, very much appreciate your legislation and the work 
that you are doing in this space. And yes, as a smaller 
platform, it is an incredibly competitive arena for us. We 
compete every day with companies that collect more information 
on users and store that information to monetize.
    So any efforts that this body, and especially the 
legislation that you have undertaken to create an equal playing 
field so that it is indeed a competitive atmosphere for 
platforms like Snapchat, we would very much welcome.
    Senator Klobuchar. OK, thanks. Mr. Beckerman.
    Mr. Beckerman. Yes, likewise, we appreciate your efforts in 
the work that you have done to promote and spur competition. It 
is something that I think we all benefit from. As it relates to 
specific challenges that we face, I would be happy to meet with 
you and your team to discuss in detail some of those issues.
    Senator Klobuchar. Thank you very much. Thank you, 
everybody.
    Ms. Miller. Senator, I found the answer if you don't mind. 
So on COVID misinfo, we have removed over a million videos 
since we started rolling out COVID misinfo, and over 130,000 
videos as it relates to COVID vaccine misinfo. So it is an area 
we have put a lot of resources behind to make sure our platform 
isn't promoting or allowing this type of content. Thank you.
    Senator Klobuchar. OK, thank you. Thanks, everybody. 
Thanks, Senator Blumenthal and Blackburn.
    Senator Blumenthal. Thank you, Senator Klobuchar. I seem to 
be the last person standing or sitting between you and the end, 
but I do have some questions. Ms. Stout and Mr. Beckerman, over 
10 million teens use your apps. These are extremely 
impressionable young people, and they are a highly lucrative 
market. And you make tens, probably hundreds of millions from 
them.
    There is a term, I am sure you have heard it, called 
``Snapchat dysmorphia,'' to describe the depression, mental 
health, and other issues associated with your apps. Snapchat 
dysmorphia. Kids use the filters that are offered and create 
destructive, harmful expectations, the filter does, and you 
study the impact of these filters on teen mental health, I 
assume. Do you study the impact of these filters before you put 
them in front of kids? Ms. Stout.
    Ms. Stout. Yes, Senator, the technology that you are 
referring to is what we call lenses, and these lenses are 
augmented reality filters that we allow users who choose to use 
them to apply them over top of selfies. And for those that are 
familiar, they are the, you know, opportunity to put on a dog 
face or----
    Senator Blumenthal. I have seen them. But they also change 
one's appearance, potentially to make one thinner, different 
colors, different skin tones.
    Ms. Stout. So these filters, Senator, are created both by 
Snapchat and our creator community. There are over 5 million of 
these augmented reality filters, and a very, very small 
percentage of those filters are what you would call 
beautification filters. Most of them are silly, fun, 
entertaining filters that people use to lower the barrier of 
conversation.
    Because again, when you are using Snapchat, you are not 
posting anything permanently for likes or comments. You are 
using those filters in a private way to exchange a text or to 
exchange a video message with a friend. So it really kind of 
creates this fun, authentic ability to communicate with your 
friends in a fun way. And those filters are one of the many 
ways in which friends love to communicate with each other.
    Senator Blumenthal. Do you study the impact on kids before 
you offer them? Have you studied them?
    Ms. Stout. So we do a considerable amount of research on 
our products. And in fact, one of--I think it was one of the 
competitive pieces of research that was revealed as part of 
your earlier hearing showed that filters or lenses on Snapchat 
are intended to be fun and silly. It is not about body 
dysmorphia----
    Senator Blumenthal. Well, we all know as parents, something 
intended to be fun and silly can easily become something that 
is dangerous and depressing. And that is the simple fact about 
these filters. Not all of them. Maybe not the majority of them, 
but some of them. And I would like you to provide the research 
that you have done. I have asked for other research, but I am 
gathering from what you have said that you haven't--you don't 
do the research before you provide them.
    Ms. Stout. No, Senator, that's not what I am saying. And in 
fact, I--particularly with respect to this question, I am not 
able to answer the kind of research we have done simply because 
I am not aware, but I will go back and look and try to get you 
an answer to answer your question.
    Senator Blumenthal. You know, for 8 years, Snapchat had a 
speed filter. It allowed users to add their speed to videos, as 
you know. The result was that it encouraged teens to race their 
cars at reckless speeds. There were several fatal and 
catastrophic crashes associated with teens using the speed 
filter. It took, I think, 8 years, and warnings from safety 
advocates, and multiple deaths for Snapchat to finally remove 
that dangerous speed filter. It was silly. It was maybe fun for 
some people.
    It was catastrophic for others. And I want to raise what 
happened to Carson Bride. His mother, Kristin, originally from 
Darien, told me about how Carson was relentlessly bullied 
through anonymous apps on Snapchat. After trying desperately to 
stop the abuse and after unanswered pleas for help, he took his 
own life. Silly, fun--how can parents protect kids from the 
relentless bullying that follows kids home from school?
    As Ms. Haugen said so movingly, it no longer stops at the 
schoolhouse door. Now 24/7. Comes into their homes, just before 
they go to sleep. What are you doing to stop bullying on 
Snapchat?
    Ms. Stout. Senator, this is an incredibly moving issue for 
me as a parent as well. Bullying is unfortunately something we 
are seeing more and more happening to our kids. And indeed, 
this is not just on the online community. They face it at 
school, they bring it at home. We have zero tolerance on 
Snapchat for bullying or harassment of any kind. And as a 
platform that reaches so many young people, we see this as a 
responsibility to get in front of this issue and do everything 
we can to stop it.
    So again, because Snapchat is designed differently, you 
don't have multiple abuse vectors for bullying in a public sort 
of way on Snapchat. You don't have public permanent posts where 
people can like or comment, thereby introducing additional 
opportunities for public bullying or public shaming. That is 
not to say that bullying doesn't happen both online and 
offline. And so we have in-app reporting tools where users can 
anonymously and quickly report bullying or any other harmful 
activity.
    And I will say our trust and safety teams that work around 
the clock actually move to remove this content on average in 
less than 2 hours. Usually, it is far more quickly than that. 
But I want to assure you, Senator--and thank you for raising 
the bullying issue. In addition to prevention, we do--I am 
sorry.
    In addition to combating this practice, we do a lot to 
prevent it, and we think that there ought to be more 
opportunities to raise awareness on the effects of bullying and 
the effects that people's words have on other people, and we 
will continue to make that commitment.
    Senator Blumenthal. Mr. Beckerman, we have heard from 
TikTok that it is a safe environment. At the same time, we see 
challenges--the blackout challenge, just to take one example, 
where in effect teens and children have actually died emulating 
and recording themselves following blackout and choking 
challenges, including a 9-year old in Tennessee who saw one on 
TikTok.
    Despite apparent efforts to discourage certain dangerous 
trends, this content is still being posted, and viewed widely 
by kids online and followed and emulated whether it is 
destruction of school property or other kinds of challenges. A 
mother who lost her child in the choking challenge shared 
questions with me, and they are questions that deserve answers 
from you and from others who are here today.
    And I am just going to ask her question, how can parents be 
confident that TikTok, Snapchat, and YouTube will not continue 
to host and push dangerous and deadly challenges to our kids?
    Mr. Beckerman. Thank you, Senator. As it relates to this 
specific challenge and particularly things that can be 
dangerous or deadly for teens, it is sad and tragic. I mean, I 
remember myself when I was in grade school, I had a classmate 
that we lost from something very similar.
    And so I know it is something that touches many of our, 
many of our lives and it is awful. As it relates to TikTok, 
this is not content that we have been able to find on our 
platform. It is not content that we would allow on our 
platform. Independent fact checkers have looked. But it is 
important that we all remain vigilant to ensure that things 
that are dangerous or even deadly, particularly for teenagers, 
don't find their way on platforms.
    And it is important that we all have conversations with our 
teenagers to make sure they stay safe offline and make sure 
they stay safe online. And it is a responsibility that we take. 
And again, I just think it is important that we distinguish 
between things that there is actual content on platforms 
encouraging things, among others were actually the content does 
not exist on the platform.
    Senator Blumenthal. But this content existed on your 
platform.
    Mr. Beckerman. Senator, we have not been able to find any 
evidence of a blackout challenge on TikTok at all. And again, 
it would violate our guidelines, but it is something that we 
proactively search, both with AI and human moderators. But we 
have found absolutely no evidence of it. And it is something 
that we have had conversations with parents and others all the 
time around, things that could be dangerous, but it is 
important that that we have conversations with our teens about 
this.
    Senator Blumenthal. And other challenges? You are saying 
none of them----
    Mr. Beckerman. So anything that is illegal or dangerous 
violates our guidelines. Our teams have been very aggressive 
and acted very quickly as things pop up on the platform, as our 
transparency reports show. Over 94 percent of content that 
violates our guidelines are removed automatically, but 
dangerous challenges have no place on TikTok.
    Senator Blumenthal. I understand that you react by taking 
them down, but they existed for the time they were there. And I 
guess my question is, what can you do to prevent those 
challenges from being there in the first place?
    Mr. Beckerman. Well, we are we are often able actually to 
be proactive in blocking things from coming on when they are 
found. We divert searches, we block content, we remove content. 
But unfortunately, something that we have seen recently are 
press reports about alleged challenges that when fact checkers 
like Snopes and other independent organizations look into it, 
they find out that these never existed on TikTok in the first 
place, and in fact, were hoaxes that originated on other 
platforms.
    And so I think it is important of all of us, particularly 
parents and teachers and those that care about our young 
people, that we, you know, look at the facts and look to see 
what content actually exists, rather than spreading rumors 
about alleged challenges.
    Senator Blumenthal. Well, I just have to tell you, Mr. 
Beckerman, we found pass out videos. We found them. So I have a 
lot of trouble crediting your response on that score. Let me 
ask you about another instance. A parent in Connecticut wrote 
to me about how their 13 year old daughter was inundated on 
TikTok with videos about suicide, self-injury, and eating 
disorders. I have heard similar stories from parents across the 
country. So, we checked. We did our own research.
    My staff made a TikTok account I was describing earlier and 
spent hours scrolling through its endless feed of videos as a 
teen would. TikTok began by showing us videos of dance trends. 
Within a week, TikTok started pushing videos promoting suicidal 
ideation and self-harm. We didn't seek out this content. We 
can't show these videos in this room because they were so 
disturbing and explicit.
    And finally, another TikTok account we created as a 13 year 
old, as a 13 year old was flooded with nothing but sexually 
obscene videos. How do you explain to parents why TikTok is 
inundating their kids with these kinds of videos of suicide, 
self-injury, and eating disorders? This is stuff occurring in 
the real world.
    Mr. Beckerman. Yes, Senator. I can't speak to what the 
examples were that--from your staff, but I can assure you that 
is not the normal experience that teens or people that use 
TiKTok would get. Those kinds of content violate our guidelines 
and are removed. But we would be happy to sit down with you and 
your staff and go through what that example was.
    Senator Blumenthal. Would you support restrictions on 
product features that are intended to foster addictive use of 
apps? In other words, bar features that lead to addictive use 
of apps.
    Mr. Beckerman. Yes, Senator, we have already done a number 
of that proactively in the form of take a break of videos, in 
terms of time management features, in terms of not having 
direct messages, for example for under 16, and our family 
pairing tools. You know, it is important that when we look at 
these issues, that we have conversations and foster these 
conversations with our teenagers.
    I mean, that is one of the reasons we built our family 
pairing to make it easy for parents. I know it can be daunting 
for parents when you have teenagers to have all these different 
tools and features to protect them. But we built this in a way 
where parents from their own app can have additional control of 
the safety, privacy, and time use for their teenagers.
    Senator Blumenthal. I have to go vote, and I don't have 
anyone to preside here for me. May I suggest taking a 5-minute 
recess? I will have some final questions if you would be 
willing to wait, and then we can close the hearing. Thank you.
    [Recess.]
    Senator Blumenthal. Welcome back, everyone. I was told that 
a couple of my colleagues wanted to come back and have an 
opportunity to question, but if they are not here within the 
next 5 minutes, they will lose that opportunity. Thank you for 
your patience. I want to give you an opportunity to answer what 
I think is probably one of the paramount questions on the minds 
of most parents today.
    And that is, how are you different from Facebook? I 
mentioned at the very start of the hearing that you would try 
to differentiate yourself from Facebook. It is not enough to 
say, we are different. But I want to give you the opportunity 
to tell parents why they should be less fearful and scared of 
Snapchat, TikTok, and YouTube. Ms. Stout.
    Ms. Stout. Senator, thank you for the opportunity. Again, I 
think it is very important to understand Snapchat is a very 
different platform. It was created by our founders to be an 
antidote to social media. Our founders saw very early on what 
traditional social media can do to your self-esteem, to your 
feeling that you have to perform or be perfect for the world at 
all times.
    And Snapchat was a decidedly different platform that was 
private, that was safe, where people could come and actually 
talk to the friends that they have in real life. You weren't 
being judged on your perfect posts. Your posts were intended to 
be ephemeral, the way a real life conversation is ephemeral. 
And it really strengthens relationships. So for parents who may 
not be sure of Snapchat, I would tell them, get on Snapchat, 
understand what your kids are doing. And in fact, we are going 
to make it easier for parents through parental controls to 
understand how indeed their kids can be safe and private on 
Snapchat.
    These parental controls, which will be rolling out very 
soon, will help parents understand who their kids are talking 
to the most, what their children's privacy or location settings 
may be, and start a conversation with their parents because 
this is a partnership. Parents and children ought to come 
together and talk to about how----
    Senator Blumenthal. Let me just interrupt--let me interrupt 
by saying, you know, everything you have said is fine, but it 
is aspirational, or it will appear so to some parents. In other 
words, just like Facebook says, well we bring communities 
together. Nothing to see here that is bad. I understand your 
goals, and for a lot of people, it may be so, but I am giving 
you the opportunity to tell us how you protect kids in a way 
that Facebook doesn't.
    Ms. Stout. Yes, so it is not only aspirational, but it is 
something that we live, and we practice every day. So in order 
to be friends with people on Snapchat, you have to be bi-
directional friends. So there is no following. There is no not 
being invited into the conversations.
    These are two-way mutual friendships where people are able 
to speak to one another. Senator, an average 14 year old on 
Snapchat only has 30 friends and we don't try to push them to 
create more friends or make more connections with strangers. We 
want Snapchatters to be connected to the people they are 
connected to in real life.
    Senator Blumenthal. Mr. Beckerman.
    Mr. Beckerman. Thank you, Senator. So let me answer that in 
three ways. First, we put people first and particularly as it 
relates to teens, we put their well-being first. As I outlined, 
and with just so many examples of things that we have done, we 
have made difficult policy and difficult product choices that 
put the well-being of teens first. And so that is everything 
from we don't allow direct messages for under 16. Under 16, 
their accounts are private. The way we have built out family 
controls.
    And so with our actions--look, trust, everybody can say 
trust us, but unless you are doing actions and putting those 
actions in place as we already have and we are constantly 
improving, we are not talking today about things that 
aspirationally we want to do, we are talking about things that 
we already have done and are continuing to build on. So that is 
one. Two, the approach of the company. Being open and humble is 
really part of our company culture, and that means that we are 
open to getting feedback from outside experts and policymakers 
and parents about ways we can improve.
    And we are humble to say, we are not always going to get it 
right, and we want to take that feedback and implement it in a 
way that really, really can earn trust, particularly as it 
relates to protecting teenagers. And the third is, I would 
encourage parents to have conversations with your teenagers. 
You know, ask them how they use TikTok, how do they feel after 
using TikTok? And you know, from what we are hearing and what 
we are seeing in the community is that TikTok is a joyful 
experience. It is inclusive. It makes teens feel better when 
you have difficult times when the pandemic or in the school 
day.
    And so just have those conversations with your teens, but 
also take the opportunity to maybe film a video, a TikTok video 
with your teenager to facilitate that conversation and 
recognize that TikTok is not a passive experience. You know, it 
is not just about watching videos, it is about the creativity 
and making and doing fun videos on the app. And that is 
something that we have seen parents come together with their 
teenagers on that. And so I think those three would answer your 
question, Senator.
    Senator Blumenthal. Ms. Miller.
    [Technical problems.]
    Ms. Miller.--prioritize profits over safety.
    Senator Blumenthal. I think we are--maybe you can start 
again. I think we missed the beginning of what you said.
    Ms. Miller. Oh, can you hear me now?
    Senator Blumenthal. Yes.
    Ms. Miller. OK. Sorry, Senator. Thank you for asking this 
question. For YouTube, our mission is to give everyone a voice 
and show them the world. And as such, when users come to 
YouTube, they are--they experience search and discovery of 
finding all types of content. But as a company, we do not 
prioritize profits over safety. We invest heavily in making 
sure that our platforms are safe for our users, and we do not 
wait to act.
    We put systems and practices and protocols in place that 
over time we have relied on when we need to adapt to the world 
around us. For example, we rolled out YouTube Kids in 2015 when 
we realized that those under 13 were trying to get on the 
YouTube Main platform. We created a safe space for them on 
YouTube Kids. In addition, we are a transparent company. We 
rolled out our first quarterly transparency report in 2018. We 
continue to update that on a quarterly basis where we share 
additional metrics.
    We rolled out the violative view rate statistic earlier 
this year. We believe that being a company that, where 
responsibility is our number one priority, where we are 
constantly trying to balance the freedom of expression with 
being responsible, is something that is ingrained in our DNA, 
and we are very proud of the work that we have done but we know 
there is always more for us to do.
    Senator Blumenthal. I am going to leave it to the parents 
of America and the world whether or not they find your answers 
sufficient to distinguish you from Facebook. I know for sure 
with certainty that we need more than just, with all due 
respect, the kind of answers that you have given to persuade me 
that we can rely on voluntary action by any of the companies in 
big tech. And so I think you are going to hear a continuing 
drumbeat.
    And I hope that you will be part of the effort for reform. 
I hope there will be release and disclosure. I want to read in 
that connection a text that just came to my staff, and I 
haven't had the opportunity really to review it myself, but I 
am going to read it. It comes from one of the families whose 
story I shared anonymously, ``I have watched the entire 
hearing. I just heard the Senator refer to our experience. Mr. 
Beckerman is lying that her experience is atypical, that is not 
true.
    My daughter's TikTok feed was inundated with suicide 
ideation videos, self-harm videos, and anorexia videos because 
she was feeling depressed and searched for videos on this 
topic. Then every time she opened the app, there was more. She 
could have been feeling better, but these videos brought her 
down again and inspired her to spend hours making her own 
similar videos. I also just wanted you to know that she is an 
honor student, athlete, president of her student council and 
all around great kid.
    Mr. Beckerman's testimony makes me so angry.'' I say that 
with all due respect, but I just want folks watching to know 
that we are not taking at face value what you have told us, and 
I certainly want to give you an opportunity to respond but you 
and big tech should know, these messages are the real life 
experiences that people are relating to us as we go to town 
meetings, as we talk to them on the phone or by e-mail.
    That is what we are hearing. That is why you are here. And 
I think there are a lot of questions still to be answered here 
and a lot of contentions to be disputed and a lot of work to be 
done in real action, real reform, and I hope Congress will do 
it. So I would give every one of you an opportunity to make a 
closing statement, if you wish or not, and then we will 
conclude the hearing. Ms. Stout.
    Ms. Stout. Chairman Blumenthal, thank you again for the 
opportunity to appear today, and I would absolutely agree with 
you that voluntary measures are not adequate in and of 
themselves, and we have never suggested that voluntary 
regulation is the way forward. We want to partner with this 
committee and with Congress on what those regulations should 
look like.
    And we, as I wrote in my testimony, believe that Congress's 
role in pushing forward regulation in this space is absolutely 
necessary. So thank you for the opportunity to talk to you 
today about how Snapchat has been prioritizing and will 
continue to prioritize the safety and well-being of young 
people on our platform. We do see this as our highest priority 
order, and we will continue to press forward. Thank you.
    Senator Blumenthal. Mr. Beckerman.
    Mr. Beckerman. Thank you, Senator. This is one of the most 
important topics for all of us, particularly as parents, as 
tech companies, and as Congress to address. We all have a 
responsibility to protect our teenagers. None of us can do it 
alone. It can't just be on the parents. It can't just be on us. 
And it certainly can't just be on Congress. But we have to come 
together and solve the issues that exist both on and offline 
for our teenagers.
    It is a responsibility that we at TikTok take very 
seriously and have taken tangible actions and steps to do, and 
will continue to do so. And you know, as we focus today on 
legislation, we support stronger privacy rules to be put in 
place. We have done many of that proactively on our own. We 
support Congress acting on age appropriate design, age 
appropriate content, the things that we have done proactively. 
We would like to see Congress act on that as well. Thank you.
    Senator Blumenthal. Thank you. Ms. Miller.
    Ms. Miller. Senator, thank you very much for inviting me to 
participate in today's hearing. I will just close by saying 
what I said at the top in my opening statement, which is that 
there is no more important issue than the safety of kids 
online, and I am personally committed, our executives are 
committed to working with you and other stakeholders to make 
sure that the experience that kids have online is one that is 
healthy and enjoyable.
    I very much appreciate the concerns that you and your 
colleagues have shared with us. We will continue to work to 
improve and earn the trust of parents and kids and all the 
other stakeholders working together in this important space. 
Thank you very much.
    Senator Blumenthal. Well, I thank each of you for being 
here today. I hope you have grasped the sense of urgency and 
impatience that many of us feel. The time for platitudes and 
bromides is over. We have seen the lobbyists at work, we have 
seen millions of dollars arrayed against us. And I think we are 
determined this time to overcome them.
    And I hope that we will look back on this hearing as a 
turning point along with others that we are going to continue 
to hold in this subcommittee. So I thank you for being here 
today, and this hearing is adjourned.
    [Whereupon, at 1:37 p.m., the hearing was adjourned.]

                            A P P E N D I X

 Response to Written Questions Submitted by Hon. Richard Blumenthal to 
                             Jennifer Stout
    Preventing Child Sexual Exploitation and Grooming. According to the 
National Center for Missing and Exploited Children (NCMEC), 2020 was 
record-breaking for reports of child sexual exploitation, and this year 
has already exceeded that number of reports. In November 2019, I led a 
bipartisan letter to Snap and other tech companies about what you are 
doing to stop child sexual exploitation.

    Question 1. What technologies do you have in place to automatically 
monitor for the grooming and enticement of children and report these 
crimes swiftly? If none, please explain why not.
    Answer. Snapchat is designed in a way that makes it difficult for 
strangers to connect with people they do not know. Snapchatters cannot 
see each other's friend lists, and, by default, you cannot receive a 
message from someone who isn't already your friend. These features 
incorporate user safety right at the start. We also offer easy-to-use 
in-app reporting tools so users can flag concerning activity and our 
team can review reports quickly. We prioritize reports related to child 
safety and act on these as quickly as possible. In addition, we also 
use PhotoDNA and CSAI Match technology to proactively identify and 
report known child sexual abuse imagery on Snapchat. When we identify 
instances of child sexual abuse material (CSAM) on Snapchat, we 
preserve these and report them to the National Center for Missing & 
Exploited Children (NCMEC), who reviews and coordinates with law 
enforcement.

    Question 2. What technologies do you have in place to detect and 
stop predators from inducing children to send explicit images, videos, 
or live-streams of themselves? If none, please explain why not.
    Answer. As referenced in Question 1, we utilize PhotoDNA and CSAI 
Match to proactively identify and report CSAM to law enforcement 
authorities. Snapchat does not offer a live-stream function. We are 
actively exploring technologies to detect novel child sexual abuse 
imagery on Snapchat. We work with a range of expert safety 
organizations and leaders to inform our approach to safety and make 
sure Snapchat is a safe environment for our users. Since 2019, we've 
created and maintained a Parents' Guide to Snapchat, offering parents 
and caregivers a step-by-step resource to understanding and navigating 
the platform. We also offer tools to assist with Snapchatters in 
crisis, and we offer specific resources on topics like non-consensual 
intimate imagery (NCII), and Reporting on Snapchat.

    Question 3. What technologies do you have in place to detect Child 
Sexual Abuse Material (CSAM) videos, both for known CSAM (such as the 
NCMEC hashlist) and new CSAM videos? If none, please explain why not.
    Answer. On the detection side, we use CSAI Match to detect known 
CSAM videos through video-hashing. This technology relies on 
organizations like NCMEC's hash database to prevent revictimization of 
survivors. On the prevention side, we have resources and information 
within the app to educate our community on the risks of sending 
intimate imagery. As referenced in Question 2, we are actively 
exploring technologies to detect novel child sexual abuse imagery on 
Snapchat.

    Question 4. Do you have specific mechanisms for users to report 
child sexual exploitation? If so, please elaborate on them; if not, 
please explain why not.
    Answer. We have an in-app reporting feature that allows for users 
to report any offending content, including reports of child sexual 
exploitation. We prioritize reports related to child safety and act on 
these reports on average within two hours. Our Trust & Safety team 
works 24/7 to review reports and if they find the content violates our 
Community Guidelines for CSAM, the violating material is removed, the 
account is suspended and preserved, and the account and relevant 
material are sent to NCMEC for further investigation and law 
enforcement referral.

    Question 5. Do you include IP addresses in all reports to NCMEC? If 
not, please explain why not.
    Answer. Yes. The Cybertip reports that Snap sends to NCMEC include 
the IP data that corresponds to the account's last successful login, 
whenever that data is available.
    When law enforcement submits appropriate follow-up legal process to 
Snap in the course of investigating a Cybertip, Snap provides all 
available IP data for the requested account for the specified time 
period.

    Question 6. Please provide your policies and protocols on how you 
respond to law enforcement when they reach out for information relating 
to child sexual exploitation reports.
    Answer. There are two common circumstances in which law enforcement 
contacts Snap for information relating to Cybertip reports submitted to 
NCMEC. First, to investigate the Cybertip report, law enforcement may 
serve Snap with authorized legal process, such as a search warrant or 
subpoena, for the account specified in the report. In these instances, 
Snap complies with valid legal processes and provides law enforcement 
all available responsive data.
    Second, law enforcement may submit questions to our Law Enforcement 
Operations team with regard to either the information contained within 
the Cybertip report or the data received pursuant to a subsequently 
served search warrant or subpoena. Although not legally mandated to 
respond in these instances, Snap's policy is to respond to all 
inquiries and endeavor to answer the questions posed based on available 
information, while complying with any applicable legal restrictions on 
voluntary disclosure.

    Snapstreaks and Stopping Addictive Use. Snapstreaks are one of the 
most popular features on Snapchat. Teens have spoken about how 
Snapstreaks are seen as proof of friendships, and they will go as far 
as waking up early, staying up late, or logging in to friends' accounts 
so they can keep up their streaks. It would appear that the purpose of 
this feature is to encourage teens to open Snapchat every day and send 
photos or videos--or else they risk their friendship.

    Question 1. Has Snap studied the impact Snapstreaks have on the 
mental health of children? And has Snap studied whether product 
features like Snapstreaks encourage problematic (or addictive) usage? 
If so, please elaborate on these studies and share them with the 
Subcommittee; if not, please explain why not.
    Answer. No, Snap has not studied the impact Snapstreaks have on the 
mental health of children nor have we conducted research specifically 
measuring the addictiveness of our features. In an effort to understand 
how our community engaged with Streaks and to prevent them from 
becoming more outsized in importance than we intended, we conducted 
research in 2018 with Snapchatters (ages 13-24) to better understand 
how they used Streaks, and how we could address any concerns to improve 
their experiences.
    Our findings showed that our user community generally finds Streaks 
to be a fun and an easy way to stay in touch with close friends. 
Streaks is a feature that was introduced early on to celebrate 
friendships and that represents how often people in a relationship send 
snaps to each other. Streaks are voluntary, not public, and are only 
visible between the two friends who have a streak. Streaks also do not 
represent time spent on the app. An emoji reflects the number of days 
you've been in contact with a friend through the exchange of Snaps. 
There is no nudge encouraging a user to jump back into the app and we 
do not utilize push notifications.
    We are sharing those research findings (see Document 1--Streaks 
Topline August 2018). And according to the research, two-thirds of our 
community said they don't feel it is very important to keep Streaks 
going. Further, the majority of our community did not indicate Streaks 
were a significant source of stress--but six percent did. Even though 
our user research indicated streaks were not a major source of stress 
for our community, we still took action to reduce any possible anxiety 
they could cause. We did this in three ways: 1) By deemphasizing the 
prominence of the Streaks emoji; 2) By lengthening the expiration 
window for a Streak from 6 hours to 12 hours, which resulted in a 50 
percent reduction in user requests to restore Streaks; and 3) Allowing 
people an automated and speedy way to restore their own Snapstreaks 
online, which further reduced pressure. We will continue to explore 
ways to ensure Streaks serve the best interest of our community and 
remain fun, and lighthearted ways to connect with your friends. In the 
near term, we are planning to roll out new features that will allow 
users to have more control over Streaks and to pause them if they need 
a break.

    Question 2. Snapstreaks create conditions where kids as young as 13 
are obsessively opening Snapchat as a proof of their friendship. Why 
has Snap decided to keep this feature for young teens, despite its 
steps to reduce addiction elsewhere?
    Answer. See answer to Question 1.

    Question 3. To test this feature, my staff created an account for a 
thirteen-year-old. We added the real account of a staff member of my 
office as a friend. We found that Snapchat started suggesting the 
account for the thirteen-year-old to the adult friends of our staff 
member. Why did it provide that recommendation?
    Answer. We have designed Quick Add with a focus on safety and 
privacy, and as part of that we have engineered the feature to focus on 
helping people find their real-life friends. It does so by focusing on 
existing connections among friends. If the account created for a 
hypothetical thirteen year-old user granted Snapchat permission to 
access their device's contacts, then these contacts found in that 
user's phone may have been the reason for the friend suggestion.
    Quick Add also contains other privacy-centric features. For 
example, we give users the ability to turn Quick Add off so that users 
who do not want to appear as friend suggestions to other users have the 
ability to control how they appear.
    Finally, we are intensely focused on continuing to make Quick Add 
even safer and more privacy-centric, especially for Snapchatters ages 
13-17. As part of that ongoing effort, we are currently rolling out a 
change for minors: To be discoverable in QuickAdd by someone else, they 
will need to have a certain number of friends in common with that 
person. That will further ensure that the person is their real-life 
friend.
                                 ______
                                 
   Response to Written Questions Submitted by Hon. Amy Klobuchar to 
                             Jennifer Stout
    Social Media and Mental Health. According to press reports, Snap 
conducted user research on how often its users ``experience feelings of 
stress, depression, and anxiety.'' \1\
---------------------------------------------------------------------------
    \1\ https://www.fastcompany.com/90461846/snapchat-introduces-new-
interventions-for-mental-health

    Question 1. Please provide a copy of those research studies.
    Answer. We have conducted research to better understand how 
Snapchatters feel about our app, and their overall relationship to 
mental health--which we did to understand what resources we could build 
to better support them.
    According to research we conducted in 2019, 95 percent of 
Snapchatters said the app makes them feel happy, more than any other 
service tested. This research was released publicly.
    Additional research we conducted in 2019 showed that spending time 
with friends, whether in person or online, is the best defense against 
feeling lonely or depressed and that friends are often a first port of 
call for those struggling with a mental health challenge. We used the 
findings of this research to develop an in-app portal called Here For 
You, which surfaces resources from expert organizations to Snapchatters 
when they search for a range of mental health-related topics. We shared 
the findings of our research publicly at the time that we rolled out 
this feature. We are attaching it for you.

[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]

    See attached Document 2 (Teens and Young Adults Research Finds 
Mental Health Top of Mind Among Young People).

    Question 2. Please list the titles of all research Snap has 
conducted in the last 18 months about mental health and its users.
    Answer. Below please find the titles of research Snap has 
concerning the effect of Snapchat on mental health and emotions of its 
users.

  1)  Teens and Young Adults Research Finds Mental Health Top of Mind 
        Among Young People (Document 2)

  2)  Apposphere: How the Apps You Use Impact Your Daily Life and 
        Emotions (Documents 3 and 4)

  3)  Fentanyl Research (Documents 5 and 6)

  4)  Coping During Covid: How the Pandemic Changed Video Consumption 
        (Document 7)

  5)  Family Online Safety Research, available at Managing the 
        Narrative: Young People's Use of Online Safety Tools

  6)  Demystifying the Snapchat Generation (Document 8)

  7)  Next Gen: The Reinvention of Friendship (Document 9)

  8)  Foundational Research (Document 10)

  9)  The Friendship Report (Documents 11 and 12)
                               Document 2

[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]


                               Document 3

[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]


                               Document 4

[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]


                               Document 5

[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]


                               Document 6

[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]


                               Document 7

[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]


                               Document 8

[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]


                               Document 9

[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]


                              Document 10

[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]

                              Document 11

[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]


                              Document 12

[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]


    Question 3. At the time a new feature was rolled out by Snap in 
2020, a representative of Snap, Inc. was quoted as saying, ``sometimes 
people come in and search for this kind of content, be it anxiety or 
depression.'' \2\ Is Snap tracking how often people search for content 
related to teens' mental health on the app? If so, please provide that 
data.
---------------------------------------------------------------------------
    \2\ https://www.fastcompany.com/90461846/snapchat-introduces-new-
interventions-for-mental-health
---------------------------------------------------------------------------
    Answer. When any user searches for specific mental-health related 
terms like ``depressed,'' ``suicide,'' ``anxiety'', and ``self care'' 
they will be led to our mental health and body positivity portal, Here 
For You. In the past 30 days (from November 15, 2021) the Here For You 
portal was accessed a few thousand times. We minimize the collection 
and retention of data related to all search queries, and do not 
specifically conduct analysis on queries related to mental health.

    Social Media and Eating Disorders. Studies have found that eating 
disorders have one of the highest mortality rates of any mental 
illness.\3\ Researchers have found a ``clear pattern of association'' 
between social media use on apps like Snapchat and disordered eating 
patterns in teens.\4\
---------------------------------------------------------------------------
    \3\ https://anad.org/get-informed/about-eating-disorders/eating-
disorders-statistics/
    \4\ Wilksch, S. M., O'Shea, A., Ho, P., Byrne, S., & Wade, T. D. 
(2020). The relationship between social media use and disordered eating 
in young adolescents. International Journal of Eating Disorders, 53(1), 
96-106.

    Question 4. Are you aware of any research studies your company has 
conducted about how your app may push content promoting eating 
disorders to teens? If so, please provide those studies.
    Answer. We have not conducted any research studies on this issue. 
The promotion and glorification of eating disorders is a violation of 
our Community Guidelines and we remove all content that we find that 
violates this rule. Because Discover, our closed content platform where 
Snapchatters get their news and entertainment, is curated and 
moderated, there is little opportunity for users to be exposed to 
harmful viral content, including content promoting eating disorders. If 
Snapchatters search for terms that suggest interest in eating disorder 
content, they are directed to Here For You, which delivers users with 
information about eating disorder signs and symptoms in partnership 
with organizations such as the National Eating Disorders Association.

    Algorithms for Content Moderation. Frances Haugen disclosed that 
Facebook has been misleading the public about how well it can detect 
and remove harmful content.\5\ Facebook's internal documents estimated 
that their automated systems removed less than one percent of 
prohibited violent content.'' \6\
---------------------------------------------------------------------------
    \5\ https://www.wsj.com/articles/facebook-ai-enforce-rules-
engineers-doubtful-artificial-intelli
gence-11634338184
    \6\ https://www.wsj.com/articles/facebook-ai-enforce-rules-
engineers-doubtful-artificial-intelli
gence-11634338184

    Question 5. Has Snap collected any data about how often Snap's 
automated content detection systems failed to capture content on 
Snapchat that violates the company's terms of service in the last 18 
months? If so, please provide that data.
    Answer. We have some data about the prevalence of content that has 
evaded automatic detection from the broadcast areas of our app. User-
generated content that fails automatic moderation on our Spotlight and 
Discover platforms moves to a secondary phase of human moderation for 
further review. Approximately 0.09 percent of public user-generated 
stories on Discover are incorrectly auto-approved. For this 0.09 
percent, Snap has teams around the world that review Snapchatter 
reports and take down such content.
    We do not have data on our private content areas, but anyone can 
report inappropriate content for 24/7 review. We are constantly 
improving our ability to capture violating content by training our 
automation with our manual team's decisions. We also utilize 
technologies such as PhotoDNA and CSAI Match as described above that 
automatically detect known child sexual abuse material for 
investigation and referral to NCMEC and law enforcement.

    Advertising and Teenagers. Snap earns revenue from advertising.

    Question 1. Please provide copies of the 100 most popular ads, in 
terms of viewership, among all users on your platform in the last year.
    Answer. See attached spreadsheet (``Snap--Top 100 Most Popular 
Ads--Confidential'') with links to the ads. Some do not have links 
because they are dynamic product ads and do not have a set ad but 
rather, are generated from a product catalog at the time of viewing.

    Question 2. Please also provide copies of the 100 most popular ads, 
in terms of viewership, among teenagers on your platform in the last 
year.
    Answer. See attached spreadsheet (``Snap--Top 100 Most Popular 
Ads--Confidential'') with links to the ads. Some do not have links 
because they are dynamic product ads and do not have a set ad but 
rather, are generated from a product catalog at the time of viewing.

    Question 3. Snap reported that its advertising revenue per user in 
North America for the second quarter of 2021 was $7.37.\7\ How much of 
Snap's revenue in the second quarter of 2021 came from users under the 
age of 18?
---------------------------------------------------------------------------
    \7\ https://investor.snap.com/news/news-details/2021/Snap-Inc.-
Announces-Second-Quarter-2021-Financial-Results/default.aspx
---------------------------------------------------------------------------
    Answer. We estimate that 13.3 percent of Q2 2021 North America 
revenue came from users between the ages of 13 and 17.

    Children on Social Media. Snap states that it prohibits children 
under 13 from creating accounts on the Snapchat app.

    Question. How many under 13 accounts has Snap removed in the last 
year?
    Answer. In the past 365 days (as of November 15, 2021), we've 
deleted 2,568 accounts for being under 13. In the last 30 days, 
approximately 3 percent of devices that attempted to register for a 
Snapchat account failed registration due to being under 13.
                               Attachment

              Snap--Top 100 Most Popular Ads--Confidential
------------------------------------------------------------------------
 rank   ad--account--name   organization--name       download--link
------------------------------------------------------------------------
1      Invisalign          Align Technology     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/7771c2fe-
                                                 e3f5-4e8c-9a5e-
                                                 0f8652da8339/2e18b0b5-
                                                 d02a-40d5-a54a-
                                                 67f734a0aba5.png
2      Invisalign          Align Technology     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/3aa0d6e3-
                                                 7dd5-4e56-9a29-
                                                 6cc8e7fdac43/0827a4bf-
                                                 d1d3-4b82-af1d-
                                                 93de7d09ea09.png
3      Invisalign          Align Technology     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/1b5d719b-
                                                 b010-472a-bf63-
                                                 48a3382f5fc0/3f986958-
                                                 5ffc-4abc-ad8d-
                                                 048579bafe2f.mp4
4      Invisalign          Align Technology     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/a45dbbc7-
                                                 9122-493a-a984-
                                                 d33e90348336/8580f3b1-
                                                 2cbb-4c47-912e-
                                                 c90d330f5a3b.png
5      Invisalign          Align Technology     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/0fcf411d-
                                                 8348-4040-94bc-
                                                 2c0356a9f17b/ca248a50-
                                                 f2ea-44dd-b490-
                                                 91bf114ffd57.lens
6      Invisalign          Align Technology     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/bfc78fc6-
                                                 90fc-45b5-9e7a-
                                                 95beb63c3435/cc03afdf-
                                                 a9a9-46f9-aae0-
                                                 ce4a3025e0d8.png
7      Invisalign          Align Technology     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/a45dbbc7-
                                                 9122-493a-a984-
                                                 d33e90348336/8580f3b1-
                                                 2cbb-4c47-912e-
                                                 c90d330f5a3b.png
8      Invisalign          Align Technology     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/a6bf681c-
                                                 77f6-4e1f-b752-
                                                 c30edbf52f79/55f485a5-
                                                 efc9-4f74-afcc-
                                                 d2c73ede93f3.mp4
9      Red Ventures--      Red Ventures LLC     https://
        Arcade                                   storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/0ae8e729-
                                                 2d26-4381-a813-
                                                 da7b15c372b1/a3eb119d-
                                                 4271-40e7-bb54-
                                                 c5d960308c48.mp4
10     Discovery           Group Nine Media     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/329d1bae-
                                                 1b7f-4fa7-815b-
                                                 7a7efc61b8e7/3a3c10ac-
                                                 cde8-4d0d-b8a2-
                                                 6e2504924b65.mp4
11     U.S. Air Force and  GSD&M LLC            https://
        Space Force                              storage.googleapis.com/
        Recruiting                               ad-manager-creatives-
                                                 production-us/bdbcdc7a-
                                                 39c1-4e80-92f2-
                                                 c856419ae280/7ba5731b-
                                                 ef09-4be7-829e-
                                                 a0493d5260c1.mp4
12     Red Ventures--      Red Ventures LLC     https://
        Apple Music                              storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/4fda0329-
                                                 bb62-41ee-9c70-
                                                 acb8b9b13651/534dbce5-
                                                 47cc-46cd-a12c-
                                                 d3ace7f8567c.png
13     Nike--WITHIN--iOS1  Nike, Inc.           https://
        4                                        storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/b009bf46-
                                                 1031-4443-944c-
                                                 2386ac403ad8/d0a2ca30-
                                                 5dad-4b39-935a-
                                                 371ffd61fc4d.png
14     Twitch              Twitch Interactive,
        Interactive, Inc.   Inc.
        Self Service
15     Invisalign          Align Technology     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/7771c2fe-
                                                 e3f5-4e8c-9a5e-
                                                 0f8652da8339/2e18b0b5-
                                                 d02a-40d5-a54a-
                                                 67f734a0aba5.png
16     Adobe ROI           Adobe                https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/50120850-
                                                 5df7-4bf6-a892-
                                                 a89c59582bb9/ff673366-
                                                 bcce-4722-90ae-
                                                 bdcd18855b26.mp4
17     Kellogg--PTT        Starcom US           https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/376cf943-
                                                 ac25-4888-9226-
                                                 e618bdffdfe9/8a398bed-
                                                 f745-4f1e-aca2-
                                                 ddf3dc2b2aad.mp4
18     Invisalign          Align Technology     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/eb4b3772-
                                                 5d50-46ea-89a3-
                                                 95bc574818fe/7577a499-
                                                 d7a2-4d53-920b-
                                                 b130284c1e14.mp4
19     Red Ventures--App   Red Ventures LLC     https://
        Store                                    storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/0ad57947-
                                                 d7b0-4bdd-b05a-
                                                 83833cd34294/6344ce54-
                                                 89cb-459d-bb52-
                                                 0d1355547a53.png
20     Red Ventures--      Red Ventures LLC     https://
        Apple Music                              storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/5f1c513d-
                                                 2696-42d8-b13a-
                                                 1621e4592f43/8955cd6d-
                                                 c5f1-427d-9634-
                                                 f264fff4f0c7.png
21     Discovery           Group Nine Media     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/652cfefd-
                                                 1680-4fd3-9ffd-
                                                 ada2695aaa40/bcd8a491-
                                                 e6f6-4560-9eb8-
                                                 c7874e7a5102.mp4
22     Shein-ZOETOP-EC     Baidu (Hong Kong)    https://
        APP-Adtiger-??-     Limited              storage.googleapis.com/
        APP-IOS14                                ad-manager-creatives-
                                                 production-asia/
                                                 9ed4efcf-000d-4eba-ba1b-
                                                 16c050aba582/6978752a-
                                                 73af-42e5-8113-
                                                 3c10bd420bb6.png
23     Invisalign          Align Technology     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/7692cc5d-
                                                 94e4-4505-807e-
                                                 55e65293e465/d4f129cb-
                                                 d3d1-441f-adaa-
                                                 2231951c79d5.png
24     Discord Inc. Self   Discord Inc.         https://
        Service                                  storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/820783e3-
                                                 168d-4c93-9ffb-
                                                 ddad75a15335/8785038d-
                                                 631f-40fa-9262-
                                                 82e9405e9a5f.mp4
25     Target--BrandCampa  Target Corporation   https://
        ign                                      storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/1cd42b6b-
                                                 888f-4c42-b37d-
                                                 c4ed4a92d8e2/1d162001-
                                                 cf6b-4251-aaae-
                                                 bfd617155126.mp4
26     Kellogg--PTT        Starcom US           https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/e4272562-
                                                 c375-4ed1-ba70-
                                                 b56ccd902cdc/4e1e093c-
                                                 4b12-4a2c-b47c-
                                                 76784a77b6b1.mp4
27     Discovery           Group Nine Media     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/d6ccf943-
                                                 38be-4fe5-8ae0-
                                                 7cc3d6bd8f5d/9f8b4a79-
                                                 89d5-4bd6-81bb-
                                                 ffdfb0520dfd.mp4
28     PoshmarkQ4          Poshmark, Inc.       https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/07896cb6-
                                                 440d-4b6d-98fc-
                                                 1f2e09975abb/8eccb0ee-
                                                 fd19-4c34-8b88-
                                                 e45be1e1f527.png
29     [J&J] Clean &       VaynerMedia          https://
        Clear                                    storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/3657612c-
                                                 817e-4443-948c-
                                                 6979f10cfd7e/12c4d744-
                                                 7c26-4c09-91c7-
                                                 8d43b48dea91.png
30     Red Ventures--      Red Ventures LLC     https://
        Apple Music                              storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/3793b5eb-
                                                 ad3f-45b7-80f5-
                                                 79aad76d9695/2cc8900a-
                                                 257b-4ece-bfcf-
                                                 d13402ca8f95.png
31     FY21--SOC--SC--McD  Omnicom Media Group  https://
        onald's--GCM--MMJ   Holdings Inc.        storage.googleapis.com/
        -GMA iOS 14.5                            ad-manager-creatives-
        [All Segments]                           production-us/c54261c5-
                                                 85a8-4617-885f-
                                                 cb4464c8e22a/360e807d-
                                                 ea2f-4b4f-8b00-
                                                 7247ee7c14f9.png
32     Red Ventures--      Red Ventures LLC     https://
        Apple Music                              storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/0eaeddbe-
                                                 8c53-4592-836f-
                                                 4624f6e52f18/7b987767-
                                                 a3a4-455c-965a-
                                                 f3a4c8932170.mp4
33     Axe--Masterbrand--  Unilever US--        https://
        The Rock--2021      Mindshare            storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/c8195b5d-
                                                 1fd8-4d5a-980c-
                                                 a70fb49c84a6/83bc3c78-
                                                 607e-4664-9de5-
                                                 cd6a154166d2.mp4
34     Invisalign          Align Technology     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/5aae8937-
                                                 ba7a-4693-b9a3-
                                                 ab68e994a06e/efb1ea17-
                                                 e159-4cb3-8ad7-
                                                 eb4a59f11438.png
35     FY21--SOC--SC--McD  Omnicom Media Group  https://
        onald's--GCM--MMJ   Holdings Inc.        storage.googleapis.com/
        -GMA iOS 14.5                            ad-manager-creatives-
        [All Segments]                           production-us/0122acf9-
                                                 5190-41ad-9a8d-
                                                 8ed1ce7c0dc8/fa09f75e-
                                                 1ef0-44ce-9deb-
                                                 c0c1f3ee74bd.png
36     MS--CRL--TSTCRNCH-  General Mills        https://
        -F22--Social                             storage.googleapis.com/
        (G1M--CTC--013)                          ad-manager-creatives-
                                                 production-us/09768189-
                                                 daa6-4a62-9d96-
                                                 13c5ce4efdf1/846483e0-
                                                 7d39-4321-af81-
                                                 16ff7db1b038.mp4
37     FY21--SOC--SC--McD  Omnicom Media Group  https://
        onald's--GCM--MMJ   Holdings Inc.        storage.googleapis.com/
        -GMA iOS 14.5                            ad-manager-creatives-
        [All Segments]                           production-us/12c23e6c-
                                                 0e97-42b5-aa72-
                                                 b1b6e9fbc020/5e6c3cc3-
                                                 9081-4c22-9c45-
                                                 afa4aceb96b2.png
38     Twitch              Twitch Interactive,  https://
        Interactive, Inc.   Inc.                 storage.googleapis.com/
        Self Service                             ad-manager-creatives-
                                                 production-us/e36a2270-
                                                 b668-4c35-bed6-
                                                 19862790b198/253a8bcc-
                                                 14d5-473d-8dc2-
                                                 96162361d3db.png
39     Hershey's | US |    UM NY--Hershey       https://
        Jolly Rancher       Social               storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/bafdaad5-
                                                 8fe4-4323-8147-
                                                 d92bfa3ecc2d/2f48cfdb-
                                                 e96b-4a01-b089-
                                                 1d8139f351f5.mp4
40     Invisalign          Align Technology     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/7ce768f5-
                                                 5c94-4846-939a-
                                                 f0dfbf50ee4d/373b1452-
                                                 f2f8-41ac-a7ac-
                                                 aa543d148d18.mp4
41     PoshmarkiOSWomen    Poshmark, Inc.       https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/3342f5d6-
                                                 e175-4df4-86bb-
                                                 d982a2be1d6a/4b4d52bb-
                                                 eed2-43d5-babc-
                                                 ca6c2e5304b9.png
42     Hershey's | US |    UM NY--Hershey       https://
        Jolly Rancher       Social               storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/bafdaad5-
                                                 8fe4-4323-8147-
                                                 d92bfa3ecc2d/2f48cfdb-
                                                 e96b-4a01-b089-
                                                 1d8139f351f5.mp4
43     Invisalign          Align Technology     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/bfc78fc6-
                                                 90fc-45b5-9e7a-
                                                 95beb63c3435/cc03afdf-
                                                 a9a9-46f9-aae0-
                                                 ce4a3025e0d8.png
44     Hershey's | US |    UM NY--Hershey       https://
        Jolly Rancher       Social               storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/20e58fcb-
                                                 63e9-4b09-b5d1-
                                                 6d40cb0dc471/e90acdd6-
                                                 a906-41dd-b963-
                                                 ad730437c596.mp4
45     Red Ventures--      Red Ventures LLC     https://
        Apple Music                              storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/fd0c2de1-
                                                 0b4b-402a-900b-
                                                 c1f1c35c9f0f/311b310b-
                                                 c237-4693-8667-
                                                 14bf4ec1b05a.jpg
46     Vans Self Service   Vans                 https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/1be8e123-
                                                 9f70-4e82-9b9d-
                                                 cac0faf464fa/b6386336-
                                                 1296-4b85-82c6-
                                                 06ff25e8ee00.MOV
47     [J&J] Clean &       VaynerMedia          https://
        Clear                                    storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/a26305a8-
                                                 3468-49ce-8002-
                                                 434681f8cb07/07524f75-
                                                 98a7-4169-948b-
                                                 f52680b67764.png
48     Cl:MDLZ--Br:SOUR    VaynerMedia          https://
        PATCH KIDS--Sub-                         storage.googleapis.com/
        Br:BASE--Cat:CONF                        ad-manager-creatives-
        ECTIONS--Sub-                            production-us/3f67e547-
        Cat:CANDY--Cur:US                        3b06-4857-959d-
        D--Geo:USA--Free:                        3778a85ae13e/33105423-
        SC                                       dd15-42da-b411-
                                                 1bd7f8abaeef.mp4
49     Discovery           Group Nine Media     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/f229e8bf-
                                                 dc7b-4841-a812-
                                                 8e8b90519b0f/c9fdab5f-
                                                 6b78-496b-883a-
                                                 b4867fff3772.mp4
50     MS--CRL--Extramore  General Mills        https://
        dinary--F21--Soci                        storage.googleapis.com/
        al (G1M--CTC--                           ad-manager-creatives-
        011)                                     production-us/8555d870-
                                                 aee3-45f7-98f8-
                                                 b577288b9c65/718c0402-
                                                 28d8-4d36-9177-
                                                 eee60a11e71e.mp4
51     Invisalign          Align Technology     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/2312d223-
                                                 4e58-4ae8-8101-
                                                 2c2d3626de28/d707d141-
                                                 3dd1-43a6-ab57-
                                                 a524536bbf60.png
52     Hershey's | US |    UM NY--Hershey       https://
        Jolly Rancher       Social               storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/20e58fcb-
                                                 63e9-4b09-b5d1-
                                                 6d40cb0dc471/e90acdd6-
                                                 a906-41dd-b963-
                                                 ad730437c596.mp4
53     Red Ventures--      Red Ventures LLC     https://
        Apple Music                              storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/6db9e1a5-
                                                 40fd-4ee0-900c-
                                                 533be4bf4ccd/71af20ba-
                                                 872c-4d9f-a317-
                                                 a23505186733.jpg
54     Schiefer ChopShop   Vans                 https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/cebaf9c5-
                                                 2604-45db-8c02-
                                                 8244b5ccbd23/a61f05b4-
                                                 3fde-41e2-9647-
                                                 73e4c2042d83.png
55     SDC--U.S Core       SmileDirectClub LLC  https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/c8621bb5-
                                                 ba33-47c2-a00c-
                                                 1104459f1776/b6a31a97-
                                                 a9d3-4d9c-b1c6-
                                                 c713ca2a67e4.mp4
56     MS--MLS--TOTFRZNHT  General Mills        https://
        SNCKS--Totino's                          storage.googleapis.com/
        Teen--F21 (G1M--                         ad-manager-creatives-
        TOT--020)                                production-us/b8bae064-
                                                 5812-4986-a9cd-
                                                 926936638970/ccf21fe3-
                                                 d9d6-4264-9a5b-
                                                 fec835c1fecc.mp4
57     Narrative--Episode  Pocketgems           https://
        -iOS14                                   storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/9fd61950-
                                                 73de-47ca-8e7e-
                                                 de65ebf55f41/7a0f8367-
                                                 6af3-4712-8472-
                                                 38cbc6d37cfc.png
58     MS--CRL--TSTCRNCH-  General Mills        https://
        -F22--Social                             storage.googleapis.com/
        (G1M--CTC--013)                          ad-manager-creatives-
                                                 production-us/b67a214c-
                                                 1658-4948-86da-
                                                 85cc3df7b77b/a5f70b69-
                                                 9d78-4320-8872-
                                                 c5e915a93707.mp4
59     truth campaign in-  Truth Initiative     https://
        house ad account    Foundation           storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/63d16b85-
                                                 8f7b-4ba4-837f-
                                                 937856441430/bc7669c2-
                                                 a8f2-4acb-a380-
                                                 6f4881a826ae.png
60     Invisalign          Align Technology     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/fc35c500-
                                                 0165-4c06-ac2f-
                                                 69c0885160b3/c47a345d-
                                                 6f38-43cd-8b83-
                                                 432a76e46dc3.png
61     Schiefer ChopShop   Vans                 https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/0ec066b4-
                                                 4767-480c-be4a-
                                                 913f16dad080/7fce8fd3-
                                                 4dfe-4e78-a77d-
                                                 2a2773f72cf0.png
62     Invisalign          Align Technology     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/d1fe1329-
                                                 6c15-4abc-91ba-
                                                 482267d346a8/379462dd-
                                                 c0ce-43ff-9b0f-
                                                 c42ce343eb9e.mp4
63     PoshmarkiOSWomen    Poshmark, Inc.       https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/d80618ff-
                                                 ded4-40ea-9d1d-
                                                 fa7e79f4ccee/a40d49a2-
                                                 9456-4aca-bfac-
                                                 6af581daa540.png
64     Curology            Curology, Inc.       https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/75c18aeb-
                                                 d9f0-46f5-829b-
                                                 a57c91afd4ef/fcb503da-
                                                 4add-4743-8efe-
                                                 d1a069e62f0c.mp4
65     Nike--WITHIN        Nike, Inc.           https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/eafa9847-
                                                 690f-41e9-801f-
                                                 6335a0ea7c8b/9b327822-
                                                 eb55-4984-b811-
                                                 77ce5b3ffce9.png
66     FY21--Q1-Q2--       Omnicom Media Group  https://
        McDonald's--GCM--   Holdings Inc.        storage.googleapis.com/
        GMA 2021--MMJ--                          ad-manager-creatives-
        GM3--205&206                             production-us/eea8bbcb-
                                                 1b48-44ee-a3e3-
                                                 22e9580073ea/6fcdb8fc-
                                                 a732-444f-9162-
                                                 3d8b45896361.png
67     Marine Corps 2019   Mindshare            https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/c14a8309-
                                                 ffe2-41b7-9c53-
                                                 bf8c4f77d1a7/a8b08667-
                                                 0421-4535-8aaf-
                                                 13807d483479.png
68     Invisalign          Align Technology     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/1290a0e4-
                                                 9d67-47ab-bd67-
                                                 6f70b9e9ea93/7922e7da-
                                                 06b1-4373-a017-
                                                 6cdd7e90a436.mp4
69     Target--BrandCampa  Target Corporation   https://
        ign                                      storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/cbccb41e-
                                                 a211-49dd-b071-
                                                 dc7c8dac0db5/7c6510b9-
                                                 b89d-4fcb-8a6d-
                                                 9ad4f94b95fd.mp4
70     Tubi (IOS)          Tubi TV              https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/7655a4ff-
                                                 5ba8-47f2-8be1-
                                                 f9d01a959ee6/f8cf97e5-
                                                 3c13-436e-b123-
                                                 f70b1db0df7c.mp4
71     Gatorade--2021--Ga  Omnicom Media Group  https://
        torade--Gatorade    Holdings Inc.        storage.googleapis.com/
        Portfolio--Q3Bran                        ad-manager-creatives-
        d--PGD CE 250                            production-us/e5f4f712-
                                                 b447-495f-b18b-
                                                 f4eedff4b4dd/714f40a8-
                                                 270f-4f14-8cdd-
                                                 65fa1497f630.lens
72     Snow                Structured Social    https://
                            LLC                  storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/25a46c6e-
                                                 46f7-4042-a90a-
                                                 6365c6f4d2d4/39278f93-
                                                 a122-41f6-906f-
                                                 a0718a11eb06.png
73     MS--MLS--TOTFRZNHT  General Mills        https://
        SNCKS--Totino's                          storage.googleapis.com/
        Teen--F21 (G1M--                         ad-manager-creatives-
        TOT--020)                                production-us/1674d695-
                                                 8c58-4ec2-b068-
                                                 f5fb4b710680/82ce9af7-
                                                 33a7-406e-a193-
                                                 680ab30c5bfd.mp4
74     MS--CRL--Extramore  General Mills        https://
        dinary--F21--Soci                        storage.googleapis.com/
        al (G1M--CTC--                           ad-manager-creatives-
        011)                                     production-us/909e77ba-
                                                 a556-42cf-a6ec-
                                                 cddabd249bab/d35fb1ff-
                                                 5043-4b17-b430-
                                                 379a7182b0e8.mp4
75     Hershey's | US |    UM NY--Hershey       https://
        Reese's             Social               storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/31e646bd-
                                                 29f9-4a2e-84a8-
                                                 211758b7ad91/65cba178-
                                                 f539-436b-9784-
                                                 91d9fdf30e16.mp4
76     Red Ventures--      Red Ventures LLC     https://
        Apple Music                              storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/c8250784-
                                                 47bf-491b-bce7-
                                                 f2e210a61e98/1d82bc92-
                                                 78da-4cf4-b8f4-
                                                 97a6bfcadf47.png
77     Target--BrandCampa  Target Corporation   https://
        ign                                      storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/1cd42b6b-
                                                 888f-4c42-b37d-
                                                 c4ed4a92d8e2/1d162001-
                                                 cf6b-4251-aaae-
                                                 bfd617155126.mp4
78     Twitch              Twitch Interactive,  https://
        Interactive, Inc.   Inc.                 storage.googleapis.com/
        Self Service                             ad-manager-creatives-
                                                 production-us/e36a2270-
                                                 b668-4c35-bed6-
                                                 19862790b198/253a8bcc-
                                                 14d5-473d-8dc2-
                                                 96162361d3db.png
79     TYROO--SS--SnA--IE  Smile Internet       https://
        C-WSP--DIRECT--VN   Technologies PTE.    storage.googleapis.com/
                            Limited              ad-manager-creatives-
                                                 production-asia/
                                                 0df8c884-76f4-4b8e-aa85-
                                                 c3218919ff06/a4d9e911-
                                                 7251-439f-b462-
                                                 f3fc38f3a10e.png
80     Discord Inc. Self   Discord Inc.         https://
        Service                                  storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/db2fbc85-
                                                 b8cc-437d-ac92-
                                                 c0cf9f6ed40e/12792900-
                                                 cde1-48ff-819a-
                                                 402ceb59fad7.mp4
81     Kellogg--PTT        Starcom US           https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/b4d2e576-
                                                 b6cb-4f37-b142-
                                                 ca889008b346/68212318-
                                                 7dca-4007-8ab5-
                                                 d36a89117731.jpg
82     FY21--Q1-Q2--       Omnicom Media Group  https://
        McDonald's--GCM--   Holdings Inc.        storage.googleapis.com/
        GMA 2021--MMJ--                          ad-manager-creatives-
        GM3--205&206                             production-us/4ef907e8-
                                                 06fe-4be0-8608-
                                                 1e19d6c40537/fa812bd3-
                                                 fa3e-49c3-9eb9-
                                                 eb04f25c7474.png
83     2020                Subway Restaurants   https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/601e8634-
                                                 c115-4306-9cf9-
                                                 48adffd7a2d3/a23c146f-
                                                 b97f-45f9-bc9d-
                                                 0484ab50a352.mp4
84     Narrative--Episode  Pocketgems           https://
        -iOS14                                   storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/9183b902-
                                                 3b5a-4ab3-9287-
                                                 6fc010a1b147/a56c61c3-
                                                 1315-4a43-bc51-
                                                 d3a1df4300a6.png
85     Red Ventures--      Red Ventures LLC     https://
        Apple Music                              storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/4261f3b2-
                                                 3930-4a3a-ba61-
                                                 25e1976a5223/f7a1c8d6-
                                                 bd9b-4459-b5f9-
                                                 21499d49e355.png
86     Tubi (IOS)          Tubi TV              https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/7655a4ff-
                                                 5ba8-47f2-8be1-
                                                 f9d01a959ee6/f8cf97e5-
                                                 3c13-436e-b123-
                                                 f70b1db0df7c.mp4
87     PoshmarkiOSWomen    Poshmark, Inc.       https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/82ce237e-
                                                 5a60-495c-a175-
                                                 7e41dcd989f3/730465ad-
                                                 7d60-4071-a6d5-
                                                 de001a40d4fc.png
88     Discovery           Group Nine Media     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/cbf02b63-
                                                 e0a3-4d35-890b-
                                                 be08beae584e/e4432b37-
                                                 0811-4698-812d-
                                                 e16f17bdb88f.mp4
89     Nike--WITHIN        Nike, Inc.           https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/f05ecd32-
                                                 a4c4-40e4-9ac3-
                                                 104a22da4fc8/36222309-
                                                 9afd-4f83-85be-
                                                 76ad14d6949b.png
90     PoshmarkQ4          Poshmark, Inc.       https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/00e231df-
                                                 0b9f-490b-9174-
                                                 c8aef9cb9d2d/e8ef89bb-
                                                 8255-401e-a6d0-
                                                 b37f85ab44ea.png
91     KOLLAB DOS          KOLLAB STUDIOS LLC   https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/726a7871-
                                                 0739-413b-8264-
                                                 24fd89dde1e0/d8d8d362-
                                                 4c0b-4165-a212-
                                                 5f7da7f709ff.MOV
92     KO--NASC--Sprite--  Universal McCann     https://
        2018--present       New York             storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/96d81533-
                                                 907e-4bbe-830a-
                                                 5d9134a63193/e6b8a9b0-
                                                 a147-42cb-b03e-
                                                 185f0e0d0ac2.mp4
93     Invisalign          Align Technology     https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/1dcd7cf5-
                                                 0ed3-4486-8284-
                                                 eed6c8613bd1/f7048fe7-
                                                 f1d1-4d97-9c25-
                                                 89a03029a451.png
94     Kellogg--PTT        Starcom US           https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/fd5ac2d5-
                                                 6a6b-4d16-81ad-
                                                 4704e3d08eec/d0283e27-
                                                 ec90-4d0f-8fe4-
                                                 18479fd896e9.mp4
95     Nike--WITHIN--iOS1  Nike, Inc.           https://
        4                                        storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/eb1495c1-
                                                 4782-4736-bee2-
                                                 3e2a5638477c/e1cc15c8-
                                                 ea53-4f11-a897-
                                                 85c570c5e6ba.mp4
96     SDC--U.S Core       SmileDirectClub LLC  https://
                                                 storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/7608c5fc-
                                                 977b-42f0-8068-
                                                 0b0bcfa5d288/c095350b-
                                                 0a33-482f-9fd5-
                                                 ea5d6cd641ea.mp4
97     Narrative--Episode  Pocketgems           https://
        -iOS14                                   storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/791c339a-
                                                 48bc-4f18-b7eb-
                                                 a29a417d9b98/bd5771e0-
                                                 1c60-4b24-a838-
                                                 b27eb930de43.png
98     KO--NASC--Fanta--2  Universal McCann     https://
        019--present        New York             storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/1303f62c-
                                                 5681-48e8-ba08-
                                                 11956b41234d/9cab6bd8-
                                                 0ac0-45bb-8768-
                                                 577de9caf871.mp4
99     PetSmart Inc Self   PetSmart Inc         https://
        Service                                  storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/3659b93d-
                                                 e0db-4a8f-bf93-
                                                 5a63817f9c94/acd79865-
                                                 76c4-426b-8976-
                                                 e53b239fde1f.mp4
100    PetSmart Inc Self   PetSmart Inc         https://
        Service                                  storage.googleapis.com/
                                                 ad-manager-creatives-
                                                 production-us/f8aa6c8f-
                                                 de87-41b5-8a10-
                                                 c7f3854175dd/da301f17-
                                                 5e28-41c1-94c6-
                                                 0844193262a4.mp4
------------------------------------------------------------------------

                                 ______
                                 
 Response to Written Questions Submitted by Hon. Richard Blumenthal to 
                           Michael Beckerman
    Preventing Child Sexual Exploitation and Grooming. According to the 
National Center for Missing and Exploited Children (NCMEC), 2020 was 
record-breaking for reports of child sexual exploitation, and this year 
has already exceeded that number of reports. In November 2019, I led a 
bipartisan letter to TikTok and other tech companies about what you are 
doing to stop child sexual exploitation.

    Question 1. What technologies do you have in place to automatically 
monitor for the grooming and enticement of children and report these 
crimes swiftly? If none, please explain why not.
    Answer. TikTok is committed to leveraging advanced technology to 
prevent, detect, remove, and report online child sexual exploitation 
content, including the grooming and enticement of children. Our 
Community Guidelines include policies that prohibit video, audio, 
livestream content, and accounts that depict or promote predation or 
grooming. We use artificial intelligence to identify potentially 
violating content or accounts and we create training modules to provide 
additional support to our human moderators who are reviewing such 
content. We also have implemented age-appropriate controls designed to 
help keep minors safe, including disabling direct messaging and hosting 
live streams for users under 16 years old. In addition, users can 
utilize the in-app report feature to report a video that they believe 
contains inappropriate content. If TikTok becomes aware of violative 
content, we will take immediate action to remove content, terminate 
accounts, and report cases to NCMEC and law enforcement as appropriate. 
All of these measures combine to help protect against minor grooming 
and enticement type content.

    Question 2. What technologies do you have in place to detect and 
stop predators from inducing children to send explicit images, videos, 
or live-streams of themselves? If none, please explain why not.
    Answer. First and foremost, it is not possible to send an off-
platform video or image via our direct messaging feature on TikTok.
    Explicit images are prohibited by our Community Guidelines, which 
state that ``we do not allow depictions, including digitally created or 
manipulated content, of nudity or sexual activity.'' We utilize a 
variety of technologies, including Photo DNA, to identify such content 
to ensure that we swiftly identify and remove this content from the 
platform. As mentioned above, both our direct messaging and hosting of 
livestream features require a user to be at least 16 years of age to 
further reduce the possibility of younger users receiving inappropriate 
communications. In addition, users aged 16-17 have their direct 
messaging by default set to friends only.
    Question 3. What technologies do you have in place to detect Child 
Sexual Abuse Material (CSAM) videos, both for known CSAM (such as the 
NCMEC hashlist) and new CSAM videos? If none, please explain why not.
    Answer. Studies have shown that CSAM is linked to and spread via 
private messaging. (See, e.g., https://www.innocentlivesfoundation.org/
how-predators-have-infiltrated-social-media; https://
www.missingkids.org/blog/2020/grooming-in-the-digital-age). As 
previously mentioned, TikTok restricts direct messaging to accounts 
ages 16 and over, and it is not possible to send an off-platform 
attachment (video or image) via our direct messaging feature on TikTok.
    TikTok has invested heavily in human and machine-based moderation 
tools to find and remove violative content, including CSAM. The systems 
and measures we have put in place, including the use of Photo DNA, work 
to proactively identify illegal and harmful content, and our automated 
technology filters and detects potentially violating content. Our human 
moderators review and determine whether content violates the Community 
Guidelines. If we become aware of violative content, we will take 
immediate action to remove content, terminate accounts, and report 
cases to NCMEC and law enforcement as appropriate. The NCMEC Hash 
Integration, integrated with NCMEC hash-sharing resource, is one of 
several tools we use to detect and swiftly remove and report CSAM.

    Question 4. Do you have specific mechanisms for users to report 
child sexual exploitation? If so, please elaborate on them; if not, 
please explain why not.
    Answer. Any video can be reported and flagged for review by any 
user. Users can report content that potentially violates our Community 
Guidelines, including minor safety issues, through our in-app reporting 
feature. Finally, anyone can report inappropriate content or users via 
our webform.

    Question 5. Do you include IP addresses in all reports to NCMEC? If 
not, please explain why not.
    Answer. Yes.

    Question 6. Please provide your policies and protocols on how you 
respond to law enforcement when they reach out for information relating 
to child sexual exploitation reports.
    Answer. TikTok's Law Enforcement Guidelines (available at https://
www.tik
tok.com/legal/law-enforcement) set forth the protocols for law 
enforcement officials seeking the data of users of the TikTok platform. 
TikTok also has internal policies and procedures governing how TikTok 
handles and responds to law enforcement requests, including requests 
related to child sexual exploitation. These policies require that 
TikTok disclose user data to law enforcement only where a request is 
based on a valid legal process or in emergency circumstances.

    Dangerous Content and Algorithmically-amplified Rabbit Holes. 
Videos on TikTok have shown young users dangerous content, and when 
these videos are part of challenges, they are seen widely across the 
platform. You said TikTok was never able to find evidence of a Blackout 
Challenge on the platform, despite widespread media coverage of the 
tragic deaths from this challenge and current videos on TikTok warning 
against participation in the challenge. Moreover, my staff have found 
videos of dangerous challenges like kids trying to pass out and 
encouraging others to do the same, some of which have been up for over 
a year.

    Question 1. Please explain the discrepancy between your statements 
and the reports of deaths due to Blackout Challenges on TikTok.
    Answer. Safety of our community is our priority. We strictly 
prohibit dangerous challenges, including so-called ``blackout 
challenges'', as such challenges would violate our Community 
Guidelines. We have removed ``blackout challenge'' and similar hashtags 
on the platform, and if someone searches for ``blackout challenge'', 
there would be no results. While we still have not found evidence of a 
``blackout challenge'' trending on our platform, we remain vigilant in 
our commitment to user safety.
    To that end, a few months ago, we launched a global project to 
better understand young people's engagement with potentially harmful 
challenges and hoaxes. While not unique to any one platform, the 
effects and concerns are felt by all--and we wanted to learn how we 
might develop even more effective responses as we work to better 
support teens, parents, and educators. On November 17, 2021 we released 
the results of our research (available at: https://
praesidiosafeguarding.co.uk/reports). We have used the findings from 
the report to inform a review of our policies and processes, and we're 
making a number of improvements to build on our existing safeguards. 
For example, we have developed a new resource for our Safety Center 
dedicated to challenges and hoaxes. This includes advice for caregivers 
that we hope can address the uncertainty they expressed about 
discussing this topic with their teens. We have also worked with 
experts to improve the language used in our warning labels that would 
appear to people who attempt to search our platform for content related 
to harmful challenges or hoaxes. Should people search for hoaxes linked 
to suicide or self-harm, we will now display additional resources in 
search. More information is available in our Newsroom here: https://
newsroom.tiktok.com/en-us/helping-our-community-stay-safe-while-having-
fun-on-tiktok.

    Question 2. Please describe in detail the content moderation 
processes--both through machine learning and manual review by TikTok 
employees or agents--that TikTok uses to screen for and remove content 
related to the Blackout Challenge, passing out, self-harm, and related 
content.
    Answer. TikTok uses a combination of human and machine-based 
moderation tools to detect and remove violative content, which includes 
content related to self-harm and dangerous acts.
    All content uploaded to TikTok first runs through a level of 
automated review. Our automated technology systems continuously learn 
and adapt using data in each video, as well as from the ultimate 
moderation decisions that are made downstream based on their 
observations and predictions. Content that has a high probability of 
violating TikTok's guidelines is prioritized for review by our 
moderation team. If content is found to be violative of our guidelines, 
then the action prescribed by the relevant guideline--for example, 
removal of content--is immediately implemented.
    Another strategy we use to counter potentially dangerous content is 
to block certain key words and their variants. For example, if someone 
searches for blackoutchallenge, they will see a clickable message of 
how they can learn how to recognize harmful challenges. This means a 
user cannot access content with these terms via search, the terms will 
not be clickable from a video, and users will not be able to use these 
as hashtags in the future. Users are also prohibited from using these 
blocked terms in their account usernames.
    In addition to our proactive measures, our teams also diligently 
stay alert to trends to ensure any new potentially violative content is 
detected and removed as quickly as possible.

    Question 3. How effective are these review processes? How do you 
measure this effectiveness, given that you said 94 percent of content 
violations are removed proactively yet this content was still readily 
accessible?
    Answer. As reported in our most recent Transparency Report, for the 
period April-June 2021, approximately 81.5 million videos were removed 
globally for violating our Community Guidelines or Terms of Service, 
which is less than 1 percent of all videos uploaded. Of those videos, 
we identified and removed 93 percent within 24 hours of being posted, 
94 percent before a user reported them, and 87.5 percent at zero views.
    While we believe these statistics reflect that we have made great 
strides in our content moderation policies and procedures, we recognize 
that our work is never finished and continue to look for ways to 
improve our technology, policies, and procedures.

    Question 4. Recently, the Wall Street Journal reported a disturbing 
increase in the number of teens exhibiting tics, like physical jerking 
movements and verbal outbursts, which doctors and researchers have 
linked this to TikTok. Some hospitals are reporting a ten-fold rise in 
tic-like behaviors among teens. What is TikTok doing to combat this 
problem?
    Answer. We continue to consult closely with experts, who caution 
that correlation does not mean causation and more research is necessary 
to better understand this specific experience. Similarly, the fact-
checking organization Snopes recently concluded that ``Social media 
consumption is just one possible explanation researchers provided 
alongside other stressors, including those related to coronavirus 
lockdowns as well as diagnoses of other mental health conditions . . . 
Furthermore, the claim is based on two observational reports published 
in peer-reviewed medical journals and not the results of a regimented 
scientific study. This suggests a correlation and one possible 
explanation, but it does not prove a causation.''

    Gun Videos Violating Community Guidelines. TikTok's community 
guidelines state: ``we do not allow the depiction, promotion, or trade 
of firearms, ammunition, firearm accessories, or explosive weapons. We 
also prohibit instructions on how to manufacture those weapons.'' 
Unfortunately, content that violates these guidelines is frequently 
posted on TikTok. According to some reports, clips of guns and related 
content have been viewed nearly half a million times with hundreds of 
comments. Alarmingly, some gun dealers are taking customer orders in 
the comments section of these popular TikTok videos. Other reporting 
has highlighted how TikTok is enabling young users to find instructions 
for making their own homemade firearms using downloadable 3-D printing 
instructions.

    Question 1. Please describe in detail the processes--both through 
machine learning and manual review by TikTok employees or agents--that 
TikTok utilizes to enforce the community guidelines with respect to the 
depiction, promotion or trade of guns.
    Answer. TikTok prohibits and removes content that promotes or seeks 
to trade firearms, ammunition, firearm accessories, or explosive 
weapons, as well as instructions for manufacturing weapons. We do so to 
ensure that weapons are neither glorified when used for violence nor is 
TikTok used as a means of illegal arms sale. We do however make 
exceptions for the depiction of weapons in certain safe and controlled 
environments. As stated in our Community Guidelines: ``Content as part 
of a museum's collection, carried by a police officer, in a military 
parade, or used in a safe and controlled environment such as a shooting 
range may be allowed.''
    TikTok uses a combination of human and machine-based moderation 
tools to detect and remove violative content, which includes content 
related to the depiction, promotion, or trade of guns.
    All content uploaded to TikTok first runs through a level of 
automated review. Our automated technology systems continuously learn 
and adapt using data in each video, as well as from the ultimate 
moderation decisions that are made downstream based on their 
observations and predictions. Content that has a high probability of 
violating TikTok's guidelines is prioritized for review by our 
moderation team. If content is found to be violative of our guidelines, 
then the action prescribed by the relevant guideline--for example, 
removal of content--is immediately implemented.
    Additional strategies that we utilize include the blocking and 
redirecting to our Community Guidelines certain key words and their 
variants. This means a user cannot access content with these terms via 
search, the terms will not be clickable from a video, and users will 
not be able to use these as hashtags in the future. Users are now 
prohibited from using these blocked terms in their account usernames.
    We also continually refine our moderator guidance to clarify and 
enhance our enforcement on content involving the sale, trade, 
redirection to sale, and instructions to make firearms, firearm 
accessories, explosive weapons, and ammunition. We've also expanded our 
definition of firearms to broaden our moderation and enforcement 
methods.

    Question 2. Is there a specialized team of content moderators 
specific to this issue set? If so, approximately how many employees 
work on this issue?
    Answer. TikTok's Trust and Safety teams have experts as well as 
moderators who focus on Illegal Activities and Regulated Goods. TikTok 
has thousands of moderators working to protect the TikTok community and 
integrity for our platform.

    Question 3. What steps has TikTok taken to ensure content related 
to ghost guns and 3D printed guns do not appear on its platform? If 
none, please explain why not.
    Answer. We treat ghost guns and 3D printed guns the same as we do 
any other weapon and do not allow content that depicts methods to 
manufacture, trade or glorify these weapons or associated accessories. 
TikTok uses an AI model to detect firearms in videos. Like our approach 
to traditional weapons, we have invested in training our automated 
technologies, as well as training our moderators, to recognize these 
weapons when conducting their content evaluations.

    Question 4. For posts deemed to violate TikTok's community 
standards with respect to the depiction, promotion or trade of guns, 
what is the average time it takes TikTok to identify and remove such 
posts?
    Answer. During the second quarter of this year, 20.9 percent of the 
videos removed violated our policy that prohibits criminal activities 
or the promotion or trade of drugs, tobacco, alcohol, and other 
controlled substances or regulated goods. 95.7 percent of these videos 
were removed within 24 hours of being posted, 92.3 percent were removed 
at zero views, and 97.1 percent were removed before any reports.

    Question 5. What does TikTok view as the major hurdles to 
enforcement of its community guidelines with respect to the depiction, 
promotion or trade of guns?
    Answer. We recognize there is no finish line when it comes to 
safety and that nefarious actors will continue to try to manipulate our 
platform for their own gain. Examples of this include actors using 
coded language, or depicting dissembled or otherwise masked weapons. We 
work with a range of experts to identify such work-arounds or tactics 
to ensure they do not take a foothold on our platform. These experts 
have also offered insights on our policies and shared research with us.
                                 ______
                                 
   Response to Written Questions Submitted by Hon. Amy Klobuchar to 
                           Michael Beckerman
    Social Media and Eating Disorders. Studies have found that eating 
disorders have one of the highest mortality rates of any mental 
illness.\1\ At the hearing, I asked TikTok about research related to 
eating disorder content on TikTok. You said you were not aware of any 
research studies TikTok has conducted about how your apps push content 
promoting eating disorders to teens, but that TikTok works with 
``outside experts'' to understand these issues.
---------------------------------------------------------------------------
    \1\ https://anad.org/get-informed/about-eating-disorders/eating-
disorders-statistics/

    Question. Please share any studies TikTok is aware of, including 
involving outside researchers, related to the relationship between 
TikTok and eating disorders.
    Answer. TikTok has partnered with the National Eating Disorders 
Association (NEDA) to raise awareness about eating disorders. TikTok 
worked with NEDA and other experts to develop in-app resources to 
provide access to help from expert organizations, and these 
conversations informed the list of searches and hashtags that we 
redirect to the NEDA helpline (and for which we block results). 
Additionally, we consult with NEDA and other experts on our eating 
disorder related policies and language in our Community Guidelines.
    In September 2021, TikTok expanded on these resources with a new 
Safety Center guide on eating disorders for teens, caregivers, and 
educators (available at https://www.tiktok.com/safety/en-us/eating-
disorder/). Developed in consultation with independent experts 
including the National Eating Disorders Association (NEDA), National 
Eating Disorder Information Centre, Butterfly Foundation, and Bodywhys, 
this guide will provide information, support and advice on eating 
disorders.

    Advertising and Teenagers. Tik Tok earns revenue from showing ads.

    Question 1. Please provide copies of the 100 most popular ads, in 
terms of viewership, among all users on your platform in the last year.
    Answer. Information relating to advertising performance, such as 
viewership, is confidential and sensitive information, both to TikTok 
and to TikTok's advertisers to which TikTok has confidentiality 
obligations. Please note that TikTok makes available some of its top 
performing ads in the Creative Center it maintains for registered 
advertisers, available at https://ads.tiktok.com/business/
creativecenter/inspiration/topads/pc/en, only when the advertiser has 
granted TikTok permission. TikTok does not share total viewership of 
the ads that are available in the Creative Center.

    Question 2. Please also provide copies of the 100 most popular ads, 
in terms of viewership, among teenagers on your platform in the last 
year.
    Answer. Information relating to advertising performance, such as 
viewership among teenagers, is confidential and sensitive information, 
both to TikTok and to TikTok's advertisers to which TikTok has 
confidentiality obligations. Please note that TikTok makes available 
some of its top performing ads in the Creative Center it maintains for 
registered advertisers, available at https://ads.tiktok.com/business/
creativecenter/inspiration/topads/pc/en, only when the advertiser has 
granted TikTok permission. TikTok does not share total viewership of 
the ads that are available in the Creative Center.

    Advertising Revenue. TikTok is privately held, so we don't have 
public documents on your advertising revenue per user.

    Question 1. TikTok's best estimate of advertising revenue per U.S. 
user for the last quarter?
    Answer. TikTok generates substantially all of its revenue from 
selling advertising. However, details regarding the company's revenue 
constitute confidential and proprietary business information, which we 
do not share publicly.

    Question 2. How much of TikTok's U.S. revenue came from users under 
the age of 18? If precise data is not available, provide an estimate.
    Answer. TikTok generates substantially all of its revenue from 
selling advertising. However, details regarding the company's revenue 
constitute confidential and proprietary business information, which we 
do not share publicly.

    Algorithms for Content Moderation. Frances Haugen disclosed that 
Facebook has been misleading the public about how well it can detect 
and remove harmful content.\2\ Facebook's internal documents estimated 
that their automated systems removed less than one percent of 
prohibited violent content.'' \3\
---------------------------------------------------------------------------
    \2\ https://www.wsj.com/articles/facebook-ai-enforce-rules-
engineers-doubtful-artificial-intelligence-11634338184
    \3\ https://www.wsj.com/articles/facebook-ai-enforce-rules-
engineers-doubtful-artificial-intelligence-11634338184

    Question. Has TikTok collected any data about how often TikTok's 
automated content detection systems failed to capture content on TikTok 
that violates the company's terms of service in the last 18 months? If 
so, please provide that data.
    Answer. TikTok discloses removals of violative videos in our 
quarterly Community Guidelines Enforcement Reports. TikTok began 
disclosing total removals based on automated content detection in our 
April-July 2021 report. During 2021Q2, 81,518,334 videos were removed 
globally for violating our Community Guidelines or Terms of Service, 
which is less than 1 percent of all videos uploaded. During this time-
frame we also continued to roll out technology in additional markets 
that automatically detects and removes some categories of violative 
content. As a result, 16,957,950 of the total removals were handled 
through this technology.

    Children on Social Media. TikTok allows kids under 13 in the United 
States to use the TikTok app through TikTok's ``experience for Younger 
Users.'' \4\
---------------------------------------------------------------------------
    \4\ See https://www.tiktok.com/legal/privacy-policy-for-younger-
users?lang=en

    Question. How many under 13 accounts have you removed from TikTok--
not TikTok's experience for kids--in the last year?
    Answer. Earlier this year TikTok began proactively disclosing data 
in our Community Guidelines Enforcement Reports regarding the removal 
of suspected underage accounts. Although other platforms have made 
similar disclosures, we note that TikTok was the first company to do 
so.
    In the first quarter of 2021, 7,263,952 suspected underage accounts 
were removed globally from the full TikTok experience for potentially 
belonging to a person under the age of 13.
    In the second quarter of 2021, 11,205,597 suspected underage 
accounts were removed globally from the full TikTok experience for 
potentially belonging to a person under the age of 13.

    Harmful Content on Social Media. TikTok now has more than a billion 
users,\5\ yet TikTok's algorithm is a secretive black box to almost 
everyone. A recent investigation by the Wall Street Journal showed that 
TikTok can easily push teen users into rabbit holes of potentially 
harmful content, including content that TikTok supposedly bans, related 
to eating disorders, drugs, and violence.\6\
---------------------------------------------------------------------------
    \5\ https://newsroom.tiktok.com/en-us/1-billion-people-on-tiktok
    \6\ https://www.wsj.com/articles/tiktok-algorithm-sex-drugs-minors-
11631052944?mod=Search
results_pos9&page=1

    Question. Reports have indicated a significant portion of TikTok's 
user base is minors.\7\ Approximately how many of your U.S. users are 
under the age of 18? How many are under 13?
---------------------------------------------------------------------------
    \7\ https://www.wsj.com/articles/tiktok-tops-1-billion-monthly-
users-11632760678?mod=Search
results_pos7&page=1
---------------------------------------------------------------------------
    Answer. To learn more about TikTok's recommendation system, please 
see our Newsroom post for additional details: https://
newsroom.tiktok.com/en-us/how-tiktok-recommends-videos-for-you. We also 
release regular transparency reports, and we routinely provide public 
updates on our efforts to continuously improve platform safety. Our 
Content Advisory Council brings together technology and safety experts 
who help shape our forward-looking policies that not only address the 
challenges of today, but also plan ahead for the next set of issues 
that our industry will face. And we have also opened our Transparency 
and Accountability Centers, which due to constraints as a result of the 
coronavirus pandemic are currently virtual experiences. At TikTok 
Transparency and Accountability Centers, invited guests and experts 
will have the opportunity to review:

   How our trained content moderators apply TikTok's Community 
        Guidelines to review the content and accounts that are 
        escalated to them via user reports and technology-based 
        flagging.

   Our data security practices and the measures we're taking to 
        safeguard user privacy and information.

   TikTok source code, which will be made available at the 
        center for testing and evaluation. Additionally, visitors will 
        learn how our application's algorithm operates.

    We do not publicly share details on user demographics.

    Social Media and Medical Impacts. A recent Wall Street Journal 
report highlighted several instances of teenage girls going to their 
doctors' offices with tics, including physical jerking movements and 
verbal outbursts. Doctors have said that content watched on TikTok 
might be playing a factor in teens developing tics.\8\
---------------------------------------------------------------------------
    \8\ https://www.wsj.com/articles/teen-girls-are-developing-tics-
doctors-say-tiktok-could-be-a-factor-11634389201

    Question. TikTok has stated that they are looking into the 
connection between videos on TikTok and teens developing tics.\9\ What 
is the company doing right now to intervene to protect kids on TikTok 
while doctors and researchers examine this issue?
---------------------------------------------------------------------------
    \9\ https://www.wsj.com/articles/teen-girls-are-developing-tics-
doctors-say-tiktok-could-be-a-factor-11634389201
---------------------------------------------------------------------------
    Answer. We continue to consult closely with experts, who caution 
that correlation does not mean causation and more research is necessary 
to better understand this specific experience. Similarly, the fact-
checking organization Snopes recently concluded that ``Social media 
consumption is just one possible explanation researchers provided 
alongside other stressors, including those related to coronavirus 
lockdowns as well as diagnoses of other mental health conditions. . . 
Furthermore, the claim is based on two observational reports published 
in peer-reviewed medical journals and not the results of a regimented 
scientific study. This suggests a correlation and one possible 
explanation, but it does not prove a causation.''
                                 ______
                                 
  Response to Written Questions Submitted by Hon. Marsha Blackburn to 
                           Michael Beckerman
    Question 1. Do you get parental consent when you do research on 
kids? Can you provide a copy of a blank parental consent form, if 
applicable?
    Answer. TikTok has engaged external experts to conduct research on 
the topics of dangerous challenges and cyberbullying, and in both 
cases, the third parties we engaged sought parental consent.

    Question 2. Can you provide clarity on your chain of ownership and 
whether the language in your privacy policy saying you share language 
``with your parent company and affiliates'' means that you share U.S. 
user data with ByteDance or its employees or contractors?
    Answer. The TikTok platform collects personal information from U.S. 
users in accordance with its Privacy Policy. All U.S. user data is 
stored on servers located in the U.S., with backup data stored on 
servers in Singapore.
    Some ByteDance entities provide services that support the operation 
of the TikTok platform, such as providing technical support services. 
To the extent that these support functions require ByteDance entities 
to access U.S. user data, such access is remote (as U.S. user data is 
stored in the U.S. and Singapore), and steps are taken to limit such 
access, including restricting who has access to which dataset, 
implementing strong authentication measures, logging of access, 
limiting access periods, and encrypting data.

    Question 3. Why do you say in your testimony that you aren't aware 
that viral trends like ``Devious Licks'' were on your site?
    School administrators in Tennessee have suffered serious property 
damage from content like this. How does this square with comments about 
TikTok's commitment to removing dangerous content?
    How do your content moderators go about becoming aware of such 
videos to determine what to remove or deprioritize?
    Answer. Over the last year, there have been increasing concerns 
around online challenges and hoaxes. Although these concerns are not 
unique to any one platform (and, in some instances, have been focused 
on so-called challenges that TikTok has not found evidence of on our 
platform), TikTok takes seriously our responsibility to support and 
protect teens, and to help limit exposure to content about challenges 
that may be harmful or dangerous.
    Content that depicts dangerous or criminal activities violates 
TikTok's Community Guidelines, and we remove such content from our 
platform as soon as we discover it. With regard to the ``Devious 
Licks'' challenge that impacted various platforms, including TikTok, we 
moved quickly to identify and remove associated videos and audio from 
our platform as soon as we became aware of them. Overall, we removed 
more than 200,000 videos related to this trend.
    TikTok uses a combination of human moderators and machine-based 
moderation tools to detect and remove content that violates our 
Community Guidelines, which prohibit content related to dangerous or 
criminal activities. All content uploaded to TikTok first runs through 
a level of automated review. Our automated technology systems 
continuously learn and adapt using data in each video, as well as from 
the ultimate moderation decisions that are made by human moderators 
based on their observations and predictions. The automated filters 
remove certain content automatically and flag other content for 
prioritized review by our human moderation team. If content is found to 
be violative of our Guidelines, then the action prescribed by the 
relevant guideline (e.g., removal of content) is implemented. In 
addition to these proactive measures, our teams also diligently stay 
alert to trends to ensure that any new potentially violative content is 
detected and reviewed as quickly as possible.
    TikTok remains vigilant in our commitment to user safety, and to 
that end, a few months ago, we launched a global project to better 
understand young people's engagement with potentially harmful 
challenges and hoaxes. We have used the findings from the report to 
inform a review of our policies and processes, and we're making a 
number of improvements to build on our existing safeguards. More 
information is available in our Newsroom here: https://
newsroom.tiktok.com/en-us/helping-our-community-stay-safe-while-having-
fun-on-tiktok.
                                 ______
                                 
     Response to Written Questions Submitted by Hon. Rick Scott to 
                           Michael Beckerman
    Question 1. TikTok is owned by ByteDance, a Chinese conglomerate 
that has strategic partnerships with the Communist Chinese Ministry of 
Public Security, an agency of the Communist Party of China that is 
tasked with ``public security,'' or more basically ``mass 
surveillance'' of their citizens. In 2017 the Communist Party of China 
enacted a law that requires any Chinese company to provide information 
or data upon request, which may include Americans' data collected by 
TikTok.
    Does TikTok collect data on American citizens and house or route 
that data in or through Communist China?
    Answer. TikTok collects personal information from U.S. users in 
accordance with its Privacy Policy. All U.S. user data collected on the 
TikTok platform is stored on servers located in the U.S., with backup 
data stored on servers in Singapore.
    Some ByteDance entities provide services that support the operation 
of the TikTok platform, such as providing technical support services. 
To the extent that these support functions require ByteDance entities 
to access U.S. user data, such access is remote (as U.S. user data is 
stored in the U.S. and Singapore), and steps are taken to limit such 
access, including restrictions on who has access to which dataset, 
strong authentication measures, logging of access, limited access 
period, and encryption of data.

    Does TikTok comply with all of Communist China's laws that require 
it to provide data including data on American citizens?
    Answer. The TikTok platform is not available in mainland China and 
is not the focus of the Chinese counter-espionage law, national 
security law, or national intelligence law. TikTok has never been asked 
or subpoenaed by the Chinese government to provide any U.S. user data, 
and the Chinese government has never asserted any legal rights over 
such data.

    In 2019, the Washington Post ran an article about TikTok's 
censorship of ``#hongkong''. Does TikTok, at the request of the 
Communist Chinese government, censor American users' content, such as a 
post about the need for democracy in Hong Kong or a post condemning the 
treatment of the Uyghur Muslim population in China?
    Answer. TikTok has established a U.S. Safety Team that is based in 
the U.S. and led by an American, which makes independent decisions 
regarding content moderation in the U.S. TikTok does not remove content 
based on sensitivities related to China. We have never been asked by 
the Chinese government to remove any content, and we would not do so if 
asked. One can find user generated content on TikTok addressing users' 
views of diverse subjects, including democracy in Hong Kong and the 
treatment of the Uyghur Muslim population in China.

    Question 2. In July, the Wall Street Journal ran an investigation 
into TikTok's use of algorithms, which found that these algorithms 
collect massive amounts of data on each user and their interests, 
creating an echo chamber of videos that TikTok believes the user wants 
to see. TikTok will continue to feed the user more and more of related 
videos.
    It appears that the common theme with most platforms is how you get 
a user to stay on your platform for as long as possible. I think 
everyone could agree that all social media platforms can become 
addictive. Do you believe the use of your algorithms are harmful to 
users, especially teenagers and children?
    Does TikTok study the harmful impacts that using their platform has 
on different demographics that use your platform?
    If so, will you make that research public?
    Answer. TikTok takes seriously our responsibility to help maintain 
a safe environment for our community, especially for teens. TikTok 
conducts research on a variety of topics relating to safety and would 
be happy to discuss the findings with your office.

    Question 3. It is pretty fair to assume that a large portion of the 
growth and revenue that your platforms bring in is centered on ensuring 
a growing youth user base. To keep growing as a platform, you likely 
need new young users, which in turn requires you to target younger 
demographics in outreach. Current Federal law prohibits platforms such 
as yours from collecting data on children under 13 years old. Companies 
like YouTube paid $170 million to settle allegations by the FTC and the 
New York Attorney General for allegedly collecting personal information 
from children without their parents' consent.
    What portion of your user base in the United States is under 18-
years old?
    Answer. TikTok disagrees with the premise of the question that 
TikTok is required ``to target younger demographics.'' TikTok 
celebrates diversity and welcomes people of different ages, cultures, 
and backgrounds. We do not publicly share details on our user 
demographics.

    Do you believe that provisions in the Children's Online Privacy 
Protection Act (COPPA), that require parental consent for collecting 
data on children under 13, should be reviewed and possibly changed to 
increase the age limit?
    Answer. We are supportive of an effective Children's Online Privacy 
Protection Act and would be happy to meet with your office to discuss 
any specific provisions or proposals.
                                 ______
                                 
   Response to Written Questions Submitted by Hon. Cynthia Lummis to 
                           Michael Beckerman
    Question 1. In response to my question regarding information that 
TikTok automatically collects on its users, you asserted that you 
collect less information than many of your peers, namely ``Facebook and 
Instagram.'' Please identify the specific privacy policies, data 
collection practices, or security measures, that you have adopted which 
are less intrusive than your peers in this space, including, but not 
limited to, Facebook and Instagram.
    Answer. At TikTok, we respect the privacy of our users and look to 
adhere to data minimization principles. For example, we do not require 
users to provide their real name in order to create an account. We also 
do not ask our users for personal background information, such as 
education, employment, or marital status.

    Question 2. In our hearing you stated that you would follow up with 
my office, you have not. Will TikTok commit to giving my staff a 
privacy briefing?
    Answer. Yes.

    Question 3. Your privacy policy states that you may share, ``all of 
the information we collect with a parent, subsidiary, or other 
affiliate of our corporate group.'' Does ByteDance technology, or any 
other affiliate controlled in whole or in part by the Chinese Communist 
Party, receive directly or indirectly through commerce or otherwise, 
information collected about TikTok users outside of China?
    Answer. ByteDance Ltd. is not, and does not have any affiliates, 
controlled in whole or in part by the Chinese Communist Party.
    Beijing ByteDance Technology Co., Ltd. is a separately held 
subsidiary under ByteDance Ltd. that is focused on the domestic China 
market. It has no ownership or control over TikTok, and no input into 
TikTok's operations. Further, employees of Beijing ByteDance Technology 
Co., Ltd. are restricted from accessing U.S. user data.

    Question 4. Has your business conducted internal research, or 
contracted or collaborated with external entities to perform research 
examining, or which has revealed, various harms that can result from 
using your platform generally, or specifically as it relates to its use 
by minors? This harm definition should include deleterious effects to a 
user's social, mental, or physical well-being.
    If so, will you commit to sharing that research with Members of 
Congress and/or appropriate outside professionals?
    Answer. TikTok takes seriously our responsibility to help maintain 
a safe environment for our community, especially for teens. TikTok 
conducts research on a variety of topics relating to safety and would 
be happy to discuss the findings with your office.

    Question 5. You stated in your testimony that length of engagement, 
or some variation of it, was one of many metrics that your company uses 
to measure your company's success. This necessarily implies the 
existence of other metrics that your company uses to define success. 
Please identify all of those specific metrics, how the specific metric 
is calculated and tracked on the platform, and how they might align 
with or relate to your company's specific mission.
    Answer. There is no single metric or set of metrics that define 
TikTok's success. Depending on the objective, we may look at different 
metrics. For example, to help provide a safe, diverse, and entertaining 
experience on our platform, we consider a range of safety metrics (such 
as the account and content removals detailed in our Community Guideline 
Enforcement Reports) and engagement metrics (such as time spent, video 
views, and likes).
                                 ______
                                 
     Response to Written Question Submitted by Hon. John Thune to 
                           Michael Beckerman
    Question. TikTok's privacy policy states that ``We may share all of 
the information we collect with a parent, subsidiary, or other 
affiliate of our corporate group.'' Please provide a complete list of 
all parent, subsidiary, or other affiliates with whom you share the 
information that you collect.
    Answer. TikTok Inc. is a U.S. entity. Its parent company is TikTok 
LLC, which is owned by TikTok Ltd., which in turn is owned by ByteDance 
Ltd. TikTok entities within our corporate group include TikTok Pte. 
Ltd. (Singapore), TikTok Information Technologies UK Ltd. (UK), TikTok 
Technology Ltd. (Ireland), TikTok Technology Canada Inc. (Canada), 
TikTok Australia Pty. Ltd. (Australia), and TikTok Hong Kong Ltd. (Hong 
Kong), among others. In addition to these entities, affiliates under 
our Privacy Policy also include some non-TikTok entities under 
ByteDance Ltd. that provide services that support the operation of the 
TikTok platform, such as providing technical support services. To the 
extent that these support functions require ByteDance entities to 
access U.S. user data, such access is remote (as U.S. user data is 
stored in the U.S. and Singapore), and steps are taken to limit such 
access, including restrictions on who has access to which dataset, 
strong authentication measures, logging of access, limited access 
period, and encryption of data.
                                 ______
                                 
      Response to Written Questions Submitted by Hon. Ted Cruz to 
                           Michael Beckerman
    Question 1. Mr. Beckerman, in your testimony before the 
subcommittee, you acknowledged that ByteDance, the parent company of 
TikTok, is considered part of TikTok's ``corporate group'' as that term 
is used in TikTok's privacy policy. The relevant paragraphs of that 
policy are reproduced below:
How we share your information
    We are committed to maintaining your trust, and while TikTok does 
not sell personal information to third parties, we want you to 
understand when and with whom we may share the information we collect 
for business purposes.
    . . .
Within Our Corporate Group
    We may share all of the information we collect with a parent, 
subsidiary, or other affiliate of our corporate group.

    You would not, however, say whether Beijing ByteDance Technology, a 
sister company of TikTok, is included under the same ``Within Our 
Corporate Group'' paragraph, which outlines ``How [TikTok] share[s] 
your information.''

    As I noted during the hearing, media reports from earlier this year 
showed the Chinese government took a minority stake in Beijing 
ByteDance Technology, doing so through the use of the state-backed 
``Internet Investment Chinese (Beijing) Technology'' entity. 
Additionally, it was reported that as part of this deal a government 
official named ``Wu Shugang'' took one of three board seats at Beijing 
ByteDance Technology. According to those same reports, this government 
official, Wu Shugang, has spent most of his career in Chinese 
propaganda, including with a stint at the ``online opinion bureau'' 
under the Cyberspace Administration of China (CAC), China's Internet 
regulator.

    Mr. Beckerman, I'll ask again in writing--is Beijing ByteDance 
Technology, which press reports have characterized as a subsidiary of 
ByteDance, TikTok's parent company, considered to be part of your 
``corporate group'' as that term is used in TikTok's privacy policy? 
Please answer this question with a `yes' or `no'.
    Answer. Beijing ByteDance Technology Co., Ltd. is a separately held 
subsidiary under ByteDance Ltd. that is focused on the domestic China 
market. It has no ownership or control over TikTok, and no input into 
TikTok's operations. Further, employees of Beijing ByteDance Technology 
Co., Ltd. are restricted from accessing U.S. user data.

    Question 2. Mr. Beckerman, how many entities are in TikTok's 
``corporate group'' as that term is used in TikTok's privacy policy, 
and what is their relationship to each other? Please provide an 
organizational chart illustrating these relationships if available.
    Answer. TikTok Inc. is a U.S. entity. Its parent company is TikTok 
LLC, which is owned by TikTok Ltd., which in turn is owned by ByteDance 
Ltd. TikTok entities within our corporate group include TikTok Pte. 
Ltd. (Singapore), TikTok Information Technologies UK Ltd. (UK), TikTok 
Technology Ltd. (Ireland), TikTok Technology Canada Inc. (Canada), 
TikTok Australia Pty. Ltd. (Australia), and TikTok Hong Kong Ltd. (Hong 
Kong), among others. In addition to these entities, affiliates under 
our Privacy Policy also include some non-TikTok entities under 
ByteDance Ltd. that provide services that support the operation of the 
TikTok platform.

    Question 3. Mr. Beckerman, please provide a definition for parent, 
subsidiary, other affiliate, and corporate group, as those terms are 
used in TikTok's privacy policy.
    Answer. TikTok Inc. is a U.S. entity. Its parent company is TikTok 
LLC, which is owned by TikTok Ltd., which in turn is owned by ByteDance 
Ltd. TikTok entities within our corporate group include TikTok Pte. 
Ltd. (Singapore), TikTok Information Technologies UK Ltd. (UK), TikTok 
Technology Ltd. (Ireland), TikTok Technology Canada Inc. (Canada), 
TikTok Australia Pty. Ltd. (Australia), and TikTok Hong Kong Ltd. (Hong 
Kong), among others. In addition to these entities, affiliates under 
our Privacy Policy also include some non-TikTok entities under 
ByteDance Ltd. that provide services that support the operation of the 
TikTok platform.

    Question 4. Mr. Beckerman, please provide the following:

  1.  The legal name of each ``parent, subsidiary, or other affiliate'' 
        which is part of TikTok's ``corporate group.''

  2.  The location where each ``parent, subsidiary, or other 
        affiliate'' is headquartered

  3.  The location where each ``parent, subsidiary, or other 
        affiliate'' is domiciled, if domicile location differs from 
        where the parent, subsidiary, or other affiliate is 
        headquartered.

  4.  For each ``parent, subsidiary, or other affiliate,'' the laws 
        under which they were originally incorporated.

  5.  For each ``parent, subsidiary, or other affiliate,'' the laws 
        under which they currently operate, if different from the laws 
        under which they were incorporated. If there are multiple 
        jurisdictions, please list all that apply for each ``parent, 
        subsidiary, or other affiliate.''

  6.  For each ``parent, subsidiary, or other affiliate,'' the full 
        names of the leadership of each ``parent, subsidiary, or other 
        affiliate,'' including the members of the board where 
        applicable.

  7.  For each ``parent, subsidiary, or other affiliate,'' the mix of 
        capital backing the entity, including all state-owned banks or 
        financing regimes, or state-backed banks or financing regimes, 
        the names of those state-owned or state-backed banks and 
        financing regimes, and the names of those nations.

    Answer. TikTok Inc. is a U.S. entity. Its parent company is TikTok 
LLC, which is owned by TikTok Ltd., which in turn is owned by ByteDance 
Ltd. TikTok entities within our corporate group include TikTok Pte. 
Ltd. (Singapore), TikTok Information Technologies UK Ltd. (UK), TikTok 
Technology Ltd. (Ireland), TikTok Technology Canada Inc. (Canada), 
TikTok Australia Pty. Ltd. (Australia), and TikTok Hong Kong Ltd. (Hong 
Kong), among others. In addition to these entities, affiliates under 
our Privacy Policy also include some non-TikTok entities under 
ByteDance Ltd. that provide services that support the operation of the 
TikTok platform.

    Question 5. Mr. Beckerman, also in TikTok's privacy policy is a 
paragraph titled ``for legal reasons.'' The relevant text has been 
reproduced below:
How we share your information
    We are committed to maintaining your trust, and while TikTok does 
not sell personal information to third parties, we want you to 
understand when and with whom we may share the information we collect 
for business purposes.
    . . .
For Legal Reasons
    We may disclose any of the information we collect to respond to 
subpoenas, court orders, legal process, law enforcement requests, legal 
claims, or government inquiries, and to protect and defend the rights, 
interests, safety, and security of TikTok Inc., the Platform, our 
affiliates, users, or the public. We may also share any of the 
information we collect to enforce any terms applicable to the Platform, 
to exercise or defend any legal claims, and comply with any applicable 
law.
    Mr. Beckerman, please answer whether TikTok would consider demands 
made under the following to fall under the umbrella of ``subpoenas, 
court orders, legal process, law enforcement requests, legal claims, or 
government inquiries'' for which TikTok may ``disclose any of the 
information [it] collect[s]'':

  1.  China's 2014 counter-espionage law, which allows Chinese 
        authorities to seal or seize any property linked to activities 
        deemed harmful to the country.

  2.  China's 2015 national security law, which outlaws threats to 
        China's government, sovereignty and national unity as well as 
        its economy, society, and cyber and space interests.

  3.  China's 2017 national intelligence law, which obliges 
        individuals, organizations, and institutions to assist Public 
        Security and State Security officials in carrying out a wide 
        array of ``intelligence'' work, and stipulates that ``any 
        organization or citizen shall support, assist, and cooperate 
        with state intelligence work according to law.''

    Answer. The TikTok platform is not available in mainland China and 
is not the focus of the Chinese counter-espionage law, national 
security law, or national intelligence law. TikTok has never been asked 
or subpoenaed by the Chinese government to provide any U.S. user data, 
and the Chinese government has never asserted any legal rights over 
such data.

    Question 6. Mr. Beckerman, as a subsidiary of ByteDance, would 
TikTok consider protecting ByteDance from government retribution, say 
by turning over data to the Chinese Communist Party, to be in defense 
of ``the rights, interests, safety, and security of TikTok Inc., the 
Platform,'' or its ``affiliates'' ?
    Answer. The TikTok platform is not available in mainland China and 
is not the focus of the Chinese counter-espionage law, national 
security law, or national intelligence law. TikTok has never been asked 
or subpoenaed by the Chinese government to provide any U.S. user data, 
and the Chinese government has never asserted any legal rights over 
such data.
    As a U.S.-based company, TikTok Inc. is committed to abiding by the 
laws to which it is subject. Any subpoenas, legal processes, and 
government inquiries would be evaluated by legal professionals on 
behalf of TikTok and responded to in accordance with applicable laws.

    Question 7. Mr. Beckerman, is Beijing ByteDance Technology, 
considered to be one of TikTok's ``affiliates'' which, in defense of, 
TikTok could share ``any of the information'' it collects?
    Answer. Beijing ByteDance Technology Co., Ltd. is a separately held 
subsidiary under ByteDance Ltd. that is focused on the domestic China 
market. It has no ownership or control over TikTok, and no input into 
TikTok's operations. Further, employees of Beijing ByteDance Technology 
Co., Ltd. are restricted from accessing U.S. user data.

    Question 8. Mr. Beckerman, please list, in detail, the following 
for all ``affiliates'' which, in defense of, TikTok could share ``any 
of the information'' it collects:

  1.  The legal name.

  2.  The location where headquartered.

  3.  The location where domiciled.

  4.  The laws under which each was incorporated.

  5.  The laws under which each operate currently. If there are 
        multiple jurisdictions, please list all that apply.

    Answer. TikTok Inc. is a U.S. entity. Its parent company is TikTok 
LLC, which is owned by TikTok Ltd., which in turn is owned by ByteDance 
Ltd. TikTok entities within our corporate group include TikTok Pte. 
Ltd. (Singapore), TikTok Information Technologies UK Ltd. (UK), TikTok 
Technology Ltd. (Ireland), TikTok Technology Canada Inc. (Canada), 
TikTok Australia Pty. Ltd. (Australia), and TikTok Hong Kong Ltd. (Hong 
Kong), among others. In addition to these entities, affiliates under 
our Privacy Policy also include some non-TikTok entities under 
ByteDance Ltd. that provide services that support the operation of the 
TikTok platform, such as providing technical support services. To the 
extent that these support functions require ByteDance entities to 
access U.S. user data, such access is remote (as U.S. user data is 
stored in the U.S. and Singapore), and steps are taken to limit such 
access, including restrictions on who has access to which dataset, 
strong authentication measures, logging of access, limited access 
period, and encryption of data.

    Question 9. Mr. Beckerman, what product development is done solely 
in China? What product development is shared between Chinese employees 
and American employees, and what is the breakdown of how much of this 
shared product development is done in China vs. the United States?
    Answer. The team members that work on product development and 
related features are located in the U.S., China, Singapore, and Europe. 
We are unable to provide a breakdown of how much development is done in 
China versus other countries where we do product development, and any 
such percentage would be constantly changing in any event, as we 
continue to hire more engineering talent. Our global approach towards 
product development is similar to the approach of other large global 
tech companies that likewise have product development or innovation 
teams in various global locations, including the U.S., China, India, 
and Europe.

    Question 10. Of TikTok's employees in the United States, what 
number have ByteDance e-mail addresses and what percentage of the 
American workforce does that represent?
    Answer. TikTok Inc. is a large, fast-growing company that is part 
of an even larger global organization in ByteDance Ltd. TikTok Inc. 
employees have ByteDance domain e-mails but can also choose to have a 
TikTok e-mail address, which some do. Using an e-mail domain that is 
tied to the top level entity is a common practice for large 
multinational organizations.

    Question 11. What data do ByteDance employees in China have 
unrestricted access to?
    Answer. ByteDance employees in China do not have unrestricted 
access to U.S. user data.

    Question 12. What data is controlled by TikTok's ``access 
controls''?
    Answer. TikTok's data access approval policy, which sets out 
TikTok's data access controls and procedures, applies to all TikTok 
U.S. user data.

    Question 13. Of the ByteDance employees in China who have access or 
have previously accessed TikTok data, how many of them are affiliated 
with, or have some kind of relationship with, the Chinese Communist 
Party? Please list, in detail, what those relationships are and what 
data these employees had or have access to.
    Answer. TikTok Inc. does not have information about the status of 
the political affiliation of ByteDance employees in China, nor does 
TikTok Inc. have information about the political affiliation of 
employees of other ByteDance entities outside the U.S.

    Question 14. Mr. Beckerman, it was reported in July of this year 
that TikTok's parent company, ByteDance, began licensing parts of its 
artificial intelligence (AI) technologies to third parties through a 
new division called BytePlus. According to this reporting, customers 
can pay ``to use recommendation algorithms, real-time filters, and 
effects, automated translations, and computer vision tech, akin to 
what's found in TikTok . . .''
    Mr. Beckerman, how does TikTok use AI in its algorithms, filters, 
and other parts of the platform?
    Answer. TikTok uses AI in its recommendation system that powers the 
For You feed. Recommendation systems are common for entertainment and 
social media platforms like Netflix, YouTube, and Instagram. To learn 
more about TikTok's recommendation system, please see our Newsroom post 
for additional details: https://newsroom.tiktok.com/en-us/how-tiktok-
recommends-videos-for-you. TikTok also uses AI-based technology for its 
Discover page, special effects, platform safety and content moderation, 
and data security tools.
    ByteDance recently began offering recommendation technology through 
BytePlus, a new technology services subsidiary, but the recommendation 
technology provided by BytePlus is not the TikTok recommendation 
system, which has been highly customized and refined over time for the 
global TikTok business and user community.

    Question 15. Of the technologies ByteDance is licensing through 
BytePlus, which of them are also technologies used by TikTok?
    Please specify the technology being licensed through BytePlus, and 
how each is used by TikTok currently or was used previously.
    Answer. TikTok's proprietary technologies underlying its 
recommendation system and special effects models are different from the 
technologies that are licensed through BytePlus.

    Question 16. Of the technologies ByteDance is licensing through 
BytePlus, which of them are technologies developed, wholly or 
partially, by TikTok? Please specify the technology being licensed 
through BytePlus, and what role TikTok played in the development of 
that technology.
    Answer. TikTok's proprietary technologies underlying its 
recommendation system and special effects models are different from the 
technologies that are licensed through BytePlus.
                                 ______
                                 
      Response to Written Questions Submitted by Hon. Mike Lee to 
                           Michael Beckerman
    Question 1. Mr. Beckerman, one of the main draws to TikTok is its 
fast paced, short videos and its ``For You'' page that suggests videos 
to its users. The WSJ published a recent article detailing how the 
``For You'' page can send teens down rabbit holes full of explicit 
content, including sexual violence and drugs, even if they have only 
searched for explicit content once on the platform.
    What changes, if any, have been made to the algorithm that controls 
the ``For You'' page since these findings?
    Answer. At TikTok, safety and wellness--in particular for children 
and teens on our platform--is our priority. This has been the case long 
before the WSJ article. For children under 13 years old, TikTok offers 
TikTok for Younger Users, which is a limited experience where under 13 
users can watch a curated library of safe and age-appropriate videos. 
With respect to teens, we work hard to provide families with the tools 
and features to make the content decisions that are right for them.
    While we believe the WSJ study's use of bots to artificially create 
a simulated TikTok experience does not represent true user behavior or 
experience, we nevertheless take the issues raised in the WSJ article 
very seriously and are actively exploring ways, including development 
of new automated models, to address such issues with respect to our 
younger users.
    To learn more about TikTok's recommendation system, please see our 
Newsroom post for additional details: https://newsroom.tiktok.com/en-
us/how-tiktok-recom
mends-videos-for-you.

    Question 2. How do you reconcile content that is typically banned 
by your platform being recommended to teens regularly through your 
algorithm?
    Answer. TikTok disagrees with the characterization that it 
regularly recommends to teens content that is typically banned.
    TikTok's Community Guidelines define a set of norms and common code 
of conduct; they provide guidance on what is and is not allowed on 
TikTok in order to help create a welcoming and safe space for everyone. 
We publish quarterly Community Guideline Enforcement Reports that 
demonstrate that of all the content that users upload to TikTok, less 
than 1 percent of videos uploaded was found to violate the Community 
Guidelines. Additionally, of those videos, 93 percent were identified 
and removed within 24 hours of being posted.

    Question 3. Does your platform permit ``interest'' specific ads to 
be targeted to children?
    Answer. There are no ads in TikTok for Younger Users, which is a 
limited experience where under 13 users can watch videos curated for 
safety and age-appropriateness.

    Question 4. What interest related data does your platform collect 
from children's profiles?
    Answer. It is unclear what is meant by ``interest related data.'' 
TikTok for Younger Users collects a very limited set of data, including 
but not limited to ``likes'' and browsing history.

    Question 5. Does your platform collect data on children that is for 
adult products--tobacco, alcohol, recreational drug use--or for 
sexually suggestive ads?
    Answer. There are no ads in the TikTok for Younger Users 
experience. We note that TikTok's Ads Policies applicable to the U.S. 
platform prohibit ads for tobacco, alcohol, and recreational drug use, 
as well as sexually suggestive ads.
                                 ______
                                 
 Response to Written Questions Submitted by Hon. Richard Blumenthal to 
                             Leslie Miller
    Preventing Child Sexual Exploitation and Grooming. According to the 
National Center for Missing and Exploited Children (NCMEC), 2020 was 
record-breaking for reports of child sexual exploitation, and this year 
has already exceeded that number of reports. In November 2019, I led a 
bipartisan letter to you and other tech companies about what you are 
doing to stop child sexual exploitation.

    Question 1. What technologies do you have in place to automatically 
monitor for the grooming and enticement of children and report these 
crimes swiftly? If none, please explain why not.

    Question 2. What technologies do you have in place to detect and 
stop predators from inducing children to send explicit images, videos, 
or live-streams of themselves? If none, please explain why not.
    Answer. Because the answers to these questions are related, we have 
grouped together our response to Questions Nos. 1 and 2.
    YouTube's Approach. YouTube is committed to fighting online child 
sexual abuse and exploitation and preventing our services from being 
used to spread child sexual abuse material (CSAM). We invest heavily in 
fighting child sexual abuse and exploitation online and use our 
proprietary technology to deter, detect, remove, and report offenses on 
our platforms. We proactively detect and report illegal child sexual 
abuse material to the National Center for Missing and Exploited 
Children (NCMEC), and our Trust & Safety team works 24 hours a day to 
quickly respond to any child safety incident. We may terminate the 
accounts of those who are seeking to sexualize or exploit minors, and 
we report offenses involving CSAM to NCMEC, which liaises with global 
law enforcement agencies.
    We identify and report CSAM with trained specialist teams and 
cutting-edge technology, including machine learning classifiers and 
hash-matching technology, which creates a ``hash'', or unique digital 
fingerprint, for an image or a video so it can be compared with hashes 
of known CSAM. In 2008, we began using these ``hashes'' to identify, 
remove, and report copies of known images. We also use our automated 
systems and human reviews to detect new CSAM and contribute new hashes 
to a hash database maintained by the NCMEC, helping to continue to grow 
the bank of hashes of known CSAM.
    In 2014, YouTube engineers created CSAI Match, world-leading 
technology that can be used to scan and identify uploaded videos that 
contain known child sexual abuse material. We make this technology 
available to other platforms and NGOs free-of-charge to help them 
identify matches against our database of known abusive content so they 
can responsibly take action on it in accordance with local laws and 
regulations. In 2018, Google engineers created the Content Safety 
Application Programming Interface (API) which helps Google and other 
businesses and organizations prioritize potential abuse content for 
review. In the first half of 2021, our partners used the Content Safety 
API to classify over 6 billion images, helping them identify 
problematic content faster and with more precision so they can report 
it to the authorities.
    In addition to our long-standing efforts to combat CSAM video, we 
have made large investments to detect and remove content which may not 
meet the legal definition of CSAM, but where minors are still being 
sexualized or exploited. We have always had clear policies against 
videos, playlists, thumbnails and comments that sexualize or exploit 
children (more information is available at https://support
.google.com/youtube/answer/2801999). We use machine learning systems to 
proactively detect violations of these policies and have human 
reviewers around the world who quickly remove violations detected by 
our systems or flagged by users and our trusted flaggers. Our machine 
learning systems help to proactively identify videos that may put 
minors at risk and apply our protections at scale, such as restricting 
live features, disabling comments, and limiting video recommendations. 
Identifying and removing videos more quickly--often before they have 
even been viewed--means children who are being sexually abused today 
are more likely to be identified and protected from further abuse.

    Critical Partnership with NCMEC. YouTube reports all instances of 
CSAM on our platforms to NCMEC, who in turn may report the incident to 
law enforcement. Where there may be imminent harm to a child, our 
specialist team escalates the report to NCMEC's prioritization queue.
    We provide NCMEC with detailed information about the user accounts 
associated with the possession or distribution of CSAM. We are informed 
that inclusion of such information in our reports has been instrumental 
in allowing law enforcement to rescue children being victimized and 
ultimately bring those responsible to justice. When YouTube becomes 
aware of suspected grooming or the potential for hands-on abuse in 
connection with CSAM content, we include that information in a 
supplemental report to NCMEC with a high-priority status. When 
information within the report indicates that the information resolves 
to another country, NCMEC forwards the report to the appropriate law 
enforcement agency within that country.
    NCMEC maintains a database of hashes--serving as digital 
fingerprints of identified CSAM content--that is made available to 
participating service providers so that content identified on one 
platform can be swiftly removed from all platforms. Google helped NCMEC 
redesign and rebuild the CyberTipline in 2011 to make the reporting 
system easier to use and established a Google Fellowship so that NCMEC 
can update the Cybertipline and integrate it with their Child Victim ID 
program, which helps identify missing or exploited children.
    Earlier this year, building on our industry-leading transparency 
initiatives, Google launched a transparency report (available at 
https://transparencyreport
.google.com/child-sexual-abuse-material/reporting?hl=en) specifically 
dedicated to detailing our efforts to combat online child sexual abuse 
material. From January through June 2021, these efforts included the 
submission of over 400,000 CyberTipline reports to NCMEC, involving 
over 3.4 million pieces of content. YouTube in particular contributed 
more than 124,000 CyberTipline reports and over 133,000 pieces of 
content to NCMEC.

    Collaboration with NGOs. In addition to our work with NCMEC, we 
maintain strong partnerships with NGOs and industry coalitions to help 
grow and contribute to our joint understanding of the evolving nature 
of child sexual abuse and exploitation. For example, we are a leading 
member of the Technology Coalition (available at https://
www.technologycoalitio n.org/), where child safety experts across the 
industry build capacity and help companies working to increase their 
capacity to detect CSAM. The Tech Coalition announced research grants 
for five world-leading institutions (more information is available at 
https://www.end-violence.org/tech-coalition-safe-online-research-
fund#grantees) who are working on actionable research including, for 
example, how to more detect online grooming. This is part of Project 
Protect--a cross-industry initiative to combat CSAM through investment, 
research, and information sharing (more information is available at 
https://www.technology
coalition.org/2020/05/28/a-plan-to-combat-online-child-sexual-abuse/).

    Question 3. Do you have specific mechanisms for users to report 
child sexual exploitation? If so, please elaborate on them; if not, 
please explain why not.
    Answer. We want to protect children using our products from 
experiencing grooming, sextortion, trafficking and other forms of child 
sexual exploitation. In addition to the steps we take detailed in the 
answer above, we provide useful information to help users report child 
sexual abuse material to the relevant authorities. Users can flag 
inappropriate content on YouTube (available at https://
support.google.com/youtube/answer/
2802027#report_channel&zippy=%2Creport-a-channel) if they discover 
content that may depict a child in danger or an abusive situation, by:

   Flagging the video: Report videos that contain inappropriate 
        content involving minors by flagging the video for `Child 
        Abuse'; and

   Filing an abuse report: If a user has found multiple videos, 
        comments, or a user's entire account that warrants reporting, 
        the user should visit the reporting tool to submit a more 
        detailed complaint.

    We also provide information in our YouTube Community Guidelines 
(available at https://support.google.com/youtube/answer/2801999) and 
Google Safety Center (available at https://safety.google/families/) on 
how to deal with concerns about bullying and harassment, including 
information on how to block users from contacting a child. Google's 
Help Center (available at https://protectingchildren.google/
howtoreport/) also provides additional information for users on 
reporting and support for children who may be being abused, including 
contact information for the National Center for Missing and Exploited 
Children (NCMEC) and other helpful resources. Google's Protecting 
Children Site (available at https://protectingchildren
.google/#fighting-abuse-on-our-own-platform-and-services) also provides 
information on how users can report inappropriate behavior towards 
children, including grooming and other forms of child sexual 
exploitation.

    Question 4. Do you include IP addresses in all reports to NCMEC? If 
not, please explain why not.
    Answer. When we identify CSAM on our platforms, we make a 
``CyberTipline'' report to NCMEC, as described in detail in the 
responses above. A single report may contain one or more pieces of 
content depending on the circumstances. This content could include, for 
example, images, videos, URL links, and/or text soliciting CSAM.
    The reports sent to NCMEC may include information identifying the 
user, the minor victim, and/or other helpful contextual facts. We 
include both IP login information for the account and upload IP of the 
CSAM content if available in our reports to NCMEC.

    Question 5. Please provide your policies and protocols on how you 
respond to law enforcement when they reach out for information relating 
to child sexual exploitation reports.
    Answer. As noted above, we proactively look for and report illegal 
child sexual abuse material to the National Center for Missing and 
Exploited Children. Where there may be imminent harm to a child, our 
specialist team escalates the report to NCMEC's prioritization queue. 
From January through June 2021, these efforts included the submission 
of over 400,000 reports to NCMEC, involving over 3.4 million pieces of 
content.
    Once a report is received by NCMEC, they may forward it to law 
enforcement agencies around the world. Law enforcement may then send 
legal process to YouTube seeking further information. In order to 
facilitate such requests, Google provides an online system that allows 
verified government agencies to securely submit valid requests for 
further information. These agencies can then view the status of 
submitted requests using this online system, and, ultimately, download 
Google's response to their request.
    We have a robust law enforcement response process with analysts and 
lawyers dedicated to ensuring that we appropriately respond to valid 
legal processes from law enforcement and make referrals to law 
enforcement when we identify problematic or illegal activity on our 
platform, including that relating to child safety. A law enforcement 
agency may ask YouTube to preserve specific information while the 
agency applies for valid legal process to compel the disclosure of that 
information. We have a dedicated team that responds to these requests, 
and in cases of emergencies, a team that works around the clock, every 
day of the year.
    We describe our work and protocols concerning cooperation with law 
enforcement in our policies (available at https://policies.google.com/
terms/information-requests) and our publicly available Transparency 
Reports (available at https://transparency
report.google.com/).

    Advertising to Children and Teens. Your policies for advertisers on 
supervised accounts and child-directed content state ``personalized 
ads, remarketing, and other personalized targeting features are 
prohibited on YouTube,'' and you testified that YouTube does not serve 
personalized ads or engage in personalized advertising in YouTube Kids 
and YouTube supervised experiences. However, YouTube does still allow 
advertising to teenage users and on child-directed content--and holds 
an extraordinary amount of information about those users. YouTube could 
itself target ads to users it believes are more likely to click on the 
ad, even if it doesn't allow advertisers to target certain audiences: 
for example, it could target a product to a teen because it thinks a 
teen of a certain gender or with certain interests is more likely to 
view the ad.

    Question 6. Does YouTube itself do any targeting of advertising to 
teens or on child-directed content based on personal information? If 
so, please explain.

    Question 7. Please list the factors that YouTube does consider for 
such targeting of advertising to teens and on child-directed content.
    Answer. Because the answers to these questions are related, we have 
grouped together our response to Questions Nos. 6 and 7.
    At the outset, we want to note that in August 2021, we announced 
that we'll be expanding safeguards to prevent age-sensitive ad 
categories from being shown to teens, and that we will block ad 
targeting based on the age, gender, or interests of people under 18. 
We'll start rolling out these updates across our products globally over 
the coming months (more information is available at https://
blog.google/technology/families/giving-kids-and-teens-safer-experience-
online/). This builds upon our existing prohibitions on personalized 
advertising for on YouTube Kids, within YouTube Supervised Experiences, 
and for users watching ``Made for Kids'' content on our YouTube 
service.
    To provide greater information on advertising, we initially adopted 
an advertising-supported model because we wanted YouTube services to be 
available free of charge and accessible to all families, irrespective 
of their ability to pay. YouTube Kids provides guidance for third-party 
experts to create collections or works with influential individuals to 
provide content that supports academic or social-emotional learning. 
The educational and enriching content that results from such 
collaborations is expansive and diverse.
    Creators are often small businesses who rely on ad revenue to 
invest in development of quality content; they receive a majority share 
of the advertising revenue. We are committed to fostering an 
appropriate viewing environment for children and families, and ensuring 
compliance with all applicable regulations and privacy commitments 
concerning ads and children. On YouTube Kids, within YouTube Supervised 
Experiences, and for users watching ``Made for Kids'' content on our 
YouTube service, we prohibit personalized advertising. In addition, for 
all users we prohibit ads with remarketing or other third-party 
tracking pixels from being shown to users on YouTube Kids, YouTube 
supervised accounts, and on ``Made for Kids'' content.
    Only contextual ads are eligible to serve to users on YouTube Kids, 
YouTube supervised accounts, and on ``Made for Kids'' content. 
Contextual advertising is based on the content of the underlying video 
that is being watched, whereas personalized advertising makes use of, 
for example, an individual user's previous activity or demographic 
information in order to provide tailored ads.
    We want to support a healthy digital advertising ecosystem--one 
that is trustworthy and transparent, and works for users, advertisers, 
and publishers. Our Google Ads Policies are designed not only to abide 
by laws but to ensure a safe and positive experience for our users. We 
take strong enforcement action against advertising content that 
violates our policies for all of our users. We use a combination of 
manual and automated review to detect and remove ads that violate our 
policies. Our ads policies (available at https://support.google.com/
adspolicy/answer/60089
42?hl=en) cover four broad areas including prohibited content, 
prohibited practices, restricted content and features, and editorial 
and technical standards. These policies also prohibit advertising that 
includes content that promotes dangerous products or services, enables 
dishonest behavior, and includes inappropriate content like bullying, 
intimidation, and self harm.
    Further, to foster an appropriate viewing environment for children 
and families, all ads shown on YouTube Kids, against ``Made for Kids'' 
content, or to a user in our new supervised experience must comply with 
our kids advertising policies (available at https://support.google.com/
adspolicy/answer/9683742) and YouTube's general advertising policies 
(available at https://support.google.com/youtube/answer/
188570?topic=30084&ctx=topic). Under these policies, we prohibit paid 
ads that contain adult, dangerous, sexualized, or violent content. Our 
advertising policies also prohibit contests and sweepstakes, ads that 
mislead or make deceptive claims, ads that imply social status, and ads 
that refer to social media campaigns or websites. Ads that incite 
children to make a purchase, or urge their parents to buy the item are 
also prohibited. Ads for certain product categories are also 
prohibited, including: (i) regulated products, including alcohol, 
gambling, and pharmaceutical/healthcare as well as other products that 
may be dangerous to kids, such as fireworks or instructions to make 
harmful products; (ii) sensitive products that may be inappropriate for 
kids, such as those relating to dating, politics, and religion; and 
(iii) products that we have determined may not be appropriate for young 
audiences, including age-sensitive media content, beauty and fitness, 
dating and relationships, online or virtual communities, political ads, 
religious ads, and video games.
    Across Google as a whole, our enforcement measures have allowed us 
to take down 3.1 billion ads worldwide for violating our ads policies 
in 2020--that's the removal of more than 5,900 bad ads per minute. We 
annually publish our ads safety report (available at https://
services.google.com/fh/files/misc/ads_safety_report_
2020.pdf; see also this blog at https://blog.google/products/ads-
commerce/ads-safety-report-2020/), explaining enforcement actions we 
took against illegal and harmful advertising to protect users.

    Question 8. Some creators, like influencers, make videos to push 
certain products to their users without employing traditional 
advertising mechanisms. You mentioned a set of ``quality principles'' 
to regulate the content in YouTube Kids. Please list these principles 
and elaborate on the types of commercial content that are and are not 
in violation of these principles.
    Answer. YouTube Kids prohibit paid ads that contain adult, 
dangerous, sexualized, or violent content. The YouTube Kids' ads policy 
also forbids contests and sweepstakes, as well as ads that mislead or 
make deceptive claims, ads that imply social status, and ads that refer 
to social media campaigns or websites. Ads that incite children to make 
a purchase, or urge their parents to buy the item, are also prohibited. 
Ads for certain product categories are also prohibited on YouTube Kids, 
including: (1) regulated products, including alcohol, gambling, and 
pharmaceutical/healthcare, as well as other products that may be 
dangerous to kids, such as reworks or instructions on how to make 
harmful products; (2) sensitive products that may be inappropriate for 
kids, such as those relating to dating, politics, and religion; and (3) 
products that we have determined may not be appropriate for young 
audiences, including age-sensitive media content, beauty and fitness, 
relationships, online or virtual communities, political ads, religious 
ads, and video games.
    YouTube's quality principles (available at https://
support.google.com/youtube/answer/1077 4223) for children and family 
content are designed to help guide YouTube's kids and family creator 
ecosystem. These principles were developed in collaboration with child 
development specialists, and are based on extensive research on 
children's media, digital learning, and citizenship. They are meant to 
give our creators a better idea of what may be considered low or high 
quality content, but the lists are not exhaustive. Additionally, these 
principles supplement our Community Guidelines (available at https://
www.youtube.com/howyoutube works/policies/community-guidelines/
#community-guidelines), which help provide a safe viewing experience 
for our users. Our creators are still responsible for following our 
Community Guidelines concerning any content they create. For ease of 
reference, our quality principles are as follows:
High quality principles
    Age-appropriate, enriching, engaging, and inspiring content can 
come in different formats and cover a range of topics, but it should 
promote:

   Being a good person: This includes content that demonstrates 
        or encourages respect, good behavior, and healthy habits. 
        Examples include content about sharing, being a good friend, 
        brushing teeth, eating vegetables, and setting digital 
        wellbeing goals.

   Learning and inspiring curiosity: This includes content that 
        promotes critical thinking, discussing connected ideas, and the 
        discovery and exploration of the world. Content should be age 
        appropriate and designed for a young audience. It can also span 
        traditional to non-traditional learning (e.g., academics, 
        informal learning, interest-based exploration, tutorials).

   Creativity, play, and a sense of imagination: This includes 
        content that is thought-provoking or imaginative. It may also 
        encourage kids to create, make, and engage with something in a 
        meaningful and novel way. Examples include creating imaginary 
        worlds, storytelling, soccer tricks, sing-alongs, and creative 
        activities like art and crafts.

   Interaction with real world issues: This includes content 
        that depicts life lessons and strong characters, or encourages 
        building social-emotional skills, problem solving, and 
        independent thinking. It often includes a complete narrative 
        (e.g., character development, plot, resolution) and clear 
        takeaway or lesson.

   Diversity, equity, and inclusion: This content celebrates 
        and encourages representation and participation of diverse 
        perspectives and groups of people. This includes content 
        representing a range of ages, genders, races, religions, and 
        sexual orientations. It also advocates for equal treatment of 
        those differences. Examples include content that discusses the 
        benefits of diversity and inclusion, or depicts stories/
        characters where these themes are demonstrated.

    These principles help determine inclusion in YouTube Kids and how 
recommendations work in the main YouTube experience. On YouTube Kids, 
we identify and include videos and channels that are age-appropriate 
and adhere to the quality principles referenced above. We also use 
these principles to determine which high-quality content we raise up in 
our recommendations on YouTube. This means that when a user is watching 
``Made for Kids'' content on YouTube, we aim to recommend videos that 
are age-appropriate, educational, and inspire creativity and 
imagination.
Low quality principles
    We inform creators to avoid making low quality content. This 
includes content that is:

   Heavily commercial or promotional: Content that is primarily 
        focused on purchasing products or promoting brands and logos 
        (e.g., toys and food). It also includes content that is focused 
        on excessive consumerism.

   Encouraging negative behaviors or attitudes: Content that 
        encourages dangerous activities, wastefulness, bullying, 
        dishonesty, or a lack of respect for others (e.g., dangerous/
        unsafe pranks, unhealthy eating habits).

   Deceptively educational: Content that claims to have 
        educational value in its title or thumbnail, but actually lacks 
        guidance or explanation, or is not relevant to children (e.g., 
        titles or thumbnails that promise to help viewers ``learn 
        colors'' or ``learn numbers,'' but instead features mindless 
        repetitive content or inaccurate information).

   Hindering comprehension: Content that is thoughtless, lacks 
        a cohesive narrative, is incomprehensible (e.g., has shaky 
        visuals/inaudible audio), as is often the result of mass 
        production or auto-generation.

   Sensational or misleading: Content that is untrue, 
        exaggerated, bizarre, or opinion-based, and may confuse a young 
        audience. It might also include ``keyword stuffing'', or the 
        practice of using popular keywords of interest to children in a 
        repetitive, altered, exaggerated, or nonsensical way.

   Strange use of children's characters: Content that puts 
        popular children's characters (animated or live action) in 
        objectionable situations.

    We use these principles to reduce kids content that is low-quality 
(but that doesn't violate our Community Guidelines, available at 
https://support.google.com/youtube
/answer/9288567) in our recommendations on YouTube, and remove channels 
from YouTube Kids. Examples of this content include videos that are 
heavily commercial or promotional, encourage negative behaviors or 
attitudes, and more. Our efforts are ongoing and we regularly 
reevaluate and update these principles.
    We recently shared additional monetization policies (available at 
https://support.google.com/youtube/answer/1311392#kids-quality)--which 
align with the quality principles discussed above (and that are 
available at https://support.google.com/youtube/answer/10774223)--for 
channels that primarily create kids and family content on YouTube. 
Going forward, these principles will have not only an impact on 
recommendations and inclusion in YouTube Kids, but also on 
monetization.
    Channels that primarily target young audiences or are classified as 
``Made for Kids'' will need to deliver high-quality content and comply 
with kids-specific monetization policies. For example, channels that 
have predominantly low-quality kids content, such as ``Heavily 
commercial or promotional'' or ``Encouraging negative behaviors or 
attitudes'', may be suspended from the YouTube Partner Program. And if 
an individual video violates these quality principles, it may see 
limited or no ads.
    For more information about our quality principles, please see the 
guide available at https://storage.googleapis.com/support-kms-prod/
kLTcocWcVNdo4YWcM9pHbLr
XBKxtXaLrN2Wb.
                                 ______
                                 
   Response to Written Questions Submitted by Hon. Amy Klobuchar to 
                             Leslie Miller
    Preventing Unfair Competition. Streaming provider Roku says YouTube 
has made unfair demands in negotiations for carrying the YouTube TV app 
on Roku, including demanding Roku give preference to YouTube over other 
content providers in its search results and give YouTube access to non-
public data from Roku's users. \1\
---------------------------------------------------------------------------
    \1\ https://www.roku.com/blog/update-on-youtube-tv

    Question. Did YouTube ask for non-public data or preferencing in 
search results at any point in negotiations with Roku or any other 
provider?
    Answer. Our negotiations with Roku began earlier this year and we 
continue to work with them to find a resolution that benefits our 
mutual users. While we negotiate in good faith, we are providing Roku 
the ability to continue distributing both YouTube and YouTube TV apps 
to all existing users to make sure they are not impacted. We hope to 
resolve open commercial matters by the December 9, 2021 deadline.
    Our agreements to distribute YouTube include a certification 
process in which new devices need to meet our technical requirements. 
This process exists to provide a consistent and high-quality YouTube 
experience for users across different devices.
    For distribution partners like Roku whose devices provide users 
with a universal search experience that aggregates results across 
multiple content providers, YouTube requires that search results from 
YouTube are included among the results from other content providers. 
This is because most living room users might know what they want to 
watch but not where they can or should watch it. Therefore we ask to be 
included as part of the search results. Partners have complete control 
on where to display the YouTube search results. Other content providers 
have similar requirements.
    What is different about YouTube--in contrast to other content 
providers--is that we have an incredibly expansive content library with 
500 hours uploaded every minute. YouTube makes a public Application 
Programming Interface (API) available so that all partners, including 
Roku, can query that API when a user issues a search query. For 
distribution partners who don't have a universal search feature, we do 
not require them to display YouTube search results.

    Social Media and Eating Disorders. Studies have found that eating 
disorders have one of the highest mortality rates of any mental 
illness.\2\
---------------------------------------------------------------------------
    \2\ https://anad.org/get-informed/about-eating-disorders-
statistics/

    Question. Has YouTube conducted any research about how its apps may 
push content promoting eating disorders to teens? If so, please provide 
those studies.
    Answer. YouTube doesn't permit content that glorifies eating 
disorders or encourages serious harm or death. And in fact, the first 
things you will see if you search on YouTube for issues related to 
suicide or eating disorders are prominent crisis resource panels boxes 
showing where to get help, such as the National Suicide Prevention 
Lifeline. Further, earlier this year we introduced new health source 
information panels on videos to help viewers identify videos from 
authoritative sources, as well as adding health content shelves that 
more effectively highlight videos from these sources when a user 
searches for specific health topics, including topics like depression. 
For more information on these panels, please see https://blog.youtube/
news-and-events/introducing-new-ways-help-you-find-answers-your-health-
questions/.
    We have launched a number of programs to support children's mental 
health throughout 2020 and the COVID-19 pandemic including a meditation 
series (available at https://www.youtube.com/watch?v=J9nE4RE8uiQ) 
exclusively found on YouTube and YouTube Kids in partnership with 
Sesame St. and Headspace. We also recently partnered with experts 
including Mass General Brigham, National Alliance on Mental Illness, 
and the WHO to expand high quality content on topics such as 
depression, anxiety, substance use and recovery (more information is 
available at https://www.youtube.com/
playlist?list=PLbpi6ZahtOH4MCLZywkbRXiHQ
4T40RrAD).

    Advertising and Teenagers. YouTube earns revenue from showing ads.

    Question 1. Please provide copies of the 100 most popular ads, in 
terms of viewership, among all users on your platform in the last year.
    Answer. We are unable to provide the information requested at this 
time.

    Question 2. Please also provide copies of the 100 most popular ads, 
in terms of viewership, among teenagers on your platform in the last 
year.
    Answer. Similar to Question 1, above, we are unable to provide the 
information requested at this time.

    Question 3. YouTube reported that its advertising revenue overall 
in the second quarter of 2021 was $7 billion.\3\ How much of YouTube's 
U.S. revenue came from users under the age of 18?
---------------------------------------------------------------------------
    \3\ https://abc.xyz/investor/static/pdf/
2021Q2_alphabet_earnings_release.pdf
---------------------------------------------------------------------------
    Answer. YouTube does not calculate or report revenue on a per-user 
basis, nor demographic including age group.

    Algorithms for Content Moderation. Frances Haugen disclosed that 
Facebook has been misleading the public about how well it can detect 
and remove harmful content.\4\ Facebook's internal documents estimated 
that their automated systems removed less than one percent of 
prohibited violent content.'' \5\
---------------------------------------------------------------------------
    \4\ https://www.wsj.com/articles/facebook-ai-enforce-rules-
engineers-doubtful-artificial-intelligence-11634338184
    \5\ https://www.wsj.com/articles/facebook-ai-enforce-rules-
engineers-doubtful-artificial-intelligence-11634338184

    Question. Has YouTube collected any data about how often YouTube's 
automated content detection systems failed to capture content on 
YouTube that violates the company's terms of service in the last 18 
months? If so, please provide that data.
    Answer. We publish a quarterly Community Guidelines Enforcement 
report (available at https://transparencyreport.google.com/youtube-
policy/removals?hl=en). This report includes data on video, channel, 
and comment removals; appeals and reinstatements; human and machine 
flagging; and key topics including child safety. This year, we started 
releasing a new data point in our Community Guidelines Enforcement 
Report to provide even more transparency around the effectiveness of 
our systems: the Violative View Rate (VVR). That metric, which we now 
give quarterly, tells the world the percentage of views on YouTube that 
comes from content that violates our policies. We want to be open about 
that number and held accountable for driving it down.
    In Q3 2021 alone, for example, we removed 6.2 million videos for 
violating our Community Guidelines. Approximately 72 percent of those 
first flagged by our systems received 10 or fewer views and 
approximately 95 percent were detected by our automated flagging 
system. In Q3 2021, the Violative View Rate was 0.09-0.11 percent (out 
of every 10K views on YouTube, only 9-11 come from violative content).
    The VVR metric demonstrates that YouTube has invested significantly 
in our automated detection systems to identify violative content and 
our engineering teams continue to update and improve them month by 
month. We're constantly innovating to improve our machine learning and 
algorithms to identify content in violation of our policies. Among 
other AI principles, we believe AI should be socially beneficial, avoid 
creating or reinforcing unfair bias, and be accountable to people. We 
continue to be one of the leading companies investing heavily in 
responsible AI and implementation.

    Children on Social Media. YouTube allows kids under 13 in the 
United States to use YouTube through YouTube Kids.\6\
---------------------------------------------------------------------------
    \6\ See https://support.google.com/youtubekids/answer/
7554914?hl=en&ref_topic=7554316#zip
py=%2Cfirst-time-using-youtube-kids

    Question. How many under 13 accounts has the company removed from 
YouTube--not YouTube Kids--in the last year?
    Answer. Kids under 13 are not allowed to use YouTube, unless such 
use is enabled by a parent or legal guardian. When a user provides age 
information while creating a Google account, YouTube has access to that 
declared age information. We work hard to verify the ages of our users, 
requiring date of birth at sign-up and utilizing various signals in our 
network to confirm it.
    YouTube also employs machine learning to identify signals on 
YouTube channels that may indicate that the account operating the 
channel may be owned by users under 13. We rely on signals to find 
these channels, and then flag for a team to review more closely and/or 
terminate the accounts when they appear owned by an underage user.
    In the first three quarters of 2021, we terminated more than 7 
million accounts when we learned they belonged to people under 13, with 
3 million accounts terminated in Q3 2021 alone.

    Children's Privacy Online. In September 2019, the FTC settled with 
Google and YouTube for $170 million after YouTube allegedly collected 
children's data without their parents' consent so that it could show 
them targeted ads.\7\ I'm still concerned YouTube is showing targeted 
ads to kids and teens.
---------------------------------------------------------------------------
    \7\ https://www.ftc.gov/news-events/press-releases/2019/09/google-
youtube-will-pay-record-170-million-alleged-violations

    Question. Do you ever use data collected on users who are not 
logged in for targeted advertising? If so, how can you be sure that 
none of those users are under 13?
    Answer. YouTube has invested significant resources to meet our 
obligations under our settlement with the FTC. As part of our 
compliance solution, we require all creators to tell us whether or not 
their videos are ``Made for Kids.'' To help creators comply with this 
requirement, we allow creators to set their audience at the channel 
level, which will set all of their future and existing content as 
``Made for Kids'' or not, or at the video level. We provide guidance to 
assist YouTube creators determine what content is ``Made for Kids'' in 
our help center (available at https://support.google.com/youtube/
answer/9528076?hl=en#zippy=%2Chow-do-i-know-if-i-should-set-my-content-
as-made-for-kids) and include information released by the FTC 
(available at https://www.ftc.gov/news-events/blogs/business-blog/2019/
11/youtube-channel-owners-your-content-directed-children). For example, 
we state that if a video includes actors, characters, activities, 
games, songs, stories, or other subject matter that reflect an intent 
to target children, it's likely made for kids. We provide clear 
communication to creators that a failure to set their content 
appropriately may result in consequences on YouTube and have legal 
consequences under COPPA and other laws.
    When a creator's content is set as ``Made for Kids'' we limit data 
collection and use from every viewer of such content, regardless of 
whether they are logged in or not. This means we don't serve 
personalized ads on ``Made for Kids'' content. This also means we 
restrict or disable certain features like comments, notifications, and 
others. If a video or live stream is set as ``Made for Kids'' we 
disable features like autoplay on home, live chat, comments, save to 
playlist, channel memberships, amongst other features. Further, if a 
channel is set as ``Made for Kids''--in addition to the features that 
are disabled as a result of a video being identified as ``Made for 
Kids''--a creator's channel also won't have posts, stories, channel 
memberships, or the notification bell.
    While we require creators to tell us whether or not their videos 
are ``Made for Kids,'' we also use machine learning to help us identify 
videos that are clearly directed to young audiences. In cases of abuse 
or in which the creator has made an error we may override a creator's 
audience setting. If a creator has already set an audience for their 
video and YouTube detects error or abuse, the creator may see their 
video set as ``Set to made for Kids.'' When this happens, the creator 
isn't able to change their audience setting. If a creator believes a 
mistake has been made they can appeal the decision.
    We have also taken significant additional measures that go beyond 
what is required under our settlement. For example, for many years we 
have continued to invest in helping creators understand how to make 
enriching, engaging and inspiring videos for kids and families. We 
recently announced our work to implement products and policies that 
help us connect families with high-quality content on YouTube and 
YouTube Kids (more information is available at https://blog.youtube/
news-and-events/our-responsibility-approach-protecting-kids-and-
families-youtube/). In collaboration with child development 
specialists, we established a set of quality principles (available at 
https://support.google.com/youtube/answer/10774223) to help guide our 
kids and family creator ecosystem. We use these principles to determine 
which high-quality content we raise up in our recommendations on 
YouTube. This means that when you're watching ``Made for Kids'' content 
on YouTube, we aim to recommend videos that are age-appropriate, 
educational, and inspire creativity and imagination.
    Going forward, these principles will also have an impact on 
monetization. For example, channels that primarily target young 
audiences or are classified as ``Made for Kids'' will need to deliver 
high-quality content and comply with kids-specific monetization 
policies. In addition, channels that have predominantly low-quality 
kids content, such as ``Heavily commercial or promotional'' or 
``Encouraging negative behaviors or attitudes'', may be suspended from 
our YouTube Partner Program. And if an individual video violates these 
quality principles, it may see limited or no ads.
    As noted above, in the first three quarters of 2021, we terminated 
more than 7 million accounts when we learned they belonged to people 
under 13, with 3 million accounts terminated in Q3 2021 alone.

    Spanish Language Misinformation. In July, I led a letter with 
Senator Lujan to social media companies including YouTube about 
misinformation in Spanish. Your response raised concerns about your 
content moderation efforts in non-English languages.

    Question. In that letter, I asked how many employees or contractors 
YouTube has to moderate content in each of the top five languages 
spoken by U.S. users. You didn't answer that question.\8\ Please 
provide data on the number of content moderators you employ for the top 
five languages spoken by U.S. users.
---------------------------------------------------------------------------
    \8\ https://www.lujan.senate.gov/wp-content/uploads/2021/09/
YouTube-Response-Lujan.pdf
---------------------------------------------------------------------------
    Answer. YouTube is a global platform with over 2 billion users in 
more than 100 countries speaking 80 different languages; responsibility 
is our top priority. To achieve our responsibility goals, YouTube 
removes content that violates our publicly available Community 
Guidelines. YouTube's Community Guidelines are enforced consistently 
across the globe, regardless of the language or location of where the 
content is uploaded.

    Removing violative content: To enforce our Community Guidelines, we 
use a combination of machine learning and human reviewers, and have 
made significant investments in our systems to remove content that 
violates our Community Guidelines before it is widely viewed or viewed 
at all. Last year across Google, more than 20,000 people worked in a 
variety of roles to help enforce our policies and moderate content. 
These individuals are located across the globe, and possess a diverse 
set of backgrounds, including an array of linguistic capabilities and 
varied regional contexts.
    In the first quarter of 2021, a majority of content removed from 
YouTube was from users outside of the United States. More specifically, 
of the top 30 countries with the highest number of content removed in 
Q1, 83.4 percent was removed outside of the United States. It's also 
important to note that English is not the primary language in 9 of the 
top 10 countries where the videos were removed. We disclose metrics on 
a quarterly basis in our Transparency Report (available at https://
transparencyreport.google.com/).

    Raising up authoritative information: YouTube also strives to 
provide context and raise authoritative information before and during 
viewer engagement. This is especially important in the case of topics 
that are prone to misinformation, such as COVID-19.
    A range of information panels on video watch pages and in search 
results are a key mechanism to provide users additional context from 
authoritative sources on such topics. For example, when a user in the 
U.S. enters a query or views a video related to COVID-19, we display an 
information panel that links to the Centers for Disease Control page 
with information about the pandemic.
    In the U.S., these info panels are available in several different 
languages, including Spanish, French, and Chinese. Additionally, we 
work to prominently surface authoritative content in search results in 
a wide diversity of languages. For example, if a user searched for 
``COVID-19 vacuna,'' they'd receive authoritative content about the 
COVID-19 vaccine in Spanish. Millions of search queries are getting 
this treatment today.
                                 ______
                                 
  Response to Written Questions Submitted by Hon. Marsha Blackburn to 
                             Leslie Miller
    Question 1. Do you get parental consent when you do research on 
kids? Can you provide a copy of a blank parental consent form, if 
applicable?
    Answer. We care deeply about how families and children use 
technology and have engaged third parties to help us understand the 
challenges faced. Over the past several years, we have put together a 
group of leading experts on kids online safety, content quality, mental 
health, trauma, and child development to help us ensure that our 
product and policy decisions are up to date with the latest research in 
those fields.
    As an example of our use of studies, in 2019, we commissioned 
Fluent Research to conduct a study (available at https://
www.thinkwithgoogle.com/feature/digital-wellbeing-statistics/) to 
examine the role digital technology plays in the wellbeing of families 
and to help us understand how we could better support healthy tech 
habits through our products. This research was only conducted after 
individuals had explicitly provided us with their consent. This 
research, which included focus groups and survey responders in eleven 
countries, has informed our approach to product and policy design. The 
research also directly impacted our efforts to create experiences that 
help develop healthy tech habits, including the evolution of Family 
Link parental controls.
    Other examples of our use of studies include third-party research 
as to what content is beneficial to children online, what content may 
be upsetting to kids, and how young people use digital media to manage 
their mental health.

    Question 2. During the hearing, you said you never spoke out 
against online privacy legislation.
    Have you actually supported any specific bill?
    Did the Internet Association advocate against online privacy 
legislation on your behalf?
    Answer. We support comprehensive Federal privacy legislation, and 
we would welcome the opportunity to work with Congress on it. However, 
we are not waiting on Federal regulation to continue improvements to 
our privacy program. We are always working on new ways for users to 
protect their data, including by making our controls like auto-delete 
more robust and easier to use. We are also continuing to make advances 
in privacy enhancing technologies like differential privacy. Recently, 
we announced that YouTube will begin adjusting the default upload 
setting to the most private option available for teen users on YouTube 
and will also block ad targeting based on age, gender, or interests of 
users under 18.
    As lawmakers debate new rules for the internet, citizens expect 
companies to engage in legislative debate openly and in ways that fully 
account for the concerns of all of society. We know we have a 
responsibility to take this understanding into our work on Internet 
policy, and consistently strive to do so. Google is a member of the 
Internet Association, a trade association that advocates on behalf of 
over 40 Internet technology companies; the Internet Association has 
also expressed support for comprehensive Federal privacy legislation, 
as seen here: https://internetassociation.org/positions/privacy-
security-safety/privacy/.
                                 ______
                                 
     Response to Written Questions Submitted by Hon. Rick Scott to 
                             Leslie Miller
    Question 1. It is pretty fair to assume that a large portion of the 
growth and revenue that your platforms bring in is centered on ensuring 
a growing youth user base. To keep growing as a platform, you likely 
need new young users, which in turn requires you to target younger 
demographics in outreach. Current Federal law prohibits platforms such 
as yours from collecting data on children under 13 years old. Companies 
like YouTube paid $170 million to settle allegations by the FTC and the 
New York Attorney General for allegedly collecting personal information 
from children without their parents' consent.
    What portion of your user base in the United States is under 18-
years old?

    Do you believe that provisions in the Children's Online Privacy 
Protection Act (COPPA), that require parental consent for collecting 
data on children under 13, should be reviewed and possibly changed to 
increase the age limit?
    Answer. YouTube has invested significant resources to meet our 
obligations under our settlement with the FTC. As part of our 
compliance solution, we require all creators to tell us whether or not 
their videos are ``Made for Kids.'' We provide clear communication to 
creators that a failure to set their content appropriately may result 
in consequences on YouTube and have legal consequences under COPPA and 
other laws. In addition to prohibiting personalized ads on YouTube 
Kids, on our main platform, content that is made for kids does not run 
personalized ads and has certain features disabled, like comments and 
notifications.
    YouTube also employs machine learning to identify signals on 
YouTube channels that indicate that the account operating the channel 
may be owned by users under 13. We rely on signals to find these 
channels, and then flag for a team to review more closely and/or 
terminate the accounts when they appear owned by an underage user. In 
the first three quarters of 2021, we terminated more than 7 million 
accounts when we learned they belonged to people under 13, with 3 
million accounts terminated in Q3 2021 alone.
    There are many legitimate reasons that 13-16 year olds should be 
able to access services to exercise age-appropriate rights to 
expression and access to information. We're committed to constantly 
making these services safer for teens, including expanding our age-
restrictions so that users coming to YouTube must be signed-in and 
their account age must be 18 or older in order to view mature content. 
If they aren't, they see a warning and are redirected to find other 
content that is age-appropriate. Our Community Guidelines (available at 
https://www.youtube.com/howyoutube
works/policies/community-guidelines/) include guidance to uploaders 
about when content should be age-restricted. Recently we announced 
further changes to Google Accounts for people under 18 (more 
information is available at https://blog.google/technology/families/
giving-kids-and-teens-safer-experience-online/). For example, on 
YouTube we are changing the default upload setting to the most private 
option available for teens ages 13-17. We also are turning location 
history, a Google account setting, off for users under the age of 18 
(without the ability to turn it on). Finally, as referenced above we 
are expanding safeguards to prevent age-sensitive ad categories from 
being shown to teens, and we will block ad targeting based on the age, 
gender, or interests of people under 18.
                                 ______
                                 
   Response to Written Questions Submitted by Hon. Cynthia Lummis to 
                             Leslie Miller
    Question 1. Has your business conducted internal research, or 
contracted or collaborated with external entities to perform research 
examining, or which has revealed, various harms that can result from 
using your platform generally, or specifically as it relates to its use 
by minors? This harm definition should include deleterious effects to a 
user's social, mental, or physical well-being. If so, will you commit 
to sharing that research with Members of Congress and/or appropriate 
outside professionals?
    Answer. We care deeply about how families and children use 
technology and have engaged third parties to help us understand the 
challenges faced. Over the past several years, we have put together a 
group of leading experts on kids online safety, content quality, mental 
health, trauma, and child development to help us ensure that our 
product and policy decisions are up to date with the latest research in 
those fields.
    As an example of our use of studies, in 2019, we commissioned 
Fluent Research to conduct a study to examine the role digital 
technology plays in the wellbeing of families and to help us understand 
how we could better support healthy tech habits through our products 
(more information on the study is available at https://www
.thinkwithgoogle.com/feature/digital-wellbeing-statistics/). This 
research, which included focus groups and survey responders in eleven 
countries, has informed our approach to product and policy design. The 
research directly impacted our efforts to create experiences that help 
develop healthy tech habits, including the evolution of Family Link 
parental controls.
    Other examples of our use of studies include third-party research 
as to what content is beneficial to children online, what content may 
be upsetting to kids, and how young people use digital media to manage 
their mental health. We have also seen recent studies--from Pew 
Research (available at https://www.pewresearch.org/internet/2018/05/31/
teens-social-media-technology-2018/) and Common Sense Media (available 
at https://www.commonsensemedia.org/research/coping-with-covid19-how-
young-people-use-digital-media-to-manage-their-mental-health)--that 
have highlighted that digital media use can help teens communicate with 
peers and family, seek helpful resources if they are experiencing 
distress, and find opportunities for learning and entertainment that 
can help combat isolation.
    We also recognize that protecting children from abuse is a mission 
that no one company, industry, or part of society can accomplish alone. 
We also partner with others through the Technology Coalition (available 
at https://www.techno
logycoalition.org/)--an organization made up of 24 members representing 
different parts of the industry. Through the Technology Coalition we 
work with stakeholders around the world to help advance cutting-edge 
research, technology innovation, and sharing of knowledge and best 
practice on how to prevent, detect and report child sexual abuse and 
exploitation. The Tech Coalition recently announced the funding of five 
global research projects to help increase our understanding and 
awareness of how to use artificial intelligence to combat child sexual 
exploitation at scale (more information is available at https://
www.technologycoalition.org/2021/09/23/tech-coalition-safe-online-
research-fund-announces-awardees/).

    Question 2. You stated in your testimony that length of engagement, 
or some variation of it, was one of many metrics that your company uses 
to measure your company's success. This necessarily implies the 
existence of other metrics that your company uses to define success. 
Please identify all of those specific metrics, how the specific metric 
is calculated and tracked on the platform, and how they might align 
with or relate to your company's specific mission.
    Answer. YouTube's platforms are designed to allow users to search 
and discover all types of content and have an enjoyable experience. In 
response to your specific question, I stated that we look at various 
types of data points to inform us about whether a user is enjoying a 
piece of content, including if a video is watched through its entirety. 
To be clear, a user's watchtime is not used as a metric to define 
corporate success. In 2012, we incorporated watchtime into our 
recommendations systems (additional information available at https://
blog.youtube/inside-youtube/on-youtubes-recommendation-system/) to help 
people find the videos they want to watch and that will give them 
value. Recommendations play a pivotal role across our entire community, 
introducing viewers to content they love and helping creators connect 
with new audiences. Recommendations also play an important role in how 
we maintain a responsible platform connecting viewers to high quality 
information and minimizing the chances they'll see problematic content.
                                 ______
                                 
     Response to Written Questions Submitted by Hon. John Thune to 
                             Leslie Miller
    Question 1. What percentage of YouTube video views in the United 
States and worldwide are the result of an algorithm'' used by YouTube 
suggesting or playing another video after the user finishes watching a 
video? Are the algorithms used by YouTube designed to display content 
that seeks to optimize user engagement?
    Answer. Our recommendation system is built on the simple principle 
of helping people find the videos they want to watch and that will give 
them value. Users come to YouTube for video content that will help them 
learn something, be entertained, listen to music, or get the latest 
from a favorite creator. Our users get exposure to a wide diversity of 
content, andour recommendations system helps users discover this new 
and diverse content. When our recommendations are at their best, they 
connect billions of people around the world to content that uniquely 
inspires, teaches, and entertains.
    The success of YouTube's recommendations depends on accurately 
predicting the videos viewers want to watch. Today, recommendations do 
drive a majority of the overall watchtime on YouTube, but the precise 
percentage of watchtime that comes from recommendations fluctuates.
    Our system doesn't follow a set formula, but develops dynamically 
as user viewing habits change. A number of signals build on each other 
to help inform our system about what users find satisfying: clicks, 
watchtime, survey responses, sharing, likes, and dislikes. We also know 
not everyone wants to always share this information with us. So we've 
built controls that help users decide how much data they want to 
provide. Users can pause, edit, or delete their YouTube search and 
watch history whenever they want.
    Over the years, a growing number of viewers have also started to 
come to YouTube for news and information. Recommendations also play an 
important role in how we maintain a responsible platform. We have built 
our systems to connect viewers to high-quality information and minimize 
the chances they'll see problematic content. And they complement the 
work done by our robust Community Guidelines that define what is and 
isn't allowed on YouTube. For example, we know that consumption of 
borderline content as a result of recommendations is significantly 
below 1 percent, and our goal is to have views of borderline content 
from recommendations below 0.5 percent of overall views on YouTube. You 
can learn more about our YouTube's recommendations system at https://
blog.youtube/inside-youtube/on-youtubes-recommendation-system/.
    Recommendations also help us connect viewers watching ``Made for 
Kids'' content to videos that are age-appropriate, educational, and 
inspire creativity and imagination, in line with our quality principles 
for kids and family content. More information about the set of quality 
principles we developed to help guide YouTube's kids and family creator 
ecosystem is available at https://support.google.com/youtube/answer/
10774223.

    Question 2. YouTube Kids was created to provide younger users a 
more family-friendly experience, but YouTube Kids still shows ads to 
children. Given all the products and services owned and operated under 
Alphabet, Inc., in your view, is it necessary for YouTube to put ads on 
YouTube Kids?
    Answer. To ensure YouTube Kids is available free of charge and 
accessible to all consumers, we adopted an advertising-supported model.
    We offer our services for free for users, and the revenue from ads 
is a key part of what allows us to sustain a vibrant community of 
content creators. We are committed to fostering an appropriate viewing 
environment for children and families. In 2015, we created YouTube Kids 
from the ground up with kids in mind. To ensure that the app could 
remain free while also offering a kid-friendly environment, we designed 
and implemented strict policies specifically for advertising that 
appears in YouTube Kids. Creators are often small businesses who rely 
on advertising revenue to invest in development of quality content and 
they receive a majority share of the advertising revenue. To ensure 
that the app could remain free while also offering a kid-friendly 
environment, we designed and implemented strict policies specifically 
for advertising that appears in YouTube Kids (more information is 
available at https://support.google.com/youtube/answer/6168681?hl=en).
    Three core principles underlie the YouTube Kids' ads policies: (1) 
maintain an appropriate viewing environment for children and families; 
(2) prohibit personalized advertising or third-party tracking or data 
collection in the app; and (3) permit only advertisement formats that 
maintain a closed environment that we can monitor and control. We 
prohibit paid ads on YouTube Kids unless they meet these requirements. 
We also have never allowed any personalized ads on YouTube Kids.
    Our ads policy team, who is specifically trained on the YouTube 
Kids' ads policy (available at https://support.google.com/youtube/
answer/6168681), reviews each ad before it appears on YouTube Kids. 
Only ads that pass this review are eligible to be served in YouTube 
Kids, and we prohibit paid ads that contain adult, dangerous, 
sexualized, or violent content. The YouTube Kids' ads policy also 
forbids contests and sweepstakes, as well as ads that mislead or make 
deceptive claims, ads that imply social status, and ads that refer to 
social media campaigns or websites. Ads that incite children to make a 
purchase, or urge their parents to buy the item, are also prohibited. 
Ads for certain product categories are also prohibited on YouTube Kids, 
including: (1) regulated products, including alcohol, gambling, and 
pharmaceutical/healthcare, as well as other products that may be 
dangerous to kids, such as reworks or instructions on how to make 
harmful products; (2) sensitive products that may be inappropriate for 
kids, such as those relating to dating, politics, and religion; and (3) 
products that we have determined may not be appropriate for young 
audiences, including age-sensitive media content, beauty and fitness, 
relationships, online or virtual communities, political ads, religious 
ads, and video games.
    We require all ads to be clearly branded by the advertiser and/or 
product marketed in the video and the ads must be distinctive so that 
users can readily distinguish ads from general content. To reinforce 
the separation between paid ads and other content on YouTube Kids, a 
three-second ad ``bumper'' introduction appears at the start of every 
advertisement shown on YouTube Kids.
    YouTube Kids prohibits any personalized advertising or retargeting/
remarketing campaigns from being shown in the app. Only contextual ads 
are permitted. YouTube Kids also prohibits any collection of data for 
personalized advertising or retargeting. YouTube Kids does not contain 
any plug-ins or social widgets that would allow users to interact with 
a third-party site or service such as Facebook or Twitter, and it does 
not allow third-party ad networks to serve ads within the app.
    We have also adopted several policies with respect to creator 
videos that contain commercial content within YouTube Kids. For 
example, we require content creators to indicate whether their videos 
include any paid product placements or endorsements during the upload 
process. We do not allow any videos with such declared paid product 
placements from being shown anywhere on YouTube Kids. Our Help Center 
notifies creators that they are required to follow relevant local laws 
and must declare that their content includes paid promotion if they 
have received payment or goods in exchange for their video. We've also 
started to remove overly commercial content from YouTube Kids, such as 
a video that only focuses on product packaging or directly encourages 
children to spend money (more information is available at https://
support.google.com/youtube/answer/10938174).
                                 ______
                                 
      Response to Written Questions Submitted by Hon. Mike Lee to 
                             Leslie Miller
    Question 1. During the hearing I asked the following:

        ``Google Play has the app rating at Teen, meaning 13+, while 
        the Apple Store has it rated at 17+. Why is there is a 
        disparity in the age ratings between Apple and Google for 
        YouTube? And if Apple determined that the age rating should be 
        17+, why did Google determine that its own app should be set as 
        Teen?''

    At the time you said you were not sure how these decisions were 
made, but that you would follow up. Now that you've had a chance to 
review, can you explain why there is a disparity in the age rating 
between Google Play and the Apple Store?
    Have you advocated for your app rating to be increased on Google or 
decreased on Apple?
    Answer. Google Play strives to provide a positive, safe, and fun 
environment for children and families. We empower parents to set 
healthy digital ground rules with Family Link parental controls and 
special Google accounts for kids (more information is available at 
https://families.google.com/familylink/; also see https://
support.google.com/families/answer/7103338?hl=en). We also have 
developer policies in place with regard to apps that are primarily or 
in some cases partly directed to children (for additional information, 
please see https://support.google.com/google play/android-developer/
answer/9893335?hl=en&ref_topic=9877766).
    App and games ratings are provided on Google Play in order to help 
parents understand the maturity of the content available to users, 
including whether the content may include sexual content, violence, 
drugs, gambling and/or profane language (for additional information, 
please see https://support.google.com/googleplay/answer/6209544). 
Parents and users can tap ``Read More'' within an apps details page in 
order to see a description of why the app got a particular rating, 
including whether the app has interactive features.
    We approach age ratings and related restrictions from two angles--
First, the Google Play store incorporates content ratings from the 
International Age Rating Coalition (IARC) (more information is 
available at https://www.globalratings.com/), which are designed to 
help developers communicate locally relevant content ratings to users 
in the jurisdictions where they live. We use IARC because we know that 
people in different countries have different ideas about what's 
appropriate when it comes to content for kids, teens, or adults. These 
ratings are also used to aid parental blocking--a parent can filter or 
block apps for their children by not showing mature rated apps to kids. 
They can achieve this using device settings or the Family Link app. 
Second, we also require developers to abide by certain policies 
(available at https://support.google.com/googleplay/android-developer/
topic/98580
52?hl=en) based on the target audience of their apps. This is based in 
part on developers' self-reported target audience and, in part, on our 
review of the app. In terms of enforcement, Google promptly 
investigates both internal and third-party flagging of apps for 
potential violations of the Play policies, to determine what steps need 
to be taken. We encourage the Committee to contact the IARC for more 
information about the rating issue identified.

    Question 2. Does your platform permit ``interest'' specific ads to 
be targeted to children?
    Answer. On YouTube Kids, YouTube's supervised experience, and on 
our main YouTube platform for users watching ``Made for Kids'' content, 
we prohibit personalized advertising.
    We are especially careful with ads that are shown in these 
contexts, ensuring that certain categories of ads, such as for alcohol 
or drugs, tobacco, or food and beverages are never shown on YouTube 
Kids, supervised experiences on YouTube, or those watching ``Made for 
Kids'' content, regardless of age.
    Only contextual ads (not personalized ads) are eligible to serve to 
users of YouTube Kids, supervised accounts on YouTube main and on 
``Made for Kids'' content. Contextual advertising is based on the 
content of the underlying video that is being watched, whereas 
personalized advertising makes use of, for example, an individual 
user's activity or demographic information in order to provide tailored 
ads.
    In August, we announced that we'll be expanding safeguards to 
prevent age-sensitive ad categories from being shown to teens, and that 
we will block ad targeting based on the age, gender, or interests of 
people under 18 (more information is available at https://blog.google/
technology/families/giving-kids-and-teens-safer-experience-online/). 
We'll start rolling out these updates across our products globally over 
the coming months.

    Question 3. What interest related data does your platform collect 
from children's profiles?

    Question 4. Does your platform collect data on children that is for 
adult products--tobacco, alcohol, recreational drug use--or for 
sexually suggestive ads?
    Answer. Because the answers to these questions are related, we have 
grouped together our response to Questions Nos. 3 and 4.
    As a general matter, we only collect the minimum amount of data 
necessary to make the YouTube Kids experience engaging and age 
appropriate, and we never collect personal information like name, 
address, or contact information from kids. We adhere to our obligations 
under COPPA concerning data collection and storage.
    YouTube Kids also does not allow kids to share their personal 
information with third parties or make it publicly available. As 
detailed in the YouTube Kids privacy notice (available at https://
kids.youtube.com/t/privacynotice), we do collect information about the 
devices used to access the service, and app usage information such as 
preferred language. We also collect information on app activity like 
search terms and videos watched, in order to offer content that is 
likely to be of interest to them. If parents decide to sign into 
YouTube Kids with their Google Account, the App will also collect 
profile information for any profiles they create for their children, as 
well as their parental controls and customization preferences.
    We believe that products should keep user information for only as 
long as it's useful and helpful to the user--including for things like 
getting recommendations for what to watch on YouTube. That's why last 
year we introduced auto-delete controls (available at https://
www.blog.google/technology/safety-security/automatically-delete-data/), 
which give users the choice to have Google automatically and 
continuously delete their Location History, search, voice, and YouTube 
activity data after 3 months or 18 months. We continue to challenge 
ourselves to do more with less data, and we've recently changed our 
data retention practices to make auto-delete the default for our core 
activity settings (available at https://myaccount.google.com/data-and-
personalization).
    As noted in our response to Question 2, we do not personalize ads 
for users declared to be under 18, and we are especially careful with 
ads that are shown in these contexts, ensuring that certain categories 
of ads, such as for alcohol or drugs, tobacco, or food and beverages 
are never shown on YouTube Kids, Supervised Experiences, declared under 
18 users or those watching ``Made for Kids'' content, regardless of 
age.

                               [all]