[Senate Hearing 119-171]
[From the U.S. Government Publishing Office]
S. Hrg. 119-171
THE GOOD, THE BAD, AND THE UGLY:
AI GENERATED DEEPFAKES IN 2025
=======================================================================
HEARING
before the
SUBCOMMITTEE ON PRIVACY,
TECHNOLOGY, AND THE LAW
OF THE
COMMITTEE ON THE JUDICIARY
UNITED STATES SENATE
ONE HUNDRED NINETEENTH CONGRESS
FIRST SESSION
__________
MAY 21, 2025
__________
Serial No. J-119-20
__________
Printed for the use of the Committee on the Judiciary
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
www.judiciary.senate.gov
www.govinfo.gov
_______
U.S. GOVERNMENT PUBLISHING OFFICE
61-677 WASHINGTON : 2025
COMMITTEE ON THE JUDICIARY
CHARLES E. GRASSLEY, Iowa, Chairman
LINDSEY O. GRAHAM, South Carolina RICHARD J. DURBIN, Illinois,
JOHN CORNYN, Texas Ranking Member
MICHAEL S. LEE, Utah SHELDON WHITEHOUSE, Rhode Island
TED CRUZ, Texas AMY KLOBUCHAR, Minnesota
JOSH HAWLEY, Missouri CHRISTOPHER A. COONS, Delaware
THOM TILLIS, North Carolina RICHARD BLUMENTHAL, Connecticut
JOHN KENNEDY, Louisiana MAZIE K. HIRONO, Hawaii
MARSHA BLACKBURN, Tennessee CORY A. BOOKER, New Jersey
ERIC SCHMITT, Missouri ALEX PADILLA, California
KATIE BOYD BRITT, Alabama PETER WELCH, Vermont
ASHLEY MOODY, Florida ADAM B. SCHIFF, California
Kolan Davis, Chief Counsel and Staff Director
Joe Zogby, Democratic Chief Counsel and Staff Director
Subcommittee on Privacy, Technology, and the Law
MARSHA BLACKBURN, Tennessee, Chair
LINDSEY O. GRAHAM, South Carolina AMY KLOBUCHAR, Minnesota,
JOHN CORNYN, Texas Ranking Member
JOSH HAWLEY, Missouri CHRISTOPHER A. COONS, Delaware
JOHN KENNEDY, Louisiana RICHARD BLUMENTHAL, Connecticut
ASHLEY MOODY, Florida ALEX PADILLA, California
ADAM B. SCHIFF, California
Ben Blackmon, Republican Chief Counsel
Dan Goldberg, Democratic Chief Counsel
C O N T E N T S
----------
OPENING STATEMENTS
Page
Blackburn, Hon. Marsha........................................... 1
Klobuchar, Hon. Amy.............................................. 2
Tillis, Hon. Thom
Prepared statement........................................... 26
Coons, Hon. Christopher A........................................ 4
WITNESSES
Brookman, Justin................................................. 10
Prepared statement........................................... 27
Carlos, Suzana................................................... 12
Prepared statement........................................... 50
Glazier, Mitch................................................... 7
Prepared statement........................................... 55
McBride, Martina................................................. 6
Prepared statement........................................... 57
Price, Christen.................................................. 8
Prepared statement........................................... 59
APPENDIX
Items submitted for the record................................... 69
THE GOOD, THE BAD, AND THE UGLY:
AI-GENERATED DEEPFAKES IN 2025
----------
WEDNESDAY, MAY 21, 2025
United States Senate,
Subcommittee on Privacy, Technology,
and the Law,
Committee on the Judiciary,
Washington, DC.
The Subcommittee met, pursuant to notice, at 2:34 p.m., in
Room 226, Dirksen Senate Office Building, Hon. Marsha
Blackburn, Chairman of the Subcommittee, presiding.
Present: Senators Blackburn [presiding], Hawley, Tillis,
Moody, Klobuchar, and Coons.
OPENING STATEMENT OF HON. MARSHA BLACKBURN,
A U.S. SENATOR FROM THE STATE OF TENNESSEE
Chair Blackburn. The Senate Judiciary Committee on Privacy,
Technology, and the Law will come to order.
And I want to thank all of you for being here with us this
afternoon. As you can see, everyone is curious about AI. And
today, we are going to talk about AI and how this affects our
content creators, our creative community, our children, and how
it affects all Americans when it comes to the digital space and
what happens with AI-generated deepfakes.
This hearing is titled ``The Good, the Bad, and the Ugly,''
and it is titled in that regard for a specific reason. Now in
Tennessee, we talk about there is a lot of good that has come
from AI when you are talking about logistics, advanced
manufacturing, healthcare, cutting-edge research, and we've
even seen the amazing role that AI has played in giving voice
to some, Randy Travis to be specific, who joined us here on the
Hill recently in introducing the NO FAKES bill. It gave Randy
Travis the ability to share his talent with the world once
again.
And despite some of these benefits, there are really some
bad and unpleasant sides to AI, and specifically when it comes
to AI-generated deepfakes. These deepfakes cause tremendous
harm, and today, we are going to examine those harms and the
legislative solutions, including the NO FAKES Act that Senators
Coons, Klobuchar, Tillis, and I have introduced. We have
introduced them specifically to address these harms.
First, these deepfakes pose significant harm to our content
creators, from Music Row to Beale Street, back over to the
Smoky Mountains in Upper East Tennessee. Tennesseans have made
their mark in the music world, and we've got one of those
artists with us today. But the proliferation of these digital
replicas created without the artist's consent pose a real
threat to their livelihoods and the livelihoods of all American
artists and creators.
The NO FAKES Act is a monumental step forward in protecting
our creative community. It provides landmark protection of the
voice and visual likenesses of all individuals and creators
from the spread of these digital replicas that are created
without their consent. And I am looking forward to speaking
with our witnesses about this critical bill and how impactful
it will be for the creative community.
And I have got to be clear, our efforts must protect all
Americans from the harms of deepfakes, and that includes our
precious children. In recent years, we have seen a deeply
troubling spike in the use of generative AI to create sexually
explicit deepfake content. Just as concerning, NCMEC saw a--get
this number--1,325 percent increase from 2023 to 2024 in
reports involving generative AI. We have got to do something
about that, and both the NO FAKES Act and the TAKE IT DOWN Act,
which President Trump just signed into law this week, go a long
way to providing greater protections for our children from
these deepfakes.
These deepfakes have also served as a powerful tool for
fraud. In one example, scammers used AI-generated images and
voices of a multinational firm CEO to steal millions of
dollars. We've also seen celebrities' likenesses
misappropriated in false product endorsements.
It is clear that Congress has to act, and that's why the
three of us sitting right here on this dais have joined forces,
plus Senator Tillis, who is going to get here in a little bit,
to work on the NO FAKES Act and get it to President Trump's
desk this year. We know that the creative community, all these
content creators, our children, and all Americans deserve
nothing less than our best efforts on this issue.
And I turn to Senator Klobuchar for her opening statement.
OPENING STATEMENT OF HON. AMY KLOBUCHAR,
A U.S. SENATOR FROM THE STATE OF MINNESOTA
Senator Klobuchar. Thank you very much, Senator Blackburn.
I am very excited about this Subcommittee and the work we have
already done together for years on this issue and similar
issues when it comes to tech.
I share your hopes for AI and see that we are on this cusp
of amazing advancements if this is harnessed in the right way.
But I am also concerned if things go the wrong way. I think it
was Jim Brooks, a columnist, that said he has trouble writing
about it because he doesn't know if it will take us to heaven
or hell. So it is our job to head to heaven, and it is our job
to put some rules in place. And this is certainly one of them.
We want this to work for children, for consumers, for artists,
and not against them.
And you brought up the example, Chair, of Randy Travis, who
was at the event that we recently had with you and Senator
Coons and myself about the bill and how he used AI in such a
positive way. But then we know there are these risks.
And one of the things that I think is really exciting about
this week is that, in fact, on Monday, the President signed my
bill with Senator Cruz, the TAKE IT DOWN Act, into law. This
was a bill I discussed with him and the First Lady at the
inaugural lunch. It is an example of use every moment you have
to advance a cause. And then she supported the bill and hoped
to get it passed in the House. Senator Cruz and I had already
passed it in the Senate, and we were having some trouble
getting it done over in the House. So we are really pleased
because it actually does set some track moving forward, even
though that bill is about nonconsensual porn, both AI-created
and non-AI-created.
It has had huge harmful effects, about 20-some suicides a
year of young kids who think they are sending a picture
innocently to a girlfriend or a potential boyfriend, and then
it gets sent out on their school internet. It gets sent out to
people they know, and basically they believe their life is in
ruins and don't have any other context and take their own
lives. And that is just the most obvious and frightful part of
this, but there are others as well. So I am hoping this is
going to be a first step to some of the work that we can do,
including with the bill that we are going to be discussing
today.
So AI-enabled scams have become far too common. We know
that. It takes only a few seconds of audio to clone a voice.
Criminals can pull the audio sample and personal backstory from
public sources. Just last week, the FBI was forced to put out
an alert about scams using AI-cloned voices of FBI agents and
officials asking people for sensitive payment information.
Jamie Lee Curtis was forced to make a public appeal to Mark
Zuckerberg to take down an unauthorized deepfake ad that
included her digital replica endorsing a dental product. While
Meta removed the ad after her direct outreach, most people
don't have that kind of influence.
We also need rules of the road to ensure that AI
technologies empower artists and creators and not undermine
them. Art just doesn't entertain us. It is something that
uplifts us and brings us together. When I recently met with
Cory Wong, a Grammy-nominated artist from Minnesota, he talked
about how unauthorized digital replicas threaten artists'
livelihoods and undermine their ability to create art.
So this is not just a personal issue; it is also an
economic issue. One of the reasons our country, one of our best
exports to the world is music and movies. When you look at the
numbers and how we have been able to captivate people around
the world, that is going to go away if people can just copy
everything that we do. And one of the keys to our success as a
Nation in innovation has been the fact--and Senator Coons does
a lot of work in this area. We have been able to respect
copyrights and patents and people own the rights to their own
products.
So that is why this NO FAKES Act is so important. It
protects people from having their voice and likeness replicated
using AI without their permission, all within the framework of
the Constitution. And it protects everybody because everyone
should have a right to privacy.
I also am working in the space on AI to put some base rules
in place in my role on the Commerce Committee. Senator Thune
and I have a bill that we are reintroducing on this to set some
rules for NIST to be able to put out there for companies that
are using AI. And then I am always concerned about its effect
on democracy. But that is for a different day and in a
different Committee.
But I do want to thank Senator Blackburn for her
willingness to come out on doing something about tech,
including the work she does with Senator Blumenthal, the work
that we have done together on Commerce. And if Monday is any
sign with the first bill getting through, and they are in that
Rose Garden signing ceremony, there is more to come. And so
thank you, and I look forward to hearing from the witnesses.
Chair Blackburn. Thank you, Senator Klobuchar.
Senator Coons, you are recognized.
OPENING STATEMENT OF HON. CHRISTOPHER A. COONS,
A U.S. SENATOR FROM THE STATE OF DELAWARE
Senator Coons. Thank you so much, Chair Blackburn, Ranking
Member Klobuchar. It is a delight to work with you. And thank
you for inviting me to give some brief opening remarks about
the NO FAKES bill.
Because of you and Senator Tillis working on this together
since 2023, we have made real progress. There is momentum with
this bill. We have been adding co-sponsors. My thanks to
Senators Durbin and Hagerty, Schiff and Cassidy. We are adding
organizations that are endorsing it like YouTube and RAINN. And
as we saw at the White House on Monday, if there is bipartisan
agreement in Congress and support from the White House that
action is needed, we can make progress in complex, challenging
technical areas.
This hearing is a chance to look critically at the current
State of the NO FAKES bill so we can both build on that
momentum and answer the questions, what did we get right? What
do we need to tweak? How can we get more co-sponsors and push
to a Full Committee markup?
So I am excited to hear from our witnesses today. There are
two other Committee hearings going on right now, which is why
you will see Senators come in and out, not a lack of interest.
Senator Klobuchar. Or be late.
[Laughter.]
Senator Coons. Yes. When we were drafting this bill, its
applicability to pillars of the creative community like Ms.
McBride, Martina McBride, or to a movie star like Tom Hanks,
its applicability to people who make a living off of their
voice or likeness was clear. But Senator Blackburn and I agreed
at the outset, the rules we were drafting should apply to
everyone. Everyone should have the power to control their
digital replica online, not just those who are superstars.
So I appreciate, Chair Blackburn, the witnesses you brought
together today speak to the full scope of what this bill can do
to keep the public safe from scams, just like the bill Senator
Klobuchar just got signed into law, and help wipe nonconsensual
deepfake pornography off the internet.
Second, the revised draft we introduced last month was the
product of stakeholders negotiating in good faith. Ms. Carlos,
you and YouTube came to the table with the intention of getting
to yes, and we got there. And if Google can get behind this
bill, can handle the obligations that NO FAKES impose, so can
the other tech platforms.
Thank you. I look forward to hearing from you and returning
to questions.
Chair Blackburn. Thank you, Senator Coons.
I would like to introduce our witnesses. Martina McBride is
a Nashville-based singer-songwriter who has sold more than 23
million albums worldwide with six singles hitting number one on
the country music chart.
In addition to her 14 Grammy Award nominations, Ms. McBride
is a four-time Country Music Association Female Vocalist of the
Year, a three-time Academy of Country Music Top Female
Vocalist, and a member of the Grand Ole Opry. She first signed
to RCA Records in 1991 and has since been awarded 14 gold
records, 12 platinum honors, 3 double platinum records, and 2
triple platinum awards.
Mitch Glazier is the CEO and chairman of the Recording
Industry Association of America. We use the acronym RIAA. He
helps to represent the rights and interests of over 1,600
member labels. Prior to joining RIAA, Mr. Glazier served as
Chief Counsel for Intellectual Property to the U.S. House of
Representatives Judiciary Committee, as well as numerous other
roles in and around government, including as a commercial
litigation associate. He earned his bachelor's degree from
Northwestern University and his JD from Vanderbilt School of
Law.
Our next witness is Christen Price. Ms. Price serves as
Senior Legal Counsel for the National Center for Sexual
Exploitation NCOSE. Correct? And she works to combat all forms
of sexual exploitation and advocate for justice for survivors
of sex trafficking, child sexual abuse, pornography, and
prostitution.
Before her work at NCOSE, Ms. Price served as legal counsel
at the Alliance Defending Freedom, where she specialized in
First Amendment law and conscious protections. Ms. Price earned
her bachelor's degree from Cedarville University and her JD
from Georgetown University Law Center.
And Mr. Justin Brookman, Mr. Brookman is the Director of
Technology Policy for Consumer Reports, where he specializes in
data privacy and security issues. Before joining Consumer
Reports, he was Policy Director of the Federal Trade Commission
Office of Technology, Research, and Investigation. Earlier in
his career, he served as Chief of the Internet Bureau of the
New York Attorney General's Office. He earned his bachelor's
degree from University of Virginia and his JD from New York
University School of Law.
And Ms. Suzana Carlos, who serves as Head of Music Policy
at YouTube. Until 2022, she served as Senior Counsel for
YouTube's Music Publishing and in senior positions at the
American Society of Composers, Authors, and Publishers--we like
to call it ASCAP-Universal Music Group and EMI Publishing. She
is also on the board of Digital Media Association, which
represents the leading global audio streaming companies and
promotes legal access and engagement of music content between
creators and users. Ms. Carlos earned her bachelor's at the
University of California, Los Angeles, and her JD from Fordham
University School of Law. Welcome to each of you.
At this time, I want to ask you all to rise and raise your
right hands.
[Witnesses are sworn in.]
Chair Blackburn. And let the record reflect that everyone
is in the affirmative.
We will begin with our testimony. Ms. McBride, you are
recognized for 5 minutes and welcome.
STATEMENT OF MARTINA McBRIDE, MULTIPLATINUM COUNTRY MUSIC
SINGER-SONGWRITER, NASHVILLE, TENNESSEE
Ms. McBride. Chairman Blackburn, Ranking Member Klobuchar,
Senator Coons, and Members of the Subcommittee, thank you for
inviting me to speak about S. 1367, the NO FAKES Act of 2025, a
landmark effort to protect human voices and likenesses from
being cloned by artificial intelligence without consent. I am
so grateful for the care that went into this effort, and I want
to thank you and your colleagues for making this issue a
priority.
I started singing when I was 4 years old, and my voice is
at the center of my art form. Each of my recordings includes a
piece of me that is individual and unique. Songs reflect the
human experience, and I am honored that they are a part of
people's lives, from wedding vows to breakups to celebrating
milestones and even the special relationship between a mother
and daughter.
But today, my voice and likeness, along with so many
others, are at risk. AI technology is amazing and can be used
for so many wonderful purposes. But like all great
technologies, it can also be abused. In this case, by stealing
people's voices and likenesses to scare and defraud families,
manipulate the images of young girls in ways that are
unconscionable, impersonate government officials, or make phony
recordings posing as artists like me. It is frightening, and it
is wrong.
Congress just took a very important step forward to deal
with sexually explicit deepfake images by passing the TAKE IT
DOWN Act. I want to thank all the leaders, including Senators
Cruz, Klobuchar, Blackburn, and many on this Committee who
worked hard with others to push that bill into law.
The NO FAKES Act is a perfect complement to that effort by
preventing AI deepfakes that steal someone's voice or likeness
and use them to harass, bully, and defraud others or to damage
their career, reputation, or values. The NO FAKES Act would
give each of us the ability to say when and how AI deepfakes of
our voices and likenesses can be used. If someone doesn't ask
before posting a harmful deepfake, we can have it removed
without jumping through unnecessary hoops or going to court.
It gives every person the power to say yes or no about how
their most personal human attributes are used. It supports AI
technology by providing a roadmap for how these powerful tools
can be developed in the right way. And it doesn't stand in the
way of protected uses like news, parodies, or criticism.
I want to thank the technology companies like OpenAI and
Google who support this bill, as well as the legions of
creators who have worked so hard to advocate for it and the
Child Protection and Anti-Sex Trafficking and Exploitation
groups who support it and continue to fight for those who are
most vulnerable.
In my career, it has been a special honor to record songs
that shine a light on the battles that many women fight,
especially the terrible battle of domestic violence. Many fans
have told me that the song ``Independence Day'' has given them
strength. And in some cases, the song has been the catalyst
that has made them realize that they need to leave an abusive
situation. Imagine the harm that an AI deepfake could do,
breaching that trust using my voice in songs that belittle or
justify abuse.
One of the things I am most proud of in my career is I have
tried to conduct myself with integrity and authenticity. And
the thought that my voice could be deepfaked or my likeness
could be deepfaked to go against everything that I have built,
to go against my character, is just terrifying. And I am
pleading with you to give me the tools to stop that kind of
betrayal.
Setting America on the right course to develop the world's
best AI while preserving the sacred qualities that make our
country so special--authenticity, integrity, humanity, and our
endlessly inspiring spirit--that is what the NO FAKES Act will
help to accomplish. I urge you to pass the bill now.
Thank you.
[The prepared statement of Ms. McBride appears as a
submission for the record.]
Chair Blackburn. We thank you.
Mr. Glazier, you are recognized for 5 minutes.
STATEMENT OF MITCH GLAZIER, CEO, RECORDING INDUSTRY ASSOCIATION
OF AMERICA, WASHINGTON, D.C.
Mr. Glazier. Thank you so much. Thank you for having me. I
am honored to testify today alongside the groundbreaking
artist, Martina McBride, who just spoke so eloquently about the
value of someone's voice, the value of their image, and the
threats posed by abuses of deepfake technology.
I would also like to recognize the almost 400 artists and
performers and actors who have just signed a statement in
support of the NO FAKES Act with some very simple words. It is
your voice, your face, your image, your identity. Protect your
individuality. That is why we are here. That is what this is
all about.
Artists' voices and likenesses are fundamental to their
work, credibility, expression, careers. In many ways, these
deeply personal, highly valuable attributes are the foundations
of the entire music ecosystem. And unauthorized exploitation of
them using deepfakes does cause devastating harm. We have to
prevent that harm.
So my deepest thanks and the thanks of a very grateful
music community go out to all of you, to Chairman Blackburn, to
Ranking Member Klobuchar, to Senator Coons, and to all of the
Senators, Senators Tillis, Hagerty, Durbin, Cassidy, Schiff,
and I hope many more on this Committee and throughout the
Senate for introducing and supporting the NO FAKES Act.
You did it. After months, actually years, of work with each
other, stakeholders, your counterparts in the House, you have
been able to build bipartisan, bicameral, broad-based consensus
around legislation that will protect not just artists but all
victims of deepfake abuses, including child exploitation and
voice clone scams, which we will hear about from the other
witnesses today.
You have shaped a commonsense bill that has won the support
of AI companies like Google, who is here today, OpenAI, IBM, as
well as broadcasters, motion picture studios, child protection
groups, free market groups, labor unions, and virtually the
entire creative community. That is hard to do.
The NO FAKES Act provides balanced yet effective
protections for all Americans while supporting free speech,
reducing litigation, and promoting the development of AI
technology. It empowers individuals to have unlawful deepfakes
removed from UGC platforms as soon as it can be done without
requiring anyone to hire lawyers or go to court in those
situations. It contains clear exemptions for uses typically
protected by the First Amendment, such as parody, news
reporting, and critical commentary. And it encourages AI
development and innovation, targeting only malicious
applications and setting the stage for the legitimate licensing
of rights with real and meaningful consent.
NO FAKES is the perfect next step to build on after the
TAKE IT DOWN Act. It provides a civil remedy to victims of
invasive harm that go beyond the criminal posting of intimate
images addressed by that legislation and protects artists like
Martina from nonconsensual deepfakes and voice clones that
breach the trust she has built with millions of fans.
American music is the most valuable music in the world. We
lead in investment, exports, and market power. Music drives the
success of other important American industries, including the
technology industry, through thriving partnerships. If we
signal to the rest of the world that it is acceptable to steal
Americans' voices and likenesses, we have the most to lose. Our
voices and our music are the most popular and will be taken the
most, destabilizing the music economy, our intellectual
property system, our national identity, and the very humanity
of the individuals who bless us with their genius.
The NO FAKES Act is a critical step in setting America up
as an example and to continue and extend its global leadership
in innovation and creativity. It shows that we can boost AI
development while preserving every individual's autonomy, all
individual liberties, and protect our constitutional property
rights at the same time.
We are really proud to support this legislation, and we vow
to help you pass it into law this year. Thank you again.
[The prepared statement of Mr. Glazier appears as a
submission for the record.]
Chair Blackburn. We thank you.
And Ms. Price, you are recognized for 5 minutes.
STATEMENT OF CHRISTEN PRICE, SENIOR LEGAL COUNSEL, NATIONAL
CENTER ON SEXUAL EXPLOITATION (NCOSE), WASHINGTON, D.C.
Ms. Price. Chair Blackburn, Ranking Member Klobuchar, thank
you for holding this hearing and addressing this truly urgent
matter. My name is Christen Price, Senior Legal Counsel at the
National Center on Sexual Exploitation, NCOSE, a nonpartisan
nonprofit dedicated to eradicating all forms of sexual
exploitation by exposing the links between them. Our law center
represents survivors in lawsuits against those who perpetrate,
enable, and profit from sex trafficking, including pornography
companies.
Contemporary pornography depicts and normalizes violence,
including asphyxiation, electrocution, and rape. This is
pervasive. The top four sites--Pornhub, XVideos, xHamster, and
XNXX--had nearly 60 billion total visits in 2024. One woman's
husband sexually assaulted her while she was sleeping and put
the video on XVideos, which was tagged ``Sleeping Pills.''
Pornhub hosts child sexual abuse material and sex trafficking
content with their employees admitting that traffickers use
their sites with impunity.
Forged or deepfake pornography uses AI that is trained on
this kind of abusive content, merging it with the faces of
other women and girls. A 2023 report found that deepfake
pornography increased by 464 percent between 2022 and 2023. The
top 10 deepfake pornography sites had 300 million video views
in 2023. Ninety-eight percent of all deepfake videos are
pornography related, and 99 percent of those who are targeted
are women.
The perpetrators are disproportionately male. One survey
found that 74 percent of deepfake pornography users don't feel
guilty about it. A high schooler discovered a boy she had never
met took a photo off of her Instagram, created an AI deepfake,
and circulated it through Snapchat. Two years later, she still
hasn't been able to remove all the images.
A woman whose close family friend made deepfake pornography
of her said, ``My only crime was existing online and sharing
photos on platforms like Instagram. The person who did this was
not a stranger. I was not hacked, and my social media has never
been public.''
These are serious human rights abuses, violating the person
whose face is depicted and the person whose body is shown.
Survivors report fear; isolation; shame; powerlessness;
suicidal thoughts; doxxing; harassment from sex buyers; and
difficulty attending school, maintaining jobs, and
participating in public life. This is a form of sexual
exploitation from which it is impossible to fully exit.
There is a very old idea that to protect more privileged
women from male violence, society needs an underclass of women
that men can violate with impunity. This was always a morally
inexcusable premise, and the rise of forged pornography shows
that it is also a lie. Deepfake technology allows any man to
turn any woman into his pornography. These are impossible
conditions for equality. As Andrew Dworkin stated in his book,
``The civil impact of pornography on women is staggering. It
keeps us socially silent, socially compliant. It keeps us
afraid in neighborhoods and it creates a vast hopelessness for
women, a vast despair. One lives inside a nightmare of sexual
abuse that is both actual and potential, and you have the great
joy of knowing that your nightmare is someone else's freedom
and someone else's fun.''
The harms are severe and irreversible, so deterrence is
essential. This is why NCOSE supported the bipartisan effort to
pass the TAKE IT DOWN Act, which the President signed into law
on Monday, and requires online platforms to remove
nonconsensual content within 48 hours of being notified.
NCOSE strongly supports three additional bills that
complement TAKE IT DOWN, the NO FAKES Act, the Kids Online
Safety Act and the DEFIANCE Act. These bills help protect
individuals from the harmful effects of image-based sexual
abuse and increase pressure on tech companies to manage
websites more responsibly.
Finally, NCOSE is concerned about the recent AI State
moratorium language included in the House Budget Reconciliation
bill, as it creates a disincentive for AI companies to put
safety first.
Technological progress should not come at the expense of
human dignity. It is our collective responsibility to protect
the voice, face, and likeness of every person from unauthorized
use.
Thank you.
[The prepared statement of Ms. Price appears as a
submission for the record.]
Chair Blackburn. We thank you.
And I will note for the record that we are submitting your
full testimony into the record with all of your footnotes. I
really appreciate that. Thank you so much.
Mr. Brookman, you are recognized for 5 minutes.
STATEMENT OF JUSTIN BROOKMAN, DIRECTOR OF TECHNOLOGY POLICY,
CONSUMER REPORTS, WASHINGTON, D.C.
Mr. Brookman. Thank you, Chairwoman Blackburn, Ranking
Member Klobuchar. Thank you very much for the opportunity to
get to testify here today.
I am here on behalf of Consumer Reports, where I head up
our work on tech policy advocacy. We are the world's largest
independent testing organization. We use our ratings, our
journalism, our surveys, our advocacy to try to create a more
fair, healthier, and safer world.
I am gratified the Committee is focusing on the problems
created by audio and video deepfakes, which, for better or
worse, are getting more realistic and convincing every day.
They are used in romance scams and grandparent scams where a
relative gets a frantic call from a distressed family member
who is in immediate need of cash. As the Chairwoman noted, they
are used in fake testimonial videos from celebrities hawking
everything from meme coins to cookware. I believe Elon Musk is
one of the most frequently impersonated celebrities online.
As Ms. Price testified eloquently, obviously, one of the
most prevalent uses is for the creation of nonconsensual
intimate images and videos. And they are increasingly used to
propagate misinformation, certainly in the political realm, but
also in the more petty personal realm. There is a story in
Maryland recently about an aggrieved teacher who created
deepfake audio of his boss saying racist and antisemitic slurs.
As this last example shows, realistic cloning tools are easily
available to the public and very cheap and easy to use.
Earlier this year, Consumer Reports conducted a study of
six voice-cloning tools that are easy to find online to see how
easy it would be to create fake audio based on a public
recording like a YouTube clip. Our study found that four of the
six companies we looked at didn't employ any reasonable
technical mechanisms to reasonably ensure they had the consent
of the person whose voice was being cloned. Instead, the
customer just had to click, like, yes, I have the person's
consent. Two require the person to read a script to help
indicate the person was onboard with having their voice cloned.
Four of the companies also did not collect much identifying
information from customers, just a name or an email address to
start creating deepfake voice clones.
Given how likely abuse of these services are, I don't think
they were doing enough. And a lot of our members agree. We
recently got 55,000 signatures on a recent petition asking the
Federal Trade Commission and State Attorneys General to
investigate whether these services were in violation of
existing consumer protection laws.
And that brings me to solutions. So one thing, we need
strong consumer protection agencies who have the resources to
crack down on abusive emerging technologies. Last year, the FTC
brought a handful of AI cases as part of Operation AIs' Comply,
but they don't have the capacity right now to confront the
massive wave of scams and abuses online.
Tools and responsibilities, I think some of these AI-
powered tools are designed such that they are almost always
going to be used for illegitimate purposes, whether it is
deepfake pornographic image generators or voice impersonation.
Developers of these tools need to have heightened obligations
to try to forestall harmful uses. If they can't do that, then
maybe they should not be freely available to the public.
Platforms too need to be doing more to proactively get
harmful material off their platforms. It is a very difficult
job. It takes resources, but it absolutely needs to be done.
Transparency: People deserve to know whether the content
they are seeing online is real or fake. I know there have been
a number of bills introduced in this Congress to try to address
that. Also, there is a law recently passed in California to
start to put transparency obligations on entities that make
deepfake content.
Stronger privacy and security laws: As this Committee knows
very well, the United States generally has fairly weak legal
protections. As the Ranking Member noted, the ready
availability of information about us online makes it easier for
scammers to target us with scams. We have seen a ton of
progress at the State level on privacy and security laws, but
they are not strong enough.
Whistleblower protections and incentives: In many cases, we
only find out about abuses inside these tech companies when
someone comes forward with their story. I was glad to see
bipartisan legislation introduced on this issue protecting AI
whistleblowers in the last week.
Education: I don't want to put all the burden on consumers,
but the reality is this is the world we live in. We need to
teach people to look out for these sorts of scams. We are part
of a campaign called Pause Take9, which tries to train people
that if they get an urgent call to action, they should pause,
take 9 seconds, think about if this is real or not.
And finally, I want to echo the words of Ms. Price about a
lot of discussion about a moratorium on State laws policing bad
uses of AI. I want to stress this is the wrong idea. AI has
tremendous, amazing potential, but as this hearing shows, it
has some real potential harms as well. The States have been
leaders in trying to address these harms, whether it is
privacy, co-opting performers' identities, regulating self-
driving cars, rooting out hidden biases, other deepfakes. AI is
an incredibly powerful technology, but that does not mean it
should be completely unregulated.
Thank you very much, and I look forward to answering your
questions.
[The prepared statement of Mr. Brookman appears as a
submission for the record.]
Chair Blackburn. And Ms. Carlos, you are recognized for 5
minutes.
STATEMENT OF SUZANA CARLOS, HEAD OF MUSIC POLICY, YOUTUBE,
BROOKLYN, NEW YORK
Ms. Carlos. Chairwoman Blackburn, Ranking Member Klobuchar,
and Members of the Subcommittee, thank you for the opportunity
to speak with you today on the important topic of the NO FAKES
Act and AI-generated digital replicas. My name is Suzana
Carlos, and I serve as the Head of Music Policy for YouTube.
Just last month, YouTube marked the 20th anniversary of the
first video ever uploaded to our platform. It is difficult to
fathom how much the world and YouTube have changed in those 2
short decades. Today, we have over 2 billion active monthly
members on our platform across more than 100 countries, with
500 hours of content uploaded every minute. We are proud that
YouTube has transformed culture through video and built a
thriving creator economy here in the United States and around
the world.
Our unique and industry-leading revenue-sharing model
empowers our creators to take 55 percent of the revenue earned
against ads on their content. And as a result, YouTube's
creative economy has contributed more than $55 billion to the
United States' gross domestic product and supported more than
490,000 full-time American jobs in the last year alone. In the
3 years prior to January 2024, YouTube paid more than $70
billion to creators, artists, and media companies.
At YouTube Music, we built one of the world's deepest
catalogs, over 100 million official tracks, plus remixes, live
performances, covers, and hard-to-find music you simply can't
find anywhere else. We have now reached over 125 million paid
YouTube music and premium subscribers. And YouTube continues to
be at the forefront of handling rights management at scale,
protecting the intellectual property of creators and our
content partners, ensuring that they can monetize their content
and keeping YouTube free for viewers around the world.
In 2007, YouTube launched Content ID, a first-of-its-kind
copyright management system that helps rightsholders
effectively manage their works. Rightsholders or their agents
provide YouTube with reference files for their works they own,
along with metadata, such as title and detailed ownership
rights. And based on these references, YouTube creates digital
fingerprints for those works in question and scans the platform
to determine when content in an uploaded video matches the
reference content. Rightsholders can instruct the system to
block, monetize, or track the reference content. And over 99
percent of the copyright issues on YouTube are handled through
Content ID. It has also proven to be an effective revenue
generation tool for rightsholders, as over 90 percent of
Content ID claims are monetized.
And as we navigate the evolving world of AI, we understand
the importance of collaborating with partners to tackle
emerging challenges proactively. We firmly believe that AI can
and will supercharge human creativity, not replace it.
Indeed, AI has the potential to amplify and augment human
creativity, unlocking new opportunities for artists, creators,
journalists, musicians, and consumers to engage creatively with
new tools and play an active role in innovation. We are already
seeing creators exploring new areas, including the creation of
new types of music, books, photography, clothing, pottery,
games, and other art inspired in collaboration with AI models.
And as this technology evolves, we must collectively ensure
that it is used responsibly, including when it comes to
protecting our creators and viewers.
Platforms have a responsibility to address the challenges
posed by AI-generated content, and Google and YouTube stand
ready to apply our expertise to help tackle them on our
services and across the digital ecosystem.
We know that a practical regulatory framework addressing
digital replicas is critical, and that is why we are especially
grateful to Chairwoman Blackburn, Senator Coons, Ranking Member
Klobuchar, and all the bill sponsors for the smart and
thoughtful approach adopted in developing the NO FAKES Act of
2025. We deeply appreciate their willingness to bring a variety
of stakeholders together to forge a consensus on this important
topic.
YouTube and Google are proud to support this legislation,
which tackles the problems of harm associated with unauthorized
digital replicas and provides a clear legal framework to
address these challenges and protect individuals' rights. The
NO FAKES Act appropriately balances innovation, creative
expression, and individuals' rights while offering a broadly
workable, tech-neutral, and comprehensive legal solution. By
supplanting the need for a patchwork of inconsistent legal
frameworks, the NO FAKES Act would streamline global operations
for platforms like ours and empower artists and rightsholders
to better manage their likeness online. We look forward to
seeing the legislation passed by Congress and enacted into law.
We have similarly proudly supported the TAKE IT DOWN Act
because it is critical to prevent bad actors from producing and
disseminating nonconsensual explicit images. We would like to
thank Ranking Member Klobuchar, along with Senator Cruz, for
their leadership on the legislation. This is an area we
continue to invest in at Google, building our longstanding
policies and protections to ultimately keep people safe online.
Thank you again for inviting me to participate in today's
hearing. I look forward to your questions.
[The prepared statement of Ms. Carlos appears as a
submission for the record.]
Chair Blackburn. And we thank you all for sticking to the
5-minute clock. I didn't have to gavel down a person.
[Laughter.]
Chair Blackburn. These are great content creators, I mean,
so there we go.
I am going to recognize myself for 5 minutes for questions.
And as Senator Coons said earlier, there are going to be
Members coming and going because we do have a variety of
hearings that are going on.
Ms. McBride, I want to come to you first. I think that your
perspective is such an important perspective as we talk about
this and talk on the direct impact to someone who is creating
content. And I appreciated so much that in your testimony, you
talked about how your voice and likeness, along with so many
other creators, that that is at risk. And therefore, your
livelihood is at risk.
So talk a little bit about how harmful deepfakes are in the
long term and why it is important to get legislation like this
to the President's desk. And then talk about fellow artists
that you have spoken with and their concerns on the issue.
Ms. McBride. Well, as you said, it does affect livelihood
for musicians, backup singers, voiceover actors, authors, like
so many people in the arts. For me, being established and
having done this for over 30 years, that's not necessarily my
first concern. I have the luxury of that not being my first
concern, but it is a concern for younger artists that are
coming up.
So as I said in my testimony, the thing that I am most
concerned with personally is how we work so hard to present
ourselves with integrity and a certain character. And the fact
that, you know, long term, that could be distorted or
manipulated to be the opposite of what I stand for and what I
believe or to be used to cause harm to someone through
endorsing a harmful product or, you know, far into the future,
after I am gone, somebody creating a piece of music or me
saying something that I never did. And it just kind of like
disintegrating what I have worked so hard to establish, which
is trust with my fans, with people who, you know, when I say
something, they believe it.
I think for younger artists, to your point of livelihood,
to be new and having to set up what you stand for and who you
are as a person, as an artist, what you endorse, what you
believe in, and establishing a trust with your fans, and then
on top of it, having to navigate these waters of someone coming
in and distorting all of that is devastating. Like, I don't
know how--I can't stress enough how it can impact the careers
of up-and-coming artists and even just in their ability to, you
know, speak their truth or just to live in fear of being a
victim of these deepfakes.
Chair Blackburn. Yes. Mr. Glazier, I want to come to you on
something you mentioned about the critical balance of
protecting the artists' voices and likenesses and then also
reducing litigation. And that is why we need to have this
framework. And I think helping artists stay out of court, I
mean, they're at a point where they may have to spend much of
what they have earned in order to protect themselves and to
protect their brand, if you will, if you will elaborate on
that.
Mr. Glazier. Sure, I am happy to. The bill has to be
effective and practical at the same time, both for the victim
and for the platform who is going to limit the damage to the
victim. It has to work on both ends. And that is why I think
the approach that was taken both in the TAKE IT DOWN Act and in
this act are so important, especially in areas where the
platform has less knowledge and less control because end users
are posting on the platform. And those can go viral very, very
quickly.
The ability for the platform to take it down as soon as,
you know, technically and practically feasible so that they
stop the damage and to keep it down so that the artist or any
other victim doesn't have to spend their lives monitoring a
platform and continually sending more notices and more notices
as end users keep putting up the same material over and over
and over again. We now have tools that will allow the removal
off of the platform. And once the removal is done, the damage
can be limited. There is no liability for the platform, and the
artist doesn't have to spend their time just litigating.
Where there is more knowledge and control, right, where the
platform has an employee upload it, for example, then there
should be responsibility on the platform. And those are cases
where you might need to go to court because the platform could
have prevented it, and they didn't prevent it.
So I think the bill is incredibly balanced and really
innovative in its approach to protecting free speech, reducing
litigation, but also effectively protecting the right that is
necessary.
Chair Blackburn. Senator Klobuchar, you are recognized.
Senator Klobuchar. All right. Thank you very much. I guess
I will start with Mr. Brookman, the non-Grammy winner.
[Laughter.]
Senator Klobuchar. And I want to talk to you just a little
bit about this consumer angle here, which I think is
interesting to people. And I think at its core, all of us
involved in this legislation have made it really clear that it
is not just people who are well known that will be hurt by this
eventually and that getting this bill passed as soon as
possible is just as important for everyone.
But I do so appreciate Ms. McBride's being willing to come
forward because those stories and the stories that we have
heard from, like I mentioned, Jamie Lee Curtis or the stories
that we have heard from many celebrities are very important to
getting this done.
So you just did a report, AI-generated voice cloning scams,
including that AI voice cloning applications, in the words of
the report, ``presents a clear opportunity for scammers.'' And
we need to make sure our consumer protection enforcers are
prepared to respond to the growing threat of these scams. I had
this happen to my State Director's husband, who their kid is in
the Marines and they got a call. They figured out that it
wasn't really him asking for stuff and money. They knew he
couldn't call from where he was deployed to. But this is just
going to be happening all over the place. And the next call
will be to a grandma who thinks it is real and she sends her
life savings in.
So I have called on the FTC and the FCC to step up their
efforts to prevent these voice cloning scams. And what are some
of the tools that agencies need to crack down on these scams,
even outside of this bill?
Mr. Brookman. Yes, absolutely. So I think the first thing
that the Federal Trade Commission probably needs is more
resources. They only have like 1,200 people right now for the
entire economy. That is down from 100 just in the past couple
of months.
Senator Klobuchar. Way down from even during like the Nixon
era.
Mr. Brookman. Yes, like 1,700 it used to be and the economy
has grown like three or four times. Chairman Ferguson has said
more cuts are coming, which I think is the wrong direction. I
worked at the Federal Trade Commission for a couple of years.
We could not do like a fraction of all the things that we
wanted to do to protect consumers. So people, more capacity,
more technologists, like there is just not enough technology
capacity in government.
I was in the Office of Technology, Research, and
Investigation there. That was like five people. That is just
not enough. Obviously, with all these very sophisticated--I
mean, just, just deepfakes alone, let alone the rest of the
tech economy. The ability to get penalties and even injunctive
relief, right? If someone gets caught stealing something, the
FTC often doesn't have the ability to make them give the money
back.
Senator Klobuchar. Yes.
Mr. Brookman. I know this Committee has tried to restore
that authority, but that would be important.
And also like, you know, again, maybe it is clear, FTC
could have rulemaking authority, but also I would like to see
Congress consider legislative authority to address tools. Like,
again, if you are offering a tool that can be used only for
harm, voice impersonation, deepfake pornographic images, maybe
there should be responsibilities to make sure it is not being
used for harm.
Senator Klobuchar. Okay. Thank you.
Ms. Carlos, can you talk about what YouTube is doing to
ensure it is not facilitating these scams?
Ms. Carlos. Sure. And thank you for the question, Senator.
Senator Klobuchar. And thanks for your support for the
bill.
Ms. Carlos. Of course. So just to primarily consider, we
obviously see great and tremendous opportunity coming from AI,
but we also acknowledge that there are risks, and it is our
utmost responsibility to ensure that it is deployed
responsibly. So we have taken a number of efforts to protect
against unharmful contact on our platform. Primarily, we have
updated our privacy policies last year to ensure that all
individuals can now submit a notice to YouTube when their
unauthorized voice or likeness has been used on our platform.
And once reviewed, it is applicable and we have confirmed that
that content should be removed, we will take it down.
We have additionally implemented watermarks on our AI
products. We originally began with both image and watermarks
using our SynthID technology. And we have recently expanded it
to also be applied to text generated from our Gemini app and
web experience and most recently, as part of our VO video tool.
Senator Klobuchar. Okay.
Ms. Carlos. We have also taken the additional step to
become a member of C2PA, the Coalition for Content Provenance
and Authenticity. And there, we are serving as a steering
member to work with the organization to create indicators and
markings that will allow the content provenance that was
created off platforms to additionally be recognized, and we are
deploying those technologies across our platform.
Senator Klobuchar. Okay. Thank you. We have mentioned the
TAKE IT DOWN Act, and thank you for the support for that.
Mr. Glazier, you talked about how this is the first Federal
law related to generative AI and that it is a good first step.
And could you talk about how if we don't move on from there and
we just stop and don't do anything for years, which seems to be
what has been going on, what is going to happen here and why it
is so important to do this?
Mr. Glazier. I think there is a very small window and an
unusual window for Congress to get ahead of what is happening
before it becomes irreparable. The TAKE IT DOWN Act was an
incredible model. It was done for criminal activity, you know--
--
Senator Klobuchar. I know.
[Laughter.]
Mr. Glazier. Yes, right, you know. You wrote it.
[Laughter.]
Mr. Glazier. It was a great model, but it only goes so far.
But we need to use that model now, and we need to expand it
carefully in a balanced way to lots of other situations, which
is exactly what the NO FAKES Act does.
Senator Klobuchar. Right.
Mr. Glazier. And I think, you know, we have a very limited
amount of time in order to allow people and platforms to act
before this gets to a point where it is so far out of the barn
that instead of encouraging responsible AI development,
instead, we allow investment and capital to go into----
Senator Klobuchar. Into----
Mr. Glazier [continuing]. AI development that hurts us.
Senator Klobuchar [continuing]. Stealing things, yes.
Mr. Glazier. So let's encourage investment the right way to
boost great AI----
Senator Klobuchar. Right.
Mr. Glazier [continuing]. Development and be first. Let's
not be the folks that encourage investment in AI technologies
that really harm us.
Senator Klobuchar. And Ms. Price, you have expressed
concerns about this 10-year moratorium on State rules. I am
very concerned, having spent years trying to pass some of these
things, and I think that one of the ways we pass things
quickly, like Mr. Glazier was talking about, is if people
actually see a reason that they don't want to patchwork, they
want to get it done. But if you just put a moratorium and you
look at like the ELVIS law coming out of Tennessee, Ms.
McBride, and some of the other things that would stop all of
that. My last question here before we go to another round,
could you talk about why you are concerned about what is right
in front of us now, which is this 10-year moratorium?
Ms. Price. Yes. Thank you for the question, Senator. We are
concerned about the moratorium because it is basically
signaling to the AI companies that they can kind of do whatever
they want in the meantime, and it inhibits States' ability to
adapt their laws to this form of technology that is changing
very quickly and then has this potential to cause great harm.
Senator Klobuchar. Thank you.
Chair Blackburn. And I know Senator Coons is on his way and
Senator Hawley is coming back, but Ms. Price, staying with you,
you talked about the TAKE IT DOWN Act and the importance there,
but touch on the gap that NO FAKES fills for a child who may
have something posted, but yet it doesn't fit under TAKE IT
DOWN and how this would open up an avenue of recourse for them.
Ms. Price. Yes, thank you, Senator. So under the NO FAKES
Act, because there is a private right of action, there would be
another way essentially for a victim to seek accountability
from a perpetrator or platform, which is really important
because the layers of accountability are what really deter bad
actors from engaging in harm. So having the criminal, but then
also having the ability to do the private right of action, the
civil action is important.
Chair Blackburn. And speaking to the States and their
actions, I do want to mention that Tennessee passed the ELVIS
Act, which is like our first generation of the NO FAKES Act.
And we certainly know that in Tennessee, we need those
protections. And until we pass something that is federally
preemptive, we can't call for a moratorium on those things,
so----
Senator Klobuchar. Excellent statement.
Chair Blackburn. Of course.
[Laughter.]
Chair Blackburn. Of course. Ms. Carlos, I want to talk with
you for just a minute. And we are grateful for the support that
you all have talked about. And there is a provision in the bill
that I know is important to your platform and many others, and
that's the notification piece and giving individuals harm. You
have talked about artists being able to contact you, but for
you all to be able to notify and letting people know about this
and then asking for that content to come down and then taking
that action.
As we have worked on the Kids Online Safety Act, one of the
complaints that had come to Senator Blumenthal and I from
individuals that tried to get things off was they could not get
a response. So this is something that that notification is an
imperative. So talk a little bit about how you are approaching
notification.
Ms. Carlos. Thank you. Thank you for the question. Yes, so
in looking at the framework of NO FAKES, again, we began with a
voluntary framework on YouTube, which allows individuals to
notify us when digital replica content of them is online. And
this is smartly mirrored in the NO FAKES Act. It empowers a
user to identify content and flag it to us when they believe it
should be removed for an unauthorized use of their voice or
likeness.
And as you mentioned, that notification is critical because
it signals to us the difference between content that is
authorized and harmful fakes. And it is with that notice that
we are able to review content and make an informed decision as
to whether or not it should be removed.
Chair Blackburn. And then what is your length of time for
getting it down upon receiving notification? What is your
process going to be on implementation?
Ms. Carlos. Sure. So as a similar framework, we envision as
under the DMCA where a web form would be easily available for
any user quickly filled out and then submitted to our trust and
safety team. We make every effort to review every notice on a
case-by-case basis and remove it as soon as possible.
Chair Blackburn. So are you talking hours, days? What is
your framework?
Ms. Carlos. I don't have the exact number on the top of my
head, but I do know that we try to process every notification
as quickly as possible.
Chair Blackburn. Thank you. If you will check on that----
Ms. Carlos. Sure.
Chair Blackburn [continuing]. And then get that information
back to us, I think we would like to know that because the fact
that this has taken such lengths of time for people to have any
kind of response has been very difficult for consumers, and
they feel like they are talking to the outer space and nobody
is listening and nobody is responding.
Ms. Carlos. Thank you for flagging the concern. I would be
happy to followup with you and the Committee.
Chair Blackburn. I appreciate that.
Senator Coons, you are recognized for 5 minutes.
Senator Coons. Thank you so much, Madam Chair.
I would like to first thank Ms. McBride for being here to
testify in support of NO FAKES. Could you speak to why this
bill is so important, both to protect artists like you and to
protect your fans?
Ms. McBride. Thank you. I think that it is important
because, as artists, we hopefully want to speak the truth. We
want to build a relationship with our fans in which they trust
us so they believe what we say. So when you have something like
a deepfake that either sells a product or says a statement, it
can be so harmful to that trust. You know, I mean, I just
realized sitting here that I bought a product, a collagen
supplement off of Instagram the other day because it had LeAnn
Rimes and a couple of other people, and I am sitting here
thinking, oh my goodness, I don't even know if that was really
them, right? So it is damaging to the artist and to the fan.
You know, we had a situation personally where one of my
fans believed they were talking to me, ended up selling their
house and funneling the money to someone who they thought was
me. That is so devastating to me to realize that somebody who
trusts me could be duped like that, you know?
And then also I think that eventually, somebody who is
duped by a deepfake is going be angry enough to have
retribution, which we are on stages in front of thousands of
people. We are in public places. So it is a danger to the
artist as well.
Senator Coons. Mr. Glazier, to followup on Ms. McBride's
testimony, what do you think are the consequences for the music
industry if we don't get NO FAKES over the finish line? What
will the consequences be for music fans and for the industry?
Mr. Glazier. The entire music ecosystem is dependent on the
authentic voice and the authentic image of the artist, right?
That is what the music industry is. If you allow deepfakes to
perpetuate, you are taking the soul out of the art. And when
you do that, you are taking the humanity out of the art. And
that is what art is.
So I think it is fairly existential that the voice of music
be the voice of music. I think that is what everything is built
on. And the idea, it is almost bizarre that we have to sit here
today talking about allowing someone to protect the use of
themselves. If there is anything that we have a right in and
should be able to control, it's the gifts that God gave us, the
voices that we have, the image that we have. And for that to be
taken from you is devastating both for the individual and
obviously for the industry itself, which is built on these very
voices.
Senator Coons. Ms. Carlos, if I might, I just want to thank
you for YouTube's partnership in getting to the place where you
support NO FAKES. Other tech companies haven't come forward. I
would be interested in what you might say or encourage me to
say to the Metas or TikToks of the world about why they should
support this bill, even though it imposes new obligations on
them.
And some have argued that NO FAKES might show legitimate
speech by incentivizing platforms to over-remove content out of
fear of being sued. How does YouTube think about balancing its
obligations under this bill with its First Amendment
obligations to users?
Ms. Carlos. Thank you for the question, Senator. As we
mentioned, YouTube largely supports this bill because we see
the incredible opportunity of AI, but we also recognize those
harms, and we believe that AI needs to be deployed responsibly.
I believe Mr. Glazier mentioned during his opening
statement that the NO FAKES Act does carry First Amendment
exemptions: parody, satire, newsworthiness. And that is one of
the reasons that we felt comfortable endorsing this bill. We
are, at the end of the day, an open platform, and we believe
that a variety of viewpoints can succeed on YouTube. So those
would be some of the things that I would share with you to
share with those other companies, but I cannot speak directly
on behalf of why they may or may not choose to endorse the
bill.
Senator Coons. Understood. Thank you. And thank you all for
your testimony today. Thank you.
Chair Blackburn. Senator Hawley, you are recognized for 5
minutes.
Senator Hawley. Thank you very much, Madam Chair. Thanks to
all of the witnesses for being here.
Ms. Carlos, if I could just start with you. You are here on
behalf of YouTube, is that right?
Ms. Carlos. That is correct.
Senator Hawley. Can you tell me, why is it that YouTube has
monetized videos that teach people how to generate pornographic
deepfakes of women? Why does that happen on your platform?
Ms. Carlos. Thank you for the question. Protecting our
users is one of our top priorities. My general expertise is in
music policy, so I am not in the best position to answer that
question, but I am happy to followup with you.
Senator Hawley. Do you know how many such videos there are
out there that are--these are monetized videos now on YouTube.
Ms. Carlos. I am not aware of that number. I can say that
our community policies do not allow that type of content on our
platform.
Senator Hawley. Well, Forbes magazine just reported that
YouTube has in fact promoted over 100 YouTube videos with
millions of views that showcase AI deepfake porn and include
tutorials on how to make deepfake porn, particularly porn that
targets young women. Do you have any idea how much money
YouTube has made off of this monetization?
Ms. Carlos. Thank you for bringing this to my attention. I
do not have detail on this specific news article. I am happy to
followup with you and the Committee.
Senator Hawley. So you don't have any idea of how many ad
dollars YouTube has made off of this?
Ms. Carlos. I do not.
Senator Hawley. Are you aware that one of these websites
that was promoted by YouTube in these videos was later cited in
a criminal prosecution for AI sexual abuse material--let me be
more specific--generating AI sexual abuse material involving
children?
Ms. Carlos. Thank you again for the question, Senator. As
we mentioned earlier, YouTube has endorsed the TAKE IT DOWN
Act, and we take these issues very seriously. Again, I will
notify that I represent music policy and do not have the
information to give you a fulsome response during today's
hearing.
Senator Hawley. Well, so let me ask you this then. If a
teenage girl's face ends up in an AI porn video on your
platform, what does YouTube do about it? What is her recourse
right now? What can she do to get some recompense, get some
restitution?
Ms. Carlos. After over a year ago, we updated our privacy
policy so that anybody who believes that their voice or
likeness is being used without their authorization on our
platform can submit a request for removal.
Senator Hawley. A request for removal. Is there some policy
in getting reimbursement for any profits the company may have
made, again, if these videos are monetized? I mean, does the
victim get a share of anything?
Ms. Carlos. I am not aware of those policies. I would have
to followup with you, Senator.
Senator Hawley. Why is it that the enforcement of YouTube's
own policy here seems to only happen after videos go viral? Is
there a reason for that?
Ms. Carlos. I do not have the answer to that question to
you.
Senator Hawley. Do you know how many AI-generated deepfake
videos or deepfake content is removed before a victim
complains? Does the victim have to complain before YouTube does
anything?
Ms. Carlos. Again, my specialty is in music policy. I do
understand that we use technology such as AI to search for that
content. And when it is in violation of our policies, we will
remove it.
Senator Hawley. Let me ask you about this. YouTube training
data, has YouTube provided data for use in Google's Gemini or
other AI training programs?
Ms. Carlos. YouTube does provide data in Google training
data in accordance with our agreements.
Senator Hawley. So if an artist uploads music to YouTube,
does the company use that music to train AI models?
Ms. Carlos. As I mentioned, we do share data in accordance
with our agreements. I can't speak to the specifics of any
individual agreement.
Senator Hawley. Well, so how are people like Ms. McBride
protected? I mean, so if you are an artist and you put any
content on YouTube, does that mean that it is just free range?
I mean, they can do whatever you want with it?
Ms. Carlos. Again, it goes down to the terms of our
agreement. I will say that we have forged deep partnerships
with the music industry. We came out of the gate with forming
AI music principles with the music industry and are continuing
to experiment with them to see how AI can best benefit their
creative process.
Senator Hawley. So are there privacy protections? You are
telling me YouTube has in place privacy protections for
artists?
Ms. Carlos. They apply to all individuals on our platform.
Senator Hawley. Oh, so this is the click wrap scenario.
This is in order to watch cute dog videos or whatever, you have
got to click the ``I consent'' and that wraps in--you basically
give consent for your stuff to be used?
Ms. Carlos. There are all different types of various
agreements, but our terms of service are included in that batch
of agreements.
Senator Hawley. I guess my question is, where are users
told about their privacy protections if they have any, and
where do they explicitly consent?
Ms. Carlos. They agree to our terms of service, and we also
have our privacy policy available on the web.
Senator Hawley. Okay. So that is the click wrap. So in
other words, if you come onto YouTube, you want to use it, you
click, you got to click through. So you click it, and there,
you basically agreed to allow YouTube to give your content to
AI and allow them to train it without any further consent. Is
that basically it?
Ms. Carlos. Again, we implement our policies in terms of
our agreement are what govern what goes into our training.
Senator Hawley. Well, and I am asking you the content of
that agreement. So in other words, if I am an artist and I
upload something to YouTube and yes, sure, I have clicked the
button that says, yes, I want to be able to use YouTube, are
you telling me that I don't have any further recourse? If
YouTube then goes and gives the information to AI models and
systems, there is nothing further I can do, or am I missing
something?
Ms. Carlos. If it is in accordance with our agreements, we
will share that data.
Senator Hawley. Yes, that seems like a big problem to me.
That seems like a huge, huge problem to me. And the fact that
YouTube is monetizing these kinds of videos seems like a huge,
huge problem to me.
I am glad you are here today. I wish there were more tech
companies here today, but we have got to do more. I mean,
YouTube, I am sure, is making billions of dollars off of this.
The people who are losing are the artists and the creators and
the teenagers whose lives are upended. We got to give
individuals powerful enforceable rights in their images, in
their property, in their lives back again, or this is just
never going to stop.
Thank you, Madam Chair.
Chair Blackburn. Thank you. And that is the reason we have
the NO FAKES bill, and we are trying to push it across the
finish line.
I would like to offer a second round. Senator Klobuchar, do
you have additional questions?
Senator Klobuchar. Very, very short. I know that Senator
Coons has asked some of my questions about just people's
personal experience with this. I guess I would ask you, Mr.
Glazier, I am not sure you were asked about this. Do you agree
that using copyrighted materials to create copycat content
undermines the value of the music created by artists and could
chill creation of new art?
Mr. Glazier. Absolutely. You know, if you are able to copy
copyrighted material for any purpose without consent, you are
basically allowing the person who is copying to make the money
and to do with it what they want, but not the creator who is
supposed to actually control it and who made it to be
compensated for it and to control its exploitation. It is the
very opposite of what the Constitution calls for in creating
intellectual property.
Senator Klobuchar. Very good. One last question.
Chair Blackburn. Sure, go ahead.
Senator Klobuchar. This is the last one on the consumer
education issue that was raised. Thank you. I am sure you all
care about it, but Mr. Brookman, so while we should not place
the burden solely on consumers to protect themselves from AI
scams, I don't think that is going to work very well. What
steps should Congress take to help educate consumers when it
comes to AI literacy and the like? I think it is something we
could have some agreement on.
Mr. Brookman. Yes, I think spending the money for a public
awareness campaign is, I think, a really good idea. I think
people, you know, hear stories of friends of friends who it has
happened to, but a lot of people just have no idea that the
things they see online, the things they see on Facebook are
just not real. So in addition to laws----
Senator Klobuchar. You know one that says I am the fourth
richest woman in the world now?
Mr. Brookman. Oh, congratulations.
[Laughter.]
Senator Klobuchar. Yes, that is just this week. I am sorry.
I don't want to exaggerate. America.
[Laughter.]
Senator Klobuchar. And then people try to defend me by
sending out the list of the top 10 richest with like Oprah, and
I always think it is kind of sad that I am nowhere near it. But
yes, that is the latest thing that is out there.
[Laughter.]
Senator Klobuchar. Go on, Mr. Brookman.
Mr. Brookman. Yes, training people to be aware of it, to
think about it, just to, you know, watch out for social
engineering attacks, false, you know, calls for urgency. You
know, the deepfake voice right now is usually good for a little
while, but it is getting better, right, and it is going to
continue to get better. So one idea is, you know, having a
family safe word, right, a word that only you and your family
know that the scammer can't get. But like they have access to a
lot of personal data about us, so we are all vulnerable. The
numbers are going up dramatically, so just like teaching
people. Like I said, it is a shame we have to teach people to
do this, but it is the world we live in.
Senator Klobuchar. Okay. Thank you.
Thank you, Senator Blackburn.
Chair Blackburn. Senator Coons.
Senator Coons. Ms. Price, I was glad to see President Trump
sign the TAKE IT DOWN Act earlier this week. Why is NO FAKES
still necessary if TAKE IT DOWN is on the books?
Ms. Price. Thank you, Senator. No fakes is still necessary
because it provides a way for victims to bring a civil lawsuit
on their own behalf, and so there is an importance to having,
yes, on the one hand, the criminal piece, the criminal law
accountability and the required take down under the FTC, but
then, of course, the victims being able to bring their own
lawsuit if they wish to do that. It is more effective for
deterrence to have multiple things.
Senator Coons. Ms. Carlos, why did YouTube come to the
table? You could have just made it a whole lot harder for the
bill to move forward if you didn't make concessions and agree
to be a part of advocating for the bill.
Ms. Carlos. Thank you for the question and thank you for
including us in that round of stakeholders. So YouTube sits in
a very unique kind of universe. You know, we not only have our
users and music partners and media partners, but we also have
creators. And that is one area where this idea of digital
replicas can cause real world harm. So in addition to
supporting NO FAKES, which gives them the individual right to
remove content, not just from YouTube, but from other
platforms, we are continuing to invest in new technology, which
we refer to as likeness ID, which will allow our participating
members and our pilot to have their face and voice scanned and
will be able to match across our platform. So we are continuing
to invest in this technology as we see it is a top issue.
Senator Coons. Thank you. Thank you very much, Senator
Blackburn, Ranking Member Klobuchar.
Chair Blackburn. Thank you, Senator Coons.
Mr. Glazier, I want you to touch on contracts. We have had
quite a discussion this week on copyright.
And as artists negotiate these contracts for their name,
their image, their likeness, recently SAG-AFTRA made a move in
some of their negotiations on this. But talk a little bit about
the importance of having a federal standard as it relates to
standard contract law.
Mr. Glazier. Yes, this goes to the very essence of consent
for the artist. And so not only does the NO FAKES Act give
control and consent to the individual about the use of their
voice and the use of their likeness, it also imposes some
guardrails around the length of those contracts, what those
contracts mean when the person is alive, what that means after
the person passes. And it also has special provisions that
protect minors who might enter into contracts that includes
parents and guardians and also court authority.
So it does a very good job of preventing abuse while giving
the power to the individual whose voice is at stake and whose
image is at stake in being able to license it.
Chair Blackburn. Thank you so much.
Ms. Price, I want you to submit to us--you can do this in
writing. When we look at the physical world and the statutes
that exist for protecting individuals from some of the harms
that you have listed today and the importance of TAKE IT DOWN
and the importance of NO FAKES, but we don't have all of those
criminal statutes that transfer to the virtual space. And I
would like for you to give me a summary of your thoughts on
that. Your testimony is expansive and helpful. And as I said,
we have submitted that whole testimony.
Mr. Brookman, we have submitted your entire testimony also,
and we thank you for that.
But I would like to have just a little bit more from you on
that issue of those protections. We have talked about NO FAKES
and the COPIED Act and KOSA. We talk often about this
difference, and you touched on it, and I would like to have
something more expansive.
With that, we have no further Members present and no
further questions. I will remind you all that Members have 5
days to submit questions for the record, and then you are going
to have 10 days to return those answers to us.
I thank you all. Our witnesses have been wonderful today.
We appreciate your testimony for the record.
And with that, the hearing is adjourned.
[Whereupon, at 3:52 p.m., the hearing was adjourned.]
[Additional material submitted for the record follows.]
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
[all]
A P P E N D I X
The following submissions are available at:
https://www.govinfo.gov/content/pkg/CHRG-119shrg61677/pdf/CHRG-
119shrg
61677-add1.pdf
Submitted by Chair Klobuchar:
Digital Media Association (DIMA), the voice of music
streaming, statement............................................ 2
[all]