[Senate Hearing 118-731]
[From the U.S. Government Publishing Office]
S. Hrg. 118-731
TAKE IT DOWN: ENDING BIG TECH'S COMPLICITY
IN REVENGE PORN
=======================================================================
FIELD HEARING
BEFORE THE
COMMITTEE ON COMMERCE,
SCIENCE, AND TRANSPORTATION
UNITED STATES SENATE
ONE HUNDRED EIGHTEENTH CONGRESS
SECOND SESSION
__________
JUNE 26, 2024
__________
Printed for the use of the Committee on Commerce, Science, and
Transportation
[GRAPHIC NOT AVAILABLE IN TIFF FORMAT]
Available online: http://www.govinfo.gov
__________
U.S. GOVERNMENT PUBLISHING OFFICE
61-870 PDF WASHINGTON : 2025
-----------------------------------------------------------------------------------
SENATE COMMITTEE ON COMMERCE, SCIENCE, AND TRANSPORTATION
ONE HUNDRED EIGHTEENTH CONGRESS
SECOND SESSION
MARIA CANTWELL, Washington, Chair
AMY KLOBUCHAR, Minnesota TED CRUZ, Texas, Ranking
BRIAN SCHATZ, Hawaii JOHN THUNE, South Dakota
EDWARD MARKEY, Massachusetts ROGER WICKER, Mississippi
GARY PETERS, Michigan DEB FISCHER, Nebraska
TAMMY BALDWIN, Wisconsin JERRY MORAN, Kansas
TAMMY DUCKWORTH, Illinois DAN SULLIVAN, Alaska
JON TESTER, Montana MARSHA BLACKBURN, Tennessee
KYRSTEN SINEMA, Arizona TODD YOUNG, Indiana
JACKY ROSEN, Nevada TED BUDD, North Carolina
BEN RAY LUJAN, New Mexico ERIC SCHMITT, Missouri
JOHN HICKENLOOPER, Colorado J. D. VANCE, Ohio
RAPHAEL WARNOCK, Georgia SHELLEY MOORE CAPITO, West
PETER WELCH, Vermont Virginia
CYNTHIA LUMMIS, Wyoming
Lila Harper Helms, Staff Director
Melissa Porter, Deputy Staff Director
Jonathan Hale, General Counsel
Brad Grantz, Republican Staff Director
Nicole Christus, Republican Deputy Staff Director
Liam McKenna, General Counsel
C O N T E N T S
----------
Page
Hearing held on June 26, 2024.................................... 1
Statement of Senator Cruz........................................ 1
Witnesses
Andrea Powell, Advocate and Expert on Sexual Exploitation........ 4
Prepared statement........................................... 6
Anna McAdams, Mother of Ms. Elliston Berry....................... 8
Prepared statement........................................... 9
Elliston Berry, High School Student and Victim of AI-Generated
Sexually Exploitative Imagery.................................. 10
Prepared statement........................................... 11
Francesca Mani, High School Student and Victim of AI-Generated
Sexually Exploitative Imagery.................................. 11
Prepared statement........................................... 12
Hollie Toups, Victim of Non-Consensual Intimate Imagery.......... 13
Prepared statement........................................... 15
Stefan Turkheimer, Vice President for Public Policy, Rape, Abuse,
& Incest National Network (RAINN).............................. 16
Prepared statement........................................... 18
TAKE IT DOWN: ENDING BIG TECH'S COMPLICITY IN REVENGE PORN
----------
WEDNESDAY, JUNE 26, 2024
U.S. Senate,
Committee on Commerce, Science, and Transportation,
Dallas. TX.
The Committee met, pursuant to notice, at 2:47 p.m., in
University of North Texas at Dallas (Campus Hall), 7300
University Hills Blvd, Dallas, Texas, Hon. Ted Cruz, Ranking
Member of the Committee, presiding.
Present: Senator Cruz [presiding].
OPENING STATEMENT OF HON. TED CRUZ,
U.S. SENATOR FROM TEXAS
Senator Cruz. Good afternoon, everyone. Welcome. Welcome to
a field hearing of the Senate Committee on Commerce, Science,
and Transportation. This is a field hearing on the Take It Down
Act, ending big tech's complicity in revenge pornography.
I want to start by thanking the University of North Texas
at Dallas for providing the venue for today's hearing,
including the UNTD staff that have worked so hard behind the
scenes to put this event together. I owe another huge thanks to
the UNTD Dallas Campus Police and the State Troopers who are
providing security this afternoon, thank you.
And of course, thank you to our witnesses for testifying
today and to our audience for taking the time to be here in
person. If you are a victim of revenge or AI-generated explicit
imagery in the Internet age, your life changes permanently. You
most likely are a woman or a young girl, and you most likely
have been targeted by someone you know.
And if you do not happen to be a celebrity, you most likely
are still struggling to have the images of you removed from the
internet. The scourge of so-called ``revenge pornography'' is
not new. It has been with us, sadly, for decades. However, new
generative AI tools have made creating realistic yet fake
explicit images and videos of real people easier than ever.
Due to advances in technology, now anyone can become a
victim. And it is increasingly affecting a particularly
vulnerable population, teenage girls. High schools across the
country are seeing an explosion of AI-generated, sexually
exploitative images of female students--thousands upon
thousands upon thousands of them.
These images are often created and spread by male
classmates. We have seen almost identical cases in New Jersey
where a male classmate created and shared fake, sexually
explicit images of their underage female classmates.
We have seen it in Illinois, where boys altered dozens of
their female classmates prom photos to create fake nude images.
We have seen it in Washington, where a boy at a dance uses AI
to virtually strip his female classmates' dresses off.
We have seen it in California, where three separate schools
around Los Angeles, in Beverly Hills, Calabasas, and Laguna
Beach have had incidents of students using AI to create
inappropriate, sexually explicit images of their classmates. We
have seen it in Florida, where middle school boys created
sexually explicit images of their 12-year-old and 13-year-old
classmates.
And we have seen it here in Texas, where a boy targeted a
group of female friends and shared realistic but fake sexually
explicit images. In each case, this cruel filth was then sent
to classmates, further exploiting and embarrassing the victims.
I suspect that there are many, many more instances that are
never reported to law enforcement and that don't make the
nightly news. State lawmakers across the country have been
taking action, including here in Texas, but the online nature
of these would-be crimes demands a Federal solution to provide
peace of mind to all victims.
That is why earlier this month I was proud to join with
Senator Amy Klobuchar, a Democrat from Minnesota, to lead a
bipartisan group of 14 Senators, 7 Republicans and 7 Democrats,
to introduce the Take It Down Act.
Also joining me last week in introducing this legislation
were Senators Shelley Moore Capito, Richard Blumenthal, Jacky
Rosen, Cynthia Lummis, Ted Budd, Lufthansa Butler, Todd Young,
Joe Manchin, Bill Cassidy, John Hickenlooper, Martin Heinrich,
and John Barrasso.
Since then, Senators John Thune, Roger Wicker, and Marco
Rubio have all joined with the effort as well. And there is a
companion bill underway in the U.S. House being led by
Representative Maria Salazar.
Our bill has garnered support from over 40 organizations
across the political spectrum, including organizations that are
both left leaning and right leaning, advocacy groups across the
political spectrum. It has garnered support from unions, from
law enforcement, and from industry.
As the support for this bill plainly indicates, this is not
and should not be a partisan issue, and it is imperative that
Congress act quickly to protect the victims. The Take It Down
Act will make it a Federal crime with real jail time for
publishing online non-consensual, sexually explicit images of
another person, with heightened penalties if the victim is a
child.
This applies even if the image is AI generated, and it also
applies in cases where the original image was created
consensually, but the depicted individual did not consent to
its publication. It also makes it a crime to threaten to
publish these images online, thereby covering the associated
problem of sextortion, which we are seeing more and more
targeting young people.
In addition, the Take It Down Act is the first of its kind
to incorporate a notice and takedown requirement for social
media and other websites that allow users to post these images.
This means that TikTok and Twitter and Snapchat and Instagram
will have processes in place to receive and to immediately act
upon complaints from victims.
Notice and takedown is a critical remedy for victims who
want the images and all the copies of them removed so they
don't spread further, especially in cases where the victim may
not even know who initially posted the images.
As the Supreme Court stated in a 2014 case concerning
restitution for the possession of child pornography, ``every
viewing of child pornography is a repetition of the victim's
abuse.'' This is no less the case for victims of nonconsensual
so-called revenge pornography, and for victims of realistic but
fake computer generated, deepfake sexually explicit images.
While many social media companies purport to have a zero
tolerance policy for posting non-consensual sexual content, in
practice it can be very difficult for victims to file report--
who file reports to have the images removed.
The Take It Down Act ensures that social media prioritizes
reports from victims, and if they don't comply, it empowers the
Federal Trade Commission to pursue enforcement actions. The
Take It Down Act has been referred to the Senate Commerce
Committee where I sit as the Ranking Member.
It is one of my top priorities that this bill be placed on
the next committee markup so that it can be passed out of
committee and that it can receive Senate floor consideration as
soon as possible.
And now I want to introduce the witnesses who are here
today for this hearing. We are honored in particular to have
testifying two 15-year-old victims of AI generated sexually
exploitative images. We have Ms. Francesca Mai of Westfield,
New Jersey, and we have Ms. Elliston Berry of Aledo, Texas.
Almost unbelievably, both of them were victims of this
heinous activity in the very same month, October 2023, while
living nearly 2,000 miles apart. Today, they are together in
person for the first time to ring the national alarm bell to
tell their stories on the growing spread of deepfake sexually
exploitative images of minors, nearly all of which are targeted
at young girls.
Being a victim of this horrendous practice is deeply
traumatic. Choosing to speak out about it on a national stage
with TV cameras requires another level of courage and verve.
Thank you to both of you for being here. I know it is not easy.
Thank you for telling your story. Thank you for having the
courage to tell your story for other young girls and other
young boys who might be victims of this going forward.
Both Francesca and Elliston are accompanied by their
equally tenacious mothers. Elliston's mother, Mrs. Anna
McAdams, will also be testifying today. I am also pleased to
have here today another Texas resident, Ms. Hollie Toups, who
was the victim of nonconsensual, intimate image publication
over a decade ago.
Let's be clear, Hollie has been fighting for over 10 years
to get her images taken down from the web. This is inexcusable.
It is well past time that Hollie and other victims of so-called
revenge pornography see justice.
Finally, I am honored to be joined by two other tireless
advocates, Ms. Andrea Powell, who has over 18 years of
experience advocating for survivors of trafficking, sexual
exploitation, and sexual abuse. And also Mr. Stefan Turkheimer
from the Rape, Abuse, and Incest National Network, or RAINN,
which is also a supporter of the Take It Down Act.
Thank you all for taking the time to be here today, and I
would now like to recognize Ms. Powell to give her opening
statement.
STATEMENT OF ANDREA POWELL, ADVOCATE AND EXPERT ON SEXUAL
EXPLOITATION
Ms. Powell. Dear Senator Cruz, thank you for having me here
today. Again, my name is Andrea Powell. I am an expert and
advocate on sexual exploitation, human trafficking, and image-
based sexual violence.
For the past 20 years, I have worked alongside more than
2,000 survivors to create safe homes, support law enforcement
and medical interventions, and support both child and adult
victims and advocates for critical policies such as today that
will give survivors access to protection, justice, and healing.
I am the author of Believe Me and Advocates Account,
Finding Justice for Victims of Sex Trafficking, and a founding
partner to the first survivor created AI facial recognition
solution to technological deepfake abuse, Alecto.AI. I am here
today to speak on the most pervasively rising form of sexual
violence harming women and girls, which is known as deepfake
abuse.
As a survivor of sexual assault as a teenage girl from
central Texas, I never thought my silence in the face of my own
abuse was a privilege. I never told anyone. That was my choice.
Today, I am standing up for survivors of what is known as
image-based sexual violence, which, by design, humiliates
victims by showing the entire world their abuse, often before
they even see it.
However, many--sorry, very few of the resources to ensure
their abuse images are removed and never again uploaded.
Meanwhile, their abusers continue to share and degrade them
with little to no recourse.
The term ``deepfake abuse'' is in fact a term coined by
Reddit user in 2017, who went on to create what is arguably the
largest sexually explicit website, Mr. Deepfakes, where women
and girls are exploited, is 1 of 9,000 such websites hosted by
Google. There is nothing fake about the pain that deepfake
abuse causes.
In 2022, nurse practitioner candidate for the Virginia
House of Representatives, Susanna Gibson, was harmed in this
way when her nonconsensual real videos of her with her husband
were distributed and then reported on by The Washington Post,
creating a roadmap to her abuse. Rather than being treated as a
victim as sex crime, Susanna was exploited by the media, the
public, and those who shared her content.
Shortly thereafter, deepfake abuse content of Susanna was
also online. She lost her job, her candidacy, experienced
terrifying threats to her and her family at their home, and
almost took her life. In order to get her content down,
Susanna, like many, could not rely upon law enforcement because
there were no laws to protect her.
Like most victims of deepfake abuse, Susanna had to create
her own digital rape kit, painfully locating each abuse image,
contacting each platform and website, hoping they would take
mercy on her and take the content down. Most did not.
Susanna now runs a nonprofit advocating for survivors like
herself. Breeze Liu considered ending her life when her former
partner created both non-synthetic and deepfake abuse images of
videos of her.
Her content was over 800 URLs across the world, and
currently over 140 of these URLs are hosted by Microsoft Asia.
Having just graduated from the University of California,
Berkeley, and a venture capitalist, Breeze was shocked when law
enforcement asked her if she was just a prostitute.
The truth of the matter is anyone can be a victim, whether
or not they have exchanged the content willingly, whether or
not it was created by deepfake abuse, but no one should be a
victim. Deepfake abuse is growing at astounding rates.
Estimates indicate deepfake abuse has grown 3,000 percent
since 2019, with over 300 apps often free with a simple search
online. I believe this directly contributes to situations like
teen boys finding and using online abuse apps to nudify their
female classmates and teachers.
It is a digital virtual gun that is loaded and aimed at the
faces and lives of women and girls, such as those testifying
before you today. So, you have the app stores that monetize the
apps, the software developers that create it, the platforms
that host it, and the websites that explicitly create a form of
organized sexual violence that is coordinated and monetized.
Some creators make over $20,000 a year. They hire assistants,
in fact.
Deepfake abuse has become a dark culture that celebrates
the abuse and sexual exploitation of women and girls. A simple
look at the forums and websites like Mr. Deepfakes show there
is a culture emerging that validates the creators, even young
boys, and rewards the violence.
While tech platforms like Meta and Bumble are partnering
with efforts such as StopNCII to remove images, the abuse still
remains. I earlier mentioned a young woman named Breeze Lui.
Breeze went on to found the very tech solution that she
needed in her darkest hour. Alecto.AI uses AI facial
recognition software to partner with platforms and give
individuals their consent back. This gives the control and
safety back to individuals, and it saves them time and
resources on the platforms.
Alecto.AI demonstrates through survivor leadership once
again that we actually can solve this problem. Survivors should
have laws that match the crime. Perpetrators should be held
accountable.
And I stand before you today with the hope that we can make
that possible, because we can stop deepfake abuse. I know we
can.
[The prepared statement of Ms. Powell follows:]
Prepared Statement of Andrea Powell,
Advocate and Expert on Sexual Exploitation
Dear Senator Cruz and members of the Senate Committee on Commerce,
Science, and Transportation for the 118th Congress:
My name is Andrea Powell. I am an expert and survivor advocate on
sexual exploitation, human trafficking and image-based sexual violence,
including what is known as deepfake (synthetic) abuse. For the past 20
years, I have worked alongside survivors to create safe homes, support
law enforcement and medical interventions to support both child and
adult victims and advocate for critical policies that will give
survivors access to protection, justice and healing. I am the author of
Believe Me, an advocates account of finding justice for victims of sex
trafficking and reside with my daughter in the coastal town of
Leucadia, California. I am a founding partner to the first survivor-
created AI facial recognition solution to deepfake abuse, Alecto.AI.
I am here to today to speak to the most pervasively rising form of
sexual violence harming women and girls which is known as deepfake
abuse. As a survivor of sexual assault as a teenage girl in central
Texas, I never thought that my silence in the face of my own abuse was
a privilege. I never told anyone. The only people who knew were the
young men who assaulted me, myself and one scared friend. However, now
standing before you today, I am standing up for survivors of what is
known as image-based sexual violence, which by design humiliates
victims by showing the entire world their abuse, their sexual violence,
as they experience it. The sexual violence such as image-based sexual
violence is never in the past because as of now, very few have the
resources to ensure their abuse images are removed and never again
uploaded.
I would first like to speak to the very terminology that is
currently being used. The term `deepfake abuse' is in fact a term
coined by a Reddit user in 2017 who then went on to create what is
arguably the largest explicit website, MrDeepfakes.com, where women and
girls are exploited. As we tackle synthetic sexual violence, we must
use terms that are accurate and honor the pain of survivors, not the
violence of the abuser. One term is sexual digital forgery, coined by
Dr. Mary Anne Franks from my colleagues at the Cyber Civil Rights
Initiative.
In fact, there is nothing fake about the pain that deepfake abuse
causes. In 2022, a nurse practitioner and candidate for the Virginia
House of Representatives, Susanna Gibson, was harmed in this way when
her nonconsensual `real' videos of her with her husband were
distributed and then reported on by the Washington Post. Rather than
being treated as a victim of a sex crime, Susanna was fully exploited
by media, the public and those who shared her content. Shortly
thereafter deepfake abuse content of Susanna was also online for anyone
to view. She lost her job, her candidacy, experienced terrifying
threats to her and her family and almost ended her life. In order to
get her content down, Susanna could not rely upon the police because
there were no laws to protect her. Susanna could not compel the
websites hosting her abuse content to remove them because they also
were not required by law to do so. Instead, Susanna, like almost all
victims of deepfake abuse, had to create her own digital rape kit,
painfully locating each abuse image and contacting each platform and
website hoping they would take mercy on her and remove the content.
Like over half of the survivors of image-based sexual violence,
including survivors of sex trafficking as young as 13, who I have met,
Breeze Liu considered ending her life when her former partner created
both non-synthetic and deepfake abuse images and videos of her. Her
nonconsensual sexual violence images were on at least over 800 URLs
across the world and currently remain on over 140 URLs hosted by
Microsoft Asia. Just recently having graduated from the University of
California Berkeley and a venture capitalist, Breeze was astounded and
humiliated when, instead of helping her, law enforcement asked her if
she was a prostitute.
The truth of the matter is, though, anyone--including for example a
young woman who exchanged a nude photo for money--should not lose their
right to consent to the further abuse of nonconsensual deepfake harms.
Neither should a woman in sports, politics or any public life. This is
not a matter of freedom of speech because another person's face or body
should not be an abuser's freedom of expression. Consent is consent and
without it, the creation of and distribution (and threat of
distribution) nonconsensual intimate images is simply sexual violence
facilitated by technology.
Deepfake abuse is growing at astounding rates. Estimates indicate
that deepfake abuse has grown by 3000 percent since 2019 with over 300
apps--often free--that are easily found by a simple online search. I
believe this directly contributes to situations like teen boys finding
and using online apps to nudify their female classmates or teachers.
It's a digital virtual gun that is loaded and aimed at the faces and
lives of women and girls such as those testifying before you today. So,
you have the app stores that monetize the nudify apps, the software
developers that create it, the platforms that host it and the websites
that explicitly create a form of organized sexual violence that is
coordinated and monetized. Some creators make over $20,000 a year.
Deepfake abuse has become a dark culture that celebrates the abuse
and sexual exploitation of women and girls. A simple look into the
forums of websites like Mr. deepfakes.com show that there is a culture
emerging that validates creators and rewards the violence. Recently in
the United Kingdom, Mr. Deepfakes was banned.
One user whose comments I found lamented that he knew it was wrong
to create deepfake abuse of an ex-wife or girlfriend but didn't see the
problem with creating deepfake abuse of celebrities. It is not that
simple. In winter of this year, an avalanche of deepfake abuse appeared
of Taylor Swift, Billie Eilish and Ariana Grande--all celebrity women
who did not consent to this abuse. The bodies used to create some of
this abuse content were in fact the bodies of the young women who were
sex trafficked and exploited online in the now infamous Girls Do Porn
case.
While technology platforms from Meta to Bumble are laudably
partnering with efforts such as StopNCII.org to remove images, the
abuse still continues. We need the law to match the crime and support
technology platforms in enforcing measures to disable deepfake abuse
from their business model. With global hashing technology, harm
moderation and AI tools, we do have a way to stop this at it's source.
I don't see tech as the enemy. I see it as the solution to human
created abuse--but will they join us to stop this form of sexual
violence?
Deepfake abuse is an evolution of such technology-facilitated
sexual violence. Image-based sexual violence is not an inevitable by-
product of our online lives nor should it become the threat that
silences women and girls in an ever increasingly digitized world.
Earlier, I mentioned a young woman survivor named Breeze Liu.
Breeze didn't just end her life when her abuse continued. Breeze went
on to found the very tech solution that she needed in her darkest
hours. Alecto AI uses AI facial recognition software that will partners
with platforms big and small to all for individuals to find and request
their content be immediately removed and prevented from being re-
uploaded. This gives the control--and safety--back to individuals where
it belongs and gives technology platforms a software that will save
them time and resources. Alecto AI demonstrates that there can and
should be a technological solution to deepfake--and non-synthetic--
image based sexual violence.
Survivors--both minors and adults--deserve protection and justice.
Every survivor should be able to report their abuse to law enforcement
and abusers should be found and held appropriately accountable. A U.S.
Federal law should rely upon the consent of the individual, not the
intent of the abuser. Image-based sexual violence should be classified
as a serious sex offense. I do understand there are nuances for minor
offenders. That said, this is rape facilitated online. You can not
accidentally sexual assault someone offline and the same should be true
for the online world where the harms quickly follow that victim home,
to school, to work and anywhere they try to exist after such a profound
and public trauma.
Survivors should also be able to rely on technology platforms and
websites to remove their abuse content immediately. This is simply a
responsible aspect of doing business. Rather than rely upon complicated
and often conflicting user agreements that are centered on the content,
we must create laws and technological solutions to support the
survivors themselves.
I have worked with and alongside many survivors of deepfake abuse
here in the United States and globally. There them, I know this to be
true: no survivor should stand alone in the face of their own abuse and
injustice. Today, I share my testimony with the determination that they
won't have to in the United States much longer. The United States needs
Federal legislation that creates protection and justice for every
survivor. The time is now, survivors are waiting.
Senator Cruz, I stand here before you today with a hope that we are
getting closer to a world where young women and girls don't have worry
that being online means being targets of sexual violence. I invite your
questions. I also stand, as an advocate, expert and survivor of sexual
violence, with survivors no matter their background or age. We can stop
deepfake abuse. I know we can.
Senator Cruz. Thank you, Ms. Powell, for your testimony and
thank you for your advocacy, speaking up on behalf of victims
everywhere. Mrs. McAdams.
STATEMENT OF ANNA McADAMS,
MOTHER OF MS. ELLISTON BERRY
Mrs. McAdams. Thank you, Senator Cruz.
On October the 2nd, we woke up to the news that devastated
our lives forever. The previous day, a fellow classmate decided
a fate for my daughter and her eight friends that they would
never have imagined for themselves. He decided to take
beautiful, innocent pictures off of their social media
platforms and impose nude body parts onto them using an AI app
from Google called DNGG.
On Snapchat, he made multiple accounts. He friend requested
people and blasted in the fake nudes of the girls. It is
important to draw attention here to the fact that he didn't
just take someone else's nudes and put their faces on them.
Instead, he took their actual bodies and used AI to make
the images look even more real. These images are child
pornography. These girls were 14 years old. Their innocence was
taken that day. In just 24 hours, everyone in their social
group had seen these pictures and they were being sent out to
the entire high school.
That week became a nightmare. Some of their friends asked
him why he was doing it, and his response was that he wanted to
go out with a bang, and he wanted to ruin the girls. In today's
world, those words are some of the scariest. We live with
school shootings on a regular basis in this country.
You can only imagine the fear those words caused the girls.
Would he come to school with his intent to hurt them? Would he
kill them or hurt himself? The girls didn't want to go to
school at all that week. On Friday of that week, the girls did
go to school and the school went into lockdown. It seemed to
the girls someone was after them.
We are thankful that it was nothing more than a false
alarm, but during this lockdown, this boy decided to go on to
his account and continue to terrorize. That allowed the
school's tech team to catch him. It is so shocking to be
violated in this way.
First, having nudes sent out that you didn't consent to.
And second, then having the fear that your life might be in
danger. What he did was malicious. He chose his victims, and he
reveled in their torture. Though he caught--though he was
caught, he was still not allowed--we were still not allowed to
know his identity.
The Sheriff's Office and the school said that they had to
protect him as a minor. Our girls felt jaded by this. They were
the victims, not him. We were able to have the school do a
Title IX investigation, and once that was completed, we were
able to know his identity.
The school sent him to in-school suspension but could not
tell us when he was coming back, so the rest of the fall
semester was spent in fear that he would come back to school
without a warning.
My husband and I went to the school board. Our plea was for
them to expel him indefinitely. The school board argued that
there was nothing in the student code of conduct to cover these
AI offenses.
The perpetrator was charged with sale, distribution, and
display of harmful material to a minor. It was a class A
misdemeanor. Instead of having our day in court, our Parker
County determined that he was not a harm to anybody and just
gave him probation.
When he turns 18, his record will be expunged. He will walk
away unscathed. However, our girls will forever live in fear
that when they apply for a job or go to college, these pictures
might resurface. He chose a fate for our daughters that we
cannot change.
There needs to be consequences for what he did. If we do
not have stricter laws and punishments for this type of crime,
it will only increase. We are thankful that our girls have
family, friends, counselors who support them.
They will come through this. If the girls didn't have this
support, it could have turned into something worse, something
unimaginable like mental breakdown or even suicide. This is why
the Take It Down Act is so crucial.
This bill will hold even minors accountable for jail time
for this crime, and it will require Snapchat and other social
media to take those images down after 48 hours. As of two weeks
ago, Snapchat had not responded to the warrant issued by our
Sheriff's department, nor to any of the requests that we made
as parents online.
When I met with Senator Cruz's office two weeks ago, they
were able to get a hold of Snapchat and get the accounts and
images taken down. It took eight and a half months. If we had
been Taylor Swift, we would have--they would have come down
immediately.
This bill will give us a voice that we did not have before.
Thank you.
[The prepared statement of Mrs. McAdams follows:]
Prepared Statement of Anna McAdams, Mother of Ms. Elliston Berry
On October 2nd, we woke up to news that devastated our lives
forever. The previous day, a fellow classmate decided a fate for my
daughter and her eight friends that they never would've imagined for
themselves. He decided to take beautiful innocent pictures off of their
social media platforms and impose nude body parts onto them using an AI
app called DNGG.
On Snapchat he made multiple accounts. He friend requested people
and blasted them with the fake nudes of the girls. It is important to
draw attention to the fact that he didn't take someone else's nudes and
put their faces on them. Instead, he took their actual bodies used AI
to make the images look real. These images are child pornography. The
girls were 14 years old. Their innocence was taken that day.
In just 24 hours, everyone in their social group had seen these
pictures and they were being sent out to the entire High School. That
week became a nightmare. Some of their friends asked him why he was
doing it? His response was that he wanted to go out with a bang and
wanted to ruin the girls.
In today's world, those words are some of the scariest. We live
with school shootings on a regular basis in this country. You could
only imagine the fear those words caused the girls. Would he come to
school with the intent to hurt them? Would he kill himself? The girls
did not want to go to school.
On Friday of that week the girls' school went into lockdown. It
seemed to the girls someone was after them. We are thankful that it was
nothing more than a false alarm. During the lockdown, this boy decided
to go on to his account and continue to terrorize. This allowed the
schools' tech team to catch him.
It is so shocking to be violated in this way. First, having nudes
sent out you didn't consent to and second, having the fear that your
life will be taken by someone. What he did was malicious. He chose his
victims and reveled in torturing them.
Though he was caught we were still not allowed to know his
identity. The Sheriff's office and the school said they had to protect
him as a minor. Our girls felt jaded by this. They were the victims not
him. We were able to have the school do a title 9 investigation. Once
this was completed we were able to know his identity.
The school sent him to in school suspension but could not tell us
when he was coming back. The rest of that fall semester was spent in
fear that he would come back to school without a warning. My husband
and I went to the school board. Our plea was for them to expel him
indefinitely. The school board argued that there was nothing the
student code of conduct to cover this AI Offense.
The perpetrator was charged with the sale/distribution/display of
harmful material to a minor--a class A misdemeanor. Instead of having
our day in court, Parker County deemed him not a harm and gave him
probation. When he turns 18 his record will be expunged. He will walk
away unscathed. However, our girls will forever live in fear that when
they apply for a job or for college these pictures might resurface. He
chose a fate for our daughters that we can't change. There needs to be
consequences for what he did.
If we do not have stricter laws and punishments for this type of
crime it will only increase. We are thankful that all the girls have
families, friends, and counselors who support them. They will come
through this. If the girls did not have that support, it could've
turned into the worst circumstance imaginable, like mental breakdowns
or suicide.
This is why the Take it Down Act is crucial. This bill would hold
even minors accountable with jail time for this crime. And, it would
require Snapchat and other social media apps to take images down within
48 hours. As of two weeks ago, Snapchat had not responded to the
warrant issued by our sheriff's department, nor to any of my requests
online. When I met with Senator Cruz's office two weeks ago, they were
able to get ahold of Snapchat and get the accounts and images taken
down. It took eight and a half months. If we had been Taylor Swift,
they would have come down immediately. This bill gives us a voice we
didn't have before.
Thank you for your time.
Senator Cruz. Thank you, Mrs. McAdams. And thank you for
your ferocious defense of your daughter. Ms. Berry.
STATEMENT OF ELLISTON BERRY, HIGH SCHOOL STUDENT AND VICTIM OF
AI-GENERATED SEXUALLY EXPLOITATIVE IMAGERY
Ms. Berry. Thank you, Senator Cruz. I was 14 years old when
a fellow classmate created AI nudes from just an innocent photo
on Instagram. I was 14 years old when I was violated all over
social media, and I was just 14 years old when I feared my
future was ruined.
October 2, 2023, I had woken up to multiple messages from a
friend notifying me that photos were circulating social media.
Not just any photos. Pictures from a past Instagram post with a
nude body created by an AI app upon mine. Fear, shock, disgust,
and disbelief were some of the many emotions that filled my
head in that moment.
I was left speechless as I tried to wrap my head around the
fact that this was occurring. Having to admit to your parents
was shameful, as I still felt responsible and began to blame
myself. As I attended school, I was fearful of the reactions
and opinions people had.
We live in a society that is built on social media, so I
convinced--I had been convinced at least the whole school had
seen these images. And to this day, the number of people that
have these images or had seen them is still a mystery.
As it took eight and a half months to get these images off
of Snapchat, that doesn't wipe the photos off of people's
devices. Every day I would live in fear that these photos or
resurface or someone could easily recreate.
As these pictures were being passed around, I still
attended school and was expected to act like all is well. I
felt unprotected walking through school. Safety was never a
concern of mine until these photos spawned. The school with
little to no help. As I said, it was out of their control.
The spread of AI nudes is unpunishable because it isn't
considered child pornography, but as a victim of AI deepfakes,
it has created a tremendous amount of pain, and this is why I
come here and share my testimony.
My goal is to prevent any other student from undergoing
this issue. Cases of students, female and male, are appearing
every single day, and the people behind this malicious act do
not have any consequences.
My intent is to give victims a voice I never had, and
hopefully turn this horrible situation into something good.
Thank you.
[The prepared statement of Ms. Berry follows:]
Prepared Statement of Elliston Berry, High School Student and Victim of
AI-Generated Sexually Exploitative Imagery
I was fourteen years old when a fellow classmate created AI nudes
from just an innocent photo on Instagram. I was fourteen years old when
I was violated all over social media. I was just fourteen years old
when I feared my future was ruined.
October 2nd, 2023, I had woken up to multiple messages from a
friend notifying me that photos were circulating social media. Not just
any photos. Pictures from a past Instagram post with a nude body
created by an AI app upon mine. Fear, shock, disgust, and disbelief
were some of the many emotions that filled my head in that moment. I
was left speechless as I tried to wrap my head around the fact that
this was occurring. Having to admit to your parents was shameful as I
still felt responsible and began to blame myself.
As I attended school, I was fearful of the reactions and opinions
people had. We live in a society that is built on social media, so I
had been convinced at least the whole school had seen these images. And
to this day, the number of people that have these images or had seen
them is still a mystery. As it took eight and a half months to get
these images off Snapchat, that doesn't wipe the photos off people's
devices. Every day, I will live in fear that these images will
resurface, or someone could easily re-create.
As these pictures were being passed around, I still attended school
and was expected to act like all was well. I felt unprotected walking
through school. Safety was never a concern of mine until all these
photos spawned. The school was little to no help, they said, ``it was
out of their control.''
The spread of AI nudes is unpublishable because it isn't considered
child pornography. But as a victim of AI deep fakes, it creates a
tremendous amount of pain. That is why I come here to share my
testimony.
My goal is to prevent any other student from undergoing this issue.
Cases of students, female and male, are appearing every single day. And
the people behind this malicious act do not have any consequences. My
intent is to give victims a voice they never had. And hopefully turn
this horrible situation into something good.
Thank you.
Senator Cruz. Thank you very much, Ms. Berry. Ms. Mani.
STATEMENT OF FRANCESCA MANI, HIGH SCHOOL STUDENT AND VICTIM OF
AI-GENERATED SEXUALLY EXPLOITATIVE IMAGERY
Ms. Mani. First off, a huge thank you to everyone who has
taken time out of their insanely busy schedules to be here. It
means a lot to me to be able to share my story and ask you to
do the right thing by putting laws in place to protect woman
and child--teenagers like me.
On October 20th, I found out that a few of my classmates
created AI nude images of me and other girls in my school
without our knowledge and our consent. Now, what is consent?
Consent is clearly and willingly agreeing to something without
any pressure. Well, we did not agree to anything.
Despite not seeing the images ourselves and hearing
assurances that they have been deleted, what happened on
October 20 to me, and the other girls is unacceptable. No
child, teenager, or woman should ever experience what we have
experienced. I initially felt shocked, then powerless and
angered by the absence of laws and school policies to protect
us. Now, I am determined to push for reforms.
The obvious lack of laws speaks volumes. We girls are on
our own and considering that 96 percent of deepfake AI victims
are women and children, we are also seriously vulnerable, and
we need your help. This meeting proves that the politicians
here today, regardless of being Democrats or Republicans,
genuinely care. They are fathers, mothers, uncles, brothers,
and many more.
They understand how urgent it is to fix this mess. Coming
together to tackle the misuse of AI is super important. So,
huge thank you to everyone here, and even those not here but
backing the newly introduced AI bill or thinking about
supporting it. I am 15 now. What happened to me at 14 could
happen to anyone. That is why it is so important to have laws
in place.
I am not here by choice. You know, discussing legislation
isn't exactly my ideal way to have fun on a Wednesday afternoon
in June. But I know my voice matters. Without Senator Cruz's
bill, we will continue to have teens making AI deepfake images
of girls.
But AI is not just a toy for bored teens, it is also a tool
for predators and that is really not OK. To wrap this up, this
issue is pretty black and white. Creating AI deepfake nude
images without consent should be illegal, and it is not OK. And
it is adults' job to protect us with laws, and our job to learn
how to protect ourselves by protecting our image.
In conclusion, while we have been focusing on the
challenges and potential misuse of AI, it is important to
recognize that AI is not also bad. It is a fascinating and
rapidly advancing technology that has been significantly
benefiting our society in the health care and science arena.
It is crucial, however, that we approach AI with a balanced
perspective. We need to educate ourselves about its
capabilities and limitations, ensuring that we evolve and grow
alongside this technology.
If we get the right laws in place and learn more about AI
in schools, we can totally tap into the benefits of AI without
getting burned. It is all about finding the sweet spot where
innovation meets responsibility, where progress doesn't come at
the cost of safety and ethics.
So, there you have it. My Wednesday plea for change. I hope
I didn't take too much of your time and that together we will
make the digital world a safer place, not just for me, but for
everyone out there. Thank you.
[The prepared statement of Ms. Mani follows:]
Prepared Statement of Francesca Mani, High School Student and Victim of
AI-Generated Sexually Exploitative Imagery
First off, a huge thank you to everyone who has taken time out of
their insanely busy schedules to be here. It means a lot to me to be
able to share my story and ask you to do the right thing by putting
laws in place to protect women and teenagers like me.
On October 20th, I found out that a few of my classmates created AI
nude images of me and other girls in my school. Without our knowledge
and our consent. Now what is consent? Consent is clearly and willingly
agreeing to something without any pressure. Well, we did not agree to
anything.
Despite not seeing the images ourselves and hearing assurances that
they have been removed, what happened on October 20th to me, and the
other girls is unacceptable. No child, teenager, or woman should ever
experience what we have experienced. I initially felt shocked, then
powerless and angered by the absence of laws and school policies to
protect us. Now, I am determined to push for reforms.
The obvious lack of laws speaks volumes. We girls are on our own
and considering that 96 percent of Deep Fake AI victims are women and
children, we're also seriously vulnerable and we need your help.
This meeting proves that the politicians here today, regardless of
being Democrats or Republicans, genuinely care. They're fathers,
mothers, uncles, brothers, sisters, and more. They understand how
urgent it is to fix this mess.
Coming together to tackle the misuse of AI is super important. So,
a huge thank you to everyone here, and even those not here but backing
the newly introduced AI bill or thinking about supporting it.
I'm 15 now. What happened to me at 14 could happen to anyone.
That's why it's so important to have laws in place. I'm not here by
choice; you know, discussing legislation isn't exactly my ideal way to
have fun on Wednesday afternoon in June. But I know my voice matters.
Without Senator Cruz's bill, we'll continue to have teens making AI
Deep Fake images of girls. But AI's not just a toy for bored teens;
it's also a tool for predators, and that's really not okay.
To wrap this up, this issue is pretty black and white. Creating AI
Deep Fake Nude images without consent should be illegal and it's not
okay, and it's the adults' job to protect us with laws and our job to
learn how to protect ourselves by protecting our image. In conclusion,
while we've been focusing on the challenges and potential misuse of AI,
it's important to recognize that AI is not all so bad. It's a
fascinating and rapidly advancing technology that has been
significantly benefiting our society in the healthcare and science
arena. It's crucial, however, that we approach AI with a balanced
perspective. We need to educate ourselves about its capabilities and
limitations, ensuring that we evolve and grow alongside this
technology. If we get the right laws in place and learn more about AI
in schools, we can totally tap into the benefits of AI without getting
burned. It's all about finding the sweet spot where innovation meets
responsibility, where progress doesn't come at the cost of safety and
ethics.
So there you have it. My Wednesday plea for change. I hope I didn't
take too much of your time and that together, we will make the digital
world a safer place, not just for me, but for everyone out there.
Thanks for listening!
Senator Cruz. Thank you very much, Ms. Mani. And let me say
thank you also to your mom who is here with you today and who
was with us in D.C. two weeks ago as well. Thank you for
sharing your story. Ms. Toups.
STATEMENT OF HOLLIE TOUPS, VICTIM OF NON-CONSENSUAL INTIMATE
IMAGERY
Ms. Toups. Thank you, Senator Cruz. Over a decade ago, I
suddenly found myself in a disturbing and unfamiliar universe.
Intimate images that were of a personal nature had been posted
on a website and shared with the world.
As I entered this alternate universe, I wasn't sure what to
do. I was overcome with an eerie feeling, knowing that there
were people in real time looking at my body without my consent.
But it was not just pictures. There were horrible comments,
threats, and personal information posted about me.
I immediately felt unsafe in my own home and uncomfortable
in my own skin. Everyone in my hometown was talking about the
website. When I was in public, I was approached by strangers
who had no boundaries. They had, after all, seen me topless.
They felt like they knew me, and they assumed I wanted the
attention.
As soon as I was able to get myself together, I e-mailed
the website explaining my pictures were posted without my
consent and requested that they be removed. This was, after
all, an egregious invasion of privacy, right? I was confident
they would agree and that would be the end of it. However, they
said they would be happy to remove my photos for a fee.
Shocked, I responded I would not be paying to remove with
my photos they had posted without my consent. I would, however,
hire a lawyer. I wasn't sure, however, that I could afford a
lawyer, so I first saw assistance from law enforcement.
However, there was little they could do. At that time,
sharing intimate photos without consent on the Internet was
something new and the laws had not caught up. I reached out to
a lawyer who agreed to send a letter to the website demanding
the removal of my photos, but nothing legally could be done, he
felt, as they were protected by Section 230. It looked as if
they were free to do as they pleased.
This was not acceptable to me. How was it that they were
protected, and I was not? By this time the photos had been
shared and saved on other websites, compounding the trauma.
Someone suggested I contacted a private investigator to see
what they could do.
He agreed to help me, and within a few days my pictures
were removed from the original website. As you can imagine, I
breathed for the first time in months. He warned me, however,
that that might not be over, and I found myself checking the
website to be sure that my pictures were not back up. Wake up,
pour coffee, check website, repeat.
Just when I began to feel like it was over, they were up
again. This time, on the front page. The perpetrators bragged
about their ability to do freely as they pleased because they
were posted--protected by Section 230.
From there, my photos continued to be shared from website
to website. It was like a whack-a-mole trying to get them down
as they continue to pop back up. The comments, the harassment
increased.
With the help of the investigator, I met with another
lawyer who agreed to help me. I had gathered names of other
local women posted on the original website. He agreed to assist
them too. We filed a lawsuit naming the host site as well.
Unfortunately, at that point, when I chose to stand up for
myself and fight for my privacy, the harassment got worse. The
original site was eventually shut down by the courts and the
site owner identified. After some time of wallowing in
depression, I became angry. This could not stand.
I reached out to my State representative, which led me to
connect with then Texas Senator Garcia and Texas State
Representative Gonzalez. I had the honor of contributing to
Texas passing the Relationship Privacy Act, aimed at
safeguarding Texans from this unwarranted harm.
While they were able to determine who started the website,
I was never able to confirm who posted my photos and how they
got possession of them. For months after, I checked the
Internet every day to make sure my photos were not there. Every
day it controlled me.
I often think of the messages and the threats that I
received, and of others who have gone through the same thing.
It is really hard to put into 5 minutes what that year was
like. These actions can inflict long term psychological,
personal, and social repercussions for the victims.
I have gone through a lot of therapy to get past it. I am
still to this day amazed that with the click of a button, your
body, your privacy, and your personal information can be
exposed for the world to see.
We have come a long way since my photos were posted without
my consent, and while I am grateful to see the progress has
been made to protect others and there are protections being in
place, there is still more to do.
I think we can all agree on the importance of privacy. I
think we can all agree that things of an intimate nature should
be kept private and not disclosed to the world by individuals
who seek harm.
Additionally, that bad actors should not be permitted to
create images with someone's likeness. Thank you for your time
and your attention to this.
[The prepared statement of Ms. Toups follows:]
Prepared Statement of Hollie Toups,
Victim of Non-Consensual Intimate Imagery
Over a decade ago, I suddenly found myself in a disturbing and
unfamiliar world. Intimate images that were of a personal nature, had
been posted on a website and shared with the world.
As I entered into this alternate universe, I wasn't sure what to
do. I was overcome with an eerie feeling knowing that there were people
in real-time looking at me without my consent. It was frightening. Who
did this? And why?
But it wasn't just pictures. There were comments, threats, personal
information about me. Immediately, I felt unsafe in my own home and
uncomfortable in my own skin. I was terrified, helpless and I just
wanted to climb in bed, pull the covers over my head and never come
out. I spent the next few days bouncing back and forth between panic,
anger, embarrassment, and being completely devastated.
Everyone in my hometown was talking about the website. When in
public, I was approached by strangers who had no boundaries. They had,
after all, seen pictures of me topless. They felt like they knew me,
and they assumed I wanted the attention.
At the time, I was pursuing a degree in Criminal Justice,
volunteering as guardian ad litem to youth in foster care and also
working as a teacher's aide. I was terrified of losing my job and of
putting the kids I was appointed to in danger.
As soon as I was able to get myself together, I e-mailed the
website, explaining the pictures were posted without my consent, and I
requested they be removed. This was, after all, an egregious invasion
of privacy. I was confident they would agree and that would be the end
of it. Because why wouldn't they? However, I was wrong. They replied
they would be happy to remove my photos for a fee. Shocked, I responded
that I would not pay to remove pictures they had posted without my
consent, I would however hire a lawyer.
I wasn't sure I could afford a lawyer, so I first sought assistance
from law enforcement. However, there was little they could do. At the
time, sharing intimate photos without consent on the Internet was
something new and the laws had not caught up.
I reached out to a lawyer in town. He agreed to send a letter to
the website demanding the removal of my photos, but nothing legally
could be done as he felt they were protected by Section 230. It looked
as if they were free to do as they pleased. His suggestion was
essentially, live and learn. This was not acceptable to me. I was being
used for someone's sexual and revengeful pleasure, I was being
exploited and they had attempted to extort money from me. How was it
they were protected, while I was not. I would not let this be my story,
this would not overcome me.
Someone suggested I see a private investigator to see what they
could do. As I told him my story, I completely broke down because I had
up until this point felt completely helpless. This weight I had carried
for the past few months had become too much. I, a victim's advocate,
desperately needed an advocate.
By this time, the photos had been shared with another site,
compounding the trauma. And no one could help me; no one could stop the
millions of eyes from seeing my body and saving them and passing them
on. No one could stop the harassment and threats.
He assured me he would do whatever it took to get my privacy back
and put my life back in my hands. Within a few days of speaking with
the investigator, my pictures were removed from one of the sites, and
as you can imagine . . . I breathed for the first time in months.
The investigator warned me that it may not be over, but he was
going to stick with me until it ended. I found myself checking the site
to be sure my pictures were not back up. Wake up, pour coffee, and
check the website--repeat.
Just when I was beginning to feel like it was over, they were up
again. This time, on the front page. The perpetrators bragged about
their ability to freely do as they pleased because they were protected
by Section 230. At one point, they even posted an excerpt from Section
230, bragging about their ``protections''.
From there, my photos continued to be shared from website to
website. It was like a whack-a-mole trying to get them down as they
continued to pop back up. The comments and harassment increased.
With the help of the investigator, I met with another lawyer who
agreed to help me. I had gathered names of other local women posted on
the same site and he also agreed to assist them as well.
They all had a story. Some had attempted to take their own life,
some lost jobs, some were underage. We filed a lawsuit, naming the site
host as well. Unfortunately, when I chose to stand up for myself and
fight for my privacy, the harassment got worse. But that original site
was eventually shut down and the site owner identified.
At some point, after wallowing in depression I became angry. This
could not stand. I reached out to my state representative which led me
to connect with then Texas Senator Garcia and Representative Gonzales.
I had the honor of contributing to Texas passing the Relationship
Privacy act, aimed at safeguarding Texans from this unwarranted harm.
While we were able to determine who started the website, I never
was able to confirm who posted my photos and how they got possession of
them. That was very hard to come to terms with, having no closure or
answers.
For months after, I checked the Internet every day to make sure my
photos were not back up. Every day, it controlled me. I often think of
threats and messages I received on that website and on social media and
of others who have gone through the same thing. It is hard to put into
a few minutes what that year was like. These actions can inflict long-
term harmful psychological, personal, and social repercussions for
victims. I have gone through a lot of therapy to get past it. I am
immensely grateful for my support system during that time, and for
those who fought for me, as I am not sure where I would have ended up
without them.
It still to this day, amazes me that with the click of a button
your body, your privacy and your personal information can be exposed
for the world to see. And it continues to this day. We have come a long
way since my photos were posted without my consent. And while I am
grateful to see progress has been made to protect others and
protections are being put in place, there is still more to do.
I think we can all agree on the importance of privacy. I think we
can all agree that things of an intimate nature should be kept private
and not disclosed to the world by individuals who seek to harm others,
and additionally that bad actors should not be permitted to create
intimate images with someone's likeness.
Thank you all for your time and for bringing this forward.
Senator Cruz. Thank you, Ms. Toups, for telling your very
difficult and harrowing story. Mr. Turkheimer.
STATEMENT OF STEFAN TURKHEIMER, VICE PRESIDENT
FOR PUBLIC POLICY, RAPE, ABUSE, & INCEST NATIONAL NETWORK
(RAINN)
Mr. Turkheimer. Thank you, Senator Cruz. My name is Stefan
Turkheimer, and I am Vice President of Public Policy for RAINN.
RAINN is the Nation's largest anti-sexual violence
organization. We created and operate the National Sexual
Assault Hotline, in partnership with more than 1,000 local
sexual assault service providers across the country.
Over 4.5 million survivors and their families have been
helped by the hotline. We also operate about 30 hotlines for
partners, including the DOD self help--safe helpline for the
Department of Defense.
RAINN also carries out programs to prevent sexual violence,
educate the public, ensure that perpetrators are brought to
justice, and help organizations and companies improve their
approach to prevention response.
Thank you for the opportunity to testify today on the
critical issue of the distribution of non-consensual intimate
images, often referred to as deepfakes, revenge porn, or
nonconsensual pornography.
My testimony will outline why it is imperative for the
United States to outlaw the creation and distribution of these
images and provide survivors with the means to remove them from
the Internet to stop the cycle of harm.
As you have heard from everyone up here today. Victims of
non-consensual intimate image distribution often endure
significant emotional and psychological distress. Includes
feelings of shame, guilt, anxiety, and depression. In severe
cases, distress can lead to self-harm or even suicide.
The emotional toll on victims underscores the need to
address these problems and for this bill to become law.
Distribution of intimate images without consent can irreparably
damage an individual's reputation, affecting both personal and
professional lives.
Victims may face stigma, discrimination, and ostracism from
their communities, workplaces, and social circles. The social
consequences can be devastating and long lasting. Sharing
intimate images without consent fundamentally undermines an
individual's autonomy over their own body and personal
information.
Consent is a cornerstone, as Ms. Mani was saying, of all
interactions, especially those involving intimate or private
matters. Outlawing non-consensual intimate image distribution
reinforces the importance of consent and personal autonomy.
And that is why it is so important that this actually
becomes a crime, because criminalizing the distribution of non-
consensual intimate images serves as a deterrent against
malicious behavior. It sends a clear message that such actions
are unacceptable and punishable by law. I want to underscore
the lack of big tech support.
Intimate images have been around forever. Deepfakes are
new. It is the easy creation and distribution of these images
has created the real problem. We are standing on a precipice of
proliferation of these images. We know how harmful they can be,
but right now there is nothing requiring the tech companies to
fix the problems that they facilitate.
Last week at the press conference, I told the story of a
Federal prosecution for identity theft. A woman had been in a
brief relationship with the Navy captain and shared images of
herself in the context of that relationship.
After the relationship ended, he created a Facebook profile
of her, friended all of her friends and new coworkers, members
of her softball team, people at her gym, and shared those
photos and more with them.
There was enough evidence for a Federal prosecutor to bring
the case in for the jury to convict this person of identity
theft. But Facebook, who has asked 400 times to take down the
photos, would not because they believed the fake profile was
more real than the actual person. Big tech, they simply aren't
going to fix the problem themselves.
Big tech needs this bill. Survivors need this bill. We need
this bill. Having legal recourse provides victims with the
means to seek justice and hold perpetrators accountable. It
also validates the experiences of victims, acknowledging the
wrong done to them and offering a pathway to closure and
recovery.
I said earlier that RAINN runs the National Sexual Assault
Hotline where so many have found help. I reached out to them
this week to gather some stories about people that have been
dealing with NCII, and I was struck by how often it overlapped
with contact offenses.
This is not just a crime of the internet. It often flows
from and leads to direct physical, in-person abuse. In many
instances, technology facilitated abuse overlaps. This occurred
at different points throughout the continued or timeline
review. So, I am just going to give a couple examples from
people that have brought this up.
Technology is a tool to facilitate contact offenses. The
survivor was groomed by a much older man online. They said they
were an online relationship at 13, 14 with a predator and
didn't know he was older. Technology used during abuse. The
foster dad filmed himself hurting the visitor and also took
photos of them.
Then, the visitor states that their mother has threatened
to share explicit photos of them if they didn't--if they tell
anyone that she has been allowing men to have sex with the
visitor for money. There is someone that was sexually assaulted
3 years ago, and took photos of it, and now is being threatened
by those photos being spread online.
This is a significant and pervasive problem, and it is only
going to get worse without him. So, thank you, Senator Cruz,
for advocating for survivors and giving them a way to take back
control of their own bodies. Thank you for your time.
[The prepared statement of Mr. Turkheimer follows:]
Prepared Statement of Stefan Turkheimer, Vice President for Public
Policy, Rape, Abuse, & Incest National Network (RAINN)
My name is Stefan Turkheimer, and I am vice president of public
policy for RAINN. RAINN is the Nation's largest anti-sexual violence
organization. RAINN created and operates the National Sexual Assault
Hotline (800.656.HOPE, online.rainn.org y rainn.org/es) in partnership
with more than 1,000 local sexual assault service providers across the
country. Over 4.5 million survivors and their families have been helped
by the Hotline. We also operate about 30 hotlines for partners,
including the DoD Safe Helpline for the Department of Defense. RAINN
also carries out programs to prevent sexual violence, educate the
public, ensure that perpetrators are brought to justice, and help
organizations and companies improve their approach to prevention and
response.
Thank you for the opportunity to testify today on the critical
issue of the distribution of non-consensual intimate images, often
referred to as ``deep fakes'', ``revenge porn'', or ``non-consensual
pornography.'' My testimony will outline why it is imperative for the
United States to outlaw the creation and distribution of these images,
and provide survivors with a means to remove these images from the
Internet and stop the cycle of harm.
Image Based Sexual Violence
Non-consensual intimate images constitute a severe violation of an
individual's right to privacy. These images are typically shared with
the understanding that they will remain private. Knowingly distributing
them without consent is an egregious invasion of personal privacy that
demands legal intervention. When they are created by an app, it just
underlines the lack of consent. The effects are often the same.
Emotional and Psychological Harm
The victims of non-consensual intimate image distribution often
endure significant emotional and psychological distress. This includes
feelings of shame, guilt, anxiety, and depression. In severe cases, the
distress can lead to self-harm or even suicide. The emotional toll on
victims underscores the need for robust legal protections.
Reputation and Social Consequences
The distribution of intimate images without consent can irreparably
damage an individual's reputation, affecting both their personal and
professional lives. Victims may face stigma, discrimination, and
ostracism from their communities, workplaces, and social circles. The
social consequences can be devastating and long-lasting.
Consent and Autonomy
Sharing intimate images without consent fundamentally undermines an
individual's autonomy over their own body and personal information.
Consent is a cornerstone of all interactions, especially those
involving intimate or private matters. Outlawing non-consensual
intimate images reinforces the importance of consent and personal
autonomy.
Deterrence of Malicious Behavior
Criminalizing the distribution of non-consensual intimate images
serves as a deterrent against malicious behavior. It sends a clear
message that such actions are unacceptable and punishable by law,
potentially preventing future incidents and protecting individuals from
similar harm.
Lack of Big Tech Support
Intimate images have been around forever. Deep fakes are new. It's
the easy creation and distribution of these images that has created the
real problem. We are standing on the precipice of proliferation of
these images. We know how harmful they can be. But right now, there is
nothing requiring the tech companies to fix the problems they
facilitate. Last week I told the story of a Federal prosecution for
identity theft. A woman had been in a brief relationship with a Navy
captain and shared images of herself in the context of that
relationship. After the relationship ended, he created a Facebook
profile of her, and friended all of her friends and new coworkers,
members of her softball team, people at her gym, and shared those
photos and more with them. There was enough evidence for a Federal
Prosecutor to bring the case and for the jury to convict this person of
identity theft, but Facebook, who was asked 400 times to take down the
photos, would not because they believed the fake profile was more real
than the actual person. They simply aren't going to fix the problem
themselves.
They need this bill. Survivors need this bill. We need this bill.
Having legal recourse provides victims with a means to seek justice
and hold perpetrators accountable. It also validates the experiences of
victims, acknowledging the wrong done to them and offering a pathway to
closure and recovery.
I said earlier that RAINN runs the National Sexual Assault Hotline,
where so many have found help. I reached out to them to gather some
stories about NCII, and I was struck by how often it overlapped with
contact offenses. This is not just a crime of the internet, it often
flows from or leads to direct physical, in-person abuse.
In many instances, technology-facilitated abuse overlapped with
contact offenses. This technology-facilitated abuse occurred at
different points during the continuum/timeline of the abuse.
Technology as a tool to facilitate contact offense: ``The
visitor was groomed by a much older man online. They said that
they were in an online relationship at 13-14 with a predator
and didn't know he was older.''
Technology used during abuse: ``Foster dad filmed himself
hurting the visitor and also took photos of them.''
Technology used to maintain abuse: ``The visitor states that
their mother has threatened to share explicit photographs of
them if they tell anyone that she has been allowing men to have
sex with the visitor for money.''
Technology facilitated abuse occurring after contact
offenses: ``The user was raped 3 years ago by someone who is
now stalking them online.''
Thank you Senator Cruz, for advocating for survivors, and giving
them a way to take back control of their own bodies.
Thank you for your time and attention. I look forward to answering
any questions you may have.
Senator Cruz. Well, let me say thank you to each of the
witnesses for being here. For the advocates who spend your time
speaking out and speaking out against this abuse, thank you for
doing so. Thank you for your courage.
And for those of you who have shared your own stories of
being victimized. That takes extraordinary courage. It would be
easy to say, I don't want to talk about what happened to me,
and you have each made the decision that you are not going to
do that. You are going to take the harder path of describing
the pain and the hurt and what happened, and in doing that, you
are making a difference.
You are making a difference for young people. You are
making a difference for women. You are making a real step to
protect others and hopefully to prevent them from going through
what you have gone through.
I have to say, as the father of two teenage daughters, I am
horrified by what has happened to you, and I hope that your
telling your stories will help build momentum to drive Congress
to act. It is well past time to talk about this. We need to
act.
We need to take up this legislation. We need to vote on
this legislation. We need to pass it into law. And I believe
there needs to be real and serious consequences, criminal
consequences, for doing to you what has been done to you.
I want to ask each of you briefly a question. How important
do you think it is to hold big tech and websites accountable
for taking the content down?
Ms. Powell. So, I think it is immeasurably important
because I have met with now over 50 survivors of sexual
violence, both adults and minors, and a mix thereof, and we
identified three core areas of desperate need.
And the number one resoundingly is having their images
removed because when we think about traditional offline sexual
assault, which is also horrific, there is a post-traumatic
stress factor, right.
But you can't be post if you are always worried about the
future of your violence being exposed online. As I mentioned,
there are technological advances, Meta with StopNCII, but we
need to do more. And we also need to recognize that there are
ever emerging smaller actors and websites, and they too should
be held accountable.
So, I think without that, and I have been told directly by
tech companies without legislation they will do nothing. And it
is not that they are heartless. It is that they are trying to
avoid the Communication Decency Act Section 230, which my
colleague here had mentioned earlier.
So, it is really about bringing technology and law together
to support and listen to survivors.
Senator Cruz. Mrs. McAdams.
Mrs. McAdams. I think like what she said, it is very, very
important that we hold them accountable. Like I had mentioned,
this boy is going to go on with life and never going to be
fazed by what he did, but the girls will live in that forever.
And I like what you said about being retraumatized every
time you think that those images might be out there. So, if--we
need, as just common people that live every day, we need to be
able to get a hold of these tech companies and get them to take
our images down and not have to know someone in order to
protect our children.
Senator Cruz. Ms. Berry.
Ms. Berry. I think it is extremely important because as a
victim of AI deepfakes, it is--took a toll on my mental health
and AI crimes are just increasing, and they are getting more
accessible and more normalized. And so far, there is not much
that us as normal people can do. So, holding tech companies
accountable for allowing these images to still be up is really
important.
Senator Cruz. Ms. Mani.
Ms. Mani. I think--yes, I think it is important that they
are held accountable. However, if they have joined us,
realizing it is the right thing to do. But if they don't join
us, they--it is shown that they should be held accountable
because all those photos and pictures, will live on their
websites forever and I just don't think that is OK. And just to
be clear that these owners of these websites could take down
these photos in maybe less than a minute.
Senator Cruz. Yes. Ms. Toups.
Ms. Toups. Yes, I agree. I think that them being held
accountable and also being cooperative would have stopped my
photos from being shared so many times. And I think if you
are--if they are cooperative and they have the tools to do that
initially, then it prevents a lot of what we have all gone
through here today.
Senator Cruz. Well, and I will say you are telling the
story of waking up each morning and checking each day, afraid
that they would be posted again. I think that was truly
haunting and powerful, and it underscores the repeated
victimization that this produces. Mr. Turkheimer.
Mr. Turkheimer. I think it is imperative that tech plays a
role here. And if they won't do it voluntarily, they should be
required to do so. This is a problem that is not just
facilitated by them, but it is created by them. And the
proliferation of these images is not possible without them.
And so, they have to take a role in pulling this back.
There is not a situation where someone is creating a deepfake
on their own without the deepfake websites and then sending it
in the mail. That is not what is happening here. This is all
facilitated by them, and they have to be either part of the
solution or required to fix it.
Senator Cruz. And let me ask again each of the witnesses
very briefly and--do you believe that Congress passing the Take
it Down Act, making it a crime to post these images and
creating a Federal right to have these images removed, do you
believe that legislation would make a real difference in
fighting this threat?
Ms. Powell. Absolutely. When you think of the crime of
image-based sexual violence, and in particular deepfake abuse,
you have sort of like a grid here. You have the victim, you
have the initial abuser. You have the platforms that facilitate
the distribution of that abuse. And then you have law
enforcement.
Law enforcement can't serve what is not in the kitchen. If
there is not legislation on the books on a State and Federal
level, then they don't have the capacity to engage oftentimes
because they would like to and they can't.
This legislation creates a through line and honestly sends
an important message to tech platforms, and I don't care if
they are big or small to be honest with you, that we are taking
this seriously. Up until now, they have been in charge of what
gets taken down or not. And when good stakeholders get
involved, good things happen, otherwise it doesn't.
Those 9,000 websites that explicitly are designed for
deepfake abuse absolutely do so with the joy of knowing they
are causing harm, and they have to be held accountable, and so
do the platforms that host them.
Senator Cruz. And it is a great point that this applies to
big tech, but little tech and anyone else as well. Yes, Ms.
McAdams.
Mrs. McAdams. Yes. I think it is imperative to pass this
bill. What we experienced with our school district was that
they didn't know what to do, so they did nothing. Same thing
with our local Sheriff. They didn't have any laws, anything in
place, so they just did nothing.
So, this Take It Down Act is going to be the basis, the
starting point for now for us to be able to go into the schools
and work with our school student code of conduct and change
that. So, we are going to change it at every level and I think
this is the beginning part of that.
Senator Cruz. Ms. Berry.
Ms. Berry. As I mentioned before, these AI nudes are not
considered child pornography, so it doesn't count for anything.
So, allowing this bill to be passed, it allows victims to have
a sense of relief and safety, because these photos will be off
of social media within 48 hours.
But as it took 8 months, the spread of my photos could have
gone for many, many people. I still have no idea. But within 48
hours, I can be assured that it won't spread as much as it did,
as well as the person behind this act will get punishment.
Senator Cruz. Yes. Ms. Mani.
Ms. Berry. I have one word and the word is yes. I think it
is super important that this bill exists because it just
proves--it just helps protect us, and women and children.
Senator Cruz. Ms. Toups.
Ms. Toups. Yes, I agree that this is vital. I know that
there have been variations of laws passed across states to, you
know, try to combat this. But I think this really encompasses,
like all victims and it is catching up with the times because,
you know, when the laws were beginning to get passed, we didn't
have the AI problem. And I think the take down provision is
really vital because the sooner you can get them--you know,
ideally, they don't go up.
Senator Cruz. Right.
Ms. Toups. The sooner you can get them down, the less
likely they are to spread. And then also the mention of the
threat of the posting.
You know, that is just as bad. Living under the threat that
someone is going to post it is just as harmful to victims as
the posting. And so, I think it is a great bill and I hope that
it goes through.
Senator Cruz. Right. Mr. Turkheimer.
Mr. Turkheimer. Yes, I think it is absolutely important. I
think that big tech, little tech, middle sized tech, none of
them are really prioritizing the protection of victims. And
having a criminal recognition of this crime is a recognition of
the harm that it caused. It is a recognition of the victims
that are produced.
And to get tech to take these things seriously in order to
take them down, in order to recognize them, in order to remove
them, it requires that sort of criminal element of the charge.
And so, yes, absolutely.
Senator Cruz. Great. So, now I am going to ask questions of
each of you individually, going a little bit further into the
testimony that you prevented--presented. And I will start, Ms.
Powell, with you. Thank you for sharing your story and thank
you for advocating, advocating for victims. Can you talk about
how image-based sexual abuse has changed over the years?
Ms. Powell. Absolutely. So, many of you here today are
probably familiar with the avalanche of deepfake abuse that was
created of Taylor Swift, Billie Eilish, and Ariana Grande.
What many don't know is that much of the bodies that were
used, along with their faces in this nonconsensual sexual
violence act, were the bodies of teenage and adult young women
in what is infamously known as the ``Girls Do Porn'' case, in
which girls were lured into San Diego, and then ultimately
sexually assaulted and filmed during those sexual assaults.
While those perpetrators have been ultimately held
accountable, sort of, what that shows is what was off--what was
nonsynthetic abuse but obviously quite horrific is now being
repurposed for new abuse content of other individuals.
So, what has changed is there is these overlapping
intersections of different types of image-based sexual
violence. And what is also I think quite important to point out
is that while these nudify apps and other sources are easily
found online, as I demonstrated in my testimony, it means that
young people are being exposed to this violence in other
epidemic ways, and it is shifting the narrative to think that
things like this are OK.
And I think as we are talking about big tech and we are
talking about law enforcement, we are talking about
perpetrators, I mean, just think about the bystanders and what
kind of bystander do you want to be. And I think image-based
sexual violence is challenging that.
Senator Cruz. I think that is very powerful. Right now, in
your experience, are websites making it a priority to take
these images down?
Ms. Powell. Some websites are, particularly those who are
compelled to do so or have received excessive heat to do so,
like the Pornhub website.
But we need to make sure that it doesn't--again, it doesn't
matter your size or what is going on, you need to have a
simple, clear understanding that this is about the survivor and
their consent.
And it is not up for debate whether or not you want to take
it down. But right now, it is very much at the whim of those
sites.
Senator Cruz. And I know you also have deep expertise in
working with survivors of human trafficking. Can you share a
bit about how image-based sexual abuse manifest in trafficking
cases?
Ms. Powell. Absolutely. And this goes for both minors and
adults. Traffickers just like everyone here, use the internet.
And they quickly learned in the early 2000s that they could
take photos of their victims and use those photos, one, to
coerce them into not leaving for fear those photos being
distributed.
Two, use those photos to market to individuals who wanted
to buy them, either without their consent or if they are a
minor. And then ultimately--and I know this personally. There
was a young woman who was mentally delayed.
She was a young--in her early 20s, and she was in my office
one day, and I was looking through a website called
backpage.com looking for victims and working with law
enforcement and I saw her photos. Well, I knew that that was a
her because she was down the hall in my office and a coloring
book.
So, I contacted the website. They basically laughed in my
face. I contacted law enforcement and they said, what is the
big deal, you got her out anyway. And what I responded was
with, but the reason they are using her photos is because she
looks very young.
And two months later, a 16-year-old pregnant girl walked
into my office, and she was the girl the trafficker was
selling. So, they were manipulating her images long after she
was out of her trafficking situation.
So, when we think about, it is a coercive control. If you
think something horrible is going to happen to you, it is more
likely you will stay in a bad situation, particularly if you
are a child, for fear of not wanting more bad things to happen.
And so, that is how I see the intersection and where I
think we need to have a coalescence of advocates and leaders in
this conversation.
Senator Cruz. Thank you, Ms. Powell. Mrs. McAdams, thank
you again for sharing Elliston's experience. Thank you for
everything you are doing to making sure that this never happens
to anyone else's daughter.
You said that you first learned these images were being
distributed on Snapchat. Can you talk more about your
experiences reporting the images of your daughter to Snapchat,
and what kind of response you got?
Mrs. McAdams. Yes. So, we started with the school, of
course, and letting them know, and then went to Snapchat. Once
we had figured out who the young man was, we pressed charges
with the Sheriff's Department.
So, we did a warrant. And then I continued to go on to
Snapchat and try to get somebody to respond. So, you can't talk
to anybody. You really can't e-mail anybody. You just send in a
form and you kind of explain what is happening, and then you
never hear back from them.
And I was told by the Sheriff's Department that don't
expect to hear back. You are probably not going to hear
anything back, even though we did a warrant. So, we really felt
like our hands were tied. We weren't going anywhere. We were
getting no response.
Senator Cruz. Well, and I will say, and you told this story
some when you and I sat down in my office along with Elliston
three weeks ago and right before we introduced this
legislation, and we sat, and you told your story to me of what
had happened. I asked in the course of the conversation, I
said, are these images still up? And you said, yes, they still
won't take it down.
I have been trying for eight and a half months and they
won't take it down. And I turned to my team that was in the
office, and I said, I want you to get on the phone with
Snapchat this afternoon, and if need be, put me on the phone
with the CEO of Snapchat and we are going to get these pictures
taken down right now.
Within an hour, they were taken down. Now, frankly, it is
ridiculous. It should not take a sitting Senator getting on the
phone to pull those pictures down. You should have the right.
If Elliston were a global music star, if she were Taylor Swift,
this would be pulled down, and they should be pulled down of
Taylor Swift.
But they also should be pulled down for every teenager in
Texas and every teenager and every victim in America. You
should have the same right. And what is infuriating also is
that demonstrates Snapchat knows exactly how to pull it down.
It wasn't difficult.
They didn't have any confusion about how to do it. It just,
when you complained on behalf of your daughter, it was not a
priority for them. That needs to change. You also talked about
the school's response to this issue. How did the school
communicate to you and to the other parents about what
happened?
Mrs. McAdams. They really didn't. That was the hard part of
it, was we really had to be--I had to kind of be that mama
bear, constantly calling, constantly asking for meetings. You
know, my husband and I went to the school board.
I mean, I couldn't get a response back from them. It took
weeks and weeks for them to actually get back in touch with me.
And we have still been working all semester. I have been trying
to get with them to just write something in the student code of
conduct, so we are kind of ahead of this if it happens again.
So, I think that their approach to it was, we don't know
what to do, so we are going to hope you are going to go away.
And so, that approach can't be how they attack it. Elliston and
the girls were the third case that I know of. I am sure there
is more at the school, but all of those were just kind of swept
under the rug because they didn't know what to do with it. So,
that is one of my pushes.
And I will say that when we got that word within 24 hours
from your office that they had been taken down, I just cried. I
mean, finally we got somewhere, and somebody heard us. But it
shouldn't be that way. It shouldn't have to go on that long.
Like you said, anybody should be able to--as a parent,
protect my child, but also just a student be able to say, hey,
I want my pictures off of there.
Senator Cruz. Well, and so you have worked with the school
district now in helping draft policies or what is the status?
Mrs. McAdams. No, I can't get them to respond to me. So, we
have had a couple of meetings set up and then for some reason
they have fallen apart. And now we are--you know, we are into
the summer. I still have not heard back from them.
So, I had to really push to even have a follow up with the
school board. And then when I did, their response was more, we
need to protect him. We have a right to him. And that was very
maddening because he did what he did, and our daughters were
the victims. So, we really--through this whole process, I felt
like I really didn't have a voice.
Senator Cruz. Well, that is infuriating. One of the things
I hope, when the Take It Down Act is enacted into law, is
making this conduct a Federal crime and a felony. I hope one of
the things it will prompt is school districts across Texas and
across the country to adopt clear policies against criminal
conduct.
That should be one of the next steps in response to this
legislation. Can you also just describe your--you are obviously
a mom who loves your daughter deeply. What was it like as a mom
to go through this experience and what were the--what was it
like for you personally as a mom?
Mrs. McAdams. It really was devastating, to be honest. As a
parent, you always think you are going to be able to protect
your child and we had no way to navigate through this. So, as
she is crying and she is going through every day--she is a very
social person, loves everybody, never meets a stranger.
And I just saw her kind of resort into kind of a shell and
kind of become somebody that she really isn't. Since he has not
been back this semester, I have seen her kind of come out a
little, but she is always going to live kind of in the fear
that something could appear.
And then for me as a parent, I still--I don't have anything
any way to protect her. Anybody could do this. And so, this is
one step closer to, as a parent, being able to protect my
child.
Senator Cruz. Well, and even though the images are down
from Snapchat right now, you don't know if someone has copies
saved on their phone. You don't know what other copies might
appear sometime in the future.
Mrs. McAdams. Yes.
Senator Cruz. Thank you. Ms. Berry, let me say and
Elliston, we have spent a little bit of time together. You have
got a lot of courage. And by the way, Elliston, both you and
Francesca, you all are doing an awesome job. You are both
really articulate, and brave, and you are telling your story
powerfully.
So, thank you for both of you in particular. Elliston, tell
us, how did you feel when you first found out what had been
done to you?
Ms. Berry. I was terrified. It was something I had never
even thought could happen, and seeing that these images were on
my phone, I was shocked.
I was scared and I couldn't even--I didn't know what to do.
It was terrifying knowing that these photos are going around
without my consent and without my knowledge. And this was
happening to my friends, and I felt hopeless.
Senator Cruz. Now, you had a friend group that you that you
went through this together. How did the other girls who were
targeted, how did they deal with this?
Ms. Berry. I mean, all of us were 14, 15 and we had just
entered high school. And seeing that these images were going
around, it scared all of us. It is--we are a freshman in high
school, and it just felt like our whole four years of high
school was all coming down in that moment, because we had
always heard that your freshman year is the most important and
it keeps you on track.
And two months into the school year, it is all falling
apart. And thankfully, I am so blessed to have that support
around me. I felt supported and I didn't feel alone, and I
think having a group was very beneficial.
But some of the many victims don't have that, and they
don't have anybody to run to or to comfort them and help them.
And I think it is really important to allow everyone to know
that this is happening, and you aren't alone if this is
happening.
Senator Cruz. Well, look, and I hope your experience of
speaking out and fighting to change, change the law, I hope
that helps you gain ownership of what happened and gain
control. I think that will be part of moving on and I hope that
it is. And so, thank you for doing that.
Ms. Berry. Thank you.
Senator Cruz. Francesca, thank you also for being here.
Thank you for your courage. Can you tell us also, when you
first found out what had happened, how did it feel? What went
through your mind when you first found out what had happened?
Ms. Mani. Well, you know, at first I was shocked and, you
know, I did cry a little bit, but in the end I was just really
mad because the school, my school didn't have any updated AI
school policies.
And I think that is very important because if I had that AI
school policy, I wouldn't be here sitting and stating the
obvious because AI school policies protect us in our schools.
And I think every school in the U.S. should update their AI
school policies.
Senator Cruz. And am I right that you now know the boys who
created these images?
Ms. Mani. Yes.
Senator Cruz. And did they face any consequences?
Ms. Mani. Well, one of the boys did--what I am aware of, he
did only face one consequence, and that was being suspended for
one day. And the other boys didn't get any punishment.
Senator Cruz. And you are still going to school with the
boys who did that?
Ms. Mani. Yes. They are attending my classes, which is
completely unfair. And I just wanted to say also that, you
know, my principal is a woman. She is a mother. And she should
be sitting here right next to me, fighting for laws, protecting
her students against what has happened. And she is--just the
whole school administration didn't handle this job correctly.
Senator Cruz. And Elliston, you also found out the boy who
developed these images, although it took some time. Is that
right?
Ms. Berry. Yes, sir.
Senator Cruz. And he was ultimately transferred to another
school, so you are not going to school with him?
Ms. Berry. Yes. He was suspended for around a semester, and
we were told he had the ability to come back to school, but his
parents had taken him out.
Senator Cruz. And this you said was a boy you didn't know
terribly well. He was in class with you, but it wasn't--it
wasn't someone you knew very well. Is that right?
Ms. Berry. No, sir. I had classes with him, and he was just
a peer of mine, but never had an actual sort of friendship or
anything like that.
Senator Cruz. Francesca, how were the images shared with
your classmates?
Ms. Mani. They were shared through Snapchat.
Senator Cruz. And what was the experience you and your mom
had dealing with Snapchat and trying to get these images
removed?
Ms. Mani. You know, our lawyers tried to take them down and
they--it took a while for them, and it was never taken down
except from your team--your team helped them take them down.
Senator Cruz. Ms. Toups, so your images, your experience is
a little bit different from Elliston and Francesca. And you
have been dealing with this for over a decade. That is a long
time to be dealing with this. Can you speak about how the
awareness around posting nonconsensual sexually explicit
images, how that awareness has changed over time?
Ms. Toups. Yes. We have come a long way. You know, when it
happened to me, I was so blown away that this was something
that was going on that people would do.
And I remember, you know, speaking with the police. Like it
was a shock to them. They didn't know what to do because again
there was nothing for them to reach for and rely on as far as a
law.
When I talked to my State Representative at the time, I
remember him, you know, leaning forward as I was talking and
then just leaning back and, like, just utter shock. So, it took
a lot of me having to educate people. Well, first I had to
figure out what was going on to myself, and then understanding
the issue and then helping others understand.
And it was really difficult because it was still kind of
new and the people that it had happened to understandably
didn't want to talk about it because it was traumatizing. And
then, you know, when you did speak out publicly, there was a
lot of backlash and a lot of harassment that came with it.
So, it was hard to kind of get the message out there of
what was going on and get people to understand it.
Senator Cruz. And at the time it initially occurred, did
people understand how serious it was or has that understanding
grown in the past decade?
Ms. Toups. I think once they wrapped their head around what
I was explaining--because it was so foreign to them at the
time.
And so, I think once they wrapped their head around that,
they did understand the seriousness. The thing that I think was
the underlying thread for everyone is--was privacy. Like how--
like that is such an invasion of privacy. How is this happening
in the first place? But also, how come no one is responding and
stopping it?
Senator Cruz. And are the images still online today?
Ms. Toups. As we sit here today, I am not sure. At some
point I had to stop looking. You know, there was a point when
they were so vast across, you know, so many places.
And the investigator that was helping me worked really hard
to just constantly go after them--as we sit here today, I don't
know. I stopped looking because I just no longer wanted to be
controlled by that, but I wouldn't doubt it.
Senator Cruz. And can you describe a little more your
experience trying to get websites to pull them down and how
they responded?
Ms. Toups. Yes. So, my--as I stated, my initial response
was to tell the website because I just figured, you know, it is
user generated. They didn't know any better. And that didn't--
that didn't work out.
And so, the police--and then my next step was the law
enforcement, and they didn't really have any way to stop that.
And then the first lawyer sent the letter, which was ignored.
And then I went to the investigator, and he did whatever he did
to kind of get them temporarily down and then they came back
up.
And they were ultimately only ever removed from the
original site by the courts when we--when I ended up hiring an
attorney and filing a lawsuit.
Senator Cruz. And were you ever able to figure out who had
posted the images, where they came from?
Ms. Toups. I was not. We were able to figure out who
started the website and who was, you know, operating it. But I
was never able to figure out--you know, I had a lot of thoughts
of how--because some of the photos that were posted had never
been sent to anyone.
So, you know, obviously, I went into thinking, you know,
was my computer hacked? Like all the things. So that was, you
know, another scary thing. Some of them had never been shared
with anyone. But no, I don't--I still don't know.
Senator Cruz. Thank you. Mr. Turkheimer, thank you for
traveling to Texas today. I am sorry it is so hot. This is
summertime in Texas. We can't help that. But come in November
and it will be beautiful, and you can get shorts, and you can
go play golf at it.
There is a tradeoff we get. Your organization, RAINN, plays
a very important role in protecting victims of sexual assault
abuse by running the National Sexual Assault Hotline. It is my
understanding that RAINN has received millions of calls from
victims of sexual violence over the years. How often does RAINN
receive calls and reports about image-based sexual abuse?
Mr. Turkheimer. That is a really good question. And because
of this field hearing, I asked people on the hotline how many
in the past few shifts they had been receiving. So just sort of
in the last week or less. And they just did a quick Excel
document for me, and there are 20--sorry, there are 36 in the
last few shifts and----
Senator Cruz. That is in about a week, is that the period
or what--?
Mr. Turkheimer. Not exactly sure what the exact period is,
but yes, is not very long.
Senator Cruz. So, it is a few days. It is not months.
Mr. Turkheimer. No, it is not months. And they talk about
where it is happening to them, whether it is on TikTok, or
Facebook, or Snapchat. There is a lot of Reddit on here. And
what often happens in these situations is, is some of these are
non-consensual image sharing.
Some of these are deepfakes, and some of these are
deepfakes that are intended to get the visitor to the hotline
who is complaining about this to create self-created ceasing.
And there are two parts to the hotline.
There is a phone number and there is also an online
hotline. The online hotline is primarily used by people that
are under the age of 18, and all of these reports are coming
from the online hotline.
Senator Cruz. So, do you have any data or sense of the
frequency that these nonconsensual images are targeting minors
as compared to adults?
Mr. Turkheimer. I don't have any really good data on it. I
just sort of have the anecdotal data on it. Everything that
comes off the hotline is anonymized, so I don't have the ages
that are associated with these. But the fact that they are
coming from the online hotline does suggest that they are
trending younger, and certainly that they are--people that are
under 18 are experiencing it.
Senator Cruz. And how often does RAINN see image-based
sexual abuse in connection to other physical crimes like sexual
assault?
Mr. Turkheimer. Yes, it is a really good question, and it
comes up relatively frequently. A lot of times people are
calling the hotline, disclosing things that happened. Their
first disclosures to the hotline and the person they are
talking to is hearing about the whole story.
And the whole story sometimes involves a sexual assault and
then pictures of that sexual assault. And then sometimes it is
the other way around where someone has been groomed and
sextorted and ends up being sexually assaulted after image-
based abuse.
So, there are sort of countless ways in which sort of an
image-based abuse, which is sexual abuse, ends up being an in-
person contact offense.
Senator Cruz. And so, you mentioned that the sextortion
component of it. Can you elaborate a little bit on any
information about how frequent this problem is?
Mr. Turkheimer. Yes. So, a lot of what happens for children
online, in sort of in grooming situations or in sextortion
situations, is the person will--the person seeking the survivor
create CSAM, self-created CSAM, does so by pretending to be a
child themselves.
And they will send CSAM, purportedly of themselves, and try
to get the child to create their own. And oftentimes, that is
difficult to do for these people. And the reason why is because
a lot of CSAM is sort of marked through photo DNA. And so,
these sites will see it and remove it.
And so there is this market to create a new CSAM that is
evading these censors, which is why the AI creation of CSAM is
so important to these people that are looking to trade CSAM.
The second part of it is, and they use the fake CSAM to get
kids to create real CSAM.
And so, you get that sort of relationship that is sort of
exploding these numbers because as people have attempted to
remove CSAM from the internet, it has created this market for
new CSAM to be made. And so, that is sort of the kind of the
economic demand that has created this supply.
Senator Cruz. Well, and we have heard testimony in the
Senate about teenagers targeted with sextortion, where they may
have been tricked into taking an explicit photo or several
explicit photos, and then the person to whom they sent it
threatens to make it public, make it public to the world. And
in some instances, those teenagers take their own life and
commit suicide. Have you seen that potential pattern?
Mr. Turkheimer. That is something that we have seen, and it
is something that is happening more and more frequently. The
people that are creating the CSAM--or excuse me, the people
that are soliciting the CSAM and inducing and coercing teens
into creating it don't care what happens to these kids at all.
All they are trying to do is empty their bank accounts, and
they will keep doing that until there is nothing left. And a
lot of these kids don't realize that this isn't--that there are
ways back from this, and so end up taking their own lives,
which is why it is so important for them to have some sort of
agency, to have some method to combat this, which is why I
think this bill is so important.
Senator Cruz. In your experience, have you seen how the
nonconsensual posting of these images, affects victims? What is
the impact on the victims?
Mr. Turkheimer. You know, we see people call the hotline
because they are feeling depressed. They are feeling--calling
the hotline because they are feeling anxious, they are feeling
targeted. They don't want to leave their homes. They are having
their life taken away from them.
And that is the common response when you feel that you have
been violated. When you feel like your body is not your own.
And it is so important for those people to have their agency
restored. For those survivors to have their agency restored. To
have something that they can do.
And right now, there isn't a tremendous amount that they
can do. It is very difficult to take back that control, which
is why it is so important and there needs to be some method
that allows them to do so.
Senator Cruz. Thank you. Ms. Powell, I noticed you nodding
emphatically during much of what Mr. Turkheimer was saying. And
so, I want to give you a chance to elaborate. I am going to
give everyone a chance if there is any kind of final closing
observation you want to make.
We are going to wrap up shortly, but I will give everyone a
chance. Just if there is something you don't want to leave
without having said, you will have that opportunity now. But
Ms. Powell, we will start with you.
Ms. Powell. Yes. I wanted to close with the story of a
young man I worked closely with.
When Matthew Herrick was an emerging actor in New York, his
early 20s, he began a dating relationship. Quickly realized the
person he was dating was very controlling and ended it. That
individual made over thousands of daybreak fantasy sites on a
website called Grinder. Matthew tried to get the content down,
as over 1,000 men, and 1,100 men came to his house, his work,
attempting to assault him.
The more he ran, the more he screamed, the more they
thought it was part of the game because of the way the ads were
written. He was targeted so badly he had to move country. He
joined forces with, I believe, the most indomitable attorney in
this field, Carrie Goldberg. Together they went all the way to
the Supreme Court trying to get Matthew's content off Grinder.
To this day, it is still there, and he is an
internationally renowned advocate still living in fear. And he
is one of the bravest people I know. And I will close with
this. There was another young woman who was desperate as a
young actress, had $50 in her bank account. Was told by a
photographer, take these images and I will pay you, but no one
will ever know it is you. They even signed a document.
Within two years, her images were all over the world and
became one of the most infamous nude images in Playboy
magazine. That was in the 1950s and that young woman's name is
Marilyn Monroe.
We have been fighting this for that long, and it is time we
actually do something.
Senator Cruz. Wow, that is horrific and powerful. Mrs.
McAdams.
Mrs. McAdams. Yes, I would like to say something about,
since all this started and everything, we have been getting all
this traction with it, I have had at least three or four moms
reach out to me with their daughters, having the same thing
happen to them.
Even to the point that like he was talking about somebody
shared her images, then groomed her, and then he showed up and
raped her. So, it is like, whoa, I can't even--stories that are
coming out. And so, I just think that this is a good starting
point to maybe stopping this abuse against our children.
Senator Cruz. Ms. Berry.
Ms. Berry. I just wanted to say that no student or anybody
for that matter should fear for their safety, which is why this
bill is so important because it ensures the protection of all
victims and allows schools to take issues like this seriously.
Senator Cruz. Thank you. Ms. Mani.
Ms. Mani. I just want to say that schools around the U.S.
should update their school AI policies, so students know how to
protect themselves against it. And I just think we should have
more laws being put in place to help protect these women and
children.
Senator Cruz. Ms. Toups.
Ms. Toups. I just wanted to say thank you to yourself and
to the other Senators who have signed onto this. I understand
that, you know, the world changes and moves pretty swiftly, and
we have to continue to revamp our laws to keep up.
But I didn't expect to still see this happening, you know,
a decade after we tried to combat it the first time. So, I am
hoping that, you know, this Federal law will cover more victims
and deter more, and give--like he was saying, you know, give
people the ability to be able to be themselves again and to,
you know, take back control of their lives.
Senator Cruz. Mr. Turkheimer.
Mr. Turkheimer. Yes. I am very grateful for your advocacy
in this area, and I think that survivors past, present, and
future will be grateful for it as well. There are not great
methods for taking back control and there absolutely need to
be.
And when you have a situation where, you know, you try to
upload a SNL clip that is recorded on Saturday night and you
report you said you tried to upload it on Sunday morning, and
it immediately gets taken down, and images that they know are
child sexual abuse or images they know are of sexual abuse just
get replicated over and over and over again is a situation
where the platforms are prioritizing the rights of NBC
Universal over people like this, and it is going to take an act
of Congress to change that.
Senator Cruz. Well, thank you. I want to thank all of our
witnesses today for being here, and especially Elliston and
Francesca, as teenagers, telling your story that you
demonstrated a lot of bravery today, and I think a lot of
people heard you. And that is true of all the witnesses.
It is my hope that your stories remind other girls and
other boys who are going through this that you are not alone
and that Americans across this country are standing with you to
end this horrific abuse.
As we have said throughout this hearing, image-based sexual
abuse is not new. You heard from Ms. Toups that she has been
fighting for justice in her case for over a decade, and your
resilience is inspiring. Every witness today knows this is
exhausting and an emotionally draining fight.
I am confident that the bipartisan support we have for
tackling this issue will cause Congress to act and act
decisively to address nonconsensual, sexually exploitative
images online. Perpetrators will be held accountable, and
victims will have an additional tool, enforceable notice and
takedown.
There are multiple bills pending in Congress to address
this problem. Many of them are good and positive bills that I
support. What the Take It Down Act does that is unique is
number one, make it a crime, a Federal crime, and it addresses
in particular not just actual images that are posted
nonconsensual, but also deepfake images that are created using
AI, and it addresses both of them. But it also contains the
notice and takedown provisions.
And as Mr. Turkheimer was noting, there is a model we used
to draw from, which is the existing laws governing copyright.
And that came from the Digital Millennium Copyright Act that
places a Federal obligation on tech platforms that if you post
a clip of SNL, if you post a clip from The Lion King, if you
post a George Strait song online, you will see it pulled down
and pulled down very, very quickly because Congress has put a
Federal obligation on these platforms to pull down copyrighted
material.
This bill takes that exact same mechanism, they know how to
do it, and it applies it here to non-consensual intimate
images. That you have the same obligation. If the individual
depicted in the image has not consented, they have a right to
have it removed. I am hopeful that this field hearing will help
build momentum.
We have a broad bipartisan coalition behind this bill, and
I am hopeful we see in the coming weeks the Commerce committee
mark it up and that we move to the floor and pass it into law.
And you don't have to take it from me, I think our witnesses
today have provided powerful testimony on why Congress needs to
act and act now.
I want to thank each of the witnesses for your courage,
your clarity, your power. The State of Texas and our country
owes each of you a debt of gratitude. Thank you for fighting to
make a difference for women, for girls, for victims throughout
the country. And with that, this hearing is adjourned.
[Whereupon, at 4:11 p.m., the hearing was adjourned.]
[all]