[Congressional Record Volume 170, Number 17 (Tuesday, January 30, 2024)]
[Senate]
[Pages S279-S280]
From the Congressional Record Online through the Government Publishing Office [www.gpo.gov]



                    Online Child Sexual Exploitation

  Mr. DURBIN. Mr. President, tomorrow, our Senate Judiciary Committee 
will hold a landmark hearing. For the first time, the CEOs of five Big 
Tech companies will testify about the crisis of online child sexual 
exploitation. This continues our committee's bipartisan work to combat 
the dangers children face online. This has been one of my top 
priorities as chair of the committee.
  Last February, we held a hearing on kids' online safety. Six 
witnesses testified about online child sexual exploitation, cyber 
bullying, addictive online platforms, and the collection and sale of 
children's sensitive personal data. At the hearing, I noted there were 
no witnesses from the tech companies present. I promised they would 
have their chance.
  Tomorrow, the CEOs of Discord, Meta, Snap, TikTok, and X will 
testify. I thank my friend and fellow colleague Senator Lindsey Graham, 
ranking Republican on the Judiciary Committee, for his bipartisan 
cooperation in establishing this hearing. The CEOs of Discord, Snap, 
and X will testify pursuant to subpoenas issued by the committee. This 
follows their repeated refusal to testify voluntarily. I look forward 
to hearing from these companies about what they are doing to make their 
platforms inaccessible to child sex offenders.
  As recently as last week, some have launched new child safety 
measures that are long overdue. If you were watching the playoff 
football games over the weekend, you heard some of these same companies 
advertising they have now discovered a new way to protect children. 
Could it have something to do with their appearance at the hearing? We 
will see.
  But it shouldn't take a hearing before the Senate Judiciary Committee 
to finally get these companies to prioritize child safety. Because 
these changes are half measures at best, I welcome the opportunity to 
question them about what more needs to be done.
  There have been recent troubling reports about how each of these 
platforms is being used by offenders to target children or trade child 
sexual abuse material. Some reports even detail how the platforms 
promote exploitive behavior. Let's be really honest about this. Some of 
these algorithms are more powerful than any parent--that is for sure--
and some of the techniques that are used by these platforms to 
encourage and lure children into situations where they are in danger 
are well-documented and researched.
  There have been recent troubling reports about these platforms being 
used to target kids in the most heinous ways. Some details about this 
show that the companies are actually promoting this kind of behavior. 
The National Center on Sexual Exploitation has named each of these 
companies to their annual ``Dirty Dozen List'' for facilitating child 
sexual exploitation.
  I am sure every Member of the Senate has heard from constituents, 
friends, and family members about the harm Big Tech is inflicting on 
our kids. Tomorrow, the Senate Judiciary Committee will demand answers.

[[Page S280]]

  Hearings are important, but it is clear that we need legislation, 
because the tech industry has failed, on its own, to protect our kids. 
They are protecting their profits, but they are not protecting our 
children. Last year, the committee unanimously reported five bills to 
combat the crisis of online child sexual exploitation. One of the bills 
I introduced is my bipartisan STOP CSAM Act, which will end Big Tech's 
free ride and allow victims to finally hold these companies accountable 
for their failure to stop online child sexual exploitation. CSAM is an 
acronym for ``child sexual abuse material.''
  Since the earliest days of the internet, companies have been allowed 
to act with near impunity. American families harmed by Big Tech's 
decisions have no means of redress. To illustrate how dangerous this 
is, consider a change Meta made last month that carries grave 
consequences for children. Every year, Meta submits tens of millions of 
CyberTips to the National Center for Missing and Exploited Children, 
known as NCMEC, concerning CSAM found on its platforms. Each CyberTip 
involves a victim of exploitation, like a child being sexually abused 
in a photo that has been traded endlessly online or a child who is 
being coerced, extorted, groomed, or sold for sexual purposes.
  In December, Meta announced it is rolling out end-to-end encryption 
by default on its Facebook and Messenger platforms. Because of this 
change, Meta will no longer be able to use certain tools to detect 
and report child exploitation. Encryption can be a valuable tool for 
protecting privacy, but it is alarming for a company to kneecap their 
own work to stop online child sexual exploitation.

  According to press reports, Meta employees warned internally that 
this would greatly diminish the company's ability to identify online 
child exploitation, and child protection advocates and survivors 
immediately sounded the same alarm. NCMEC called Meta's adoption of 
end-to-end encryption ``a devastating blow for child protection.'' 
NCMEC and other advocates are imploring Meta to pause the rollout until 
it demonstrates the encryption switch won't cause children harm. That 
is all they want: for Meta to be sure it won't hurt kids.
  This highlights the unacceptable situation we find ourselves in. 
There are no tools to hold companies accountable. Instead, survivors 
and advocates are left to plead with these companies to choose safety 
over profit.
  The Phoenix 11, a group of CSAM survivors, powerfully expressed their 
rage about this situation in a letter they recently sent to the 
committee. They wrote:

       As survivors, we bear the consequences when decisions are 
     made that prioritize profit over children . . . If Meta no 
     longer reports these crimes against us, we alone suffer the 
     consequences.

  This is a profoundly disturbing situation. In no other sector of 
society would we permit one company to make an unreviewable decision 
that puts millions of American kids at risk. But for almost 30 years, 
section 230 of the Communications Decency Act has protected the tech 
industry from accountability for the damage it has done.
  You have to look far and wide to find companies or industries that 
are exempt from liability under the law, civilly or criminally. This is 
one of those. The law was enacted to allow a fledgling industry to 
grow, but now it has become an entitlement for the most profitable 
industry in the history of capitalism to line their profits at the 
expense of kids.
  Every available metric suggests that online child exploitation is 
getting worse. In the year 2013, NCMEC received approximately 1,380 
CyberTips per day. Ten years later, in 2023, this skyrocketed to 
100,000 CyberTips per day. Think about that for a second: 100,000 
reports of sexual abuse per day.
  There has also been a dramatic increase in the number of victims per 
offender, who can use technology to ensnare a shocking number of 
children without even leaving their homes. A single defendant 
prosecuted in Minnesota sextorted over 1,100 children--one person, over 
1,000 kids.
  What does that consist of? They lure these kids and groom them to the 
point where they send photographs of themselves that are way too candid 
and expose things they shouldn't. Then the person says: If you don't 
want me to put this on the internet, you have to pay me.
  This fellow had extorted in that kind of situation over 1,000 kids 
before he was finally brought to justice. That is the status quo that 
Congress protects if we do nothing.
  Everyone needs to do their part to stop this gross injustice. That 
includes Congress finally enacting legislation that holds the tech 
industry accountable when it fails to protect children. That is why the 
Judiciary Committee will hold its landmark hearing tomorrow. That is 
why I will continue to work to bring the Stop CSAM Act and other 
critical bills to protect kids to the Senate floor.
  Mr. President, back in the day, before I was elected to office, I was 
a trial lawyer in small-town America. I made a nice living. I took 
cases to trial of a different stripe. I am not saying that I was part 
of the system of justice in this country, but it turns out I was. The 
fact that people face accountability for their wrongdoing and could end 
up losing in court is another incentive to do the right thing in your 
life. Here we have a situation that is clearly, clearly out of control. 
What is happening is beyond the reach of the most conscientious parents 
in America.
  I am lucky to have some wonderful grandkids. I have two who live in 
New York. They are 12 years old. I am really proud of them. Their mom 
worries about them--and their dad as well--every single day, as they 
spend way too much time, by their estimation, on screens. They try to 
encourage them to do the right thing and make sure that they never 
communicate with people they don't know or provide information or 
anything else that they shouldn't. But the parents can't be sure that 
always works--nobody can. They want to do the right thing for their 
kids.
  I told my daughter I was having this hearing. She said: Dad, when you 
get these execs in front of you, ask them what they do to protect their 
own kids--their own kids who could be exploited and they wouldn't even 
know about it.
  It is a legitimate question. I don't know if I will be asking it 
tomorrow. It depends on the circumstances. But it is something that 
every family across America would like to know: What are you doing, 
Senator, to protect our kids? It is getting worse instead of better. 
Can you change the law to help us?
  It is up to us to decide. I hope tomorrow's hearing is the beginning 
of a conversation on a bipartisan basis.
  I yield the floor.
  The ACTING PRESIDENT pro tempore. The Senator from Texas.