[Senate Hearing 119-205]
[From the U.S. Government Publishing Office]
S. Hrg. 119-205
PROTECTING THE VIRTUAL YOU:
SAFEGUARDING AMERICANS' ONLINE DATA
=======================================================================
HEARING
BEFORE THE
SUBCOMMITTEE ON PRIVACY,
TECHNOLOGY, AND THE LAW
OF THE
COMMITTEE ON THE JUDICIARY
UNITED STATES SENATE
ONE HUNDRED NINETEENTH CONGRESS
FIRST SESSION
__________
JULY 30, 2025
__________
Serial No. J-119-35
__________
Printed for the use of the Committee on the Judiciary
[GRAPHIC NOT AVAILABLE IN TIFF FORMAT]
www.judiciary.senate.gov
www.govinfo.gov
__________
U.S. GOVERNMENT PUBLISHING OFFICE
61-893 WASHINGTON : 2026
-----------------------------------------------------------------------------------
COMMITTEE ON THE JUDICIARY
CHARLES E. GRASSLEY, Iowa, Chairman
LINDSEY O. GRAHAM, South Carolina RICHARD J. DURBIN, Illinois,
JOHN CORNYN, Texas Ranking Member
MICHAEL S. LEE, Utah SHELDON WHITEHOUSE, Rhode Island
TED CRUZ, Texas AMY KLOBUCHAR, Minnesota
JOSH HAWLEY, Missouri CHRISTOPHER A. COONS, Delaware
THOM TILLIS, North Carolina RICHARD BLUMENTHAL, Connecticut
JOHN KENNEDY, Louisiana MAZIE K. HIRONO, Hawaii
MARSHA BLACKBURN, Tennessee CORY A. BOOKER, New Jersey
ERIC SCHMITT, Missouri ALEX PADILLA, California
KATIE BOYD BRITT, Alabama PETER WELCH, Vermont
ASHLEY MOODY, Florida ADAM B. SCHIFF, California
Kolan Davis, Chief Counsel and Staff Director
Joe Zogby, Democratic Chief Counsel and Staff Director
Subcommittee on Privacy, Technology, and the Law
MARSHA BLACKBURN, Tennessee, Chair
LINDSEY O. GRAHAM, South Carolina AMY KLOBUCHAR, Minnesota,
JOHN CORNYN, Texas Ranking Member
JOSH HAWLEY, Missouri CHRISTOPHER A. COONS, Delaware
JOHN KENNEDY, Louisiana RICHARD BLUMENTHAL, Connecticut
ASHLEY MOODY, Florida ALEX PADILLA, California
ADAM B. SCHIFF, California
Ben Blackmon, Republican Chief Counsel
Dan Goldberg, Democratic Chief Counsel
C O N T E N T S
----------
OPENING STATEMENTS
Page
Blackburn, Hon. Marsha........................................... 1
Klobuchar, Hon. Amy.............................................. 2
WITNESSES
Butler, Alan..................................................... 10
Prepared statement........................................... 32
Goodloe, Kate.................................................... 5
Prepared statement........................................... 33
Responses to written questions............................... 76
Levine, Samuel................................................... 12
Prepared statement........................................... 48
Martino, Paul.................................................... 8
Prepared statement........................................... 56
Responses to written questions............................... 80
Thayer, Joel..................................................... 7
Prepared statement........................................... 72
Responses to written questions............................... 84
APPENDIX
Items submitted for the record................................... 91
PROTECTING THE VIRTUAL YOU:
SAFEGUARDING AMERICANS' ONLINE DATA
----------
WEDNESDAY, JULY 30, 2025
United States Senate,
Subcommittee on Privacy, Technology,
and the Law,
Committee on the Judiciary,
Washington, DC.
The Subcommittee met, pursuant to notice at 2:47 p.m., in
Room 226, Dirksen Senate Office Building, Hon. Marsha
Blackburn, Chair of the Subcommittee, presiding.
Present: Senators Blackburn [presiding], Klobuchar, and
Schiff.
OPENING STATEMENT OF HON. MARSHA BLACKBURN,
A U.S. SENATOR FROM THE STATE OF TENNESSEE
Chair Blackburn. The Subcommittee on Privacy, Technology,
and the Law will come to order. And Senator Klobuchar is on her
way. She'll be here in a couple of minutes, but we will go
ahead and begin since we do have five witnesses. And we thank
each of you for giving your time and being here today.
Today, we are going to put our attention on what I think is
one of the most consequential issues up for discussion when we
talk about the virtual space, and that is how does each and
every individual American preserve their privacy and their
personal data in the virtual space? The title of the hearing,
Protecting the Virtual You: Safeguarding American's Online
Data.
This speaks to what is becoming a growing connection
between you, and the physical space, and what you are doing
each and every day in your transactional life, in the virtual
space, or the digital version of your yourself. And this comes
through how companies collect, track, and monetize your data.
And every single bit of that is done without your consent or
your knowledge.
In today's economy, data is currency. Everything from your
shopping habits to your health information, your children's
online activity, to your political views can be identified,
sold, and resold, often with little transparency or recourse.
Meanwhile, consumers are left to decipher lengthy privacy
policies and click ``agree'' at the bottom of the page even
before they can begin to access any online service.
The absence of a comprehensive national data privacy
framework has left millions of Americans vulnerable. While
numerous States have enacted privacy laws, the result has been
a patchwork that fails to provide the clarity, consistency, and
confidence that consumers and responsible businesses need and
deserve. For years now, I have been clear we need a national
privacy standard that is comprehensive and enforceable, one
that empowers consumers, promotes innovation, and ensures
accountability. It should prioritize transparency, minimize
data collection, and provide meaningful consent, not just a box
to check.
We have a panel of witnesses here this afternoon who all
agree that there is an urgent need for a comprehensive bill.
Now, there's probably going to be some disagreement about how
we get to that national standard, but we can agree on one
thing; it is past time for Congress to take up this issue, to
take action to pass a bill and see that bill signed into law.
We should also acknowledge how closely this issue is tied
to the safety of our children online. Senator Blumenthal and I
have worked diligently on the Kids Online Safety Act, which
would require platforms to design their product for children's
well-being in mind, not just for their bottom line. We've seen
time and again how data-driven algorithms target kids with
addictive content and expose them to harmful material. Business
models that profit from children's vulnerabilities must be
reined in.
It is absolutely disgusting that our children are the
product when they are online. And through the Open App Market
Act that I introduced with Senator Klobuchar, I have worked to
increase competition and consumer choice in the digital
marketplace. Whether it's protecting your personal data, your
right to download the apps you want, or your ability to access
services, the common thread is this; users not tech giants
should be in control of the individual users' life.
Today's hearing will explore core principles that should go
into a national data privacy framework that reflect American
values. We'll ask what categories of personal data deserve
background protection? How can we give consumers real control
over how their data is used, and how do we ensure that AI
systems which are only growing more powerful or accessing and
using consumer's data and information in a responsible way?
As artificial intelligence becomes increasingly embedded in
everyday life from how we shop to how we work, communicate, and
make decisions, Americans deserve to know when, where, and how
their data's being used to shape their online experiences. We
have an opportunity and a responsibility to get this right, and
I am looking forward to your testimony today, and to the
questions that we will have as we move forward.
Senator Klobuchar, you're recognized.
OPENING STATEMENT OF HON. AMY KLOBUCHAR,
A U.S. SENATOR FROM THE STATE OF MINNESOTA
Senator Klobuchar. Well, thank you very, Chair Blackburn,
and thank you to all of our witnesses. And I'm really grateful
for your leadership on these issues, Madam Chair, and your
willingness to work with me, and Senator Blumenthal, and many
others.
We all know new technologies have made it easier for people
to monitor their health, collaborate with colleagues,
communicate with loved ones, and more, but Federal law doesn't
do enough, as we all know, to address the privacy that come
with these innovations, the privacy concerns. Technology
companies collect an enormous amount of personal information
about our daily lives. They know what we buy, who our friends
are, where we live, where we work and travel, even how much we
would be willing to pay for something.
Yet, for too long the Big Tech companies, many of which
dominate the market that they operate in, have been telling
American consumers, ``Just trust us,'' even though their
business models are designed to collect personal information
and to use it for profit. The bottom line is that we are the
product, we are and that's how many tech companies make their
money, and a lot of it.
In 2024, Google and Meta earned a combined $420 billion in
advertising revenues alone, and they made a lot more money
because Americans lack privacy protections. And American's data
earned Meta $68 in a single quarter last year. Think about
that. All these people who don't realize that they're being
tracked. But a European Facebook user with a comprehensive
privacy protection only generated $23. And that money can be
used for a lot of other things that people need right now.
And it seems like every day we hear a new story about
companies playing fast and loose with data and taking advantage
of customers. Earlier this year, a whistleblower from Facebook,
now Meta, testified to another Subcommittee about how the
company would track users so closely that it could identify
when teenage girls felt emotionally vulnerable and then target
them with ads exploiting these emotions. For example, when a
teenage girl would delete a selfie, Facebook might serve her an
ad for diet products.
Criminals also view huge troves of data as attractive
targets for hacking. We've seen major data breaches ranging
from the 2017 Equifax data breach that exposed sensitive
financial information from more than 140 million individuals,
to the hack of Change Healthcare affecting 190 million people
and causing more than 100 electronic systems vital to the U.S.
healthcare system to be shut down.
On my way here, I was on the phone with the mayor of St.
Paul, Minnesota, because they, like so many other
jurisdictions, are responding to a targeted cyberattack on
their IT infrastructure, which has shut down some of the city's
digital services and may have compromised city employee data.
Once in the hands of criminals, data can be used for
everything from identity theft to more serious crimes and we
all learned too tragically with the horrific murders in my
State of my good friend Melissa Hortman, the former speaker of
the House and her husband, Mark, how accessible personal data
is including people's addresses because the murderer only
killed the people and went to the houses, the people whose
addresses he had.
Businesses are also using personal data collected across
the internet in novel ways such as to set individualized prices
designed to increase costs for consumers. Should a person--and
this is a question we have to ask as Senators really have to
submit to this kind of intrusive data collection just to send a
message to a friend online, or to book a flight, or to order
some diapers. I don't think so.
That's why more than 20 States have stepped in. I suspect
today we'll hear from some of our witnesses about the patchwork
of State laws. I agree it's a problem, but I believe we should
have passed privacy legislation many, many years ago. I
advocated for it back then. We tried, and in fact, in 2019, I
introduced a comprehensive privacy bill. I was a co-sponsor of
Senator Cantwell, and Kathy, McMorris Rogers, a former
Republican House Member.
The bill would've required companies to collect only the
information necessary to provide the goods and services that
consumers sought. Insured consumers consented before their
personal data was shared with third parties and put consumers
in control of their data by allowing them to access, correct,
and even delete personal data.
But many of the businesses that today complain about the
burden of complying with the patchwork of State laws, I have
the advantage of having been there then even before Maria
Cantwell's bill was introduced when the companies were lobbying
against a Federal privacy law, and now they're back complaining
about the patchwork of laws. And I would like to change that,
but I do think it's important to know that's why we're in the
position that we are and to understand why some of these States
are looking at this going, ``Wait a minute.''
The need for Federal privacy reform is even more urgent as
AI continues to expand its role into our lives. Data is both
the gasoline and the engine for AI models. That means that
demand for our data is skyrocketing. So, it is critical that we
set guardrails to ensure the data that powers AI is responsibly
sourced, and used for legitimate means, and protected when you
want to have it protected.
Luckily, there is a bipartisan agreement that Congress
needs to act. The Commerce Committee on which Chair Blackburn
and I also sit has seen a strong bipartisan, bicameral proposal
for Federal privacy reform. Not everyone agrees with all of
them, but there has been some start out of that Committee, and
I look forward to hearing from our witnesses about why we need
these guardrails now.
Thank you, Senator Blackburn.
Chair Blackburn. I thank you and our witnesses. Ms. Kate
Goodloe is managing director at the Business Software Alliance,
where she develops policies on privacy, AI, and law enforcement
access. She also taught AI law at the GW Law School. Prior to
her time at BSA, Ms. Goodloe was a senior associate at
Covington & Burling focusing on privacy and cybersecurity. She
earned her JD from the New York University School of Law. We
welcome you.
Mr. Joel Thayer is the president of the Digital Progress
Institute and founder of Thayer, PLLC. He has represented
clients before the FCC, FTC, and Federal courts on issues
relating to telecom law, data privacy, cybersecurity, and
competition policy.
Before that, he has held positions at the App Association,
the FCC, the FTC, and the U.S. House of Representatives. Since
earning his JD from American University Washington College of
Law, he has been recognized as a Super Lawyers Rising Star for
his work in communications law and digital policy.
Mr. Paul Martino is a partner at Hunton Andrews Kurth, LLP.
He has nearly 25 years of experience in public policy and
government relations specializing in privacy, data security,
AI, e-commerce, and tech. Mr. Martino is the founder and
general counsel of the Main Street Privacy Coalition.
Before joining Hunton, he served as VP and senior policy
counsel for the National Retail Federation and co-chaired the
Privacy and Data Security Task Force at Alston and Byrd. After
earning his JD from the University of California, Berkeley
School of Law, he served as Majority counsel on the Senate
Commerce Committee for, then Chairman, John McCain.
Alan Butler is the executive director and president of the
Electronic Privacy Information Center. Before his role as
executive director, he managed EPIC's litigation and amicus
program where he filed briefs before the U.S. Supreme Court and
other appellate courts in privacy and civil liberties cases.
After earning his JD from UCLA School of Law, he was admitted
to the DC Bar and the State Bar of California.
Samuel Levine is a senior fellow at the Berkeley Center for
Consumer Law and Economic Justice. He previously served as
director of the Federal Trade Commission's Bureau of Consumer
Protection. Prior to his role at the FTC, Mr. Levine served as
an attorney advisor to Commissioner Chopra as an attorney in
the FTC's Midwest Regional Office, and as an assistant attorney
general in Illinois. After earning his JD from Harvard Law, he
clerked on the U.S. District Court for the Northern District of
Illinois.
We welcome each of you for being here. Now, I'm going to
ask you to rise and raise your right hand.
[Witnesses are sworn in.]
Chair Blackburn. And we will note that everyone has
answered in the affirmative. Okay. Ms. Goodloe, you are
recognized for 5 minutes, and we'll go right down the line.
STATEMENT OF KATE GOODLOE, MANAGING DIRECTOR, BUSINESS SOFTWARE
ALLIANCE, WASHINGTON, DC
Ms. Goodloe. Good afternoon, Chair Blackburn, Ranking
Member Klobuchar, and Members of the Subcommittee. My name is
Kate Goodloe, I'm managing director at the Business Software
Alliance, or BSA.
BSA members create the business-to-business technologies
used by companies across industries. Privacy and security are
core to our members' operations. I commend the Subcommittee for
convening today's hearing, and I thank you for the opportunity
to testify. The United States needs a strong, clear,
comprehensive consumer privacy law. BSA has been a longtime
supporter of adopting a Federal privacy law.
Americans share their personal information online every
day, whether we shop online, use apps to track our workouts,
take ride shares, or host video calls with friends and family,
we provide personal information to a broad range of companies.
Consumers deserve to know their data is used responsibly.
In our view, a Federal privacy law should achieve three
goals. First, it should require companies to handle consumers'
personal data responsibly, and assign obligations to companies
based on their role in handling that data. Second, it should
give consumers new rights. And third, it should create strong
consistent enforcement.
I want to focus on that first goal. To create the right set
of obligations, a privacy law must recognize different types of
companies handle consumers data. Those companies must all adopt
strong but different safeguards to effectively protect
consumers. Most importantly, not all companies are consumer-
facing.
BSA represents the business-to-business technology
providers that work for companies across the economy. An online
store that sells clothing for example, will rely on a series of
business-to-business technology providers. It may use one to
manage customer service inquiries, another to track deliveries,
and a third to protect its data against cybersecurity threats.
Each company must protect the personal data it handles, but
companies need to take different actions to effectively protect
consumers because they play different roles in handling their
data. Laws should not create a one-size-fits-all obligation.
Treating an online store and its cybersecurity vendor alike
doing so actually creates new privacy and security risks for
consumers.
Now, this is something that States get right. Twenty
States, both red and blue, have adopted comprehensive consumer
privacy laws and those laws are remarkably consistent. All 20
reflect a fundamental distinction between two types of
companies that handle consumer's data and assign strong but
different obligations to each.
The first are controllers. These companies decide how and
why to collect a consumer's personal data, and State laws give
them obligations about those decisions, including; telling
consumers how and why they process data, responding to consumer
rights requests, asking for consent to process sensitive
personal data, and minimizing the collection and use of data in
the first place.
The second type are processors. These companies have a role
of handling data on behalf of a controller, and State laws give
them a common set of obligations, too. Those include;
processing data pursuant to the controller's instructions,
entering into a contract with a controller, handling the data
confidentially, and giving controller the information it needs
to conduct privacy assessments.
These roles reflect the modern economy. They're not unique
to State laws and they're not new. The distinction between
controllers and processors dates back more than 40 years, and
it underpins privacy laws worldwide. It must be part of any
Federal privacy law.
In addition to putting obligations on companies, the
Federal privacy law should create new rights for consumers and
strong consistent enforcement. Here, too, you can look to
States. There is widespread agreement on consumer rights. All
20 States give consumers rights to access, delete, and port
their personal data. Nineteen, also give a right to correct
inaccurate data. States also create similar enforcement
mechanisms, with all 20 giving a leading role to the attorney
general to enforce privacy violations.
I look forward to discussing consistent aspects of these
State laws, but I want to say that consistency may not last.
This year, we've seen a striking interest in amending existing
laws to revise, expand, and change their protections and new
obligations coming through rulemakings.
A Federal law is needed to bring consistency to existing
protections and to create broad long-lasting protections for
consumers. A Federal law should not weaken protections already
provided by the States, but extend those protections to
consumers nationwide.
There is significant common ground between industry and
civil society stakeholders on comprehensive Federal privacy
protections. We look forward to working with Congress on these
issues.
Thank you, and I look forward to your questions.
[The prepared statement of Ms. Goodloe appears as a
submission for the record.]
Chair Blackburn. And well done right at 5 minutes. You get
a gold star on that. Mr. Thayer.
STATEMENT OF JOEL THAYER, PRESIDENT,
DIGITAL PROGRESS INSTITUTE, WASHINGTON, DC
Mr. Thayer. I'll try to emulate it. Thank you, Chairwoman
Blackburn, Ranking Member Klobuchar and esteem Members of this
Committee for inviting me to testify and holding this important
hearing. My name is Joel Thayer, and I'm the president of the
Digital Progress Institute.
It's a think tank based in Washington, DC, focused on
promoting bipartisan policies in the tech and telecom space.
Ensuring privacy for all is a founding principle of the
institute. And as such, I very much appreciate the Committee's
commitment to building out a privacy framework that further
assures that the integrity and ownership of our digital selves
remains in our domain, not by a company with a domain name.
Although our privacy from our Government is well
established, that is unfortunately not the case with respect to
companies. With the allure free services, we provide details
about our most intimate selves to trillion-dollar tech
companies who in turn make enormous profit off the data they
collect. They know everything about us; what we like to eat,
when we sleep, where we live, where we are, our beliefs, and
even our fears. Curiously, though they claim our age confounds
them, but let's set that aside for now.
A recent Pew study shows that 73 percent of Americans feel
they have limited to no control over how companies use their
personal information. And the reality is they don't. We sign
privacy policies that are filled with so much legal jargon that
it may as well be unintelligible to the average person, and
presto, our data is now their data.
The problem is not just they sell our data to third party
advertisers, but also to those who use our data to create fake
images, curate bias newsfeeds, conduct elaborate scams, and
even engage in espionage campaigns. In short, we are not in
control, and Americans are right to be concerned. And with the
advent of AI, this trend is only going to increase. It's no
wonder why 85 percent of people want more privacy protections.
We need government intervention here. The good news is that
protecting privacy is a bipartisan issue. Indeed, 20 States
across the political spectrum have passed privacy laws, and as
evidenced by this hearing, Congress appears poised to address
this issue again. We welcome this much needed development.
With that in mind, here are a few high-level suggestions as
the Committee evaluates paths forward. First, it's important to
define your goals and keep the framework targeted at
accomplishing its goals. One of the primary issues with
previous attempts at passing meaningful privacy laws has been
that bills attempt to do too much all at once. We have seen the
most success in legislation that has clearly articulated goals
with targeted solutions.
It's why the institute has supported targeted bipartisan
measures such as the Protecting Americans from Foreign
Adversary Controlled Applications Act--that's a mouthful--the
TAKE IT DOWN Act, the Kids Online Safety Act, the App Store
Accountability Act, and Oama just to name a few.
As we have seen in the EU's GDPR, overly sweeping privacy
laws have the unintended consequence of entrenching incumbents.
The GDPR should be a cautionary tale for the U.S. because it
clearly shows that privacy regulations without market
guardrails can seriously exacerbate today's competition issues
we have with Big Tech.
Second, enforcement matters. In our experience, agency
actions or attorney general enforcement are the most effective,
whereas a private right of action alone may act more as a
carrot as opposed to a stick given these companies' seemingly
endless teams of lawyers and budgets.
For instance, the Texas attorney general recently secured a
$1.4 billion settlement against Google for violating its
privacy law, whereas when consumers sued Apple under
California's privacy law, in part for sharing recorded
conversations that included personal health information with
their physician to medical ad companies, they were only
entitled to a meager $95 million. Worse, consumers won't see
about a third of that because that's reserved for their
lawyers.
Third, the broader the Federal statute, the more important
preemption will become. That's because targeted legislation is
less likely to run into differing State privacy regimes. Any
preemption framework should be clear on what it is preempting
and should reserve rights for State attorney general
enforcement.
Key areas though ripe for preemption are addressing basic
definitions like; what does personal information mean, the
creation of data rights. It seems to be unanimous amongst all
State privacy laws, and of course, be specific with what data
management practice we seek to prohibit. In some, the reality
is that if these Big Tech companies cared about user privacy,
they would protect it. Frankly, it's in their interest not to.
Congress needs to act. Once again, I would like to thank the
Subcommittee for allowing me to testify, and I welcome any
questions you may have. Two seconds on this one.
[The prepared statement of Mr. Thayer appears as a
submission for the record.]
Chair Blackburn. There you go. Well done. Mr. Martino, the
pressure is on.
[Laughter.]
STATEMENT OF PAUL MARTINO, GENERAL COUNSEL,
MAIN STREET PRIVACY COALITION, WASHINGTON, DC
Mr. Martino. Thank you, Chair Blackburn, and Ranking Member
Klobuchar, for the invitation to be here today. I am Paul
Martino, a partner at Hunton Andrews Kurth, here in Washington,
and I serve as the general counsel for the Main Street Privacy
Coalition.
Our coalition members represent a broad array of companies
that line America's main streets. They interact with consumers
each day. They're found in every town, city, and State,
providing jobs, supporting our economy, and serving Americans
as a vital part of their communities.
Collectively, Main Street businesses directly employ
approximately 34 million Americans, and contribute $4.5
trillion to our Nation's GDP. Since 2019, the coalition has
supported Federal privacy legislation that would establish a
single nationwide law to protect the privacy of all Americans.
Where we sit here on Capitol Hill today, we can travel to
two States in 20 minutes by car or metro. Just like many
Americans who live in tri-State areas or near State lines,
should Americans privacy rights change as they drive from DC
into Maryland or Virginia? They do right now, but many don't
know that Americans expect their privacy to be protected the
same everywhere.
Our coalition members share a strong conviction that a
preemptive Federal privacy law will benefit consumers and Main
Street businesses alike. It would give consumers confidence
that their data will be uniformly protected across America
regardless of where they live or choose to do business. And it
would provide the certainty Main Street businesses need to
lawfully and responsibly use data to better serve their
customers online or across State lines.
Establishing a uniform national law that extends consumer
privacy rights and consistent privacy rules to all consumers
and businesses in America is a core principle for Main Street.
I will highlight two more. First, a Federal privacy law should
protect consumers comprehensively with equivalent standards for
all businesses. A privacy law should empower consumers to
control their personal data used by businesses regardless of
business type. Likewise, businesses must be permitted to
lawfully use data consumers share with them.
To better serve customer needs. To meet these goals, we
recommend a Federal privacy law that creates equivalent privacy
obligations for all businesses handling consumer data. This
would be a change from past Federal privacy bills that narrowed
obligations for service providers in Big Tech, telecom, cable,
and financial industries, relieving them from the same
obligations that apply to Main Street businesses.
For privacy laws to succeed for consumers, it is critical
for all entities handling consumer data to secure that data and
protect the privacy rights. This is true regardless of the
terms used in privacy laws that blur the reality of who
actually controls the data. The label ``controller'' which is
applied to every Main Street business that directly serves a
customer, can create a false impression about the power of Main
Street businesses as they interact with Big Tech service
providers.
Main Street companies control their relationship to
customers, a responsibility they value, but very few can
control how nationwide service providers operate and do
business. Powerful Big Tech and ISP service providers require
Main Street businesses to sign ``take it or leave it''
contracts that dictate the terms of their service. The myth
that Big Tech processors merely follow the instructions of the
typical Main Street business is not credible. Privacy laws
should not permit any industry sector to shift its
responsibilities onto another.
Ensuring equivalent data privacy obligations across
industry sectors is also inherently pro-consumer. Consumers
have the right to expect privacy rules. They can understand,
predict, and support that meet their expectations. Congress can
pass a law to ensure that all businesses protect consumer's
privacy, and processors cannot hide behind labels that make it
appear they have no control at all.
Finally, Federal privacy laws should hold accountable all
entities handling personal data with the same enforcement
mechanisms. This creates an even playing field with proper
incentives across industry. The law should encourage compliance
to protect consumers more effectively than gotcha lawsuits that
threaten Main Street businesses driving to be in compliance.
This is why State privacy laws thoughtfully couple government
notice with the opportunity to quickly correct or cure
mistakes.
Thank you, and I welcome your questions.
[The prepared statement of Mr. Martino appears as a
submission for the record.]
Chair Blackburn. And you came in with a few seconds on the
clock. You're in the lead. All right, Mr. Butler, we're going
to see what you can do here.
STATEMENT OF ALAN BUTLER, EXECUTIVE DIRECTOR AND PRESIDENT,
ELECTRONIC PRIVACY INFORMATION CENTER, WASHINGTON, DC
Mr. Butler. Thank you, Chair Blackburn, and Ranking Member
Klobuchar, and Members of the Subcommittee for the opportunity
to testify today about the need to better safeguard American's
online data.
My name is Alan Butler and I'm the executive director at
the Electronic Privacy Information Center. EPIC is an
independent, nonprofit research organization established in
1994 to secure the right to privacy in the digital age for all
people.
Twenty-five years ago, the Federal Trade Commission issued
a report to Congress based on its research of privacy risks in
the online marketplace. The takeaway was clear self-regulation
does not work, and we need legislation to ensure adequate
protection for Americans online.
In the decades since that report, we have seen our digital
world expand and develop in amazing ways, but without strong
privacy protections. We have seen an alarming expansion of
surveillance and data abuses online that threaten our rights
and subvert our most fundamental values of autonomy and
freedom.
The status quo is untenable. If the law allows a company to
scrape images of all of us to build a universal facial
recognition data base, while another company tracks every site
we visit to build invasive profiles, and yet another company
buys and sells our logs of daily movements, do we have privacy
protection at all? I believe any reasonable person would say
no, and would demand that our lawmakers step in to fix this
broken system.
In my testimony today, I will describe the current state of
State privacy law and identify the areas where Federal
leadership would be most impactful. Privacy is a fundamental
right and Americans deserve a law that actually protects our
data. In the absence of action by Congress, States have stepped
in to advance digital rights in the information age. This has
been an important catalyst for change, but there's more work
ahead to establish robust privacy standards.
There is significant bipartisan agreement across party and
State lines about the need for privacy protection in the core
principles that should shape the law. So, our attention at the
Federal level should be on establishing clear rules of the road
to make our digital world safer and more secure. What we cannot
do is pass a weak Federal standard that prevents States from
responding to new challenges and emerging threats in the
future.
A Federal privacy law should set a consistent and robust
standard for protection while preserving flexibility for States
in the future. Over the past 7 years, 19 States have passed
comprehensive data privacy laws and many States have also
passed bills aimed at preventing specific privacy harms. Most
of these State laws follow a common framework, and have many of
the key components of any modern privacy law.
But unfortunately, these laws do very little to actually
limit abusive data practices and to protect privacy. In a
recent report, EPIC analyzed these laws in detail and graded
each of them. Eight received Fs, and none received an A.
So, what went wrong? The tech industry has invested heavily
in State lobbying to water down the substantive protections,
narrow their scope and add exceptions that swallow the rules.
But over the last 2 years, we have seen stronger State
proposals building off the bipartisan framework that Congress
created in 2019 and 2021.
The Maryland Online Data Privacy Act, for example, passed
last year, it builds on existing State laws and incorporates
strong data minimization protections, and a ban on the sale of
sensitive data. Inspired by Maryland's success, 10 States have
introduced bills with strong data minimization rules this year.
Several States that originally passed weak privacy laws have
revisited and amended their laws to strengthen their
protection.
Any Federal privacy proposal should have a strong data
minimization rule, include heightened protections for sensitive
data, and establish robust enforcement mechanisms. Data
minimization offers a practical solution to our broken internet
ecosystem. Instead of allowing data collectors to dictate
privacy terms, data minimization rules set clear standards to
limit the processing of our data. Companies can collect the
data they need to provide the services we want. This standard
better aligns business' conduct with what consumers expect and
stops abusive data practices like third-party tracking and
profiling.
Enhanced protections can also ensure that our most
sensitive data remains confidential and secure. So, much
information about us that has traditionally remained private is
now captured in digital form; our health records, our
movements, our biometrics, and genetic markers, even the data
about our children. These records are frequently targeted by
hackers and scammers, and should be locked down and secure.
Strong privacy standards should also be backed up by robust
enforcement, including the three-tiered approach that we saw in
the Federal bill. And while State and Federal enforcement is
essential, the scope of data collection online is simply too
vast for any one entity to regulate, and that is why private
rights of action with enforceable court orders are so
important.
EPIC has been calling on Congress to pass a strong privacy
law to protect all Americans for the past 25 years. We are
grateful that the Subcommittee is turning its attention to this
important issue, and we urge Federal lawmakers to learn from
State's experience.
I thank you for the opportunity to testify today, and I
look forward to your questions.
[The prepared statement of Mr. Martino appears as a
submission for the record.]
Chair Blackburn. And Mr. Levine, you're recognized.
STATEMENT OF SAMUEL LEVINE, SENIOR FELLOW, UC BERKELEY CENTER
FOR CONSUMER LAW & ECONOMIC JUSTICE, NEW YORK, NEW YORK
Mr. Levine. Thank you, Senator. My name is Sam Levine, and
I'm a senior fellow at Berkeley Center for Consumer Law and
Economic Justice. Until January, I led the FTC's Bureau of
Consumer Protection.
Today, protecting Americans' personal information is about
much more than privacy. It's about whether we can afford
essential goods, whether we can be profiled based on our
political or religious beliefs, and whether the next generation
will grow up addicted to screens. I'll be focusing on three
real-world threats that unchecked privacy abuses are fueling
threats to economic fairness, democratic freedoms, and the
safety of kids and teens.
Let's start with economic fairness. On a recent earnings
call, Delta Airlines executives boasted they could soon raise
prices on plane tickets, not by adding value, but through a new
formula; stop matching competitors' prices, unbundle basic
services and charge each passenger the most they're willing to
pay.
Investors cheered the news calling this the Holy Grail, but
we should call it what it is; personalized price gouging. And
it's only possible because weak privacy protections are
allowing companies to track our behavior and predict how much
we can be pushed to pay.
This practice, also known as surveillance pricing, is
spreading. More and more businesses are looking to price
everyday goods from groceries to hardware the way airlines are
pricing tickets. And let's be clear, their goal is not to lower
prices, it's to charge each person as much as possible and the
people hit hardest will be those with the fewest options; a
parent buying baby formula, a senior filling a prescription, or
family booking last minute travel to a funeral.
Unchecked data collection is moving us from a world of one
product, one price, to one person, one price. And if we don't
act, the shift will be costly. Uncheck data collection is also
putting our democratic freedoms at risk. Last year, the Federal
Government alleged that an entity was tracking American's
movements and profiling them into categories like Wisconsin
Christian churchgoers, likely Republican voters, and restaurant
visitor during COVID quarantine.
This was not a foreign adversary. This was a U.S. data
broker. The FTC sued to halt these practices. That lawsuit
should be a wakeup call. No American should be profiled based
on their politics, their religion, or their stance on COVID
lockdowns. Yet, without strong data protections, that's exactly
what brokers are doing. Political and religious freedom cannot
thrive in a society where our movements, beliefs, and behaviors
are tracked, recorded, and then sold to the highest bidder.
We need to act. We also need to act to protect our next
generation. Over the past two decades, Big Tech has been
running a massive experiment on our children; what excites
them, what enrages them, and what holds their attention? The
result is a youth mental health crisis.
Weak data privacy is powering these harms. Social media
companies collect personal data to power their ad-driven
business models. More screen time means more revenue, and more
insights into how to keep kids hooked. It's a dangerous
feedback loop that profits from addiction and it's getting
worse.
Today, companies are building AI chatbots engineered to
earn kids' trust and keep them engaged. And that means serving
up content that's provocative, obscene, and sometimes
dangerous. One bot reportedly told a teen that self-harm feels
good. Another offered lesson on how kids can hide drugs and
alcohol, and how to set the mood for sex with an adult.
You might expect these incidents to prompt a pause, but the
opposite is happening. The same tech giants that have been
putting kids at risk for years are now racing to roll out AI
chatbots, and respectfully, they are doing so because Congress
is not telling them they need to stop. That must change across
each of these threats.
The common thread is weak data protection, but we can fight
back. Strong privacy laws can stop companies from using
personal data to set individualized prices, ban the profiling
of Americans based on sensitive information, and end the
surveillance that's fueling an endless cycle of harm to kids
and teens.
Thank you for holding this important hearing today, and I
look forward to taking your questions.
[The prepared statement of Mr. Levine appears as a
submission for the record.]
Chair Blackburn. And you win the gold medal.
Mr. Levine. Thank you.
Chair Blackburn. Yes. I think it was 23 seconds left. We're
going to move to questions, and Senator Klobuchar, I will let
you begin.
Senator Klobuchar. Okay, very good. Thank you very much.
So, as I discussed earlier, there've been a number of
bipartisan proposals for Federal data privacy law that have
been introduced over the years, including the American Privacy
Rights Act, and the American Data Privacy and Protection Act.
I guess, Mr. Butler, I will start with you. Why is it so
essential that we put reforms like these in place for consumers
across the country?
Mr. Butler. Well, thank you for the question, Senator
Klobuchar. I mean, we've seen what happens without Federal
leadership on privacy. Surveillance tools have become embedded
in every website and app that we visit. And without a Federal
standard, companies really don't have the incentive to innovate
on privacy protection and a few Big Tech firms dominate the
marketplace.
So, we're fueling harms to individuals, we're fueling harms
to the market, and we're just allowing ourselves to be
inundated by these surveillance and abusive data collection
practices.
Senator Klobuchar. Thank you. And Ms. Goodloe, in your
testimony, you highlight that there's broad consensus on many
privacy principles across the 20 States that have them both
Democratic-and Republican-led. I think Mr. Butler was
mentioning how some of the early laws were weaker. There have
been some improvements. What are the significant areas of
bipartisan consensus that should be at the core of Federal
privacy legislation?
Ms. Goodloe. Thank you for the question. We see a lot of
consensus on the right set of rights to give to consumers both
affirmative rights like the ability to access, correct, and
delete their information, and on giving them rights to opt out
of certain activities, including the sale of their data
profiling and targeted advertising. I think there is consensus
among many most of these State privacy laws on that set of
important issues.
There's also a core set of obligations on companies for
controllers. It's things like asking for consent to process
sensitive data. We have 17 States that require companies that
are processing sensitive data to conduct privacy assessments,
looking at the sensitive issues arising from that processing.
And when it comes to processors, there is broad consensus
that they have a separate set of rights to handle data on
behalf of a controller pursuant to their instructions and to do
so confidentially.
Senator Klobuchar. Okay, thank you. Mr. Levine, while at
the FTC, you prosecuted unfair and deceptive acts and practices
related to data privacy, as well as other privacy laws like
those intended to pre protect young children.
Despite your efforts to use every legal tool at your
disposal to protect privacy, what gaps exist that are the most
critical for Congress to fill through a comprehensive data
privacy bill?
Mr. Levine. Well, thank you for the question, Senator. And
as you alluded to in your remarks, we currently live under a
privacy regime where companies have taken the position that
they can basically do whatever they want so long as they
disclose it in their privacy policy.
Over the last 4 years, the FTC, we took a number of steps
to try to push back against that. We told GoodRx they couldn't
share sensitive medication information with Facebook even if
consumers clicked ``Yes.'' We told Better Health they couldn't
share with advertisers what mental health treatments people
were seeking. We told Amazon Ring that its employees couldn't
spy on people who were using their security cameras.
But I can tell you, Senator, that every case we brought,
when I would meet with counsel for those companies, they would
tell us the same thing, ``Well, we put it in our privacy
policy, so it's legal.''
I think our enforcement, the FTC's enforcement, and State
enforcement, and privacy enforcement would be far more
effective with bright-line rules on what companies can collect,
how they can use it, and with whom they can be shared. Without
that, you're going to continue to see a whack-a-mole approach
that doesn't do enough to protect Americans' privacy.
Senator Klobuchar. Thank you. Very good. Mr. Thayer, I've
long advocated for common-sense rules to require the platforms
to allow competing businesses the same access to the platform
that they give themselves. Senator Blackburn has advocated for
similar reforms in app store markets, but as you mentioned in
your testimony, dominant platforms use privacy concerns as a
pretext to avoid opening up their platforms to fair
competition.
How can interoperability requirements be implemented
without putting user privacy at risk?
Mr. Thayer. Thank you for the question, Senator, and also
thank you for your work that you do on this. And also, Senator
Blackburn, you guys have been real champions on this issue, and
I think it really does highlight the significant aspects in the
concentration that this market involves, where we have
basically four players, maybe three in some markets, or maybe
even two in others, particularly an app store where you really
have--you're at the behest of or at the whim of whatever these
companies want you to do. So, you're basically stuck with
whatever privacy policies that they decide on.
And so, a good example of this is the software we're seeing
at the DOJ, with AG Gail Slater at the helm, where she's been
arguing on the remedies case. And the first argument that you
got from Google was like, ``Hey, you can't do this sharing
arrangement because it'll violate privacy.'' But in reality,
what they really care about is scale. They want to harbor the
data. They don't really care about the privacy at all. It's
really all a ruse.
Senator Klobuchar. And how can a strong Federal privacy law
help ensure that interoperability opens up digital markets to
competition?
Mr. Thayer. So, I really point to the idea of a general
statute versus a specific statute. And as you know, Senator,
the antitrust laws are pretty broad, and so are Section 5 of
the FTC Act. Being able to designate exactly what we're
interested in and target the actual acts that we're concerned
with will help regulators down the road.
And this is precisely what Mr. Levine was alluding to when
bringing that broader framework out. If we say interop is
something that we all believe is something that could equal out
or balance out the scales, then it gives the regulator the
ability to assess it in that way instead of using vague
statutes.
Senator Klobuchar. Okay. Last question, Mr. Martino. As you
know, I was close friends of John McCain, and miss him very
much. In your written testimony, you say that businesses should
not be responsible for the data privacy practices of other
entities whose actions they cannot control, including the Big
Tech platforms on which we know many businesses now have to
rely to reach consumers.
How can Congress ensure that responsibility is aligned
properly with the entities best suited to protect consumer
privacy?
Mr. Martino. Thank you, Senator Klobuchar. Well, I think
the core principle we have here is that businesses need to have
equivalent requirements, equivalent standards to protect data.
There's a chart in my testimony that----
Senator Klobuchar. You like charts, huh?
Mr. Martino. Yes, I like charts [holds up documents]. I
didn't make it real big though, [laughter]. Sorry. But it's it
shows some of the State law requirements for the Big Tech
service providers. And you'll notice there are a couple red Xs
here on things that I think consumers would expect and
businesses like Main Street businesses would expect their
service providers to do which is provide data security.
The State laws for the most part, except for Colorado, I
believe, don't require the Big Tech service providers to
actually secure the data they're processing on behalf of
businesses. They're only required to assist the controllers in
their own data security and if they have their own breach, but
there's a lack of parity there.
Another place that I'll mention where there's a red X and
again, you know, only I think Colorado and Connecticut have
done this, but processors use lots of subprocessors or
subcontractors, and they have requirements that any
subprocessors they share the data with has to meet the same
standards as the processor.
But, you know, they don't give the Main Street business an
opportunity to object to those subprocessors, to those
subcontractors. Only in two States that I'm aware of. And that
is a big difference between, for example, what happens in
Europe and what happens in the U.S. And so, if you have a
processor that you don't want to downstream pass on data to--
you know, think of some of the past breaches and privacy
violations we've seen before, you know, the Main Street
business should have the ability to object to that. So, we ask
for the similar requirements that Main Street businesses have
to live by.
Senator Klobuchar. Okay. Thanks. And thank you. Sorry to go
over.
Chair Blackburn. No, thank you. It is perfectly fine that
you went over. This is the first of our hearings that are going
to look at this virtual space. And as you all know, Senator
Klobuchar, and I've done a lot of work and trying to secure the
American citizens' privacy in the virtual space.
And as we work through this on this Committee, I think that
foundational to the conversations is who owns an internet
user's data and what is the scope of that ownership? Where does
it begin? Where does it end? And let's just go down the line,
Ms. Goodloe, starting with you, and everybody keep it under a
minute and answer that question so that we've got that for the
record.
Ms. Goodloe. Thank you for the question. Our companies
provide business-to-business technologies to other companies.
In many cases, their business customers own the data that they
store with business-to-business providers, and yet there may be
personal data that individuals own as well.
And those individuals should have rights like to access,
correct, and delete that information no matter whether it's
stored with a consumer-facing company or the business-to-
business provider processing it on behalf of that consumer-
facing company.
Chair Blackburn. Okay. Mr. Thayer?
Mr. Thayer. Given the lack of appropriate consent to
regimes, I would say that the user owns that data, because I
don't think that the way we have things set up right now the
data subject, doesn't even know that they've given over some of
that data.
And so, at the end of the day, I think the reality is that
we have to have privacy regimes in place to ensure that the
ownership to outline those particular contours. But it is 100
percent you own your data, and it shouldn't be the other way
around.
Chair Blackburn. Okay.
Mr. Martino. Thank you, Senator, for your question. It's a
very good question. There are some nuances here I think that
are important. First, Main Street businesses understand it's
the user's data and the user has the right to correct it,
delete it, remove it from their system. But there are some
kinds of data that is considered shared.
And so, for example, if you make a purchase in a store,
well, the store needs to keep a record of that purchase if you
want to do a return or an exchange for their inventory. So, is
it the consumers'--that this consumer made this purchase on
this date? Is that personal information? Yes. Is it also
information the business needs and can't just get rid of? Yes.
And so, I think when it comes down to ownership, we just
have to understand that in modern commerce and e-commerce, some
information will need to be retained, but only for as long as
it's necessary to retain it. And I think hopefully that answers
the question.
Chair Blackburn. Okay.
Mr. Butler. Thank you for the question, Chair Blackburn. We
believe that we all have a fundamental right to control when
our data is used and how it is collected. But individual
mechanisms of consent and control don't provide a complete
solution to this problem, and that's why we feel that it is so
important to have rules of the road that protect people's
privacy by default and align business collection and use data
practices with what consumers reasonably expect.
Chair Blackburn. Okay.
Mr. Levine. Thank you, Senator. I very much agree with Mr.
Butler. Data about people should be owned by people, but at the
same time, as Alan said, we don't want a world in which people
are solely responsible for protecting their own privacy. That's
why we need strong Federal protections that don't put the onus
on people, but put the onus on companies to make sure they're
not abusing people's privacy.
Chair Blackburn. Yes. It was over a decade ago that now
Senator Welch and I were in the House at Energy and Commerce
Committee--I know Mr. Martino remembers all of this--and we had
bipartisan legislation to establish a data privacy framework.
And of course, Big Tech fought it just all the way to today. We
still don't have it into law.
So, Mr. Thayer, talk for a minute about why Big Tech has
found it so vitally important to kill any effort to have
Federal online privacy?
Mr. Thayer. Because it's against their financial interest
to actually be regulated. I mean, that's the basic--that's the
obvious answer, but in reality, what you're pointing out, and I
think everyone on this Subcommittee has experienced, it doesn't
matter how tailored you make your legislation, it doesn't
matter how measured. They will find some reason and put
something forward.
If you want to do any trust reform, for instance, they'll
say there's a privacy violation. If you say there's privacy,
then we don't have to worry about competition. It's always this
game of Whack-a-Mole. And so, at the end of the day, they like
the way things are because it benefits them. The market is
basically created for them.
And so, I think this is exactly why we have strong
advocates fighting for things like the Kids' Online Safety Act,
where you have parents begging Congress to do something, and
we're seeing the harms play out right in front of us. I think
at this point, we've recognized that Big Tech is in the
``emperor has no pants'' moment and we are all starting to see
that; that we absolutely need the reforms.
And so, things like the Open App Markets Act are going to
be very helpful to quell any of those privacy concerns. The
Kids Online Safety Act, I think will do really do a lot to
measure targeted approaches that will ultimately help kids. But
again, I think that the waves are changing, and I think that
there--I'm very hopeful, and things that I'm seeing at the DOJ,
especially from the Trump administration to the Biden
administration out the gate to the new Trump administration, it
seems as if everyone has identified that these companies are
bad actors and they should not be trusted.
So, I hope whatever advocacy I can provide would be to
outline that this really just don't fall for the red herrings.
Ultimately, the side of right is to protect consumers, and Big
Tech has no interest in doing that.
Chair Blackburn. Ms. Goodloe, I want to come back to you.
In your testimony, you talked about State laws and the
importance of some of those State laws. I want you to define a
couple of the common elements that you have seen in the State
laws that could be transferred into a Federal law that should
be broadly supported and accepted.
Ms. Goodloe. Thank you for the question. I think the States
provide a lot of common ground for Congress to look to as it
works toward Federal privacy legislation. That common ground
exists on things like the consumer rights that we've talked
about today, rights to access, correct, delete, and port your
data to another service, rights to opt out of the sale of your
data, targeted advertising, certain types of profiling.
And States are unanimous on recognizing there are different
types of companies that handle consumers' data. One set of
obligations should be assigned to controllers who decide how
and why to collect a consumer's data, how to use it. And one
set of obligations should be put on the processors that handle
the data on behalf of controllers.
I also want to take a moment to respond to something that
Mr. Martino brought up about what those processors do when they
employ other subprocessors. Because in many cases, what
processors do is they collect a series of other subprocessors,
package it together, and are able to provide it to business
customers at scale so that their small businesses can enjoy the
economies of scale at being able to use cutting edge
technologies.
That means you are providing the same service to hundreds
or thousands of business customers. And letting one object to a
package of subprocessor doesn't work. That's why we haven't
seen the majority of States adopt that, which could actually
increase security risk to consumers when one of those
subprocessors has a breach and they have to go and ask
permission to change over the data.
But I think we do see broad agreement among the States
about the right set of consumer rights and obligations on
businesses to safeguard consumers' data, and to do so
effectively along with a common enforcement system that is a
regulatory-led enforcement system to ensure we have consistent
expectations for companies that want to comply with privacy and
security obligations.
Chair Blackburn. Mr. Martino, you wanted to respond?
Mr. Martino. Yes. Just on the point. And one thing to keep
in mind with the ADPPA, that was the predecessor to the APRA.
The way the definitions worked, a subprocessor was also defined
as a processor. So, once it got to a processor, there could be
this endless train of data sharing that the mainstream business
has no control over.
Well, that might be great for efficiencies of the services
that the main processor is providing. You know there's no check
on the downstream. And so, that's why all that we've been
pushing for was a simple notice to the Main Street business of
the subprocessor you are using and the right to object. It's
not like an opt-in that they can't go to them and they can't
provide these efficiencies.
So, that's just a--I mean, it's an ``in the weeds'' point.
But I think it's an important point because it's the Main
Street businesses that will be held liable under most of these
constructs because the same requirements aren't applying to the
processors and the same enforcement mechanisms aren't applying.
I'll make one last point. In the APRA, the private right of
action largely applied only to what are called the
``controllers'', but of course, these Main Street businesses
that can't really control the Big Tech companies. And it hardly
applied to the processors and it didn't apply at all to the
third parties.
So, I think we have to look at not just that these State
laws have requirements, but who's subject to them and who's
liable for those violations.
Chair Blackburn. Okay. You had additional questions?
Senator Klobuchar. Yes.
Chair Blackburn. Go ahead.
Senator Klobuchar. It's really an extraordinary panel, so
thank you. I guess I would start with you again, Mr. Butler.
Over time we've seen that these data privacy frameworks move
away from a notice and consent regime to focus on data
minimization and transparency, consumer control opt-out rights.
Why is notice and consent insufficient for protecting user
privacy?
Mr. Butler. Thank you for the question, Senator Klobuchar.
I think, notice and consent really takes us back to that self-
regulation point that was made in the FTC report 25 years ago,
because that's essentially what it is, right? It's a rule set
that says so long as you disclose in general terms what you're
doing, then the law permits it.
And of course, the incentives there are clear, you put in
your disclosure everything you could ever potentially----
Senator Klobuchar. That I never read.
Mr. Butler [continuing]. Do with that data----
Senator Klobuchar. Says the Senator who decided every
morning this week I'm going to spend 5 minutes pushing
``unsubscribe'' on my email, and I am still getting--I cut it
in half what I'm getting. Yes, it's a nightmare.
Mr. Butler. And it doesn't shift business practices.
Senator Klobuchar. I know, but it's just really sad. Okay.
Continue on, Mr. Butler.
Mr. Butler. And it doesn't shift business practices, and it
doesn't change anything about the surveillance that surrounds
us and the data collection that pervades, which is why a data
set of data minimization rules that better align the business
practices with the expectations of the users, and link the
collection and use of data to what the services that people are
actually requesting, I think better aligns with those reasons,
and is much a much easier way to solve the problem.
Then, as I mentioned earlier, the individual control
concept, which then requires us to all make thousands of
choices every second of every day and face popups and questions
in detailed settings.
Senator Klobuchar. Right. And then you pop the wrong one,
suddenly you're in something else.
Mr. Butler. Exactly.
Senator Klobuchar. Mr. Levine what barriers does today's
notice and consent, a regime that I was just talking to Mr.
Butler about for data privacy, create for enforcers who are
trying to protect consumers?
Mr. Levine. That's a great question, Senator, and I alluded
to it earlier. It's not only are data privacy cases, but so
many of the enforcement actions we brought at the FTC over the
last 4 years, we said, ``Look, you surprised consumers. You
misled consumers. You abused their data. You shared what
medication they were taking with Facebook.'' And the company
says, ``Hold up. We put it all in our privacy policy, and the
consumer clicked, `I accept,' before proceeding to use the
service.'' This is a total fiction. It's a total fantasy that
consumers can protect themselves by reading privacy policies.
And to Mr. Thayer's excellent point, we can draw a direct
line between Congress', in my opinion, inability to pass
privacy laws and Big Tech lobbying. This is the most valuable
industry in the history of the planet, and they have built
their revenue not by selling cars, not by selling oil, but by
collecting our data and predicting our behaviors. That's how
they've built their valuations. They don't want restrictions in
what they can collect, and that's why I think it's so important
Congress defy what they want and actually pass a strong bill.
Senator Klobuchar. Very good. Mr. Martino, in your written
testimony, you say that businesses should not be responsible
for the data practices. We already went over that, but I guess
my second question about that is just when you look at the
differences between--as we look at how we craft this Federal
law, and the States, and what's stopped us before, well, how do
you think we're going to get around that to get to a place
where we can get something done?
Mr. Martino. That's a great question. Thank you, Senator. I
do think that we start with where the strong consensus of State
laws have been. They have outlined, as Ms. Goodloe pointed out,
a set of requirements. Our issue has really been with who gets
exemptions, who's subject to the liability for violations, and
is the law taking care of it?
I would say one of the things you can take from the State
laws is that they realize there is this imbalance in
negotiating power between smaller Main Street businesses and
large Big Tech companies. So, they have taken the route of
putting statutory requirements in.
We're just asking that you build on that framework and add
a few more. One of the key issues on the APRA and the ADPPA
before it was--that on there was a big debate over data
minimization standards. And when the bill was originally
drafted, the ADPPA, it was applying to both covered entities
which are like the controllers or Main Street businesses as
well as the processors.
But processors and Big Tech did not support that bill until
that data minimization standard was changed to apply only to
covered entities or controllers, and that is a fundamental
difference.
I think while there are very good requirements in State
levels. And most of the States from the same place it is not
the case that everyone in the marketplace is handling data and
protecting data for consumers and honoring their rights to the
same level that is being put on the consumer-facing businesses.
And we think Americans expect that their privacy is the same
everywhere, as I said in my testimony. And we should have
requirements that make that happen.
In terms of the politics, you know, if Big Tech's been
fighting some of the previous bills that weren't so heavy on
them, it's going to be more challenging if bills are more
fairly and have equivalent standards more fairly balanced so
but.
Senator Klobuchar. One of our best arguments as we look at
the politics of this is on both sides is affordability. And Mr.
Levine, I know you did this study on how the collection of this
data can affect affordability. So, I look at some of the fresh,
new arguments we can make to convince our colleagues, which is
always fun to do, but we're doing better and better. Could you
tell us about that?
Mr. Levine. Well, I think it is a new argument, Senator,
because it's a new practice. We're seeing more and more
companies using--some people think of privacy as a discrete
issue; I have nothing to hide, you have nothing to hide.
Privacy is much deeper than that. And what we are finding and
what the FTC study found is that companies are using these
reams of data as they've collected. And they've historically
used to target people with advertisements.
We know that's been very profitable, but they're suddenly
realizing they could target people with individual prices. And
they go around and they tell Members of Congress and State
houses, ``Oh, we're just doing this because we want to lower
prices and send people discounts.''
This is ridiculous. They are paying companies like
McKinsey, high pricing consultants, to use AI optimization and
reams of consumer data to set individual prices. And they're
not doing it to lower their profits. They're not doing it to
lower their prices. They're doing it so that they can raise
prices on the Americans who are most desperate for goods and
services.
We have always seen that pricing abuses can start in the
airline industry. That is what we are seeing now with Delta,
and I have a lot of concern this is going to spread throughout
the economy, and the early results of our FTC study show that
it already is.
Senator Klobuchar. And we've seen the same thing with rent,
by the way----
Mr. Levine. Absolutely.
Senator Klobuchar [continuing]. Collecting of data on rent.
Chair Blackburn. Let me jump in on this, because we really
appreciate having all of you here. On surveillance pricing.
Just a show of hands, do you think surveillance pricing should
be banned?
[Hands raised.]
Mr. Levine. Yes.
Chair Blackburn. Okay.
Mr. Thayer. How would you define surveillance pricing?
Chair Blackburn. I know, I know. I just wanted a response.
Mr. Martino?
Mr. Martino. For the record, my hand was not up, it was
down--asking a question as to what you meant, but yes.
Chair Blackburn. I want you to talk then a little bit about
shared data, order, history, loyalty programs, and then how
long you keep that, and how you incent that keeping of the data
because that's a choice that somebody makes to enter into that
loyalty program.
Mr. Martino. Absolutely, Senator, and in doing so, let me
just first address what Mr. Levine said. I know there's the
concern that what pricing may happen in one industry, or the
way those practices go, it may come down to retail.
I think there's a very significant difference between the
retail industry and let's say some other industries. And it's
really comes down to competition where you have robust
competition like you do in the retail industry and very low
profit margins. The goal on retail is volume. It's business.
It's attracting new customers. It's growing the business
because you have very little profit on each item.
And what that leads to is, I think, a market constraint.
So, almost like a defacto regulation in terms of having such
severe competition that your competitor is one click or tap
away on an app or one stop away. And so, what's the mindset of
retailers and Main Street businesses is how do I attract more
customers? How do I do that?
Well, you have to do that with excellent customer service.
I mean, the only way to really differentiate yourself is to do
that. And so, loyalty plans are one way that is done. There's a
report that I cited to in the testimony called the Bond Brand
Loyalty Report. They do it every year. They've been doing it
the last 14 or 15 years.
They survey consumers, consumers say that 85 percent of
them will continue to shop at a brand if they have a great, or,
or yes, will continue to buy products from a brand that has a
great loyalty program. So, yes, loyalty programs are one of
those very important features.
And also, it's important to note, it's also inherently
privacy, protective of loyalty plan in the sense that they're
not foisted on consumers without their choice. You have to opt
in to avoid a loyalty program. You have to be delivered the
deal and decide whether you want to do that or not. And the
State laws recognize this as well.
The only protections for loyalty plans are based on
bonafide loyalty plans where a consumer has voluntarily opted
into participate in it. So, I think there are ways that, you
know, one of our principles is that businesses and consumers
should be able to freely develop a business relationship.
And if businesses on Main Street can develop those
relationships, whether it's a very small business offering a
buy one get two free, or buy five cups of coffee, get the six
one free, they should be able to have those kinds of
relationships as long as they're privacy protective. And we
think they are in terms of making sure they're voluntary.
And it's important to also note that the loyalty programs
are subject in the State laws to every other requirement in the
law. So, whether it's a right to opt out or a right to delete,
the consumers have those rights. So, we think there are good
business ways to do it.
And loyalty is something that's been around in the retail
industry for centuries. And we could go to general store
examples and things from 1890, but the same thing applied back
then that applies now.
Chair Blackburn. Okay. Mr. Levine?
Mr. Levine. Well, thank you, Senator. You know, it's one
thing to join a loyalty program and say you can track my
purchase history in exchange for getting coupons, fine, but
what the FTC study showed is that these consultants are telling
companies; look at what consumers are Googling, look at what
they're searching online, look at their location, look at how
they're sorting products.
A bunch of California law enforcers actually sued Target
for increasing in-app prices while consumers were inside a
Target store. So, they didn't know that they could pay lower
prices when they're not at the store.
Briefly, with respect to loyalty programs, again, I think
if consumers voluntarily turn over information, that's fine.
But what we saw in this Delta earnings call is what Delta said
is we can stop matching prices because of our brand strength,
because of our customers' loyalty.
And in a world of surveillance pricing, my fear is that
companies are going to prey on the consumers who are going to
pay the most. You might say they're the most loyal, rather than
giving them discounts. That's what we're already seeing in the
airline industry.
Chair Blackburn. All right. Mr. Martino, come back?
Mr. Martino. I'll keep it to a 10-second response. What
applies to Delta doesn't apply to Main Street businesses. You
have to look at the size of the market, the competition in the
market. Airline industry is notorious for being very few
competitors, not millions of businesses across America.
Chair Blackburn. Years ago, as we were starting in on this
debate, I would have people take out their key chain and look
at their fobs that were on there, and those are programs they
were choosing to share information with because of the
incentive that would come back to them.
Those times have changed. I want to go to the issue of AI
because we are looking at these AI models that are collecting
more and more personal data. They are doing tracking search
history monitoring. And as we look at the prevalence of AI, and
we've had a hearing on the NO FAKES Act to protect name, image,
likeness, and voice of individuals, and that in essence is a
form of privacy.
But one of the questions that will come before us as we
look at developing a Federal privacy standard is how you hit
that sweet spot of being strict enough to have that preemptive
Federal enforcement, but yet, flexible enough to allow the
innovation of new technologies that we see, things that are
going to run on quantum rails, things that are going to be AI
applications.
So, Ms. Goodloe, let me come to you on that, and then I'd
like Mr. Thayer for you to give me a response also.
Ms. Goodloe. This is such an important question, and thank
you for asking that. I think there are a couple of different
ways to look at the need for a Federal privacy law and its
intersection with AI technologies. I think the first thing to
look at is a recognition that AI can involve many different
types of data. Some of that data may be personal if an AI
system is using personal data that relates to consumers. But a
lot of the data used to train AI systems is not personal data.
For example, AI systems may be trained to detect weather
patterns based on data that's just about the weather and not
about people. But when it comes to AI systems that may be
processing personal data, that's where a Federal privacy law is
very important to create the right set of safeguards so that
consumers know their data will be handled responsibly in,
trustworthy ways.
One key issue is exactly what you pointed out, the need to
make sure that a law is flexible enough to allow those products
to continue to innovate over time. And I think this is one of
the struggles that we've seen as the conversation about data
minimization has evolved. That is such an important
conversation, but you have to get it right because a standard
needs to allow for technologies to get better over time. I
expect all of the technology that I use today to be better next
year and even better the year after that. And so, it is
important as you look at these protections to make sure they're
flexible over time and to think through the uses that you want
to apply to create the right set of safeguards.
Chair Blackburn. Okay. Mr. Thayer?
Mr. Thayer. Thank you, Senator. And I think Kate put it
very well. There is that nuance when it comes to AI, right? You
do have that anonymized data, but there also is the question of
what the consumer expected when they gave that data over. And I
failed to remember exactly who said it, but it really is like
big that data is the new oil and what runs the machine.
The AI machine is data. So, the question is where are they
getting it and how are they using it? So, I think at the front
end, the consumer has to know how is this data going to be
used? Is it going to be used to train an AI system? Are there
elements of transparency in terms of how this this this data is
going to be used down the road?
That comes down to really being upfront with the consumer
on where the data is going. And I think that's when it goes
back to my testimony when I said that we just feel like it's
out of control. We don't feel like we know exactly what happens
when we put the data into any application or any use of search.
So, a big part of this is going to be transparency, and
specifically, what data these AI systems are training on. Are
they training on PII, are they training on anonymized data?
Where are they pulling it?
And Senator, as you well know, there are ancillary issues
like intellectual property that are also included into all of
this as well. So, the big question really comes down to, with
respect to privacy, is what rights do citizens have when it
comes to protecting their data on the front end? So, that way,
it's not used on the back end to do all the parade of horribles
that we've already heard about today.
So, that's how I see it in most cases. I think at the end
of the day, the consumer has to know exactly what their data is
going to be used, and whether or not it--and also on the AI
system, what are they training their data on?
Chair Blackburn. So, you're looking for specificity in that
utilization?
Mr. Thayer. Specificity, and at the very least, being able
to have the consumer be empowered to say, I do not want my data
to be used for X, Y, and Z. So, it's both.
Chair Blackburn. Opt-in, opt-out.
Mr. Thayer. Yes.
Chair Blackburn. All right. Do you have any other----
Senator Klobuchar. Oh, I'll just--I think Senator Schiff is
coming. I thought maybe, you know, since we have a glass
ceiling for only women asking questions here.
Chair Blackburn. We kind of like that.
Senator Klobuchar. I mean, Hawley came by, Blumenthal.
They've all had other hearings. They're great, and been really
helpful to us. Maybe I'll just ask two more and see if he can
make it.
Chair Blackburn. Okay. Go ahead.
Senator Klobuchar. So, Mr. Thayer, in your written
testimony you referenced a European study that found that after
the passage of GDPR, the General Data Protection Regulation----
Chair Blackburn. I'm going to have to jump in here because
they need me to go to VA to vote.
Senator Klobuchar. That's where he is.
Chair Blackburn. Yes, that's where Senator Blumenthal is.
So, I will say my thank yous to you-all, in case I don't get
back before this closes, and remind you all that we're going to
have questions for the record.
As you can see, we have lots of questions, and we are ever
so grateful that you-all have come before us. I'll go vote.
Senator Klobuchar [presiding]. Okay. Thank you very much.
And thank you, again, for putting together this hearing.
So, I was talking about the GDPR. We know we don't like
everything that Europeans are doing on tech, but there are some
good examples of some good things they've done. What about
GDPR? Were Big Tech platforms able to take advantage of to
entrench their position, and how can we avoid doing the same in
the U.S., and how can we design data privacy standards that
reign in abuses? What's the good things we can get out of that?
I know there's things we could simply do here that they agreed
to in Europe that we're still fighting out over here.
Mr. Thayer. So, it's a fantastic question, and I think it
really comes down to defining your goals. That was like the
first big issue. But in terms of what happened with the GDPR,
and to be clear, there are elements of the GDPR that I think a
lot of States have latched onto, particularly Texas, where they
pull this analytical framework between data controllers and
data processors. Being able to articulate exactly who has the
responsibility is a big part of it.
Senator Klobuchar. I just want to have the record reflect
the Texas used the European model, but keep going.
[Laughter.]
Mr. Thayer. I fell right into it. But I think where things
went a little bit awry, where there was this weird
responsibility that the controllers basically had with respect
to contractual regulation. I think it's Article 24 of the GDPR
where the controller basically has to dictate specifically.
Well first whether or not they have to make the assessment of
whether or not the processor is even GDPR compliant. And that
gives the controller a lot of authority over what that smaller
company most likely can do and can't do. I think that's one
area we may want to stay away from.
But my overall point was that you need privacy and strong
antitrust enforcement in competition enforcement. I think the
both two things go hand in hand. And so, I think what Congress
is currently looking at and I think is very important is that
it seems like you guys want to walk and chew gum which I very
much appreciate where you have these competition reform bills
that are currently being discussed.
You are a sponsor of that, Senator, which is the Open App
Markets Act. I think that goes a long way in quelling some of
those concerns. But one of the things I would caution against
is creating an overly generalized authority and allowing the
controller to have the pure mandate or at least the pure
control of what the smaller companies are doing. I think that's
one way you can avoid some of the pitfalls.
Senator Klobuchar. Okay. Last two questions, Mr. Martino,
and they're related, and then I'll turn to Senator Schiff.
We're very excited you're here. Yes, thank you. Mr. Martino,
you can followup on that, but could you talk about the
challenges small businesses have operating across State lines,
quickly, because I want to give Senator Schiff a chance here.
Mr. Martino. Certainly, Senator. First, let me just
followup real quickly. I wanted to add a point to what Mr.
Thayer was saying. It's just that there are some things that
are problematic in the GDPR. And some of the expectations put
on controllers envision a construct where the controller is the
big company and they're getting these smaller processors to do
what they want. And that's not what's developed here in the
U.S. where you have very few almost monopolistic Big Tech
companies who are doing the vast majority of the processing
consumers need----
Senator Klobuchar. Understand.
Mr. Martino [continuing]. Including transmission, including
broadband, and cable.
And think about how a main street business might only have
a choice of one broadband provider and imagine trying to
negotiate that contract. I mean they don't do as--they do the
same as we do when we try to argue about a cable bill or a
broadband bill. So, we've all had that experience.
In terms of the multi-state operations, it's sort of a
sense that I know I put in my original testimony. Many of us
live in areas that are tri-state or multiple States are close
by. There is travel across State lines. There's shopping, and
then certainly online, you know, if there's a boutique store in
Minnesota that while you're here in Washington doing your job
here, you want to make a purchase from there. You are engaging
in interstate commerce.
And so, it's really important and these privacy laws tend
to be set up to apply to the location where the consumer is.
So, if you're in DC and you don't have a privacy law, are they
complying with privacy law there? So, what these small
businesses need to do is they have to--I mean, there's a
defacto national standard because they have to comply with all
these different States, but they're constantly changing. New
laws are coming online. So, Congress can do a really helpful
job by passing a uniform national standard.
Senator Klobuchar. Yes. And last question here, Mr. Butler.
You've advocated for Federal privacy law as well, what you
want, one that sets a floor. Obviously, this is all going to be
political negotiations, but could you talk about why you would
take that approach?
Mr. Butler. Sure. Thank you for the question, Senator
Klobuchar. You know, as Mr. Martino alluded, I think from the
vast majority of businesses in this country, they just want to
know what the rules are. And Congress's traditional role in
privacy laws has been to set the baseline standard, but allow
States to address new challenges and threats as they emerge.
And that's been true. And I have the list here, I could rattle
off the list of acronyms, but if you look at Federal privacy
statutes, by and large, they don't set a ceiling on the level
of protection that States can provide.
But what's really essential here is for the Federal
Congress to step in and say, ``Here's what the consistent
standard is.'' And I think if they do that, then we'll have a
consistent standard. Companies will know what to comply with,
and States still have the flexibility in the future to address
new issues.
Senator Klobuchar. Thank you. Senator Schiff.
Senator Schiff. Thank you. Thank you for----
Senator Klobuchar. The filibuster [off mic].
Senator Schiff [continuing.] Too. I understand that you
did, and I'm grateful for that and for all your leadership on
this issue.
Nearly, a decade ago, California became the first State in
the Nation to adopt a comprehensive consumer privacy law, the
California Consumer Privacy Act. This was shortly followed by
the establishment of the California Privacy Protection Agency,
which has served Californians for the last 5 years by
implementing and enforcing the State's privacy laws.
Other States have looked at California and our example, and
followed our lead, especially as new technologies have emerged,
AI facial recognition, algorithmic targeting, each posing more
sophisticated threats to Americans privacy. At the end of the
day, California has proven you can be the fourth largest
economy in the world and be home to the most innovative
technology companies on the planet. And you can still protect
consumers' fundamental right to privacy.
To this end, I'd like to enter into the record a letter
from the California Privacy Protection Agency on the importance
of a Federal privacy law that creates robust baseline
protections while allowing States like California to continue
to adopt stronger protections and respond to the rapidly
changing technologies being built in our own backyard. May that
letter be entered in the record?
Senator Klobuchar. Of course, it will. Yes. We just have,
you know, procedural things.
Senator Schiff. Yes, thank you. The horrific political
assassinations last month targeting Minnesota lawmakers that I
know Ranking Member Klobuchar has already referenced were
aided, in part, by a data broker and website the shooter used
to look up politicians' addresses.
A recent investigation also revealed that a data broker
owned and operated by at least nine major U.S. airlines
secretly sold Americans' information collected through flight
records to U.S. Customs and Border Protection, and U.S.
Immigration and Customs Enforcement.
Starting on January 1, 2026, 40 million Californians will
be able to go to a single webpage hosted by the California
Privacy Protection Agency and request that their data be
deleted from over 500 data brokers if they choose. Federal
legislation that preempts California's Delete Act without
meaningful consideration of State level protections, could mean
that Californians will lose this touch-of-a-button ability to
know how their data is being used and have a voice in it.
Mr. Butler, and Mr. Levine, how can a Federal privacy law
include better regulation of data brokers, including their
registration and central clearinghouse, and allow Americans to
prevent the personal information from being sold to outside
entities like we have done in California with a soon to be
implemented Delete Act?
Mr. Butler. Thank you, Senator Schiff, for the question. I
think that California really has taken the lead here on
tackling the problems of data brokers in this specific context.
And I think both the requirements of registering, given that
the average consumer has no way really to know what data
brokerage exists and who might have access to their
information, and also providing a centralized mechanism to
allow for deletion of data held by these entities are really
important protections, especially because this is a massive
problem that requires scaled solutions, right?
This isn't a situation where an individual consumer can be
expected to go to every single one of hundreds or thousands of
data brokers and submit individualized requests. So, I think
both of those are really important protections that have been
developed in California.
Senator Schiff. Mr. Levine? Am I pronouncing your name
correctly?
Mr. Levine. You are. Thank you, Senator. I fully agree with
Mr. Butler on the need for a floor rather than a ceiling
consistent with other Federal privacy laws. You know, I'll make
a quick point. I started my career at a State attorney general
in the run up to the financial crisis. It was State AGs
desperately trying to stop subprime mortgages, the innovative
products of the day. And it was Federal banking regulators
cheered on by big banks that were actively trying to stop them.
So, as I hear today, Big Tech companies go around
Washington saying we need to hit delete at all of these
important State laws, like the one you referenced, Senator. I
recall that similar conversations two decades ago, and I
recall, well, what happened in our country as a result. Two
quick points specifically on data brokers. You know, the first
is that we brought a series of enforcement actions under chair
Khan at the FTC. And what we required data brokers to do, we
banned them from sharing sensitive location data, and we
prohibited them from building profiles of consumers based on
sensitive geolocation data. I think that's a really important
precedent.
I think Congress also acted, I think, in the last Congress
with the--I'm going to get this wrong--Protecting American Data
from Foreign Adversaries Act, PAFACA. Given the FTC enforcement
authority, I think it's regrettable that 6 months into this
administration, we've not seen a single enforcement action. I
hope that changes.
Senator Schiff. Madam Chair, do I have time for one more?
Senator Klobuchar. Oh, yes.
Senator Schiff. Okay. Thank you. Over the past few months,
I've led a number of letters along with my colleagues to the
Trump administration in response to alarming reports that
various agency officials have ordered States to hand over the
personal data of millions of Medicaid enrollees, as well as
SNAP recipients, and applicants to the Department of Homeland
Security. These actions are remarkable departure from
established Federal privacy protections and should alarm
everyone. I've demanded the administration reverse these
actions, which likely violate several Federal and State privacy
laws, including the Privacy Act of 1974, HIPAA, and the Social
Security Act.
Mr. Levine, what precedent does it set when Federal
agencies under the administration simply bypass established
privacy laws that have protected Americans for decades and
demand that States hand over their residents' most sensitive
information with little or no explanation? And how does this
compare to privacy protections in other democratic nations? Are
we seeing the U.S. now fall behind international standards for
protecting citizens' data?
Mr. Levine. Thank you, Senator. I think we have the right
standards here, at least with respect to government. It's not
clear whether government officials are following them, and that
makes me very worried. One of my consistent messages as an
enforcer to Big Tech companies and to everyone, is you need to
follow privacy laws. And if you don't, they're going to be
consequences.
And when you have reports, and I've not verified them
myself, but when you have reports of Federal officials and
Federal agencies brazenly violating hard-won privacy
protections around Federal data, resulting in potential loss of
healthcare, loss of jobs, loss of housing for Americans, I
think that's deeply disturbing. And it raises a real question
of how Congress is going to pass a privacy law to bind the
private sector when the Federal Government isn't following its
own rules.
So, I completely share your concern, and I hope to see
changes in that from this administration.
Senator Schiff. And, finally, if I could very quickly, Mr.
Butler, you mentioned that there were a list of other privacy
laws where Congress had set a floor, not a ceiling. Can you
share a few of those with us?
Mr. Butler. Absolutely. And I'm happy to supplement the
record with that as well.
Mr. Butler. But just to note that basically every major
Federal privacy law sets either a floor or a conflict
preemption standard. And that includes the Electronic
Communications Privacy Act, the right to Financial Privacy Act,
the Cable Communications Privacy Act, the Video Privacy
Protection Act, the Employee Polygraph Protection Act, the
Telephone Consumer Protection Act, the Driver's Privacy
Protection Act, the Gramm-Leach-Bliley Act, and the Fair Credit
Reporting Act.
These are not ceiling preemptions. They don't limit State's
abilities to adapt, and evolve, and protect their citizens
more.
Senator Schiff. Oh, thank you. Thank you, Ranking Member. I
appreciate it.
Senator Klobuchar. Okay. Very good. Well, thank you. And
this is a lot of great testimony and answers. I just can't tell
you how inspired I am from this work and Marsha's willingness
to put this panel together, the good questions, and just, you
know, I always think maybe we can do this. Maybe we can
actually get a privacy standard and then, you know, I get
excited and then it's hard.
But as this gets more and more important, and with the
advent of AI, and just the patchwork, and maybe we can get some
more incentives going to try to get to a better place on this,
despite what everything would seem. And what gives me hope is
just the people that are involved in this Subcommittee, people
we work with on commerce, and their ability to kind of take
risks in terms of what the everyone wants them to do, and try
to find some common ground on this issue, which we have done
several times.
So, I just want to thank all of you for the testimony, and
the hearing record will remain open for one for 1 week. And the
hearing is adjourned.
[Whereupon, at 4:22 p.m., the hearing was adjourned.]
[Additional material submitted for the record follows.]
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
A P P E N D I X
The following submissions are available at:
https://www.govinfo.gov/content/pkg/CHRG-119shrg61893/pdf/CHRG-
119shrg
61893-add1.pdf
Submitted by Chair Blackburn:
Consumer Technology Association (CTA), letter................... 2
Submitted by Senator Schiff:
California Privacy Protection Agency (CPPA), letter.............. 4
[all]