[Senate Hearing 116-118]
[From the U.S. Government Publishing Office]
S. Hrg. 116-118
DATA BROKERS AND THE IMPACT ON FINANCIAL DATA PRIVACY, CREDIT,
INSURANCE, EMPLOYMENT, AND HOUSING
=======================================================================
HEARING
BEFORE THE
COMMITTEE ON
BANKING,HOUSING,AND URBAN AFFAIRS
UNITED STATES SENATE
ONE HUNDRED SIXTEENTH CONGRESS
FIRST SESSION
ON
EXAMINING DATA BROKERS' INDUSTRY PRACTICES AND STANDARDS AND THE IMPACT
THEY HAVE ON ACCESS TO, AND ELIGIBILITY FOR, CREDIT, INSURANCE,
EMPLOYMENT, AND HOUSING
__________
JUNE 11, 2019
__________
Printed for the use of the Committee on Banking, Housing, and Urban
Affairs
[GRAPHIC NOT AVAILABLE IN TIFF FORMAT]
Available at: https: //www.govinfo.gov /
__________
U.S. GOVERNMENT PUBLISHING OFFICE
39-485 PDF WASHINGTON : 2021
--------------------------------------------------------------------------------------
COMMITTEE ON BANKING, HOUSING, AND URBAN AFFAIRS
MIKE CRAPO, Idaho, Chairman
RICHARD C. SHELBY, Alabama SHERROD BROWN, Ohio
PATRICK J. TOOMEY, Pennsylvania JACK REED, Rhode Island
TIM SCOTT, South Carolina ROBERT MENENDEZ, New Jersey
BEN SASSE, Nebraska JON TESTER, Montana
TOM COTTON, Arkansas MARK R. WARNER, Virginia
MIKE ROUNDS, South Dakota ELIZABETH WARREN, Massachusetts
DAVID PERDUE, Georgia BRIAN SCHATZ, Hawaii
THOM TILLIS, North Carolina CHRIS VAN HOLLEN, Maryland
JOHN KENNEDY, Louisiana CATHERINE CORTEZ MASTO, Nevada
MARTHA McSALLY, Arizona DOUG JONES, Alabama
JERRY MORAN, Kansas TINA SMITH, Minnesota
KEVIN CRAMER, North Dakota KYRSTEN SINEMA, Arizona
Gregg Richard, Staff Director
Joe Carapiet, Chief Counsel
Brandon Beall, Professional Staff Member
Alexandra Hall, Professional Staff Member
Laura Swanson, Democratic Staff Director
Corey Frayer, Democratic Professional Staff Member
Cameron Ricker, Chief Clerk
Shelvin Simmons, IT Director
Charles J. Moffat, Hearing Clerk
Jim Crowell, Editor
(ii)
C O N T E N T S
----------
TUESDAY, JUNE 11, 2019
Page
Opening statement of Chairman Crapo.............................. 1
Prepared statement........................................... 30
Opening statements, comments, or prepared statements of:
Senator Brown................................................ 3
Prepared statement....................................... 31
WITNESSES
Alicia Puente Cackley, Ph.D., Director, Financial Markets and
Community Investment, Government Accountability Office......... 4
Prepared statement........................................... 32
Responses to written questions of:
Senator Menendez......................................... 163
Senator Warren........................................... 163
Senator Schatz........................................... 166
Senator Cortez Masto..................................... 169
Pam Dixon, Executive Director, World Privacy Forum............... 5
Prepared statement........................................... 49
Responses to written questions of:
Senator Menendez......................................... 171
Senator Warren........................................... 176
Senator Schatz........................................... 183
Senator Cortez Masto..................................... 186
Additional Material Supplied for the Record
Letter submitted on behalf of Acxiom by Jordan Abbott, Chief
Ethics Office.................................................. 202
Letter and responses to written questions of the Banking
Committee submitted by Bob Liodice, Chief Executive Office,
Association of National Advisers............................... 204
Letter submitted by CoreLogic.................................... 210
Letter submitted by Jim Nussle, President & CEO, Credit Union
National Association (CUNA).................................... 211
Letter submitted by Brad Thaler, Vice President of Legislative
Affairs, National Association of Federally-Insured Credit
Unions......................................................... 213
(iii)
DATA BROKERS AND THE IMPACT ON FINANCIAL DATA PRIVACY, CREDIT,
INSURANCE, EMPLOYMENT, AND HOUSING
----------
TUESDAY, JUNE 11, 2019
U.S. Senate,
Committee on Banking, Housing, and Urban Affairs,
Washington, DC.
The Committee met at 10:03 a.m. in room SD-538, Dirksen
Senate Office Building, Hon. Mike Crapo, Chairman of the
Committee, presiding.
OPENING STATEMENT OF CHAIRMAN MIKE CRAPO
Chairman Crapo. This hearing will come to order.
Providing testimony to the Committee today are experts who
have researched and written extensively on big data: Dr. Alicia
Cackley, the Director of Financial Markets and Community
Investment at the Government Accountability Office; and Ms. Pam
Dixon, Executive Director of the World Privacy Forum. We
appreciate both of you being here.
As a result of an increasingly digital economy, more
personal information is available to companies and others than
ever before. I have been troubled by Government agencies' and
private companies' collection of personally identifiable
information for a long time.
There have been many questions about how individuals' or
groups of individuals' information is collected, with whom it
is shared or sold, how it is used, and how it is secured.
Private companies are collecting, processing, analyzing,
and sharing massive data on individuals for all kinds of
purposes. Even more troubling is that the vast majority of
Americans do not even know what data is being collected, when
it is being collected, how it is being collected, by whom, and
for what purpose.
In particular, data brokers and technology companies,
including large social media platforms and search engines, play
a central role in gathering vast amounts of personal
information and often without interacting with individuals,
specifically in the case of data brokers.
In 2013, the GAO issued a report on information resellers,
which includes data brokers, and the need for the consumer
privacy framework to reflect changes in technology in the
marketplace.
The report noted that the current statutory consumer
privacy framework fails to address fully new technologies and
the growing marketplace for personal information.
The GAO also provided several recommendations to Congress
on how to approach the issue to provide consumers with more
control over their data.
In 2018, 5 years later, GAO published a blog summarizing
its 2013 report, highlighting the continued relevance of the
report's findings.
The Federal Trade Commission also released a report in 2014
that emphasized the big role of data brokers in the economy.
The FTC observed in its report that ``data brokers collect and
store billions of data elements covering nearly every U.S.
consumer,'' and that ``data brokers collect data from numerous
sources, largely without consumers' knowledge.''
In her report ``The Scoring of America,'' Pam Dixon
discusses predictive consumer scoring across the economy,
including the big role that data brokers play. She stresses
that today no protections exist for most consumer scores,
similar to those that apply to credit scores under the Fair
Credit Reporting Act.
Dixon says, ``Consumer scores are today where credit scores
were in the 1950s. Data brokers, merchants, government
entities, and others can create or use a consumer score without
notice to consumers.''
Dr. Cackley has also issued several reports on consumer
privacy and technology, including a report in September 2013 on
information resellers, which includes data brokers. She says in
her report that the current consumer privacy framework does not
fully address new technologies and the vastly increased
marketplace for personal information. She also discusses
potential gaps in current Federal law, including the Fair
Credit Reporting Act.
The Banking Committee has been examining the data privacy
issue in both the private and public sectors, from regulators
to financial companies, to other companies who gather vast
amounts of personal information on individuals or groups of
individuals to see what can be done through legislation,
regulation, or by instituting best practices.
Enacted in 1970, the Fair Credit Reporting Act is a law in
the Banking Committee's jurisdiction which aims to promote the
accuracy, fairness, and privacy of consumer information
contained in the files of consumer reporting agencies. Given
the exponential growth and use of data since that time and the
rise of entities that appear to serve a similar function as the
original credit reporting agencies, it is worth examining how
the Fair Credit Reporting Act should work in a digital economy.
During today's hearing, I look forward to hearing more
about the structure and practices of the data broker industry
and technology companies, such as large social media platforms;
how the data broker industry has evolved within the development
of new technologies, and their interaction with technology
companies; what information these entities collect, how it is
collected, and whom it is shared with and for what purposes;
what gaps exist in Federal privacy law; and what changes to
Federal law should be considered to give individuals real
control over their data.
I appreciate each of you joining us today and look forward
to getting some further information about these questions.
Senator Brown.
OPENING STATEMENT OF SENATOR SHERROD BROWN
Senator Brown. Thank you, Mr. Chairman. I appreciate your
continuing these important, bipartisan efforts to protect
Americans' sensitive personal information.
We are looking today at a shadowy industry known as ``data
brokers.'' Most of you probably have not heard of these
companies. The biggest ones include names like Acxiom,
CoreLogic, Spokeo, and ZoomInfo--and maybe one you have heard
of, Oracle. According to some estimates, 4,000 of these
companies collect and sell private information, but,
stunningly--and I am not sure I have ever used that word in
this Committee--stunningly, not one of them has been willing to
show up and speak in front of this Committee today. Not one.
These companies expect to be trusted with the most personal
and private information you could imagine about millions of
Americans. They are not even willing to show up and explain how
their industry works. Some define this as cowardice. It is hard
to disagree with that. I think it tells you all you need to
know about how much they want their own faces and names
associated with that industry.
As Maciej Ceglowski told us at our last hearing, ``the
daily activities of most Americans are now tracked and
permanently recorded by automated systems at Google or
Facebook.''
Most of that private activity is not useful without data
that anchors it to the real world. Facebook, Google, and Amazon
want to know where you are using your credit cards, where you
buy your brand-name appliances, if you are recently divorced,
and how big your life insurance policy is--the kind of data
that big tech gets from data brokers. They then combine it with
your social media activity to feed into their algorithms.
You might have noticed it seems like every product or
service you buy comes with a survey or a warranty card that
asks for strangely personal information. Why are all these
nontech companies so interested in your data?
It is simple: Data brokers will pay these companies for any
of your personal information they can get their hands on so
they can turn around and sell it to Silicon Valley. It is hard
for ordinary consumers to have any power when, unbeknownst to
them, they are actually the product bought and sold.
It reminds me of a time when corporations that had no
business being in the lending industry decided to start making
loans and selling them off to Wall Street. We know what
happened. Manufacturers or car companies decided that consumer
credit would be a great way to boost their profits. When big
banks and big tech are willing to pay for something, everyone
else will find a way to sell it to them, often with devastating
results.
For example, Amazon is undermining retailers and
manufacturers across the country through anticompetitive
practices. At the same time, it scoops up information from the
very businesses it is pushing out of the market.
Then there is Facebook, almost single-handedly undermining
the profitability of newspapers across the country. It also
gobbles up personal information that the New York Times allows
data brokers to collect from its readers.
Just like in the financial crisis, a group of shadowy
players sits at the center of the market, exercising enormous
influence over consumers and the economy while facing little or
no rules at all. Then they do not show up.
Chairman Crapo and I are committed to shining a light on
these companies and keeping an unregulated data economy from
spiraling out of control. Yesterday it was reported that a
Department of Homeland Security contractor allowed unauthorized
access to photos of travelers and their license plates to be
exposed to potential identity thieves.
One of the principal differences between the two political
parties in this town is the suspicion that Democrats have of
private power and suspicion Republicans typically have of
Government power. I think you are seeing two parties come
together on our suspicion of what these data brokers are doing.
The Chairman and I agree that protecting sensitive
information like this is timely and important. I look forward
to the witnesses' testimony.
Thanks.
Chairman Crapo. Thank you, Senator Brown, and I appreciate
our partnership on this issue.
We will go in the order I introduced you, and, Dr. Cackley,
you may begin. But before you do, let me just remind both of
you that we would like you to keep your initial remarks to 5
minutes so that we can have plenty of time for the Senators to
engage with you.
Dr. Cackley.
STATEMENT OF ALICIA PUENTE CACKLEY, Ph.D., DIRECTOR, FINANCIAL
MARKETS AND COMMUNITY INVESTMENT, GOVERNMENT ACCOUNTABILITY
OFFICE
Ms. Cackley. Thank you. Chairman Crapo, Ranking Member
Brown, and Members of the Committee,
I am pleased to be here today to discuss GAO's work on
consumer privacy and information resellers, also known as
``data brokers.''
My remarks are primarily based on our September 2013 report
on privacy issues related to information resellers, as well as
more recent work on internet privacy, data protection, facial
recognition, and financial technology.
My statement will focus on two main issues: the lack of an
overarching Federal privacy law and gaps that exist in the
current consumer privacy framework.
No overarching Federal privacy law governs the collection,
use, and sale of personal information among private sector
companies, including information resellers. There are also no
Federal laws designed specifically to address all the products
sold and information maintained by information resellers.
Instead, Federal privacy laws covering the private sector are
narrowly tailored to specific purposes, situations, types of
information, or entities, such as data related to financial
transactions, personal health, and eligibility for credit.
For example, the Fair Credit Reporting Act requires that
sensitive consumer information be protected and restricts how
it is shared. But the law only applies to information used to
determine eligibility for things like credit, insurance, and
employment. Similarly, the Gramm-Leach-Bliley Act restricts how
certain financial information is shared, but it only applies to
entities that fall under the law's specific definition of a
``financial institution.'' Other privacy statutes address other
specific circumstances, but there is no Federal statute that
comprehensively addresses privacy issues in the private sector.
GAO has stated previously that gaps exist in the U.S.
consumer privacy framework. We have reported that Federal law
provides consumers with limited ability to access, control, and
correct their personal data, particularly data used for
marketing purposes. Similarly, individuals generally cannot
prevent their personal information from being collected, used,
and shared. Yet information that resellers collect and share
for marketing purposes can be very personal or sensitive. For
example, it can include information about physical and mental
health, income and assets, political affiliations, and sexual
habits and orientation.
Another area where there are gaps in the consumer privacy
framework is with respect to new technologies. For example,
Federal law does not address expressly when companies can use
facial recognition technology to identify or track individuals,
nor does it address when consumer knowledge or consent should
be required for its use. Similarly, no Federal privacy law
explicitly addresses the full range of practices for tracking
or collecting data from consumers' online activity or the
application software for mobile devices. And the rise of
financial services technologies, known as ``FinTech,'' raises
new privacy concerns, for example, because new sources of
personal data are being used to determine creditworthiness.
In summary, new markets and technologies have vastly
changed the amount of personal information private companies
collect and how they use it. But our current privacy framework
does not fully address these changes. Laws protecting privacy
interests are tailored to specific sectors and uses, and
consumers have little control over how their information is
collected, used, and shared with third parties for marketing
purposes. As a result, the current privacy framework warrants
reconsideration by Congress in relation to consumer interests,
new technologies, and other issues.
Chairman Crapo, Ranking Member Brown, and Members of the
Committee, this concludes my statement. I would be pleased to
answer any questions you may have.
Chairman Crapo. Thank you.
Ms. Dixon.
STATEMENT OF PAM DIXON, EXECUTIVE DIRECTOR, WORLD PRIVACY FORUM
Ms. Dixon. Thank you. Chairman Crapo, Ranking Member Brown,
and Members of the Committee, thank you for your invitation and
for the opportunity to talk about something very, very
meaningful today: the Fair Credit Reporting Act, data brokers,
and privacy.
Fifty years ago, this Committee struck a blow for consumers
for transparency and for fairness when it passed the Fair
Credit Reporting Act. This Committee talked with stakeholders.
They found best practices. And before the famous HEW Report
came out, the Committee report that defined what became fair
information practices, this Committee created the Fair Credit
Reporting Act. It was and still is the most important American
privacy law that we have. But it is not as important as it was.
There are three reasons why.
First, credit scores and other scores are being sold and
used in consumers' lives, and these are unregulated.
Second, the technology of prediction, what can be called
``predictive analytics,'' otherwise known as AI and machine
learning, this technology and suite of technologies has
advanced profoundly, and especially in the last 3 to 4 years,
new kinds of predictive abilities have come forth, and we have
new levels of accuracy in prediction, so that what used to be
the accuracy of the credit score now is also the accuracy of an
unregulated credit score, and this introduces new problems for
consumers.
Third, these scores are created without due process for
consumers. How on Earth do we deal with this? This is why
Congress must expand the Fair Credit Reporting Act to regulate
currently unregulated scores, especially in the financial
sector, that are being used in meaningful ways in consumers'
lives.
We have other solutions to discuss and other issues to
discuss. I look forward to your questions. Thank you.
Chairman Crapo. Thank you very much, Ms. Dixon.
I would like to ask each of you to answer my first three
questions, and then I want to get into more discussion. But I
would like you, if you possibly can, to limit your answers to
yes or no answers to the first three. I know you will be
tempted to elaborate, but I will give you that chance.
First, do you agree that data brokers collect and process
vast amounts of personal information on nearly every American
to the extent that they hold more information about individuals
than the U.S. Government or traditional credit bureaus?
Ms. Dixon. Yes.
Ms. Cackley. Yes.
Chairman Crapo. Second, do you both agree that most
Americans have no knowledge of these activities and in most
cases no rights to access, correct, or control the information
collected about them?
Ms. Dixon. Yes.
Ms. Cackley. Yes.
Chairman Crapo. And then, third, can certain processing and
uses of this information have significant impact on their
financial lives?
Ms. Dixon. Yes. Absolutely.
Ms. Cackley. Yes.
Chairman Crapo. All right. Now we will get to where you can
elaborate. You have both authored reports, as the FTC in 2014,
that highlight the gaps in the Fair Credit Reporting Act and
other privacy laws. You have both testified about that in your
introductory remarks. These gaps allow data brokers to evade
certain requirements that should be imposed on them.
What are the steps that we can take? You indicated, Ms.
Dixon, that we need to expand the Fair Credit Reporting Act,
and you essentially said the same thing, Dr. Cackley. But what
specifically does this Committee need to do with regard to
that?
Ms. Dixon. Thank you. In regards to the Fair Credit
Reporting Act, I think very small changes would be very
meaningful. Let me give you an example. Right now, as you know,
as you well know, the Fair Credit Reporting Act in regards to
credit scores applies to individuals. So when we are--you know,
that is regulated at the individual level.
However, if you look at the new forms of credit scores that
are available, they are scored at the household level where the
Fair Credit Reporting Act does not apply. So you take a ZIP+4,
and you score a household and give them, let us say, a score of
720. The household has a very accurate score of 720. Then that
becomes an unregulated form of credit score. And, you know, 10
years ago, these scores were quasi-accurate. That has changed.
Chairman Crapo. Thank you.
Dr. Cackley?
Ms. Cackley. So the Fair Credit Reporting Act has a certain
number of elements to it that are very helpful. It gives
consumers access, control, the ability to correct information,
and safeguards privacy. But it only applies in certain
situations for eligibility decisions. It would be possible to
think about looking at a broader set of personal sensitive
information that the Fair Credit Reporting Act could cover that
would give consumers more of those things, access, control,
ability to correct, over more personal sensitive information
than is currently available.
Chairman Crapo. All right. And I am going to use the term--
well, Ms. Dixon, you used the term ``unregulated credit
scores.'' There is a set of data that is collected about
individuals and, as you indicate, households, and this data is
turned into some kind of an analysis that allows those who use
the data to influence and manipulate individuals in the
marketplace.
Historically, as you have both indicated, the Fair Credit
Reporting Act has focused primarily on credit bureaus, but the
scope of who is collecting this data and how it is being used
has exploded, as you both also discussed.
The question I have is: Isn't this unregulated score that
we are talking about that is created for people and then
managed by AI, isn't that impacting people's credit? Isn't it
impacting their financial decisions? Isn't it significantly
focused on that type of influence and manipulation of
individuals?
Ms. Cackley. I think it certainly can be. The scores may
not be credit scores, but they may apply to decisions that
companies are making about what kinds of products they offer
people, and at what price they offer things. This is based on a
score that the consumer does not necessarily see, cannot tell
is correct, or cannot make any attempt to improve if they do
not even know it exists.
Chairman Crapo. And to influence them to make such a
transaction. I will let you go ahead, Ms. Dixon. I am running
out of time here, but go ahead, please.
Ms. Dixon. Thank you. We call any score that is not
regulated by the Fair Credit Reporting Act ``consumer scores,''
and we define that. It is in the written testimony. Consumer
scores are quite dangerous when they are used in eligibility
circumstances.
So, for example, the line between a lead generation, which
is allowable--you do not have to pull a credit score to create
a lead generation for a marketing product or a financial
product. However, if you are just maybe marketing a financial
product and you have something that is equivalent in accuracy
to a credit score, all of a sudden this changes the equation.
There is not even a micrometer in between, you know, what a
regulation would be and a nonregulated score.
So if you have essentially something that looks like a
credit score and that acts like the credit score and is being
used like the credit score, well, it is the same thing as a
duck. If it quacks, it is a duck.
So I think we have to look at the financial products that
are being marketed with quasi-credit scores very closely. That
is of high concern. But there are other categories. In ``The
Scoring of America,'' we identified literally hundreds of types
of scores: consumer lifetime value scores where consumers are
segmented according to how valuable they are in terms of their
purchasing power. There are frailty scores, which is more of a
medical score. But the scores abound, and the concern I have is
when people lose opportunities that are meaningful in their
lives, for example, scores that are used in eligibility
circumstances not described by the Fair Credit Reporting Act,
such as admissions to colleges and what-not, imagine having a
wonderful high school background and working very hard to
achieve the American dream, and then all of a sudden some score
says that you will not be as qualified a candidate, having
nothing to do with your academic achievements but just somehow
with maybe the neighborhood you grew up in. I find this
disturbing.
Chairman Crapo. Agreed.
Senator Brown.
Senator Brown. Thank you, Mr. Chairman.
Ms. Dixon, you noted that tens of thousands of consumers'
scores affecting millions and millions of consumers are used to
predict our behaviors, our secret, as you said. Are you
surprised that Chairman Crapo was not able nor were we able to
bring in data brokers to speak and testify? Are you surprised
they were not willing to testify about their business practices
before this Committee today?
Ms. Dixon. Actually, I am surprised, and I am actually--I
wish they were here, and I wish the credit bureaus were here as
well, because we need to have good industry step forward and to
give us their best practices that they use. If there is no good
industry to step forward with best practices, then this
Committee cannot rearticulate what it did 50 years ago. And I
do not understand why these industries are not willing to
discuss what is happening, and I also do not understand why we
cannot see our scores. Why?
Senator Brown. I am not sure that they did not show up. I
guess I would like to--I am not sure I have done this before
either. I would like to ask anybody in the room that represents
the data brokers to raise their hand. Lobbyists, lawyers,
people paid by the data broker industry, any of you here? Any
of you here that want to raise your hand? I guess is the
question.
OK. And if you are, I mean, I will give you an opportunity
of a lifetime. If you are, we will set up a different chair,
and you can sit next to Ms. Dixon and Dr. Cackley. OK. All
right. I guess no surprise there, Mr. Chairman, and that does
illustrate how--because I know they are watching. I mean, this
is really important to their industry. It is very important to
their bottom line, whether they are watching here or whether
they are watching live stream. But we will move on.
Ms. Dixon, it seems that data predictions create a vicious
cycle where the predictions end up often dictating the
outcomes. For example, could people who have been
systematically targeted by predatory lenders, having lower
credit scores, therefore be likely only to see advertisements
for other predatory financial products? I assume that happens.
Are there other examples you can think of quickly?
Ms. Dixon. Yes, the predatory example is one we get phone
calls about in our office from people who received
advertisements for financial products, and they did not
understand that they could have gone out on the market and
affirmatively looked for the best offer. So these predatory
marketing devices based on unregulated scores are very
significant.
Other significant scores are scores that predict repayment
of debt. So, for example, it is the poorest consumers who are
targeted the most for debt repayment, all sorts of things like
this. The consumer lifetime value scores impact how well you
are treated by businesses, by how long you are standing in
line, but the most meaningful circumstances that I can think of
is when kids are applying to schools and they are getting
scores that dictate whether or not they are going to be
accepted to a school based not on their academics but based on
all of these other things, like a pseudo credit score, like
what neighborhood they grew up in. There are neighborhood risk
scores which are the modern-day redlining, and I find them
deeply objectionable because if we are going to be scored by
where we live, how have we advanced and how have all the laws
that have been meant to protect from such things, how are they
operating if this is still happening today?
Senator Brown. Thank you for that. So companies that--
particularly your analogy to redlining, bank redlining,
insurance redlining, now these companies redlining, are you
worried that companies would offer discounts for products and
services in exchange for sensitive data, which would lead--you
sort of implied this--to a two-tiered system where the wealthy
can afford privacy and everyone else will have to sacrifice
sensitive information to get access to basic internet services?
Ms. Dixon. That is certainly part of it. I think it goes
even more broadly than that. One of the big issues is that you
get locked into a filter bubble of sorts, a marketing bubble,
and it is not that people mean to get locked into these, but if
you are receiving offers, especially for financial tools and
services, and a consumer does not go outside of the offers they
receive, they can pay more for autos; they can pay more for
products; they can pay more for, for example, a TV. Simple
things. But if you are a consumer on a fixed income, a
television that costs $2,000 instead of $200 makes a meaningful
difference in a person's life. That is what worries me the
most.
Senator Brown. There is one follow-up, not a question but a
comment, Mr. Chairman. Thanks for your forbearance. The whole
idea that people prey on people that are less able to fight
back,
yesterday I was in Des Moines, not running for President but in
Des Moines, and I was at a manufactured housing neighborhood,
and a large hedge fund from Salt Lake City has begun to buy up
manufactured housing neighborhoods. There are six of them in my
State. There are a number of them in Iowa. They are in a half
dozen States at least. They come and they buy these. People
have paid $50,000 or $60,000 or $70,000 for their manufactured
home. They pay $200 to $300 a month for the rent on the land,
and this hedge fund is raising rents over about a period of a
year, a year and a half, up to 70 percent, and people have
nowhere to turn. And it is like these companies out there are
just looking: Where can we come in, extract the most money at
the lowest cost against people that are the most--have the
least ability to fight back without political connections? And
it is just happening across our economy.
Thank you.
Chairman Crapo. Thank you, Senator Brown.
Senator Scott.
Senator Scott. Thank you, Mr. Chairman. I will note that
some people go to Des Moines not to run for President, but
perhaps Vice President.
[Laughter.]
Senator Scott. I apologize. I meant----
Senator Brown. Mr. Chairman, Senator Scott is a really
smart guy, but that was not the smartest thing he ever said.
[Laughter.]
Senator Brown. Go on.
Senator Scott. Senator Brown, I realize you do not actually
run for Vice President by the number of votes you get, but I
think there is a process by which people say they are qualified
to do things--like ask Ms. Dixon a question.
So one of your comments that you made sounded--I spent
about 25 years in the insurance industry, so one of the
comments you made sounded a little bit like redlining, and I
would love for you to unpack that a little bit, but just to
make sure I heard you. So in unregulated ways, credit scores
that consumers themselves do not know about, that consumers
have not seen, heard, or contributed to, are being used in ways
that will impact their financial well-being to include perhaps
even the likelihood of jobs that they may or may not be
qualified for, that to me sounds fairly nefarious, but it
sounds a whole lot like redlining. Can you unpack--if that is
not what you meant, please clarify what you did mean. And if it
is what you meant, please drill down a little bit so that we
can have a little more clarity to what you are talking about.
Ms. Dixon. Thank you. It is a really complex issue, and in
``The Scoring of America'' and in my written testimony, I have
articulated it more fully with footnotes.
Senator Scott. We have that part.
Ms. Dixon. Yes. So thank you for your question, because it
is complex and it is difficult to abstract into a few words.
Let me try and make a big effort here. All right----
Senator Scott. I will give you 3 minutes if you need it.
Ms. Dixon. Let us go for it.
Senator Scott. OK.
Ms. Dixon. So there are amazing real-time analytic
products. Actually, in our update to ``The Scoring,'' we have
looked at this. So, for example, financial service companies,
you can look across the United States and see pretty much real
time the marketplace activity of people who are spending and
buying and what that looks like in real time. You can drill
down to the census block level and see how well a neighborhood
is performing. There is, for example, a product that gives you
what is called an ``up-front score,'' what the score of that
neighborhood is. And I will send on follow-up a series of
screen shots of this to you so you can see it.
Senator Scott. Thank you.
Ms. Dixon. But let us say that you are applying for a
university position, and your neighborhood has a very poor
score. Well, now that can be taken into consideration. We have
the college board doing this. They have an adversity score that
is doing exactly this. So I find this difficult. The lines are
narrow----
Senator Scott. Just to interrupt you, Ms. Dixon. I read an
article I guess a couple weeks ago, Mr. Chairman, about this
new SAT score that would take into consideration challenges.
Are you suggesting that that score could--the neighborhood
score could have an impact on one's SAT score and college
admittance?
Ms. Dixon. I do not believe it will have an impact on a
person's SAT score. I do believe that it can have a much
further and much larger impact----
Senator Scott. Ms. Dixon, are you familiar with the new
iteration of the SAT score which takes into consideration the
family challenges in----
Ms. Dixon. Yes.
Senator Scott. OK.
Ms. Dixon. Yes, I am, and that is what I am referring to.
So while that score is meant to provide context, here is the
problem. One of the factors that it uses is a neighborhood risk
score, and that neighborhood risk score is a secret score.
Consumers do not get to see it. Currently, the college board
adversity score, the students' score, they are not allowed to
see it. It is a secret score.
Now, let us bring this score into transparency. Let us
apply some of the principles of the Fair Credit Reporting Act.
Let us give people access to the score. Let them know what
factors went into the score. Let us make it fair. That is my
point.
And right now this does not fall under the Fair Credit
Reporting Act by all law. It does not fall into any eligibility
circumstance, not yet. But that is what I am saying. We need to
have fairness. Technology is going to advance, and it is
important that it does. We need to stay competitive in the
United States within machine learning and AI. It is very, very
crucial for our economic future. But we need fairness and
transparency, and we really need the Fair Credit Reporting Act
to be guiding best practices and saying, look, technology, yes,
uses need to be right. That is the deal.
Senator Scott. Thank you.
Mr. Chairman and Ranking Member, I would love for us to do
all that we can to compel some of the companies in the industry
to participate in a future hearing.
Chairman Crapo. You have both of our agreement already on
that, Senator Scott.
Senator Scott. Thank you, sir.
Chairman Crapo. Thank you.
Senator Reed.
Senator Reed. Well, thank you, Mr. Chairman. And thank you
to the witnesses for their testimony.
In previous hearings, echoing some of the comments of my
colleagues, in particular Senator Kennedy, where a lot of the
information should be viewed as being owned by the person, not
by these data brokers. And we have to create real opportunities
to protect your data. We have got some legal statutes in place
like the Fair Credit Reporting Act, HIPAA, et cetera, where it
is clear by statute. And then we have got some information that
is very public. It is published, and it is linked notices in
newspapers, et cetera. And then there is all the information
that is just accumulated by being on a computer.
It comes back down to, I think, three principles. This is
my view. One is that consumers, people, should have the ability
to opt out of any information collection system. Then, second,
this information should be at some point expunged, 6 months, a
year, et cetera. And then if it is violated by anybody, a data
broker or a collector or anyone else, then they should have the
right to go to court and say, ``You have ruined me.''
So let us start with both your comments on how do we get
sort of an effective opt-out. You know, my sense is that
someone using or going to a website, it is hard to figure out
where the opt-out is. Sometimes they do not even offer that.
Should we in the U.S. Congress say you have to have a very
prominent opt-out, do not collect my data? Let us start with
Dr. Cackley and then Ms. Dixon.
Ms. Cackley. So an opt-out possibility is certainly
something that is available and is used in certain
circumstances. I think there are more circumstances where it
could be helpful. I do not know that that as a solution alone
would do the trick in terms of if you think about all of the
times when you go online and you are supposed to read the
disclosures and click on things.
Senator Reed. No one reads the disclosures.
Ms. Cackley. Yeah, exactly, and so it may be that no one
will read the opt-out either.
Senator Reed. That is why the opt-out cannot be hidden in
the disclosures. It has to pop right up here saying, ``Click
yes or no.''
Ms. Cackley. Absolutely. Right. I think if someone knows
that they do not want their data to be collected and they can
opt out right away, that is a way to do it. In other
circumstances, people may not understand what the opt-out is,
really----
Senator Reed. I think if you start with the major
platforms, the Googles, and et cetera, if they cannot collect
the data, then that data is not going to get down the road to
the brokers because they do not have it.
Ms. Cackley. Absolutely.
Senator Reed. And that is the first place, I think, to
begin.
Ms. Dixon?
Ms. Dixon. Thank you. I was honored to serve at the OECD as
part of their AI expert group. I just finished helping them
write the global guidelines on AI, and something that I learned
in that process even more so than I already had is that our
data world, our data ecosystems have become so profoundly
complex that I am not at all persuaded anymore that opt-out is
possible, because if you recall, you know, the Russian nesting
dolls where you have the big doll and then all the--you open
the doll and there is another doll. And then you open it up
again and there is another doll. This is what data is like.
So let us say we do opt out of, you know, a platform. Well,
what about all of the financial transactions. The financial
transactions and our retail purchase histories are actually the
basis of a lot of data broker analysis. And then it gets worse.
As you get into the dolls, here is one that really is very,
very challenging, and that is this. Data brokers right now, if
they did not collect another piece of data on us--here is
something really to think about--they could simply create data
about us because that is the state of the technology. And I do
not know how to create an opt-out that is that far removed from
us.
However, that being the case, I do believe there are things
we can do, especially if we focus on restricting negative uses
that harm consumers and really look at the endpoints of that
process, and also at the beginning and say, hey, what are the
standards you are using? What can we do to make good standards?
And at the end, what are the standards for use? How can we
control these two points?
But I think there is a role for opt-out, for example,
especially for human subject research, where there must be
meaningful consent. As a tool, I think it has lost a lot of its
power.
Senator Reed. You have studied this longer than I, but I
think it is a place to begin, and it is not a perfect solution,
but, you know, you cannot make the perfect the enemy of the
good. If it gives people a little more protection, I think it
should be pursued.
The other aspects of this, too, as you pointed out, with
this synthetic--they create the synthetic data. Sort of purging
it periodically might also help this. Again, I think you have
put your finger on this dilemma now. The complexity, the
ability to gather indirectly, not directly, data is profound.
But if we do not take some simple steps, it gets worse. It does
not get better.
Thank you.
Chairman Crapo. Thank you.
Senator Schatz.
Senator Schatz. Thank you, Mr. Chairman. Thank you to the
testifiers.
Ms. Dixon, you know, we are talking about some reforms to
the Fair Credit Reporting Act, and what worries me a bit is
that, as important as I think it is to bring data brokers back
into the fold in terms of how the statutes governs their
behavior, the Fair Credit Reporting Act does not actually work
as it relates to the credit bureaus. The credit bureaus put the
onus on the consumer. The consumer has to pay to correct or
monitor his or her own data, and so that statute is broken. And
so to the extent that we are going to put all of these shadow
data brokers under FCRA, I think we have to be clear-eyed about
how imperfect that system is for millions and millions of
Americans. I would like you to comment on that.
Ms. Dixon. Well, I agree with you. That is why I said that
even our best American privacy law is not as important as it
used to be. It does have cracks and fissures. However, it does
something very important. It makes it so that things are not
secret. You and I, we can look at our credit score. This is
huge. This is a huge improvement from pre-2000 when it was
illegal to do so. We can see our bureau report and correct it.
We cannot see our other scores, and this is problematic.
Senator Schatz. Fair enough. Let me ask you a sort of
technical question. What is the relationship between data
brokers and credit bureaus? In other words, are some of these
credit bureaus getting into the data broker business? Have some
of them acquired data brokers? What is their relationship?
Ms. Dixon. Yes, so, for example, Equifax and Experian, a
lot of times what they will do is they will have part of their
business as a formal regulated credit reporting business, and
then other aspects of their business are unregulated----
Senator Schatz. Which is what they would characterize as
the ``marketing side.''
Ms. Dixon. Yes, I am aware that they call it ``marketing.''
However, I call it the ``consumer scoring side.'' But, yes,
your point is absolutely correct. And, additionally, you
mentioned that there is, you know, also first party. One of the
things that has been happening is there is a lot of data
privacy concerns, and there is a real move now for a lot of
different types of businesses to purchase data brokers and
bring them in so that they are dealing with first-party data.
So now we have a fracture in the data broker business model
where you cannot just say, ``Well, here are the data brokers.
Let us regulate them.'' That is not possible anymore. Maybe 25,
30 years ago, but not now. I think we really have to look at
practices and say, hey, are you using the data for these
purposes, especially in regards to eligibility.
Senator Schatz. But the challenge, to follow up on what
Senator Scott talked about in terms of digital redlining, is
that to the extent that they are using data sets that are
essentially in combination a proxy for race, and to the extent
that those algorithms are not transparent, it is incredibly
difficult to imagine that even if we put them under FCRA and
even if the FTC were authorized to go after--or CFPB were
authorized to go after them, just to make the case would be
incredibly difficult. Am I correct there?
Ms. Dixon. I believe you are correct, and that is why we
proposed a standard bill that really looks at creating new
standards to start to build a mesh network to fill in these
gaps. Because you are correct, there are important gaps here.
Senator Schatz. And under FCRA and in the sort of old days,
you used to have shadow shoppers to try to figure out whether
there was discrimination in terms of impact as opposed to in
terms of intent. And yet it seems to me that there could be a
way where we could subject all of these data brokers to a
regime where they had to--they did not have to provide the code
for their algorithm, but they had to provide a regulator with
the ability to utilize the algorithm and see if the--and run a
bunch of reps and figure out if, statistically speaking, it
was, in fact, a proxy for race or if there was a disparate
impact on protected classes.
Ms. Dixon. I think that is right. And, you know, it is not
that algorithms are bad. It is not that scoring is bad. It is
how it is used----
Senator Schatz. And some of this could actually alleviate
the problem of the credit bureaus in terms of the 3 or 4
million people who have bad credit scores that are incorrect.
And so if you can come up with an alternative that is
nondiscriminatory, it provides a real opportunity.
I will just offer one last thought, and I would like both
of your comments for the record. We are working on legislation
and I am working on legislation to establish a duty of care,
because I think the problem is in a sectoral approach some of
these companies are--I do not know if they are a FinTech
company or a tech company or under the HIPAA regime, and they
sort of evade the various regulations because it is not clear
where they belong. And in any case, once the data has been
collected, either voluntarily or not, either through the
internet of Things or at one point you clicked ``I agree''
because you signed up for a social platform, the question is:
What is the obligation of the company who is in possession of
your data? And the duty of care is the most simple way to say
cross-sectorally you may not intentionally harm any person
whose data you are in possession of. And that is why the duty
of care is such a clean way to address all of this because,
otherwise, we are going to be always a decade behind whatever
these new-fangled companies are attempting to do to us. But if
I could take that for the record, please.
Ms. Dixon. Yes, I think that that is a potentially very
good approach. I think Vermont did something like this at the
State level where they said you cannot purchase data with the
intent to defraud or discriminate. So I do think that ensuring
that fairness is percolating throughout the system is a really
good remedy.
Senator Schatz. Thank you.
Chairman Crapo. Senator Cortez Masto--oh, did you want to
have Dr. Cackley----
Senator Schatz. No. I was going to take those for the
record.
Chairman Crapo. So he will let you respond in writing, is
what he is saying.
Senator Schatz. Thank you.
Chairman Crapo. Senator Cortez Masto.
Senator Cortez Masto. Thank you. I appreciate that. But I
would like to hear what Dr. Cackley had to say as well.
Chairman Crapo. All right.
Ms. Cackley. So in terms of, I think, a duty of care, a
basic part of a comprehensive privacy law, that would be a good
element to include. What we have reported is that given the
gaps that the sectoral approach allows in terms of privacy, we
have recommended that Congress really consider a more
comprehensive approach and include within it several different
elements, and a duty-of-care element should certainly be part
of that consideration.
Senator Cortez Masto. Yeah, I like that idea, too. I think
it is very innovative. Along with that, transparency would be
key, right? The consumer knows that whatever regulated credit
score or unregulated credit score, whatever is being used that
is based on an algorithm that is identifying their factors,
they should have access to that, correct?
Ms. Cackley. Access, control, ability to correct, all of
those are important elements, yes.
Senator Cortez Masto. OK. So, Ms. Dixon, I understand in
2015 Allstate insurance began selling consumer driving data,
and Allstate Chairman and CEO Tom Wilson said that the property
casualty insurance company hopes to profit from the sale of
telematics data and then pass on savings to consumers by
lowering premiums.
Is Allstate unusual in its plans to capture this
information about people's driving data to earn additional
profit? And, I am just curious, how many insurers have adopted
telematics? And what has been the impact, if you know?
Ms. Dixon. So my understanding is that they are no longer
the only insurance company doing this. There are now several
insurance companies. And there are also health insurance
companies who are saying, hey, give us access to a variety of
your data and we will give you commensurate lower rates when
applicable.
So I think that these are rather uncomfortable things, and,
to put it mildly, I would really like to see guardrails on how
these are used. I do not think we can stop what is happening in
prediction. Prediction is getting cheaper, and it is getting
more accurate. So we cannot stop it. However, I think we can
take a multifactorial approach to the problems, the real
problems that these situations impose. Do we want consumers
giving away their data in order to, you know, have a better
premium? And I think that you should be able to have
protections without giving away your data. We need good rights
here.
Senator Cortez Masto. Right.
Ms. Dixon. And to do that, we are going to have to have
good rules of the road that encompass new technology, but keep
the values, let us make a decision, and not be financially
penalized for it. And should an insurance company be able to
sell this data? That is a question we need to have as a matter
of public discussion. It should not just be decided just by
industry. It needs to be a multistakeholder conversation about
that.
Senator Cortez Masto. And this type of data is what goes
into what you have identified as the neighborhood risk scores
that----
Ms. Dixon. That is part of it.
Senator Cortez Masto.----companies could use, correct?
Ms. Dixon. Oh, there are so many scores, but, yes----
Senator Cortez Masto. But that could be part of it, there
is so much data.
Ms. Dixon. Absolutely.
Senator Cortez Masto. And the other concern I am
understanding is that because of the new technology and
algorithms, the concern is that this information with respect
to unregulated credit scores could end up providing higher
accuracy levels than the regulated credit scores, such that the
banks or other financial institutions would start using those
unregulated credit scores more so than the regulated. Is that
right?
Ms. Dixon. Well, I think that banks in particular are very,
very careful about these kinds of uses. Of the people that we
have interviewed, they have been very, very careful. Actually,
some of the people I worry about the most are the people who
are not in banks and who want to pull a credit score product to
do marketing. And instead of actually going through the
regulation and making a firm offer of credit or insurance, they
will just kind of skirt around the edges and pull the, you
know, unregulated credit score and then make these offers.
Someone discussed today especially if it is a predatory offer,
this is where things get very problematic. If you have a
consumer who is identified in the credit score 400 to 500 level
and someone does not want to make a firm offer of credit or
insurance but they want that number and they want to use that
number to market a product maybe for bill consolidation or for
payday loans, then I think we all need to be very interested in
protections for that.
Senator Cortez Masto. Thank you. And I notice my time is
also up. I will also just submit this for the record, facial
recognition and data that comes from that. It is topical right
now, and the question would be: Should that information be
shared with third parties like data brokers to be utilized? I
am curious about your thoughts on companies in general--which I
think it was just in the paper today, airlines were looking at
using this type of facial recognition data. So I will submit
that for the record.
Thank you so much for this conversation today. I appreciate
it, Mr. Chairman.
Chairman Crapo. Thank you.
Senator Warner.
Senator Warner. Thank you, Mr. Chairman. Let me, first of
all, just associate myself with both your comments and the
Ranking Member's comments. It is pretty remarkable that you
invited the data industry, the data brokers to come, and they
did not show up. I think that is a very telling statement.
I know folks have talked about the Fair Credit Reporting
Act. I know we have talked about a variety of issues. I have
been thinking a lot about this in terms of the social media
companies. You know, the data brokers are really just one piece
of the overall growing data economy, and we are talking a lot
about third-party vendors. Obviously, I have got concerns as
well about first-party vendors, the Amazons, the Facebooks, the
Googles.
Would you both agree that, candidly, most Americans do not
have the slightest idea of what kind of data is being collected
about them and what that data is worth?
Ms. Cackley. I think it is definitely true that most
Americans do not understand the breadth of data that is
collected about them. They may be aware in certain instances
where they have checked yes or provided something, but they do
not know the true extent of it.
Senator Warner. Ms. Dixon?
Ms. Dixon. Thank you. The complexity of data flows right
now is extraordinary, and you are correct, first parties, third
parties, everything is blending. And if you look at even just
identity, you can have an identity that overlaps in 20
different data ecosystems. And as a result, it has become very
difficult for anyone to map the data.
There is this amazing chart that was produced by the
advertising industry for itself, actually, and it maps this
extraordinarily. It looks like the Tokyo subway lines. I mean,
it is incredibly complex. And I do not know that it is possible
to fully map our data anymore.
So if that is the case, how on Earth do we cabin practices
so that there is almost like a set of routine uses where here
are the acceptable uses for companies, end of discussion, boom;
and then outside of this, not acceptable uses. We are going to
have to find our way to something like that, and we might have
to distinguish it by sector and by perhaps even individual
companies. But I would like to see that very fairly
adjudicated. I am really interested in seeing people talk with
each other to figure this out. We need to have very meaningful
discussions to figure out where the data is going and how we
can best protect it. But I do not think people know about----
Senator Warner. One of the things that you touched on
briefly, one of the areas I have got some bipartisan
legislation that would try to focus on some of the manipulative
practices, so-called dark patterns use, where, you know, in
layman's terms, you have six sets of arrows clicking on--you
know, pointing you toward the ``I agree'' button and you can
never find the ``unsubscribe'' button, and there are a host of
practices that go on in the industry where people give up this
information, oftentimes unwittingly, and through
extraordinarily sophisticated psychological tools being used by
the companies and others to get this information.
I know my time is getting down. I would just like your
commentary. I believe consumers ought to have a right to know
what data is being collected about them. I believe we need to
take it a step further and also have some basic valuation in
terms of how much that data is worth. And I am an old telcom
guy. For a long time, it used to be really hard to bring
competition in the telco market until we instituted, by
Government regulation, number portability. I believe that same
concept, data portability ought to be brought into the data
economy so that if you are not liking how you are being
treated--I think about it mostly in the social media context,
but there are a variety of areas, in the credit-scoring areas
as well, where, you know, if we had that knowledge of what data
was being collected, what it is worth, and then if you did not
like the way Facebook was treating you or some other
enterprise, you were easily able to move all of your data in
one swipe to a new company or a new platform. I think you could
bring some additional competitive practices to the area.
In these last couple seconds, data valuation, data
knowledge, and data portability, ideas? Comments? Suggestions?
Ms. Dixon. I really like the idea of data interoperability
so there is more freedom----
Senator Warner. With portability, you have got to have
interoperability or it does not work.
Ms. Dixon. Yes. But I think that it is going to be
something that will end up working out in time, but it should
be a good priority.
Ms. Cackley. So this is not something that we have looked
at specifically, but I think to the degree that you are talking
about comprehensive legislation that really covers all of the
different platforms and parties, then that kind of
interoperability would be----
Senator Warner. We would like to share with both of you
some of the work we have been doing, and I think there could be
broad-based bipartisan support.
Thank you, Mr. Chairman.
Chairman Crapo. Thank you.
Senator Kennedy.
Senator Kennedy. Thank you, Mr. Chairman.
If I go on the internet and I search and I look at social
media and I buy something on Amazon, let us say, who--I mean,
my actions, my behavior is recorded. We call that ``data.'' Who
owns it?
Ms. Dixon. I have a white paper I am going to send to you.
We spent a lot of time thinking about this issue. So the issue
of data ownership is quite difficult to parse, but let me give
you my best shot and let us have a discussion.
Senator Kennedy. Well, I would like to have a discussion,
but first I would like to have an answer.
Ms. Dixon. Here is the answer: I view data in our current
data ecosystems as a common pool resource. I think a lot of
different entities can lay claim to that data. However, no one
gets to own it, and--well, in some cases they can.
Senator Kennedy. You do not think that I own my data?
Ms. Dixon. It depends on where you have used it and where
it is. I think there are some----
Senator Kennedy. How about you?
Ms. Cackley. I do not think there is an answer to who owns
your data once you have taken an action, especially in some
ways interacted with another company.
Senator Kennedy. Well, let us suppose that Congress passed
a law that said the consumer owns his data and he or she can
knowingly license it. What would be wrong with that?
Ms. Cackley. I do not think there would be anything wrong
with it. I think it would have impact on who could then collect
your data or whether data could be collected.
Senator Kennedy. No, I could license my data knowingly.
Ms. Cackley. Right.
Senator Kennedy. Now in terms of knowingly licensing my
data that I own, what sort of disclosures should a social media
company, for example, make to me in terms of how it is going to
use my data? Right now they make disclosures, but they do not
inform the consumer. I have said before some of those things
are 7, 8, 9, 10 pages, written by lawyers, you could hide a
dead body in them, and nobody would find the body. I mean,
nobody reads them. That is not knowing consent. What would a
social media company have to tell me in order for me to know
what they are doing?
Ms. Dixon. May I offer an example from the medical field?
So under HIPAA, there are very meaningful mechanisms prior to a
consumer agreeing to release their information outside of the
protection of HIPAA. However, one of the concerns that has come
up with this is that it has become very, very easy for
consumers, patients, to ``donate'' their data. And what has
happened is that people have donated their data and taken it
out of the protections of HIPAA without meaningful consent.
Senator Kennedy. Ms. Dixon, I am not trying to be rude. I
am trying to get answers. Here is my question: If I own my data
and I license it, I need to understand what licensing it means.
What needs to be disclosed to me?
Ms. Dixon. My understanding, looking at other fields--
because this is not something I have studied at length. My
understanding is that is a serious agreement, and it would
require massive disclosures. I think you could almost put a
graveyard in that disclosure, you know, compared to----
Senator Kennedy. And you do not think it is possible to
write a disclosure that the consumer would understand? Is that
what you are saying?
Ms. Dixon. In this area, I would have to really look at
that. Again, this is not an area of research for me, but I----
Senator Kennedy. What do you think, Doc?
Ms. Cackley. I think it would be very complicated. It is
not an area that we have looked at either, but if Congress were
to pass a law that allowed consumers to license their own data,
that would require a large amount of regulations to go along--
--
Senator Kennedy. So you both think that we should just
allow companies to do what they want with our data, that this
problem is impossible to solve?
Ms. Cackley. No, no. I do not think I meant that at all. I
just meant that it would have to be worked through. It is not
an easy fix.
Senator Kennedy. No, I do not think there are any easy
fixes around here.
Ms. Dixon. And I do not mean that either. I believe that we
should have rules of the road, and we should have agreed-upon
rules on what----
Senator Kennedy. I agree with that, too, and everybody--we
have had a lot of interesting discussions about this, but no
offense to you, two, but the experts never offer a solution. To
me the solution is the consumer owns his data. You can license
it. Licensing has to be knowing and intentional. You can move
your data. Portability should be an option. I can change my
mind about licensing it. And companies will adapt to that. They
will have no choice.
Thank you, Mr. Chairman.
Chairman Crapo. Senator Menendez.
Senator Menendez. Thank you, Mr. Chairman.
I have the same concerns as Senator Kennedy because we seem
to be living in an age of data breaches. Just last week, we
learned of a breach concerning a medical billing company,
American Medical Collection Agency, that may have exposed the
personal, financial, and even medical data of 20 million
patients who were customers of Quest Diagnostics and LabCorp.
So let me ask you, Ms. Dixon, people are rightly concerned
that some of their personal data is now exposed and could be
used against them. Can data brokers legally compile, aggregate,
or sell data that has been acquired through an illegal hack?
Ms. Dixon. I am not an attorney, so I think that is a
question an attorney could better answer you. But my first best
guess is I do not think you can use improperly information that
has been disclosed in an unauthorized manner for your own
business purposes. That seems like that would be really out of
bounds.
Senator Menendez. Dr. Cackley, do you have any idea?
Ms. Cackley. I do not know the answer, but I can certainly
find out.
Senator Menendez. Yeah, well, I would appreciate that.
Should people be concerned that data not otherwise covered
by HIPAA is ending up in the hands of data brokers even in the
absence of a hack? Are billing companies like American Medical
Collection Agency selling non-HIPAA data to brokers?
Ms. Dixon. This is an ongoing area of grave concern for us.
There are actually scores of health data. There is a frailty
score that can predict very closely how sick you are and when
you might possibly die. I think that there are all sorts of
scores and products related to----
Senator Menendez. I am not sure I want to check on that
data myself.
Ms. Dixon. Yeah. Me either. But----
Senator Menendez. But that is pretty frightening, isn't it?
Ms. Dixon. It is. You know, health data that is not covered
under HIPAA has become an increasing area, so----
Senator Menendez. Well, let me ask you this: When hackers
gain access to non-HIPAA data like in the Quest data breach,
can data brokers apply machine learning to these data points to
infer or reconstruct sensitive HIPAA-protected medical data?
Ms. Dixon. I actually do not think that they need to
acquire unauthorized data to do that. They can just look at our
purchase histories and get an awful lot of data about us. But
in terms of what is happening with this entire area, the data
breaches of medical data actually can lead to forms of identity
theft and medical identity theft that are very, very difficult
to cure and can have extremely meaningful consequences in
people's lives.
Senator Menendez. Well, let me ask you, then, HIPAA is
nearly 25 years old, and the 2009 HITECH Act provided updates
which were concerning health information technology. But I am
still concerned that we are playing catch-up when it comes to
protecting patients. You know, of all the information that
should be private and privileged to you, your health standing
should be extraordinary--there are all types of consequences in
that, in employment and discrimination, in a whole host of
things. Are there gaps in HIPAA and other data security laws
that need to be addressed to better protect people today in
this 21st century threat? What coordination is missing between
existing legal protections?
Ms. Dixon. I do think there are gaps, and the biggest gaps
that exist right now are the gaps that exist between the
sectoral protections, and I do not think the answer is to just
rip out the sector protections that exist, such as the Fair
Credit Reporting Act or HIPAA or Sarbanes-Oxley, et cetera, but
to find a way to fill those gaps in. For example, victims of
medical identity theft can use their Fair Credit Reporting Act
rights to get their financial information corrected. But under
HIPAA it is not possible for them because it does not exist in
the statute. It is not possible for them to get a deletion
similar to the FCRA in their health file, so they can actually
carry around inaccurate information which can really have an
impact on their treatment and insurance costs. And there is not
a solution yet. So this is the kind of gap we need to address.
Senator Menendez. All right. Last, there was one breach
that compromised the personal information of 20 million
patients. That is pretty troubling. One data broker has data on
300 million consumers. We are still reeling from the Equifax
breach which affected 145.5 million consumers. If the
information of 300 million consumers were to be compromised, we
might start calling private information public information
because at the end of the day that is the result of it.
What are the ramifications for a consumer if a data broker
is breached? And should we hold them to a higher standard of
security, especially because their volume is so consequential?
Ms. Dixon. Data broker breaches are very significant. So my
assessment of this is that the various State data breach laws
are doing a pretty good job, especially in some cases where the
data breach law is quite strong, in forcing disclosures and
notices. But I think we need to do more to ensure that all of
the information held that is sensitive and health related, et
cetera, is duly notified to the consumer.
The problem with the data brokers is what they will say is,
oh, wait, wait, we do not have a direct relationship with the
consumers; we cannot notify them. And I think that is a gap
that needs to be resolved. Now, the State of Vermont has
resolved that gap.
Senator Menendez. Well, they could reach back to the entity
that provided them the data in the first place, and they could
notify, could they not?
Ms. Dixon. I believe that that could happen. And it has
happened in some----
Senator Menendez. I just think they should be held to a
higher standard of security because the consequences of
incredible numbers of Americans that are subject to having
their privacy breached and their health care breached is just
beyond acceptance.
Thank you, Mr. Chairman.
Chairman Crapo. Thank you.
Senator Rounds.
Senator Rounds. Thank you, Mr. Chairman, and thank you for
holding this hearing today.
I would like to see, in listening to this, if I have picked
up the grasp of some of the challenges we have here. It would
appear to me that we are talking about, first of all, the
question of the security of the data that is actually being
collected. Second of all, it appears that we are questioning
whether or not there is an appropriate way for individual
consumers or individuals to actually find out and to have
access to what these organizations, these nonregulated
organizations actually have. And, finally, this appears that it
may very well be a work-around with regard to the information
that is being collected and then disseminated from what a
regulated entity would have.
In a nutshell, are those the three areas? And would there
be other areas that you would also identify? I would ask each
of you for your thoughts.
Ms. Cackley. Those are certainly three of the main points
that have come up today. I think the other piece that we have
not touched on maybe as much is outside of the data brokers
themselves. There are other technologies with privacy issues,
you know, mobile devices, facial recognition technology--we did
mention that--with financial technology. All of these are areas
of concern that fall outside potentially the protections of
FCRA in particular.
Senator Rounds. The use of machine learning and artificial
intelligence in this process. OK.
Ms. Dixon?
Ms. Dixon. So my focus has really never been on the
technologies as an endpoint. My focus has always been on, OK,
so we have technological processes that are going to continue
through time, but what does that actually mean in practice. I
have always looked at the practice. So your assessment of where
the sticking points are is accurate. The thing I would add is
this: I think it is going to become, as we move forward and
prediction gets cheaper, I think prediction is going to be
coming to a mobile phone near us, like ours. And I think we
have to be very cautious about looking at categories of
technologies and labeling them as bad. Similarly, in industry,
I think we have to be very careful and say, OK, what are the
practices that we want to go after here and want to address
because they are harming consumers. And if we can do that in a
truly multifactorial way, I think that will be helpful.
Wherever these practices exist, wherever they are, we need to
be addressing them because they are meaningful and have
impacts.
Senator Rounds. There is a difference between the way that
we have looked at data and data collection and privacy in the
United States versus the way that it has been done in some
other parts of the world. Here we follow and we use Gramm-
Leach-Bliley within the United States, but in Europe they take
a different approach--the GDPR, which seeks to really achieve a
different and more comprehensive approach, but would be rather
challenging.
Can you share with me the thought process or your analysis
of the differences or the advantages, one versus the other,
between the way that we handle it today in the United States
versus what they are doing in Europe with the GDPR in its
current form?
Ms. Cackley. So we have not looked at GDPR directly yet,
but I can say that there are definitely some elements of GDPR
that embody the Fair Information Practices Principles, which
are the basis of some of our privacy regulation already. There
are other pieces of GDPR that are not in the U.S. privacy
framework, and one of the main ones, I would say, is the right
to be forgotten. The right to be forgotten is a part of GDPR
that really is not encompassed in the U.S. privacy framework.
Senator Rounds. Ms. Dixon?
Ms. Dixon. The GDPR, as you know, it was built on the EU
95/46, so it has a lot of bureaucratic history behind it. If
you look at what they were trying to do and all the derogations
and what-not, it is a really complex and thought structure.
I think that it does provide for baseline privacy
protections, but they do not have the sectoral system and they
do not have government privacy. So I think there is one thing I
will say. In our country, the Privacy Act is very effective in
regulating certain aspects of government information
collection. They do not have anything like that.
Senator Rounds. Thank you. I see my time has expired, Mr.
Chairman. Thank you to both of you for your answers today. And,
Mr. Chairman, once again thank you for the opportunity here
today with this hearing on this very important topic.
Chairman Crapo. Thank you, Senator Rounds.
Senator Sinema.
Senator Sinema. Thank you, Mr. Chairman. And thank you to
our witnesses for being here today.
At the Committee's last hearing on privacy, I spoke about
the importance of privacy to Arizonans. We are practical people
who want the modern conveniences that technology brings, but we
value our privacy. So I am committed to making sure that
Arizonans know how our data is being used so that we can make
informed decisions.
Arizonans also do not like assumptions being made about us
or how we choose to live our lives, particularly if some of
those assumptions are wrong, which is why current privacy and
consumer scoring laws concern both me and many Arizonans.
In 2013, the FTC completed and published a 10-year
congressionally mandated study on the accuracy of credit
reports. The FTC found that one in five consumers had an error
on at least one of their three credit reports. So, Ms. Dixon,
first, thank you for being here. I want to talk quickly about
credit scores as a starting point and what happens if you or I
were one of those consumers.
How drastically could an error in a credit report
negatively affect an Arizonan's credit score?
Ms. Dixon. Yes, that effect would be profound. So, for
example, for victims of identity theft, if someone has run up
your credit and it is not actually your error, you could be
seen as not making your payments, et cetera, and you can
literally move from a 780 score to a 620 in very short order.
It only takes about a month. And then what you have is a
situation where, if you are about to buy a home--and these are
from the calls we get. This is not just a hypothesis here. The
home you are about to buy, all of a sudden you cannot qualify
for a mortgage because of identity theft.
So, yes, any error from any source that is in your credit
report, it is a piece of serious business.
Senator Sinema. So, Ms. Dixon, you said this could
potentially prevent an Arizonan from buying a home. Would it
also get in the way of financing an education or starting a
small business or expanding one's business?
Ms. Dixon. Absolutely.
Senator Sinema. Wow, that is really troubling.
Under the Fair Credit Reporting Act, if an Arizonan thinks
his or her credit report or score is inaccurate, they can
appeal it with the bureau. Is that correct?
Ms. Dixon. That is correct.
Senator Sinema. And if so, how?
Ms. Dixon. Yes, there is a very specific procedure outlined
in law where the bureaus must respond, and there is a series of
steps that they can take, and both the Federal Trade Commission
and the CFPB have numerous help- and hot-lines to help everyone
through, and the State AGs also do as well. But there are very
well documented recourses for consumers in this situation.
Senator Sinema. Well, that is good. So we have established
it is important to have an accurate credit score and there is a
process to appeal it and fix it. But, increasingly, businesses
are using so-called consumer scores that rank, rate, and
segment consumers based on public-private and government data
that is packaged and sold by data brokers and others. So
sometimes this public data is inaccurate. It is often outdated
or it could be incomplete.
So are all consumer scores made available to consumers just
like credit scores are?
Ms. Dixon. Actually, almost none of them are. In fact, I
have had almost no success. Despite trying to get consumer
scores and asking companies for my consumer score, it is almost
impossible to get them.
Senator Sinema. But then how would an Arizonan know if his
or her consumer score was inaccurate if they cannot get access
to it?
Ms. Dixon. That is the same question I have. They would not
know.
Senator Sinema. Wow. So let us say that an Arizonan were
able to find out that his or her consumer score is inaccurate.
Are all consumer scores covered under the FCRA so that there is
a similar appeals process to resolve inaccuracies?
Ms. Dixon. No consumer scores that are unregulated are
currently covered under the FCRA. Unless it is a formal credit
score as articulated by the FCRA and used in an eligibility
circumstance, it is not covered.
Senator Sinema. Well, that is very concerning, but thank
you for sharing that information with us.
Mr. Chairman and Ranking Member Brown, it is clear that we
have a lot of work to do here. We have got to update our
privacy laws to reflect new trends that are occurring in both
business and technology to make sure that Americans have the
right to correct their record, whether it is their credit score
or their consumer score, on who they are, how they have lived
their lives, and what mistakes or inaccuracies that might be
occurring in their lives.
So I thank you for being here, our witnesses, and I look
forward to working with the Committee on this. And, Mr.
Chairman, I yield back.
Chairman Crapo. Thank you.
That concludes the first round, but Senator Brown and I
would like to do a second round, and you are welcome to join in
with us, Senator, if you would like.
There are so many questions. One of them I want to get back
to which has been brought up by several Senators is this notion
of the tension between doing a comprehensive bill like the GDPR
in Europe or a sectoral approach like we do in the United
States. And I think we all can understand there is sort of a
push and a pull on both sides of that question.
It seems to me, though, that we do not have a choice, at
least at a basic level, to deal with all data collection in the
same way. I think one of you mentioned earlier that it is all
blending. It used to be that we could clearly distinguish what
a credit bureau did and the credit report that a credit bureau
prepared. Now we have massive amounts--I think Senator Brown
referenced the 4,000 number, but I do not even know what the
number is--of entities that are collecting data. My
understanding is that the apps on my iPhone, many of them
collect data even when I am not using them to report further to
others about whatever it is, data that is not even often
related to the app. And it seems to me that all of that data is
in one way or another not just blending but being utilized for
many, many different purposes, one of which is credit, one of
which is retail sales, one of which is college applications,
one of which is mortgages. I mean, the list can go on and on
and on.
So I guess I would like to have each of you just briefly--
because I have got some more questions, but briefly indicate do
you believe that at some basic point the United States needs to
have a comprehensive set of standards and requirements that
would cover some basics, like when data is being collected, who
is collecting it, whether there is an opt-in or an opt-out,
what rights to manage or even remove one's data exist?
Ms. Cackley. Yes, I think that is where we are right now,
that the sectoral approach leaves too many gaps. You may not
need to completely change to a comprehensive framework, you
could merge elements of a comprehensive and sectoral approach
in some ways. But a comprehensive framework that gives basic
privacy rights and abilities for consumers to know what their
data is and how to correct it, how to control it, is definitely
something that needs to be addressed.
Chairman Crapo. Thank you.
Ms. Dixon?
Ms. Dixon. Let me share with you that I have been seeking
an answer to the question you just asked for about 27 years, so
here is what I have come up with, and it is just--it is my
opinion. What if the sectoral system was a feature, not a bug,
born from thoughtful deliberation about very focused issue
areas with a lot of buy-in? What if we have not been able to
pass comprehensive legislation because our system requires more
buy-in than other systems? These are just the hypotheses that I
am working with.
So if that is the case--and, also I have to tell you, I am
quite concerned about the deep disruption to privacy law that
would occur if there was massive preemption. But be that the
case, what if there was a way to do a surgical strike and to
provide guardrails in the areas that need it the most, that
would fill in the sectoral gaps? That is what I am very
interested in.
So I think that something that had really important
principles, fair information practices, principles, and then
the adaptation of those principles for the gaps that exist. So
I do think that standards have been a neglected part of the
privacy conversation. I have no idea why we do not have more
standards in privacy.
This mobile phone has loads of standards that attach to it,
but for our privacy and for data brokers, where are the
standards? Well, let us create some. Let us start there. I am
all for starting cautiously and working with best practices,
but to give things teeth and to abide by the larger principles.
So a nice amalgamation of all of the above, something that
is multifactorial. I do not think we have silver bullets
available to us anymore.
Chairman Crapo. Well, thank you. And just one other quick
question, and then I will turn to Senator Brown. We have talked
a lot about the problems we are trying to address here, whether
harm is caused by the use of data, whether credit is impacted,
whether people are redlined or denied access to products or
opportunities. It seems to me that when you approach the issue
from that perspective, which is a very legitimate approach,
that there is another issue that is--I do not know if I would
call it a ``harm.'' Maybe it is. But there is simply a privacy
issue. A lot of Americans, I believe, do not want to have to
prove that they were harmed. They do not want people collecting
data on them, or they do not want certain data collected. It is
sort of the right to be forgotten or the right to opt out of
certain segments of data collection.
Is that a legitimate right that we should try to protect?
Ms. Dixon. It is a legitimate option that we need to be
able to have. The adversity score, I think that any child who
is applying for college should be able to say, hey, wait, I do
not want my neighborhood being part of that. Do they have to
prove harm? I do not think they should have to. They should be
able to say, hey, no, this is not something I want. It is
legitimate.
Chairman Crapo. Dr. Cackley?
Ms. Cackley. I think that is right, that it is important
for people to be able to make a choice about what data they
share and what data they do not.
Chairman Crapo. Senator Brown.
Senator Brown. Thank you.
Ms. Dixon, this is the last round of questions, blessedly,
for both of you. And please be really brief on these because I
have several questions.
Should Federal regulators and supervisors have full access
to every company's predictive models so they can evaluate them
for bias and other legal compliance?
Ms. Dixon. I believe they would have to hire about a
million people if they did that. I am not sure of the answer to
that question, but I have a lot of thoughts on this, and I will
send you written follow-up.
Senator Brown. OK. That would be good, including if there
is a list of companies whose models you believe should be
available to regulators for review.
Ms. Dixon. I will send that to you.
Senator Brown. OK.
Senator Brown. A technology expert at our last hearing
stated, ``While our online economy depends on collection and
permanent storage of highly personal data, we do not have the
capacity to keep such large collections of user data safe over
time.'' Do you agree with that statement?
Ms. Dixon. I think it is very difficult to keep user data
safe 100 percent of the time.
Senator Brown. Should companies be required to expunge
certain types of user data after, say, 60 or 90 days?
Ms. Dixon. You know, I think there are very good arguments
for that, and there is a continuum for that. And I will respond
to that in writing.
Senator Brown. OK. Thanks.
Senator Brown. Do companies who currently use personal data
for profit see existing penalties as little more than the cost
of doing business? That is often the case in this town, that a
few-million-dollars fine on a multi-billion-dollar company is
the cost of doing business. How strong do penalties and other
enforcement mechanisms need to be in order to hold these
companies accountable?
Ms. Dixon. I do not know the answer to that question.
However, I do think that having very good enforcement is an
important stick, and I think we need carrots and sticks to make
things right.
Senator Brown. Is holding executives personally accountable
one way?
Ms. Dixon. I do not know about that.
Senator Brown. Does that mean no or you just do not know?
Ms. Dixon. It means that I literally do not know the answer
to that.
Senator Brown. A technology expert at our last hearing
stated, ``While it is possible in principle to throw one's
laptop into the sea and renounce all technology, it is no
longer possible to opt out of a surveillance society.'' Do you
agree with that statement?
Ms. Dixon. Absolutely. I do not believe that an opt-out
village exists.
Senator Brown. So what would a meaningful consent contract
between users and tech companies or users and data brokers or a
meaningful opt-out policy look like?
Ms. Dixon. So it needs to be multifactorial and not just
rely on consent, because consent is a really difficult vehicle
for that. I have a lot of very complete thoughts on that, and I
will follow up in writing.
Senator Brown. OK. You are going to be busy in the next few
days.
Ms. Dixon. That is all right. I have a lot on this.
Senator Brown. And the last question. As you point out in
your testimony, household data can serve as a proxy for an
individual credit score. Some data that seems innocuous, like
Instagram posts, can actually yield predictive data about a
user's mental health. How do we know what data is inherently
sensitive and what data is innocuous but can become sensitive
when it is used to make predictions?
Ms. Dixon. Right. One of the most difficult things that I
have had to grapple with as a privacy expert and someone who
cares so much about privacy is that it is so difficult to say,
here, this is sensitive data, here, this is sensitive data. It
is all becoming sensitive depending on how it is analyzed, and
that is why privacy protections have had to become much more
multifactorial and much more subtle in responding to this new
issue.
Senator Brown. In part, that movement, if you will, from it
is initially not sensitive but becomes that is a result of just
the power of--the quantity and quality of computing power,
correct?
Ms. Dixon. We were in a digital era. We are really moving
into the predictive era, and it changes everything.
Senator Brown. OK. Very good.
Thank you, Mr. Chairman.
Chairman Crapo. Thank you, and that does conclude the
questioning for today's hearing.
For Senators who wish to submit questions for the record,
those questions are due to the Committee by Tuesday, June 18th,
and we ask the witnesses to respond to those questions as
quickly as you can once you receive them.
Again, we thank you both for not only your time here today
but the attention and analysis that you have given to this
issue and will give to the issue as we proceed.
With that, this hearing is adjourned.
[Whereupon, at 11:32 a.m., the hearing was adjourned.]
[Prepared statements, responses to written questions, and
additional material supplied for the record follow:]
PREPARED STATEMENT OF CHAIRMAN MIKE CRAPO
Providing testimony to the Committee today are experts who have
researched and written extensively on big data: Dr. Alicia Cackley,
Director of Financial Markets and Community Investment at the
Government Accountability Office; and Ms. Pam Dixon, Executive Director
of the World Privacy Forum.
As a result of an increasingly digital economy, more personal
information is available to companies than ever before.
I have been troubled by government agencies and private companies'
collection of personally identifiable information for a long time.
There have been many questions about how individuals' or groups of
individuals' information is collected, with whom it is shared or sold,
how it is used and how it is secured.
Private companies are collecting, processing, analyzing and sharing
considerable data on individuals for all kinds of purposes.
Even more troubling is that the vast majority of Americans do not
even know what data is being collected, by whom and for what purpose.
In particular, data brokers and technology companies, including
large social media platforms and search engines, play a central role in
gathering vast amounts of personal information, and often without
interacting with individuals, specifically in the case of data brokers.
In 2013, the GAO issued a report on information resellers, which
includes data brokers, and the need for the consumer privacy framework
to reflect changes in technology and the marketplace.
The report noted that the current statutory consumer privacy
framework fails to address fully new technologies and the growing
marketplace for personal information.
The GAO also provided several recommendations to Congress on how to
approach the issue to provide consumers with more control over their
data.
In 2018--five years later--GAO published a blog summarizing its
2013 report, highlighting the continued relevance of the report's
findings.
The Federal Trade Commission also released a report in 2014 that
emphasized the big role of data brokers in the economy.
The FTC observed in the report that ``data brokers collect and
store billions of data elements covering nearly every U.S. consumer,''
and that ``data brokers collect data from numerous sources, largely
without consumers' knowledge.''
In her report ``The Scoring of America,'' Pam Dixon discusses
predictive consumer scoring across the economy, including the big role
that data brokers play.
She stresses that today, no protections exist for most consumer
scores similar to those that apply to credit scores under the Fair
Credit Reporting Act.
Dixon says, ``Consumer scores are today where credit scores were in
the 1950s. Data brokers, merchants, government entities and others can
create or use a consumer score without notice to consumers.''
Dr. Cackley has also issued several reports on consumer privacy and
technology, including a report in September 2013 on information
resellers, which includes data brokers.
She says in her report that the current consumer privacy framework
does not fully address new technologies and the vastly increased
marketplace for personal information.
She also discusses potential gaps in current Federal law, including
the Fair Credit Reporting Act.
The Banking Committee has been examining the data privacy issue in
both the private and public sectors, from regulators to financial
companies to other companies who gather vast amount of personal
information on individuals or groups of individuals, to see what can be
done through legislation, regulation or by instituting best practices.
Enacted in 1970, the Fair Credit Reporting Act is a law in the
Banking Committee's jurisdiction which aims to promote the accuracy,
fairness and privacy of consumer information contained in the files of
consumer reporting agencies.
Given the exponential growth and use of data since that time, and
the rise of entities that appear to serve a similar function as the
original credit reporting agencies, it is worth examining how the Fair
Credit Reporting Act should work in a digital economy.
During today's hearing, I look forward to learning more about the
structure and practices of the data broker industry and technology
companies, such as large social media platforms; how the data broker
industry has evolved with the development of new technologies, and
their interaction with technology companies; what information these
entities collect, and with whom it is shared and for what purposes;
what gaps exist in Federal privacy law; and what changes to Federal
law, including the Fair Credit Reporting Act, should be considered to
give individuals real control over their data.
I appreciate each of you joining us today to discuss this important
issue.
______
PREPARED STATEMENT OF SENATOR SHERROD BROWN
I appreciate Chairman Crapo continuing these important, bipartisan
efforts to protect Americans' sensitive personal information.
Today, we're looking at a shadowy industry known as ``data
brokers.'' Most of you probably haven't heard of these companies. The
biggest ones include names like Acxiom, CoreLogic, Spokeo, ZoomInfo,
and Oracle. According to some estimates, 4,000 of these companies are
collecting and selling our private information, but not one of them was
willing to show up and speak in front of the committee today. Not one.
These companies expect to be trusted with the most personal and
private information you could imagine about millions of Americans, but
they're not even willing to show up and explain how their industry
works. I think that tells you all you need to know about how much they
want their own faces and names associated with their industry.
As Maciej Ceglowski told us at our last hearing, ``the daily
activities of most Americans are now tracked and permanently recorded
by automated systems at Google or Facebook''
But most of that private activity isn't useful without data that
anchors it to the real world. Facebook, Google, and Amazon want to know
where you're using your credit cards, whether you buy name-brand
appliances, if you're recently divorced, and how big your life
insurance policy is. That's the kind of data that big tech gets from
data brokers, and they then combine it with your social media activity
to feed into their algorithms.
You might have noticed it seems like every product or service you
buy comes with a survey or a warranty card that asks for strangely
personal information. Why are all these nontech companies so interested
in your data?
It's simple--data brokers will pay those companies for any of your
personal information they can get their hands on, so they can turn
around and sell it to Silicon Valley. It's hard for ordinary consumers
to have any power when unbeknownst to them, they're actually the
product being bought and sold.
It reminds me of a time when corporations that had no business
being in the lending industry decided to start making loans and selling
them off to Wall Street. Manufacturers or car companies decided that
consumer credit would be a great way to boost their profits. When big
banks and big tech companies are willing to pay for something, everyone
else will find a way to sell it to them, often with devastating
results.
For example, Amazon is undermining retailers and manufacturers
across the country through anti-competitive practices, and at the same
time, it's scooping up data from the very businesses it's pushing out
of the market.
Then there's Facebook--it has almost single-handedly undermined the
profitability of newspapers across the country. It's also gobbling up
personal information that The New York Times allows data brokers to
collect from its readers.
Just like in the financial crisis, a group of shadowy players sits
at the center of the market, exercising enormous influence over
consumers and the economy while facing little or no rules at all.
Chairman Crapo and I are committed to shining a light on these
companies, and to keeping an unregulated data economy from spiraling
out of control. I look forward to the witnesses' testimony, and to
continuing to work with Chairman Crapo in a bipartisan manner.
______
[GRAPHICS NOT AVAILABLE IN TIFF FORMAT]
RESPONSE TO WRITTEN QUESTION OF SENATOR MENENDEZ FROM ALICIA
PUENTE CACKLEY
Q.1. Can data brokers legally compile, aggregate, or sell data
that has been acquired through an illegal hack?
A.1. GAO has not conducted work to determine the extent to
which data brokers are collecting, compiling, aggregating, or
selling data that was acquired through illegal hacks, or the
legality of such actions. However, we reported in March 2019
(GAO-19-230) that, except in certain circumstances, companies
are generally not required to be transparent about the consumer
data they hold or how they collect, maintain, use, and secure
these data. Further, we recommended more than a decade ago that
Congress consider whether to expand more broadly the class of
entities explicitly required to safeguard sensitive personal
information, including considering whether information
resellers should be required to safeguard all sensitive
personal information they hold (GAO-06-674). Even still,
statutes like the Computer Fraud and Abuse Act provide some
protection by making the knowing unauthorized access of
computers a crime, and FTC has used its enforcement authority
to address some instances of unfair or deceptive behavior in
the sale of information or its use in advertising. Notably, in
2014, FTC alleged that a data broker sold hundreds of thousands
of loan applications that contained sensitive data, including
consumers' names, addresses, phone numbers, employers, Social
Security numbers, and bank account numbers (including routing
numbers) to entities that it knew had no legitimate need for
such data. FTC alleged that, as a result, at least one of those
purchasers used the information to withdraw millions of dollars
from consumers' accounts without their authorization. FTC and
the involved companies settled this case in 2016, which
included monetary judgments and a permanent ban for all
defendants on selling, transferring, or otherwise disclosing
consumers' sensitive personal information.
------
RESPONSES TO WRITTEN QUESTIONS OF SENATOR WARREN FROM ALICIA
PUENTE CACKLEY
Q.1. In response to the Equifax data breach, I opened an
investigation into the causes, impacts, and response to the
exposure of personal data of nearly 150 million Americans.
Equifax and other credit reporting agencies collect consumer
data without permission, and consumers have no way to prevent
their data from being collected and held by private companies.
My investigation found that Equifax failed to adopt standard
cybersecurity measures, in large part because Federal law
incentivizes pursuit of profits over the protection of
sensitive data.
Your written testimony notes, ``[The Fair Credit Reporting
Act (FCRA)] protects the security and confidentiality of
personal
information collected or used to help make decisions about
individuals' eligibility for credit, insurance or employment.
FCRA limits resellers' use and distribution of personal
data.''\1\ This law, however, is not specifically designed to
address cybersecurity threats.\2\ In your view, how should
Federal regulators address this gap in the oversight and
enforcement of privacy safeguards?
---------------------------------------------------------------------------
\1\ Written testimony of Alicia Cackley to the U.S. Senate
Committee on Banking, Housing, and Urban Affairs, June 11, 2019,
https://www.banking.senate.gov/imo/media/doc/Cackley%20
Testimony%206-11-19.pdf.
\2\ Letter from Acting Federal Trade Commission Chair Maureen
Ohlhausen to Senator Elizabeth Warren, October 3, 2017.
A.1. There is currently no comprehensive Federal statute to
address consumer privacy, which is one reason that Federal
regulators are limited in their ability to address potential
gaps in current law. In a 2013 report (GAO-13-663), we
recommended that Congress consider updating the consumer
privacy framework to reflect the effects of changes in
technology and the marketplace--changes that have included new
and greater cybersecurity threats. Criteria for developing such
a framework could include the Fair Information Practice
Principles--and a key principle is that personal information
should be protected with reasonable security safeguards against
risk such as loss or unauthorized access, destruction,
---------------------------------------------------------------------------
modification, or disclosure.
Q.1.a. How would legislation to establish and provide Federal
authority and resources to monitor data security practices of
credit reporting agencies and data brokers benefit consumers?
A.1.a. Stronger Federal oversight of data security practices
could help to ensure that consumer reporting agencies and data
brokers better safeguard all sensitive personal information,
which could protect consumers from identity theft and other
effects of data breaches. To strengthen such oversight, our
February 2019 report on consumer reporting agencies (GAO-19-
196) recommended that Congress consider giving FTC civil
penalty authority to enforce Gramm-Leach-Bliley Act's (GLBA)
safeguarding provisions. In addition, we have long held that
data protections should apply broadly. For example, in 2006
(GAO-06-674), we noted that much of the personal information
maintained by information resellers that did not fall under
FCRA or GLBA was not necessarily required by Federal law to be
safeguarded, even when the information is sensitive and subject
to misuse by identity thieves. We therefore recommended that
Congress consider requiring information resellers to safeguard
all sensitive personal information they hold.
Q.1.b. In your view, would legislation to impose strict
liability penalties for breaches involving consumer data at
credit reporting agencies and data brokerages lead to
improvements in consumer data security? Would consumers benefit
if such penalties were imposed on data brokers?
A.1.b. GAO has not reviewed the issue of how strict liability
penalties for breaches involving consumer data at consumer
reporting agencies and other information resellers would affect
consumer data security or consumers. However, we have
highlighted the importance of providing agencies with civil
penalty authority, which can also be a strong enforcement tool.
In our February 2019 report on oversight of consumer reporting
agencies (GAO-19-196), we recommended that Congress consider
giving FTC civil penalty authority to enforce GLBA's
safeguarding provisions. Currently, to obtain monetary redress
for these violations, FTC must identify affected consumers and
any monetary harm they may have experienced. However, harm
resulting from privacy and security violations (such as a data
breach) can be difficult to measure and can occur years in the
future, making it difficult to trace a particular harm to a
specific breach. FTC currently lacks a practical enforcement
tool for imposing civil money penalties that could help to
deter companies from violating data security provisions of GLBA
and its implementing regulations. Such deterrence could benefit
consumers because companies may be motivated to develop
stronger procedures for data security that would protect
consumer data from theft and security breaches.
Q.2. Despite there being laws in place to regulate consumer
credit reporting, your written testimony notes that there are
``no Federal laws designed specifically to address all the
products sold and information maintained by [data
brokers].''\3\ Given the limited ability of individuals to
access, control, and correct their personal data, as well as
the limited legal framework to regulate data brokers, would the
inadequacy of current laws be addressed by regulating data
brokers under the Fair Credit Reporting Act?
---------------------------------------------------------------------------
\3\ Written testimony of Alicia Cackley to the U.S. Senate
Committee on Banking, Housing, and Urban Affairs, June 11, 2019,
https://www.banking.senate.gov/imo/media/doc/Cackley%20
Testimony%206-11-19.pdf.
A.2. GAO has not conducted work specifically assessing the
advantages and disadvantages of regulating all information
resellers (data brokers) under the Fair Credit Reporting Act.
In 2013 (GAO-13-663), we noted gaps in Federal privacy law--
including that it did not always cover consumer information
used by information resellers for marketing purposes or other
uses not covered by provisions of the Fair Credit Reporting
Act. We recommended that Congress consider strengthening the
consumer privacy framework to address these gaps, but we did
---------------------------------------------------------------------------
not recommend a specific regulatory scheme for doing so.
Q.2.a. Credit reporting agencies make billions of dollars
collecting and selling information about consumers, but
consumers have little ability to control how their personal
information is collected and used by these agencies. How would
legislation to give consumers more control over personal
financial data and to create a uniform, Federal process for
obtaining and lifting credit freezes benefit consumers? Would
consumers benefit if such legislation also applied to currently
unregulated parts of the industry, such as data brokerages?
A.2.a. While consumers currently do not have a uniform, Federal
process for credit freezes, the Economic Growth, Regulatory
Relief, and Consumer Protection Act required the three
nationwide consumer reporting agencies to place and lift
freezes at no cost to the consumer. Freezes must be placed
within 1 business day, and lifted within 1 hour, of receiving a
telephone or electronic request. However, consumers must
contact each of the three agencies individually and request the
freeze. Consumers obtain a PIN from each company, which enables
them to lift or remove a freeze at a later date. Before the
2018 Act, consumers typically had to pay $5-$10 per agency to
place a credit freeze. In our March 2019 report (GAO-19-230) on
data breaches and limitations of identity theft services, some
experts had noted cost and inconvenience as some of the
limitations to a credit freeze.\4\ The new law addresses these
concerns to some degree by making credit freezes free and
requiring these consumer reporting agencies to lift freezes
expeditiously on request.
---------------------------------------------------------------------------
\4\ GAO, Data Breaches: Range of Consumer Risks Highlights
Limitations of Identity Theft Services, GAO-19-230 (Washington, DC:
March 27, 2019).
---------------------------------------------------------------------------
In terms of less-regulated segments of the information
reseller industry--most notably, companies or data not covered
by FCRA--our 2013 recommendation to Congress (GAO-13-663)
suggested updating the consumer privacy framework in ways that
could address this gap. In particular, two key elements we said
such legislation should consider are (1) the adequacy of
consumers' ability to access, correct, and control their
personal information in circumstances beyond those currently
accorded under FCRA; and (2) whether there should be additional
controls on the types of personal or sensitive information that
may or may not be collected and shared.
------
RESPONSES TO WRITTEN QUESTIONS OF SENATOR SCHATZ FROM ALICIA
PUENTE CACKLEY
Q.1. Are data sets collected by data brokers getting into the
blood stream of credit, employment, and housing decision
making, in a way that evades FCRA?
A.1. GAO has not conducted work to determine the extent to
which information collected by data brokers is being used to
make credit, employment, and housing decisions in ways that do
not comply with the Fair Credit Reporting Act (FCRA). However,
in a 2018 report on financial technology (GAO-19-111), we
evaluated consumer protection issues related to FinTech
lenders' use of alternative data--that is, data not
traditionally used by the national consumer reporting agencies
in calculating a credit score--to make loan decisions.\1\ Five
of the 11 FinTech lenders we interviewed said they used
alternative data to supplement traditional data when making a
credit decision, with one using it exclusively. These lenders
told us that they obtain the data from borrowers, data
aggregators, national databases, or other sources. Consumers
may face risk of harm due to inaccurate credit assessments when
FinTech lenders use alternative data to underwrite loans.
Inaccurate data or models could classify borrowers as higher
credit risks than they actually are. This could result in those
borrowers paying unnecessarily high interest rates (and
increase risk of default), or it could result in creditworthy
borrowers being denied credit. While FCRA requires that
borrowers have an opportunity to check and correct inaccuracies
in their credit reports, borrowers could face challenges
checking and correcting alternative data, which typically are
not shown in credit reports. Further, it may not be transparent
to consumers and regulators what specific information
alternative credit-scoring systems use, how such use affects
consumers, and what consumers might do to improve credit access
and pricing.
---------------------------------------------------------------------------
\1\ GAO, Financial Technology: Agencies Should Provide
Clarification on Lenders' Use of Alternative Data, GAO-19-111
(Washington, DC: Dec. 19, 2018).
Q.2. Under current law, do companies that collect and sell
information about consumers have any duty to consumers about
---------------------------------------------------------------------------
how that information will be used?
A.2. The legal obligation to consumers related to the use of
consumer information varies based on the content and context of
that use. No comprehensive Federal privacy law governs the
collection, use, and sale of personal information by private-
sector companies. While there are Federal laws addressing
commercial privacy issues, they are generally narrowly tailored
to specific purposes, situations, types of information, or
sectors or entities--such as data related to financial
transactions, personal health, and eligibility for credit.
These laws include provisions that can restrict how certain
companies use consumer information they collect or sell--by,
for example, limiting the disclosure of certain types of
information to a third party without an individual's consent.
For example, FCRA--which applies to personal information
used for certain eligibility determinations--gives consumers
the right, among other things, to opt out of allowing consumer
reporting agencies to share their personal information with
third parties for prescreened marketing offers. Another example
is the Gramm-Leach-Bliley Act, which imposes certain sharing
and disclosure restrictions on financial institutions or
entities that receive nonpublic personal information from such
institutions. For instance, a third party that receives
nonpublic personal information from a financial institution to
process consumers' account transactions generally may not use
or resell the information for marketing purposes. Similarly,
other laws, such as the Health Insurance Portability and
Accountability Act of 1996 and the Children's Online Privacy
Protection Act of 1998, also restrict how consumer information
can be used, but they too apply narrowly to specific entities
or types of information.
Q.3. If consumers are discriminated against or harmed because
of how that data is used, who is responsible?
A.3. While the responsible party, if any, is going to vary
based on the facts and circumstances of each case, our January
2019 report on internet privacy (GAO-19-52) examined some
examples of Federal Trade Commission (FTC) enforcement actions
taken against companies related to internet privacy.\2\ In
these enforcement actions FTC alleged each company's practices
were unfair, deceptive, a violation of the Children's Online
Privacy Protection Act (COPPA), a violation of a settlement
agreement, or a combination of these reasons. In that report we
found that between July 1, 2008, and June 30, 2018, FTC filed
101 internet privacy enforcement actions, 15 of which included
COPPA enforcement actions against a variety of companies. Of
the 101 internet privacy actions, we reported that 51 involved
internet content providers, 21 involved software developers, 12
involved the sale of information or its use in advertising, 5
involved manufacturers, 1 involved an internet service
provider, and 11 involved a variety of different products, such
as those provided by rent-to-own companies or certification
services. In nearly all 101 cases companies settled with FTC,
which required the companies to make changes in their policies
or practices as part of the settlement. We reported that during
that 10-year period, FTC leveled civil penalties against 15
companies for alleged violations of COPPA regulations totaling
$12.7 million. These civil penalties ranged from $50,000 to $4
million with an average amount of $847,333. We also reported
that FTC can seek to compel companies to provide monetary
relief to those they have harmed and during that period FTC
levied civil penalties against companies for violations of
consent decrees or obtained monetary relief to consumers from
companies for a total of $136.1 million. These payment orders
ranged from $200,000 to $104.5 million and the average amount
was $17 million.\3\
---------------------------------------------------------------------------
\2\ GAO, internet Privacy: Additional Federal Authority Could
Enhance Consumer Protection and Provide Flexibility, GAO-19-52
(Washington, DC: Jan. 15, 2019).
\3\ However, this sum does not represent the amount of money that
consumers actually received or that was forfeited to the U.S. Treasury.
In some cases, including the payment order for $104.5 million, FTC
suspended the judgment because of the defendants' inability to pay.
Q.4. If a data broker is breached and a consumer suffers harm
---------------------------------------------------------------------------
from identity theft, who is liable?
A.4. As with the broader case of consumer harm, liability in
identity theft cases is a matter of the facts and circumstances
of each individual case. GAO hasn't examined liability
specifically with regard to data breaches. However, as noted
above, in our January 2019 report (GAO-19-52) we found that 12
of FTC's internet privacy enforcement actions between July 1,
2008, and June 30, 2018, involved the sale of information or
its use in advertising. Notably, in 2014, FTC alleged that a
data broker sold hundreds of thousands of loan applications
that contained sensitive data, including consumers' names,
addresses, phone numbers, employers, Social Security numbers,
and bank account numbers, including the bank routing numbers,
to entities that it knew had no legitimate need for such
data.\4\ FTC alleged that, as a result, at least one of those
purchasers used the information to withdraw millions of dollars
from consumers' accounts without their authorization. FTC and
the involved companies settled this case in 2016, which
included monetary judgments and a permanent ban for all
defendants on selling, transferring, or otherwise disclosing
consumers' sensitive personal information without consent.\5\
---------------------------------------------------------------------------
\4\ See Complaint, Federal Trade Commission v. Sitesearch
Corporation, dba LeapLab et al., No. 2:14-cv-02750-NVW (D. Ariz. Dec.
22, 2014), https://www.ftc.gov/system/files/documents/cases/
141223leaplabcmpt.pdf; see also Complaint, Federal Trade Commission v.
Ideal Financial Solutions, Inc., et al., No. 2:13-cv-00143-MMD-GWF (D.
Nev. Jan. 28, 2013), https://www.ftc.gov/sites/default/files/documents/
cases/2013/02/130220ifscmpt.pdf.
\5\ See Stipulated Final Order for Permanent Injunction and
Settlement of Claims, Federal Trade Commission v. Sitesearch
Corporation, dba LeapLab, a Nevada corporation; et al., No. CV-14-
02750-PHX-NVW (D. Ariz., Feb. 5, 2016), https://www.ftc.gov/system/
files/documents/cases/160218leaplaborder_0.pdf; see also Order Granting
in Part Motion for Summary Judgment and Motion for Default Judgment,
Entering Final Judgment, and Closing Case, Federal Trade Commission v.
Ideal Financial Solutions, Inc., et al., No. 2:13-cv-00143-JAD-GWF (D.
Nev. Feb. 23, 2016), https://www.ftc.gov/system/files/documents/cases/
160309ideal
financialorder.pdf.
Q.5. Do you think Federal law should require companies that
collect and use consumer data to take reasonable steps to
prevent unwanted disclosures of data and not use data to the
---------------------------------------------------------------------------
detriment of those consumers?
A.5. While GAO has not taken a position on whether Federal law
should require all companies to take measures to protect all
consumer data and to not use that data to the detriment of
consumers, we have previously recommended in GAO-13-663 that
Congress consider strengthening the current consumer privacy
framework. In making our recommendation, we noted that current
privacy law is not always aligned with the Fair Information
Practice Principles. One of these principles directly addresses
unwanted disclosures: ``security safeguards'' is the principle
that personal information should be protected with reasonable
security safeguards against risks such as loss or unauthorized
access, destruction, use, modification, or disclosure. Other
principles address not using a consumer's data to the detriment
of that consumer: for example, ``use limitation'' is the
principle that data should not be used for other than a
specified purpose without consent of the individual or legal
authority.
In addition, GAO has made a number of specific
recommendations for modifying Federal law that relate to
protecting consumer data held by private companies.
LIn May 2019 (GAO-19-340), we recommended that
Congress consider providing the Internal Revenue
Service (IRS) with explicit authority to establish
security requirements for paid tax return preparers'
and Authorized e-file Providers' systems.\6\
\6\ GAO, Taxpayer Information: IRS Needs to Improve Oversight of
Third-Party Cybersecurity Practices, GAO-19-340 (Washington, DC: May 9,
2019).
LIn February 2019 (GAO-19-196), we recommended that
Congress consider providing the Federal Trade
Commission with civil penalty authority for the
safeguarding provisions of the Gramm-Leach-Bliley Act,
which would help the agency act against data security
violations by financial institutions.\7\
---------------------------------------------------------------------------
\7\ GAO, Consumer Data Protection: Actions Needed to Strengthen
Oversight of Consumer Reporting Agencies, GAO-19-196 (Washington, DC:
Feb. 21, 2019).
LIn June 2006 (GAO-06-674), we recommended that
Congress consider requiring information resellers to
safeguard all sensitive personal information they
hold--not just information covered under the
safeguarding provisions of the Fair Credit
Reporting Act and Gramm-Leach-Bliley Act.\8\
---------------------------------------------------------------------------
\8\ GAO, Personal Information: Key Federal Privacy Laws Do Not
Require Information Resellers to Safeguard All Sensitive Data, GAO-06-
674 (Washington, DC: June 26, 2006).
---------------------------------------------------------------------------
------
RESPONSES TO WRITTEN QUESTIONS OF SENATOR CORTEZ MASTO FROM
ALICIA PUENTE CACKLEY
Q.1. What does it mean for financial markets now that FINRA can
essentially predict and decide in real time, or near real-time
investor behavior? What does it mean for other financial and
technical sectors?
A.1. In a March 2018 GAO forum (GAO-18-142SP), we highlighted
the use of artificial intelligence (AI) in financial services,
including market surveillance oversight activities.\1\ At the
time of the forum, the Financial Industry Regulatory Authority
(FINRA) was developing a prototype AI-based system, called the
Dynamic Surveillance Platform, which used supervised machine
learning capabilities to learn and detect different patterns of
market anomalies to enhance the ability to detect instances of
potential illegal manipulation of the securities and options
markets. With new AI-based tools, as well as future data
enhancements to increase the visibility of each trading
transaction offered by a new consolidated audit trail being
developed, regulators were hopeful that employing machine
learning capabilities will help identify future intentional
manipulation of the markets.
---------------------------------------------------------------------------
\1\ GAO, Technology Assessment: Artificial Intelligence: Emerging
Opportunities, Challenges, and Implications, GAO-18-142SP (Washington,
DC: March 28, 2018).
---------------------------------------------------------------------------
During the forum, industry participants and regulators
highlighted both benefits and challenges offered by the use of
AI tools in the marketplace. Benefits included enhanced
surveillance monitoring (by an entity internally as well as
externally by financial regulators) and tools to better detect
and prevent improper market conduct and enforce existing laws
and regulations in the marketplace. At the same time,
challenges and growing pains associated with technological
advances of AI-based tools also exist. For instance, banking
regulators and other industry observers said that banks are
reluctant to move quickly in implementing AI tools for lending
operations due to concerns about meeting requirements under
existing laws and regulations (e.g., requirements stemming from
fair lending laws that prohibit discriminatory practices on
lending, whether intentional or not, based on race, gender,
color, religion, national origin, marital status, or age).
Q.2. What are some of the gaps in currently existing law with
respect to how enforcement agencies deal with this multitude of
laws and what should we be thinking about in the Banking
Committee as we prepare to potentially consider broader privacy
legislation drafted by the Commerce Committee?
A.2. Many existing privacy statutes in the United States were
developed before the advent of many current technologies and
before companies were collecting and sharing such vast
quantities of consumer personal information. We reported in a
2013 review of information resellers (GAO-13-663) that we
believed that gaps exist in the current statutory privacy
framework, and we believe this remains true today.\2\ In
particular, the current framework does not fully address
changes in technology and marketplace practices that
fundamentally have altered the nature and extent to which
personal information is being shared with third parties.
Moreover, while current laws protect privacy interests in
specific sectors and for specific uses, consumers generally
have little control over how their information is collected,
used, and shared with third parties for marketing purposes.
---------------------------------------------------------------------------
\2\ GAO, Information Resellers: Consumer Privacy Framework Needs to
Reflect Changes in Technology and the Marketplace, GAO-13-663
(Washington, DC: Sept. 25, 2013).
---------------------------------------------------------------------------
If Congress considers broader privacy legislation to
strengthen the consumer privacy framework, we believe that
among the issues that should be considered are:
Lthe adequacy of consumers' ability to access,
correct, and control their personal information in
circumstances beyond those currently accorded under the
Fair Credit Reporting Act;
Lwhether there should be additional controls on the
types of personal or sensitive information that may or
may not be collected and shared;
Lchanges needed, if any, in the permitted sources
and methods for data collection; and
Lprivacy controls related to new technologies, such
as web tracking and mobile devices.
At the same time, we recognize that different legislative
approaches to improving privacy involve tradeoffs and believe
that any strengthened privacy framework should also seek not to
unduly inhibit the benefits to consumers, commerce, and
innovation that data sharing can accord.
------
RESPONSES TO WRITTEN QUESTIONS OF SENATOR MENENDEZ FROM PAM
DIXON
Q.1. In the hearing, you stated it is of ``grave concern'' that
data not covered by HIPAA is ending up in the hands of data
brokers.
Q.1.a. Are medical billing companies selling non-HIPAA data to
brokers?
A.1.a. We are most familiar with third-party medical billing
companies that inappropriately use HIPAA data for fraudulent
purposes. We are less familiar with medical billing companies
selling non-HIPAA data. The risk of HIPAA data misuses,
however, is significant by itself.
One major modality medical billing companies have used is
to fraudulently use HIPAA data to bill Medicare/Medicaid
directly, apart from original billing tasks. In another model,
medical billers may simply overcharge for services. These
activities are a form of medical identity theft, and typically
results in fraudulent changes to the health file. The Office of
the Inspector General wrote a brief but seminal report about
billing companies in March, 2000.\1\ In the report, the OIG
noted the complex problems with medical billing, including
problems with transparency and auditing. There continue to be
many cases relating directly to problems with medical
billers.\2\
---------------------------------------------------------------------------
\1\ See https://oig.hhs.gov/oei/reports/oei-05-99-00100.pdf.
\2\ See, for example, the 2015 Medicaid case: https://
www.justice.gov/usao-wdnc/pr/owner-medical-billing-company-indicted-
health-care-fraud-and-aggravated-identity-theft; and the more recent
case from July 2019: https://www.justice.gov/usao-sdoh/pr/medical-
billing-company-owner-sentenced-prison-health-care-fraud.
---------------------------------------------------------------------------
OIG has established voluntary compliance guidance for
medical billing, but the guidance dates from 1998.\3\ HBMA has
established medical billing credentialing and training for
companies, which currently functions as a set of best
practices.\4\ We believe much more can be done here, for
example, we would like to see many more credentialed members of
HBMA, and more encouragement from Congress for either
certification or some additional form of oversight for medical
billing companies.
---------------------------------------------------------------------------
\3\ See https://www.oig.hhs.gov/fraud/docs/complianceguidance/
thirdparty.pdf.
\4\ See https://www.hbma.org/content/certification/hbma-compliance-
accreditation-program/accredited-companies.
---------------------------------------------------------------------------
Medical billing deserves an update from OIG and from
Congress. It would be a particularly productive area to update.
Q.1.b. How pervasive of a problem is medical identity theft?
A.1.b. We first identified medical identity theft as a problem
in testimony to NCVHS in 2005, then wrote the first known
report on the topic in 2006.\5\ We continue to research the
field, and can now give you precise quantifications of the
problem, State by State.
---------------------------------------------------------------------------
\5\ See https://www.worldprivacyforum.org/2006/05/report-medical-
identity-theft-the-information-crime-that-can-kill-you/.
---------------------------------------------------------------------------
In January 2020 we will publish our State of Medical
Identity Theft report, which follows our 2017 Geography of
Medical Identity Theft report.\6\ We published an interactive
data visualization of medical identity theft in the United
States, by State that accompanied the report.\7\
---------------------------------------------------------------------------
\6\ See https://www.worldprivacyforum.org/2017/12/new-report-the-
geography-of-medical-identity
theft/.
\7\ World Privacy Forum, Medical Identity Theft Mapped by State:
Data Visualization. https://www.worldprivacyforum.org/2017/12/medical-
identity-theft-reports-to-the-consume-financial-protection-bureau/.
---------------------------------------------------------------------------
In our 2020 report, we again have found pervasive incidents
of medical identity theft across the United States, with some
States showing more serious problems. We have included two
screen shots of our pre-publication data to give you a visual
view of the numbers. The numbers from 2013-2018 are final, and
the numbers for 2019 run to Dec. 1. Our January report with the
final 2019 numbers will have nearly identical statistics as the
screenshots attached here.
As you can see from the data, medical identity theft is now
present in all States. This data has been adjusted per
population rate. We note persistent patterns of medical
identity theft through the southeastern corridor, with hot
spots in Texas, Georgia, Florida, South Carolina, and Nevada.
We note that New Jersey was a hot spot, but has seen
improvement in recent years, as has Illinois.
Medical Identity Theft complaints, 2013-2019:
Medical Identity Theft Complaints, 2019
Rate per 1 Million Population
Q.1.c. When patients are victims of medical identity theft,
what recourse do they have to correct errors on their files?
A.1.c. Patients can use their rights under the FCRA to correct
the financial aspects of their healthcare provider records.
However, patients do not have commensurate rights under HIPAA
to delete or correct errors in their medical records. Under
HIPAA, patients can request the addition of an amendment to
their records. An amendment request does not have to be honored
by the healthcare
provider. Amendment requests do not mandate the removal or
correction of information, they simply allow consumers to
dispute the information. Healthcare providers typically do not
delete information in a health file.
There are some workarounds. A responsible healthcare
provider can remove inaccurate information from a patient's
record and leave only a numeric cross reference to the
information introduced by the fraudulent activities. For
example, if a patient was fraudulently billed for having
cancer, the patient's health record would reflect that error.
The heathcare provider could remove that and other related
information introduced by the fraudulent activity, and
sequester it into a new ``John or Jane Doe'' file, leaving only
a numeric cross reference. This is one of the several best
practices for handling errors in records resulting from medical
identity theft.
However--this issue needs to be addressed legislatively so
that there is a national standard for how to assist victims in
correcting their health records after medical identity theft
has introduced errors. Ultimately, a national-level solution
will improve data for the entire health system as well as help
victims. This is a gap that needs to be addressed.
Q.1.d. Typically, how often do these cases go unresolved?
A.1.d. Anecdotally, many cases go unresolved. We are aware of
many patients over the years who have chosen to ignore the
problems, because they simply could not resolve them. Part of
the way we know this is from ongoing phone calls over the years
since the first publication of our report in 2006. We have
found that there is a high degree of variability in healthcare
providers' responses. We believe a uniform procedure for
correction could improve outcomes for victims and providers
alike.
Q.2. You also mentioned that we need to do more to ensure that
consumers are notified when a data broker suffers a breach that
exposes consumers' sensitive information.
Q.2.a. Given that data brokers often do not have a direct
relationship with consumers, what do you think is the best way
for Congress to ensure that consumers are notified when their
data is exposed by a breach?
A.2.a. Data brokers should have specific requirements to make
breach notification to consumers. It is not reasonable that
data brokers cannot find a way to contact consumers who are not
their direct customers, but nevertheless have lists and APIs
filled with highly identifiable personal data of these same
consumers, including email addresses, home addresses, phone
numbers, and sometimes social media handles. Of all entities,
data brokers have the information on hand to make appropriate
breach notification--even those that do not have a direct
relationship to the consumer.
Q.2.b. Is there a way for consumers to better control how their
data is shared with brokers, perhaps by requiring some sort of
affirmative consent?
A.2.b. Requiring consent in some circumstances and providing a
uniform opt-out with enforcement procedures and penalties for
noncompliance would be helpful for better controlling data
management among data broker companies.
Currently, there is not a uniform, comprehensive, or simple
way for consumers to control how their data is shared with
brokers, nor to opt out. Not all data brokers provide an opt
out. Those that do can be difficult for most consumers to find.
To opt out of all data brokers operating in the United States
is not possible today. Even if it were possible, most consumers
would need to be an extraordinary amount of time to find and
request data broker opt outs. A central data broker
registration point would be helpful to solve this problem.
Vermont passed a modest but important data broker
registration law that did not include opt-out requirements.
However, the registration law is still helpful so that
consumers know what data brokers are operating in their State.
A handful of other States have passed some limited opt-out
requirements, for example, some States allow members of the
judiciary and law enforcement the right to opt out of data
broker databases.
Both data broker registration and opt-out requirements have
roles to play in improving consumer control.
Q.3. The World Privacy Forum's website says ``Some commercial
data brokers allow some categories of consumers to opt out of
some limited uses and disclosures of personal information.''
That quote does not inspire confidence in consumers that they
have control over their data.
Q.3.a. Does the data broker industry have a comprehensive and
uniform opt-out policy for consumers?
A.3.a. No. The data broker industry does not have a uniform or
comprehensive opt-out policy for consumers. The data broker
industry has a poor record of how they handle opt outs. Here
are some of the key issues:
LOpt-outs often require additional identity
information, including digital scans of Government IDs,
which consumers are rightly concerned about giving to a
data broker.
LSome sites charge opt-out fees. For example, the
DMA charges a fee to consumers to opt out. Consumers
should be able to opt out free of charge.
LData brokers--many of them--make the opt-outs so
difficult that the hurdle is too high for any but the
most persistent and determined consumer. See the FTC
complaint we wrote in regards to this issue.\8\ There
are also a lot of nudges to redirect people from opting
out.
---------------------------------------------------------------------------
\8\ See https://www.worldprivacyforum.org/2009/04/public-comments-
request-for-declaration-regarding-fairness-of-opt-out-methods-and-
investigation-into-acxiom-ussearch-publicrecordsnow-and-usa-people-
search-consumer-opt-outmethods-for-compliance-with/.
LWe have worked with many survivors of crime and
domestic violence regarding data broker issues. When we
work with individuals to try to opt out, we find that
it takes people about 40 hours on average to get
through all of the opt-outs. And that is a first pass
---------------------------------------------------------------------------
of just the larger data brokers that do allow opt-outs.
LNot all opt-outs ``take.'' The rates for opt-out
failure vary widely by site.
LFCRA compliance among data brokers is woefully low;
data brokers that are offering background checks often
disclaim responsibility by noting that consumers can
only search for themselves. How are these sites
ensuring no FCRA violations are occurring? Where is the
oversight on this?
LAnd on top of all of this, can consumers even find
all of the data brokers to opt-out from?
Q.3.b. What is the best approach for giving consumers power
over their data given that current data broker opt-out options
are ``quite limited'' and that it is nearly impossible to tell
the effect an opt-out will actually have?
A.3.b. First, it is important to institute multifactoral
solutions. Data brokers present complex problems and challenges
for consumers. There isn't a ``single silver bullet'' solution
that will capture everything.
Second, there are many small solutions which, if put in
place, would facilitate meaningful improvements for consumers
regarding data brokers. When taken together, if a thoughtful
grouping of solutions could be enacted, it would be helpful.
(Opt out plus registration plus data breach requirement plus
oversight, et cetera.)
Third, self regulation has utterly failed in the data
broker industry. We do not need to spend any more time on this.
It hasn't worked, and is not likely to work.
Fourth, data brokers have many business models. It is a
complex sector, and the definitional boundaries are challenging
to set. There is not one sole definition anymore of a data
broker. It makes sense at this point to consider a variety of
regulatory strategies to match the type of data broker. For
example, People Search data brokers should be required to
provide opt-outs to consumers. Data brokers creating aggregate
credit scores should be subject to the FCRA in their uses of
household-modeled scores. (The FCRA will need to be expanded
for this to happen.)
Solutions that will help:
1. LLegislation that requires data brokers to not use or
disclose consumer data for any fraudulent or criminal
purpose, and requires data brokers to not use consumer
data in a discriminatory way or for any discriminatory
purpose.
2. LLegislation requiring data brokers to provide an opt-out
to consumers. All People Search data brokers should be
required to provide an opt-out.
3. LLegislation mandating a comprehensive, unified opt-out in
content and format.
4. LLegislation providing for a unified registry of all
categories of data brokers (Vermont State statute,
exemplar.)
5. LExpansion of the FCRA to expand definitions of
eligibility to ensure that household or aggregate
credit scoring and other meaningful consumer scores are
regulated.
6. LLegislation that requires all data brokers to provide
data breach notification to consumers.
7. LLegislation that requires data brokers to maintain
security standards, and actively set requirements for
meeting security targets, benchmarks, and show security
improvements.
Q.3.c. What happens to a consumer's data once they have opted
out?
A.3.c. Consumers' data, after they have placed an opt-out
request, is most frequently suppressed in some way. The opt-out
data is frequently still held by the data broker, but when data
brokers ``suppress'' the data, they do not allow it to be
visible to the public for a period of time.
A number of data brokers require opt-outs to be repeated
after a period of time, and there are no rules of the road for
what period of time will be involved. It can be 1 year, 2
years, 3 years, et cetera. Consumers are on their own to keep
track of how often they will have to go through the opt-out
process.
------
RESPONSES TO WRITTEN QUESTIONS OF SENATOR WARREN FROM PAM DIXON
Q.1. In response to the Equifax data breach, I opened an
investigation into the causes, impacts, and response to the
exposure of personal data of nearly 150 million Americans.
Equifax and other credit reporting agencies collect
consumer data without permission, and consumers have no way to
prevent their data from being collected and held by private
companies. My investigation found that Equifax failed to adopt
standard cybersecurity measures, in large part because Federal
law incentivizes pursuit of profits over the protection of
sensitive data.
Q.1.a. Your written testimony notes, ``Credit scores and
predictions are being sold that are not regulated by [The Fair
Credit Reporting Act (FCRA)]'' and that ``The technology
environment is facilitating more scores being used in more
places in consumers' lives, and not all uses are positive.''
Your proposed solutions include bringing unregulated forms of
credit scoring under the FCRA and studying new areas of
eligibility that need to fall under the FCRA. Given the limited
ability of individuals to access, control, and correct their
personal data, as well as the limited legal framework to
regulate data brokers, would the inadequacy of current laws be
addressed by regulating data brokers under the Fair Credit
Reporting Act?
A.1.a. It would be of great help for Congress to clarify that
aggregate credit scores should already be regulated under the
FCRA, and to study new areas of eligibility. These actions
would provide for significant improvements in solving some of
the more egregious issues related to credit and other ``grey
area'' eligibility decisions. These changes, should Congress
take action, would remedy certain aspects of the current
problems. I agree that these changes would not address every
challenge posed by data broker activities. But these changes
would capture a good portion of some of the more serious and
systemic problems consumers are facing.
In 2013, WPF testified before Congress about non-FCRA or
unregulated credit scores, warning that they were problematic
and could create consumer harm. In 2014, we wrote a report
called The Scoring of America that more fully documented the
non-FCRA credit scores. We have found that in 2019, unregulated
credit scores are now widespread and are being used on data
broker lists and in electronic data append services. We are
deeply concerned that the use of unregulated credit scores is
poised to create substantial, widespread consumer harm as the
use of these scores becomes an entrenched business practice.
I would like to respond in additional detail to your
questions.
First, regarding issues relating generally to data
availability, even though unregulated credit scores use third-
party data, which now circulates in abundance, this use does
not automatically mean the scores are unregulated. The
alternative credit scores such as those offered by PRBC are
regulated credit scores. Alternative data is considered
regulated just as if it were credit bureau data. This creates a
strong basis for determining that it is not just the use of
traditional credit bureau data that causes the applicability of
the FCRA to a score. Using third-party data therefore does not
constitute a condition under which a score does not fall under
FCRA regulation.
Second, household-level scores may still be applied to an
individual consumer. Even though companies and credit bureaus
creating and using unregulated versions of credit scores make
great efforts to explain that the scores are ``aggregated'' to
a household level data, or census block-level data, or ZIP+4
data, it does not mean that the data will not be used as a
proxy for a credit score of an individual living at that
address.
If an aggregate credit score is applied to an individual at
a decision-making point that would be regulated if it were a
traditional credit score, then the credit score, even if it is
an aggregate, ZIP+4 modeled score, still must be regulated
under the FCRA because it is being applied to an individual. We
stress that as long as a person's home address is known, then a
ZIP+4 credit score can be applied to that person as an
individual. Additionally, any person who gives a general ZIP
Code at a point of purchase, for example, could be scored in
near real-time and decisions can be made about that person as
an individual based on the ZIP Code of the neighborhood they
live in. In this way, too, unregulated credit scores may be
applicable to individuals.
Note the following exemplars:
A. LEquifax Aggregated FICO Scores.\1\
---------------------------------------------------------------------------
\1\ See https://www.equifax.com/business/aggregated-fico-scores/.
B. LTransUnion offers TransUnion Audiences. This is what the
company calls a summary level view of credit profiles
at a geographic (ZIP+4) level. This is TransUnion's
version of an unregulated credit score, and the scoring
---------------------------------------------------------------------------
is offered as a service.
L``Our consumer finance audiences are aggregated and de-
personalized using ZIP+4 microgeographies to achieve a
high level of targeting effectiveness while maintaining
regulatory compliance.''\2\
---------------------------------------------------------------------------
\2\ TransUnion Audience Buying Guide, https://www.transunion.com/
resources/transunion/doc/insights/buying-guides/TU-digital-audience-
buying-guide-july-2018.pdf.
---------------------------------------------------------------------------
Land
L``TransUnion audiences are sourced from anonymized,
aggregated consumer credit data, delivering valuable
credit behavior intelligence. Built from TransUnion's
consumer database consisting of more than 230 million
U.S. records, aggregated credit data provides a
summary-level view of credit profiles at a geographic
(ZIP+4) level. TransUnion audiences target the
consumers most likely to have the financial ability to
qualify and respond.''\3\
---------------------------------------------------------------------------
\3\ Nielsen Data as a Service Data Partners, TransUnion. http://
sites.nielsen.com/daas-partners/partner/transunion/.
C. LAnalytics IQ offers a GeoCreditIQ product,\4\ which is
its version of an unregulated consumer score. Analytics
IQ states that:
---------------------------------------------------------------------------
\4\ Analytics IQ. https://analytics-iq.com/what-we-do/. For a more
detailed description, see: https://analyticsiq.com/downloads/
analyticsiq-productsheet-geocreditiq.pdf.
L``Credit-related data, even summarized at a geographic
level, should always come directly from the source--
U.S.-based credit bureaus. That is the approach
AnalyticsIQ takes to create the foundation of our
GeoCreditIQ data. By working directly with the bureaus,
our GeoCreditIQ data is extremely accurate and
predictive. With GeoCreditIQ marketers get the best of
both worlds. The data correlates highly to actual
credit scores, however, it is less restrictive and very
powerful in everyday marketing activities.''\5\
---------------------------------------------------------------------------
\5\ Analytics IQ GeoCreditIQ brochure, https://analytics-iq.com/
downloads/analyticsiq-product
sheet-geocreditiq.pdf.
D. LExperian offers its Premier Aggregated Credit Statistics
score. The ``The Premier Aggregated Credit Statistics
product is derived from the credit profiles of more
than 220 million credit-active consumers and averaged
at the ZIP-Code level.''\6\ Experian states that this
score is ``Beneficial to virtually any industry,
including debt collections, education, government,
financial services, capital markets and data
analytics.''\7\ Experian states that customers can
``Get unprecedented insight into the credit health of
neighborhoods across the United States.'' And it also
states that it can be used for debt collections, which
typically is applied at an individual level. It has
used its data to score the top 25 neighborhoods with
the most mortgage debt, for example.\8\ Experian's ZIP
Code credit score is offered as a service.
---------------------------------------------------------------------------
\6\ Experian Premier Aggregated Credit Statistics. Available at
https://www.experian.com/consumer-information/premier-aggregated-
credit-statistics.html.
\7\ Supra note 5.
\8\ Experian Blog Post, ZIP Codes with the Highest Mortgage Debt,
July 22, 2019. https://www.experian.com/blogs/ask-experian/research/
zip-codes-with-the-highest-mortgage-debt/.
E. LNextMark sells a data broker list of ``Summarized Credit
Scores FICO-Like Mailing List.''\9\ The data card
states: ``Summarized Credit Scores are used to help our
clients target segments of the population at varying
levels of credit worthiness. It is carefully built upon
the historic financial transaction data of hundred of
millions of consumers, aggregated at the ZIP+4 level.''
The data card has further recommendations for use:
---------------------------------------------------------------------------
\9\ Nextmark, https://lists.nextmark.com/
market;jsessionid624D63468C12F73E52082D474F1C4
9C9?page-order/online/datacard&id=281247.
L``Recommendations for Banking, Insurance and Automotive
---------------------------------------------------------------------------
Industries:
LOverlay summarized credit scores on your database to
determine credit worthy, or subprime for special
finance offers.
LRecommendations for mortgage industry:
LSubprime Program: Identify consumers with debt and credit
challenges: Choose summarized credit FICO-like ranges
of less than 600, specific loan dates and loan amounts
or LTV. . . .''
F. LThe Dataman Group has ``Modeled Credit Score Prospect
Lists.''\10\ The lists include a profitability score,
and uses layers of data to score at the household
level.
---------------------------------------------------------------------------
\10\ Dataman Group, Modeled Credit Score Lists, https://
www.datamangroup.com/modeled-credit-score-lists/.
L``This new ConsumerView Profitability Score list select
helps identify households likely to pay their debts and
ranks households by profitability, allowing marketers
---------------------------------------------------------------------------
to target the best prospects based on:
LProfitability
LApproval Rates
LResponse Rates
LThe scores align very closely to bonafide Credit
Scoring--and with this file--no preapproval is needed!
LThe ConsumerView Profitability Score combines a robust
scoring model that offers high levels of refinement for
selecting the most profitable prospects combined with
our top-notch Consumer Database. This gives you greater
precision in predicting, identifying and targeting
prospects at the Household Level.''
These are just a few exemplars of the ways in which unregulated
credit scores are being used today.
Third, credit scores may only be pulled for purposes
strictly defined in the FCRA; they cannot be used for general
marketing purposes. It is already established policy, and law,
that credit scores cannot be used for general marketing
purposes except in situations expressly defined by the FCRA.
Given that unregulated credit scores are accurate proxies for
regulated credit scores, the use of aggregate ZIP+4 credit
scores for expansive marketing purposes currently violates
established law and public policy about uses of credit scores.
If credit scores were meant to be used for expansive marketing
purposes, then the FCRA would permit such uses.
And finally, despite the apparent applicability of the FCRA
to aggregate credit scores, we do not see mechanisms that have
been made available to consumers for making the uses of these
scores transparent. We do not see prominent efforts by credit
bureaus to allow consumers to see their ZIP+4 credit scores,
nor household scores, nor reveal who has requested their
unregulated credit score. We do not see mechanisms for
consumers to correct errors in their unregulated scores, or to
prevent other abuses the FCRA and ECOA were designed to
address. We do not know how or if the credit bureaus are
affirmatively tracking, monitoring, and policing the uses of
unregulated credit scores, and we are greatly concerned that
these scores may also be easily used both applied at an
individual level and used for eligibility purposes. We do not
see the credit bureaus and others reporting publicly their
technological proof of compliance with the FCRA regarding the
unregulated credit scores.
Unfortunately, consumers are not able to avoid the harms
involved with unregulated credit scoring. The lists and
databases of millions of consumers appended with their
unregulated credit scores occur without consumers' knowledge or
ability to correct the data. Financial, educational,
employment, and other opportunities based on a person's
unregulated ZIP+4 or household credit score may have profound
impacts on individuals, but they will not be able to use
existing FCRA tools to remedy the problems posed by this
category of credit scores.
If Congress clarified the FCRA to bring aggregate credit
scores clearly under the auspices of the FCRA, with no
interpretational grey areas, it would provide meaningful,
significant improvement. Aggregate credit scores would no
longer be able to be used for marketing purposes, these types
of credit scores would not be able to be quietly applied
illegally to individual consumers, and an avenue of growing
harm would be closed.
Q.1.b. Credit reporting agencies make billions of dollars
collecting and selling information about consumers, but
consumers have little ability to control how their personal
information is collected and used by these agencies. How would
legislation to give consumers more control over personal
financial data and to create a uniform, Federal process for
obtaining and lifting credit freezes benefit consumers? Would
consumers benefit if such legislation also applied to currently
unregulated parts of the industry, such as data brokerages?
A.1.b. When identity theft remedies were being put in place
from the mid-1990s though the early 2010s, I observed in real-
time how these remedies beneficially impacted consumers through
the many phone calls that came in to World Privacy Forum. After
State security freeze laws were enacted, consumers with
multistate identity theft issues experienced significant
relief, as did single-state victims of identity theft. Security
freeze laws have worked well for consumers, particularly those
with serious identity theft in their present or past. If a
uniform Federal process took the strongest and best of the
State laws and created rapid setting and lifting of security
freezes, that could be beneficial.
It would be beneficial for security freezes to apply across
data brokerages as well. This would assist in cases of identity
theft, and it would assist with safety considerations. We have
found that in particular, victims of crime, including domestic
violence and stalking among other crimes, as well as elected
officials and law enforcement officers, have safety
considerations that apply to data broker data.
Q.2. Your written testimony calls for legislation to facilitate
setting due process standards that would fill in meaningful
gaps in privacy protections. Along with Professor Jane Winn,
you suggest legislation that would give the Federal Trade
Commission additional authorities to regulate practices in
connection with personal data. Relatedly, I have introduced
legislation to give the Federal Trade Commission more direct
supervisory authority over data security at credit reporting
agencies.
Q.2.a. How would legislation to establish and provide Federal
authority and resources to monitor data security practices of
credit reporting agencies and data brokers benefit consumers?
A.2.a. Legislation that would provide Federal authority and
resources to monitor data security practices of CRAs and data
brokers could benefit consumers in several ways; by setting
guardrails for the data broker sector generally, by giving
consumers more agency in the overall process, and by requiring
data brokers and CRAs to manage data using processes documented
to facilitate ongoing improvements in outcomes.
By way of background, the current debate over what Federal
information privacy legislation should look like is often based
on the assumption that there are only two models to choose
from: a market-based approach or a hierarchical rights-based
approach. Applying Nobel Laureate Elinor Ostrom's principles of
governance design (Nives Dolsak, Elinor Ostrom & Bonnie J.
McCay, The Commons in the New Millenium (2003) and a pragmatic
understanding of scientific knowledge as socially constructed
makes it possible to find a middle path between a market
approach or a hierarchical approach to information governance.
Successful examples of governance mechanisms that lie on
this middle path include privacy standard setting processes, as
you noted in your question. Such collaborative standards-
setting efforts should not be confused with privacy self-
regulation, which is one example of a market approach that
lacks accountability because, as the economist Anthony Ogus
pointed out in Rethinking Self-Regulation, (Oxford Journal of
Legal Studies, 1995), private self-regulation is per se
captured from its inception.
The term ``voluntary consensus standards'' has a specific
meaning that is already defined in law. The U.S. Food and Drug
Administration has been using voluntary consensus standards
that comply with due process requirements as articulated in the
Office of Management and Budget (OMB) Circular A-119 for more
than 20 years, which has resulted in more than 1,000 recognized
standards applicable to medical devices. The World Trade
Organization (WTO), Agreement on Technical Barriers to Trade is
a core document that outlines how standards may be set by
independent parties in a fair and appropriate manner that does
not create transactional or other barriers. These ideas have
applicability to data ecosystems and privacy risks.
Within the framework of due process guarantees set out in
OMB Circular A-119, Federal regulators today have the power to
recognize compliance with voluntary, consensus standards as
evidence of compliance with the law for specific, limited
regulatory purposes. Federal regulators may only use voluntary
consensus standards to create such safe harbors if the
standards can be shown to have been developed through processes
whose openness, balance, consensus, inclusion, transparency and
accountability have been independently verified.
When the interface between Federal legislation and
voluntary, consensus industry standards is working correctly,
then the private sector (inclusive of all private sector
stakeholders) takes the lead in developing appropriate,
context-specific standards for solving policy problems. Next,
regulators take the lead in assessing whether those private
standards meet the needs of the American public as well as the
industry players that developed them. These assessments will
ideally be conducted in an ongoing manner, and can
realistically include monitoring that is in real time or near
real time. Finally, courts stand by ready to serve as
independent arbiters of the behavior of both industry and
Government.
Beyond the standards approach, another important set of
measures relates to governance that ensures ongoing improvement
targets are set and achieved. See my response to B, below.
Q.2.b. In your view, would legislation to impose strict
liability penalties for breaches involving consumer data at
credit reporting agencies and data brokerages lead to
improvements in consumer data security? Would consumers benefit
if such penalties were imposed on data brokers?
A.2.b. Credit Reporting Agencies and data brokers have a
heightened responsibility to ensure data integrity on all
fronts, including responsibilities related to data security,
data integrity, and data breaches. Strict liability
requirements can have a place in highly sensitive data settings
to ensure the highest standards of data integrity are being
met.
Much has been learned in the last 25 years about data
protection and digital ecosystems. Data protection laws that
have already been enacted in 123-plus countries have grown to
have significant similarities, even when aspects of the law
have been adapted to unique county-level conditions. See for
example, the work of Graham Greenleaf on this topic. Data
breach requirements are spreading globally.
However, despite all of the work on privacy and data
protection, baseline governance principles that have
demonstrated worth in other settings such as environmental,
manufacturing, and law enforcement contexts, have generally not
yet been applied in the privacy realm. This is a rich area for
exploration regarding legislation.
By themselves, strict liability requirements are not enough
to create reliably good results in the long term if the goal is
to substantively improve outcomes for consumers and for the
businesses that must comply with data breach laws. A
comprehensive governance system is needed that will facilitate
the creation of specific and appropriate benchmarking and
improvement processes to achieve improvement goals.
Here, we point to the expansive and demonstrably productive
work of W. Edwards Deming, including his system (and
principles) of management\11\ and his process cycle of
continual improvement.\12\ If legislation were to go beyond
strict liability and also enshrine such types of ongoing
improvement processes as part of the principles of governance
within a privacy or data breach context, it would go far to
creating a more mature and effective approach to data systems
and processes. Over time, while strict liability will have
certain baseline compliance effects, it is primarily a tool for
deterrence. It does not fully work to complete the job of
bringing businesses up to significant levels of improvement.
For this to happen, affirmative governance structures also need
to be in place. Given that privacy is still catching up to
other business systems thought in other sectors, enshrining
ideas of continual improvement would be helpful in creating an
environment where better systems of data governance can be
created.
---------------------------------------------------------------------------
\11\ See https://deming.org/explore/fourteenpoints.
\12\ Plan, Do, Study, Act; https://deming.org/explore/p-d-s-a.
---------------------------------------------------------------------------
------
RESPONSES TO WRITTEN QUESTIONS OF SENATOR SCHATZ FROM PAM DIXON
Q.1. Are data sets collected by data brokers getting into the
blood stream of credit, employment, and housing decision
making, in a way that evades the FCRA?
A.1. Yes, data sets regarding consumers that are held by data
brokers are being used for credit, employment, and housing
decision making in ways that may evade the FCRA. Going one step
further, data broker data is being used to create consumer
scores being used in eligibility situations, and this also
evades the FCRA, or closely skirts it. In our Scoring of
America report we documented many of the various data streams
that data brokers utilize in gathering consumers' personal
data, and we documented the scores themselves.
In particular, aggregate or modeled credit scores are
particularly challenging in regards to FCRA compliance. These
are scores that are typically modeled on ZIP+4, census block,
or the household level. They are often marketed as comparable
to regulated credit scores. When household credit scores are
applied to the individual, I believe this violates the FCRA.
When the household credit scores are used in eligibility
circumstances at the individual level, this, too, I believe is
a violation of the FCRA. In my testimony, I discussed the FICO
Aggregate Credit Score. It is not the only such score in this
category.
Tracking the proliferation of aggregate and modeled credit
scores is one way to see the significant potential for skirting
of the FCRA. Questions abound:
LHow many of these scores are being used in
eligibility circumstances?
LHow are these scores being used in marketing or
other circumstances?
LHow are the companies policing the use of these
scores?
LTo whom or what entities have the scores been sold?
LHow can the companies producing aggregate credit
scores affirmatively demonstrate that their product is
only being used in full compliance with the FCRA?
There are limited ways available to track data broker data.
However, one of the ways to get a glimpse of it is to review
the data broker data cards that are available via the list
broker or data broker websites. Examples include:
LNextMark List Finder: https://lists.nextmark.com/
market.
LExact Data Consumer Lists: https://
www.exactdata.com/consumer-mailing-lists.html.
LInfoUSA Consumer Lists: https://www.infousa.com/
lists/consumer-lists/.
LDataman Consumer Lists: https://
www.datamangroup.com/national-consumer-database/.
LExperian Consumer Sales Leads: https://
www.experian.com/small-business/sales-leads.jsp.
This is a very small selection of offerings of detailed
consumer data available via lists. I note that this is just one
aspect of data brokering. It happens to be the easiest to
demonstrate at this time; however, many other data broker
activities occur out of sight, for example, data APIs, which
provide the ``list'' on demand and will likely replace older
list methods fairly soon.
And to reiterate, it is crucial to understand that the
production of consumer scores is a way to condense raw data
broker data into numeric shorthand. Unregulated consumer scores
can be as challenging to the FCRA as the original raw data, and
can cause harms when misused in eligibility circumstances.
Q.2. Under current law, do companies that collect and sell
information about consumers have any duty to consumers about
how that information will be used? If consumers are
discriminated against or harmed because of how that data is
used, who is responsible?
A.2. There is not yet a broad, comprehensively applicable rule
applicable to duties of care regarding the use of consumer
data. There are some sectoral protections in place. Additional
pressures from the States have created a very narrow pathway
for some rules in some circumstances. We note that California's
law, the CCPA, has numerous exemptions and loopholes, and thus,
even in California there is not a broad law that will apply
routinely to all data brokers. Because of this, there is no
question that there are meaningful gaps in consumer protection
at the State and Federal level.
At the Federal level, the answer to the questions of duty
and responsibility depends on what entity is holding the data,
what sectoral regulations are in place, and for unregulated
companies, what the privacy policy of that company states. For
example, HIPAA-covered entities do have a duty to patients
about how protected health information will be used. Entities
engaging in FCRA-covered activities also have some duties to
consumers about information use. As good as the FCRA is, in
some ways, as I mentioned in testimony, it has lost some of its
effectiveness due to what has become the ``household'' vs.
individual loophole. In the public sector, the Privacy Act does
make some stipulations about data use.
For companies that are not regulated under a sectoral
regime, the FTC can enforce privacy policies that are posted by
companies under its FTC Act Sec. 5 authority; but this has its
limits, and does not provide for a proactive requirement of
certain duties to consumers regarding data use.
Vermont, in enacting its first-in-nation 2018 data broker
legislation, made incremental steps at a State-level toward
creating at least some duty regarding consumer data when it
required data brokers to not use consumer data for committing
fraud, or in a discriminatory way. This is not a comprehensive
protection, but it remains an important exemplar.
Q.3. If consumers are discriminated against or harmed because
of how that data is used, who is responsible? If a data broker
is breached and a consumer suffers harm from identity theft,
who is liable?
A.3. The answer to both of these questions will depend on the
circumstances of the discrimination or harm, and the
complexities of resolving this issue are no small matter. In an
FCRA context, consumers who experience harm because of
improperly conducted background checks, for example, have
recourse. In this situation, an employer may be the responsible
party, or the background check provider. But outside of the
FCRA context, harms can accrue that are unregulated, which
makes the assignation of responsibility more difficult in some
circumstances.
For example, when a business uses an aggregate or household
credit score to determine eligibility for a financial service
or product, and chooses to decline the consumer for a service
or product, unless the consumer had a way to know about this
declension, they would not be likely to learn about the harm.
In this situation, the creator of the aggregate or household
score, the seller of the score to the institution that used it,
and the institution may possibly have some responsibility, but
this is not yet litigated under the FCRA, and Congress has not
yet clarified the issue of aggregate or modeled credit scores.
Until and unless we have additional clarity, it will be very
difficult to have bright-line responsibility assigna-
tions in this and other areas.
Regarding data brokers and unregulated scores generally,
there is a need for more bright-line rules in regards to
responsibilities and duties, including nondiscrimination.
Currently, outside of the State of Vermont, and as of 2019,
also California, which have both passed basic data broker
registration laws, the answer to this question is not
straightforward whatsoever, and in large part, it is fair to
say it is undetermined. In most cases, consumers are unlikely
to be able to determine with specificity how their information
was compromised, or what party created the risk. In the case of
consumer data held by data brokers, it would be very difficult
for consumers to know which data brokers held their data, much
less which had breached their data. Specific data broker breach
requirements and other protections would help ameliorate some
of these problems.
Q.4. Do you think Federal law should require companies that
collect and use consumer data to take reasonable steps to
prevent unwanted disclosures of data and not use data to the
detriment of those consumers?
A.4. Yes. There are no reasonable arguments against providing
proper security for consumer data at all stages of its
lifecycle in a business. And there are no arguments against
prohibiting using data in a detrimental, discriminatory, or
unfair way. It is essential to provide for fair data uses and
prevention of harm regarding consumer data; without such
provisions, consumer trust will eventually be lost. Abusive
data practices where data is used in detrimental,
discriminatory, or unfair ways in consumers' lives is not
sustainable in a digital economy.
------
RESPONSES TO WRITTEN QUESTIONS OF SENATOR CORTEZ MASTO FROM PAM
DIXON
Q.1. Are there firms that you think are utilizing algorithms to
expand access for affordable credit or useful financial
products that are beneficial? If so, which ones?
A.1. Some beneficial examples in this context are found in the
area of ``thin file'' consumer scoring products. These types of
credit scores are well understood in the marketplace. Typically
called ``alternative credit scores,'' thin file credit scores
are almost always brought in as regulated scores under the
FCRA. Alternative credit scores typically use a small
alternative data set to calculate thin file scores. Utility
payments, rent payments, phone bill payments, and other types
of steady payments are used as predictors for credit risk for
people who may not have purchased a home, a car, and may not
have an extensive credit history for a variety of reasons.
Exemplars include the FICO UltraFICO,\1\ and ID Analytics
use of alternative credit data,\2\ particularly the Credit
Optics Full Spectrum.\3\ These products utilize alternative
data to provide credit score analysis, and at last check, the
companies consider the products to be regulated under the FCRA.
---------------------------------------------------------------------------
\1\ See https://www.fico.com/en/products/ultrafico-score.
\2\ See https://www.idanalytics.com/solutions-services/credit-risk-
solutions/alternative-credit-data/.
\3\ See https://www.idanalytics.com/solutions-services/credit-risk-
solutions/.
---------------------------------------------------------------------------
Thin file or alternative credit scores should not be
confused with aggregate credit scores. Companies building
aggregate credit scores typically do not see these models as
regulated under the FCRA, because these scores apply to
households, not individuals. This is a loophole in the FCRA, as
the FCRA only applies to individuals. Aggregate credit scores
that are created at a household level are not regulated, but
they nevertheless might be applied to individuals by companies
seeking an unregulated predictive score.
Aggregate credit scores can use hundreds and up to more
than a thousand factors, and can be quite accurate. In short,
aggregate credit scores can act as an unregulated proxy for the
traditional credit scores originally regulated under the FCRA.
This is in contrast to thin file, alternative credit scores,
which are regulated scores that can be beneficial to previously
unscored consumers or consumers with minimal credit histories.
Q.2. Do you believe that people should get to see their
unregulated credit reports and scores just as they do their
regulated scores?
A.2. Yes, people should be able to see their unregulated credit
reports and scores. For example, we should be able to see our
FICO aggregate credit score. We should also be able to see our
Experian neighborhood risk score, as this score is used to
create a variety of metrics about households and those living
in that household. Any score used in matters relating to
eligibility, or used to determine the character, reputation or
creditworthiness of an individual should be available and not
secret.
Q.3. What does it mean for financial markets now that FINRA can
essentially predict and decide in real time, or near real-time
investor behavior? What does it mean for other financial and
technical sectors?
A.3. FINRA is a key exemplar of modern real-time governance. It
didn't begin that way, but the system has evolved in important
ways. We think that FINRA is just the beginning of the ``real-
time governance'' movement, where high volumes of data analysis
and governance is what a lot of compliance reporting is going
to start looking like in the United States and elsewhere.
As a self-regulatory organization under the Securities and
Exchange Act ('34 Act), FINRA is authorized to issue rules
under Section 15A(b)(6) of the 1934 Act in order to ``. . .
prevent fraudulent and manipulative acts and practices, to
promote just and equitable principles of trade, and, in
general, to protect investors and the public interest and
Section 15A(b)(9) of the Act.''
Q.4. In the past, FINRA produced periodic summarized reports to
support its mission. This was fine, and entirely appropriate
for a paper-based economy and era. From the 1930s when the
modern U.S. securities law framework was established through to
the present, regulators such as the Securities and Exchange
Commission and SROs such as the New York Stock Exchange and the
National Association of Securities Dealers (whose SRO powers
were eventually transferred to FINRA) had no choice but to rely
on periodic reporting from regulated entities as their primary
source of information. Staff members of regulated entities
spent huge amounts of time boiling down vast quantities of raw
data into highly simplified, abstract form for reporting. Then
staff members of regulators tried to develop an accurate
understanding of the complex reality summarized in the
reporting forms through a combination of analysis of the
reporting forms and selective audits. These paper-based
reporting and regulatory processes were normal and appropriate
and used throughout the American economy and world for most of
the 20th century.
The computerization of American financial markets was
driven in the late 1960s and 1970s by the ``paperwork crunch''
on Wall Street. As trading volumes increased, paper-based
clearing and settlement systems became overloaded, making it
impossible to settle all of 1 day's transactions before the
start of the next trading day. The first response to the
paperwork crunch was to close markets earlier, which was
obviously not a solution that appealed to either financial
firms or their clients.
By the end of the 1970s, clearing and settlement systems
were running on mainframe computers and American banks,
brokerage firms and insurance companies were world leaders in
the computerization of their back-office systems. The
regulatory financial reporting obligations of these firms were
met through a combination of reports generated by mainframe
computer systems and information collected and summarized by
staff members. These reporting and regulatory oversight
processes were based on point-in-time, low-resolution snapshots
of the business operations of regulated entities. Regulators
could see the equivalent of the tip of an iceberg and were
forced to guess the characteristics of the submerged portion of
the iceberg. The executives running regulated entities were in
much the same position.
In his book, ``Seeing Like a State,'' Harvard political
science Professor James Scott wrote a book, articulated the
challenges that modern regulators face when forced to make
decisions on the basis of the kind of highly compressed
summaries of complex realities found in periodic reporting by
regulated entities. The regulator can literally ``see'' only
what is presented in the summary, and on the basis of that kind
such summaries, make educated guesses about where to look more
closely for evidence of violations of law.
Following the Stock Market Crash of 1987, regulators began
working with regulated entities to better understand the
operation of their computer systems and to integrate the
functioning of those computer systems more directly into their
regulatory oversight activities. As regulators gained greater
direct access to the information begin generated by the
information systems operated by regulated entities, they
gradually were able to ``see'' something closer to what the
executives of regulated entities could see.
By the 2000s, financial market regulators such as the SEC
and FINRA were developing the capacity to collect and analyze
raw data feeds directly from regulated entities. This brings us
to today, where FINRA is using the availability of increased
technological capacity to acquire real-time transaction data
regarding TRACE--eligible securities (Trade Reporting and
Compliance Engine). Instead of receiving periodic reports,
those subscribing to FINRA's TRACE reporting system now have
firehoses of real-time data to manage and analyze.
In the FINRA real-time environment, regulators now have to
develop their own capacity to analyze these data feeds and draw
their own inferences from them, which requires huge investments
in computing capacity and staff with relevant subject matter
expertise. After these systems are fully operational, then in
theory what regulators should be able to ``see'' whatever
executives at regulated entities can ``see.'' The starting
point of the dialogue between regulators and regulated entities
can focus on comparing the results of the regulators' analyses
and the regulated entities' analyses of the same raw data
generated by the regulated entities' computer systems.
FINRA's TRACE reporting system was developed specifically
to assist with this process. To meet its primary mission, FINRA
will need to continue to ensure that the kinds of compliance
problems they look for, such as concealed shell companies,
achieve maximum benefits from the data volume and velocity
``real time'' affords. ``Real time'' does not automatically
equal ``better'' unless foundational work has been done to
ensure that the data has been properly tagged and organized to
facilitate compliance reporting and response. For example,
compliance alerts in real-time systems are typically based on
some form of trigger. Various kinds of data tags and
identifiers are particularly important to construct properly to
fulfill this task. With proper triggers in place, real-time
data firehoses can be purposefully and reliably analyzed at
scale and at speed in order to create accurate real-time
governance feedback.
The ability of regulators to request real-time data from
regulated entities and to engage in real-time analysis of that
data for evidence of compliance or violations of the law by the
regulated entities represents the beginning of a new era of
``real-time governance.'' In a real-time governance system,
regulators should be able to respond almost as quickly as
regulated entities to evidence of a risk of noncompliance. The
expansion of real-time governance in the United States and
around the world promises a fundamental breakthrough in risk
management: citizens should be able to enjoy the best quality
goods and services and the benefits of rapid technological
innovation while at the same time also being provided better
protection from risks.
In order to lay a foundation for continuous improvement of
real-time governance systems, regulators and regulated entities
will need to collaborate to increase the standardization of
data formats. Back in the 1970s, when each financial service
firm was installing its own mainframe computer, it was not
uncommon for each firm to acquire custom-developed, bespoke
software application. Standards were developed for transaction
data so that first it could send and receive order and
execution information from exchanges and other firms quickly
and accurately, but there was no need to standardize other
parts of the firms' computer systems.
By the 2000s, the result was significant diversity across
firms in the way that some of the information relevant to their
reporting obligations was generated and stored. Limited
standardization of data formats and software architectures
across regulated entities increases the challenges to
regulators to move to real-time governance because of their
need to compare compliance-related behaviors across different
firms with different computer systems.
Lack of standardization of data formats hampered
regulators' ability to respond to the 2008 collapse of Lehman
Brothers and the 2010 Flash Crash. Regulators' efforts to track
down the course of large volumes of computer-generated orders
were hampered by the difficulty of comparing data generated by
different firms. One problem in particular had to do with lack
of standardization in how customers that were ``legal persons''
(e.g., corporations), were identified. The same corporation's
name might be entered into different firm computers differently
due to the use of nonstandard abbreviations or even
typographical errors. The lack of global standards for
identifying common ownership of financial accounts by business
entities quickly and accurately was hampering tax and anti-
money laundering regulatory efforts as well.
In 2011, the Depository Trust & Clearing Corporation (DTCC)
and the Society for Worldwide Financial Telecommunications
(SWIFT) launched a collaborative, global standard-setting
effort that led to the creation of the ``Global Legal Entity
Identifier'' standard. This standard has been endorsed by the
Financial Stability Board and the G20 and designated as
International Organization for Standardization ISO standard
17442. Some jurisdictions outside the United States have begun
mandating the use of LEI numbers in certain financial service
markets in order to increase the effectiveness of regulatory
oversight processes (e.g., EU Markets in Financial Instruments
Directive known as MiFID II).
Any legal entity anywhere in the world can obtain quickly,
easily and cheaply a globally unique 20 digit LEI number from
the LEI issuer of their choice, and be confident that it will
be accepted by regulators and counterparties around the world
for compliance purposes. The LEI Regulatory Oversight Council
and the Global Legal Identifier Foundation (GLEIF) jointly
administer the LEI system. This includes the oversight of a
global network LEI issuers that compete with each other to
issue LEI numbers to entities; providing the Global LEI Index,
an open, searchable database of LEI numbers, and monitoring
emerging technologies and updating the standard as needed to
accommodate them.
The LEI ROC and GLEIF provide a clear example of the kind
of transparent, accountable and inclusive governance processes
that are needed to insure that real-time governance serves the
public and is not captured by industry or leveraged by owners
of proprietary technologies. The LEI ROC and GLEIF operate in
all global markets simultaneously to reduce compliance burdens
on regulated entities, amplify the effectiveness of national
and global regulators' efforts to protect the public and are
completely transparent to end users.
But the public, the regulators that represent the public
interest, and private firms cannot enjoy any of those benefits
of real-time governance without a very large, one-time
investment by the private sector in business process
reengineering. That is because all private enterprises today
have some system for identifying themselves to their
counterparties and keeping track of their counterparties that
was developed before the global legal entity identifier
standard was developed. The problem from a software programming
perspective is similar to the Y2K problem at the end of the
1990s: software programs that only allocated two digits for
storing information about years had to be modified to
accommodate four digit years in order to insure that the year
2000 was not interpreted by the software as 1900 instead. In a
similar manner, all business software systems will have to make
a one-time change to adopt GLEI and phaseout whatever other
system they were using. Depending on how a firm's computer
system is organized, this may require undertaking a long, slow,
difficult process to achieve what appears to be a simple and
obvious outcome to anyone not familiar with the challenges of
business processing reengineering.
With regard to the ability of FINRA or any other regulator
working with real-time data feeds to fulfill their public
service mission through real-time governance processes,
increasing standardization of data formats is an essential part
of the process of increasing the accuracy of regulators'
ability to predict the behavior of investors, regulated
entities and markets generally. The kind of predictions that
the use of big data and artificial intelligence make possible
are statistical inferences about the probability of different
outcomes. The use of data analytics would permit a regulator to
estimate the probably that certain data revealed a violation of
the law.
Using real-time data flows and real-time governance
processes in this way permits regulators to engage in provable,
fact-based, and ``risk based'' regulation. This would permit
regulators to adjust dynamically and in real-time their
allocation of scarce enforcement resources to those situations
where they would create the most value for the public. They
could use real-time governance mechanisms to identify those
situations where the regulator believes the probability of a
violation of the law occurring is the highest and the risk of
harm to the public as a result of that violation is the
highest, and concentrate their resources there.
The migration by regulators to real-time governance in
effect levels the playing field with regard to what the
executives of regulated entities know and what regulators know.
In addition, regulators gain deeper insight into the behavior
of markets generally because unlike the executives of regulated
entities who can see in detail only their own firms' internal
operations, regulators will be able to learn from comparing
detailed, accurate information about operations of all
regulated entities.
As regulators give up the 20th century system of regulation
based on information contained in point-in-time, low resolution
snapshots of the behavior of regulated entities and move to
real-time governance instead, regulators will be able to use
whatever resources they have more effectively, the public will
be better protected and regulated entities will benefit from
greater predictability and consistency of regulatory
enforcement actions.
It is difficult to overstate the potential significance of
the move from 20th century command and control bureaucratic
regulatory processes to real-time governance process not just
in financial services but in every sector of the American
economy and across global markets. In the 19th century,
governments could only act as a ``night watchman state''
because of their limited capacity to regulate the economy. By
the 20th century, the modern regulatory State had come into
being and could act to protect the public from tainted food,
poisonous medicines and lethal workplaces. The Administrative
Procedure Act of 1946 was enacted to insure that the power of
the modern regulatory State was exercised in a manner
consistent with the rule of law.
The fundamental advances in accountability and
effectiveness ushered in by the APA such as notice and comment
rulemaking cannot meet the challenge of insuring that
regulatory power exercised through real-time governance
processes also conforms to the rule of law. In order to lay a
statutory foundation for the transparent, accountable and
inclusive exercise of regulatory power through real-time
governance processes, a fundamentally new approach to
regulation is required.
Such a new legislative interface would be congruent with
the APA but would explicitly authorize regulators to leverage
voluntary, consensus standards developed by private standard-
setting organizations that have committed to observing due
process. Public-private collaborations between Federal
regulators and private sector standard developing organizations
have been taking place for decades with the framework of Office
of Management and Budget Circular 119-A governing Federal
Participation in the Development and Use of Voluntary Consensus
Standards and in Conformity Assessment Activities and most
recently updated in 2016. This new approach to regulatory
governance is discussed in more detail in the information
privacy law context in Pam Dixon and Jane Winn, From Data
Protection to Information Governance (forthcoming 2019) and
Jane Winn, The Governance Turn in Information Privacy Law (July
11, 2019), https://ssrn.com/abstract
=3418286.
Real-time financial sector analysis is no longer a single-
jurisdiction endeavor. It requires multilevel cooperative
efforts. The example of the Global LEI standard demonstrates
that the use of a
legislative interface through which regulators and private
standard-setting organizations can collaborate to achieve real-
time governance that serves the public can work any context,
not just information privacy law. It also demonstrates that the
transparency,
accountability and inclusiveness of real-time governance can be
supported by cooperative efforts with global standard-setting
organizations as well as American standard setting
organizations. How these cooperative efforts are accomplished
requires careful and methodical decision making and planning--
private organizations and the public sector both need to be
fully committed to insuring the fundamental fairness of their
own processes. FINRA's system gives us a view into the
implications of the world to come, and the depth of its new
technical and policy requirements.
Q.4. Do you believe that there should be something similar to
the ``legitimate interest'' basis for data processing in the
United States and, if so, how should we think about nonconsent-
based processing for entities that have no consumer
relationship such as data brokers?
A.4. Data processing that is not based on consent is an
important issue to address, because it is going to become front
and center in the predictive world we are moving into. It is
not reasonable to think that individuals will be able to
consent to every bit of processing of their data. That being
said, we still need structures that ensure nondiscrimination
and people-beneficial uses of data. Processing varies in levels
of importance depending on the context and use of the
processing and data, among other factors.
We now have some experience with legitimate interests
processing via the GDPR in Europe. Legitimate interest-based
processing has proven to be a challenging issue to implement,
and the results have been uneven thus far. Because of the
implementation issues with the GDPR, I prefer the idea of
routine uses as outlined conceptually in the Privacy Act of
1974. The United States routine uses model allows for data
processing within limits, based on the context, but prohibits
other uses outside of the known context and requires
affirmative consent as the uses and data become more sensitive.
One of the questions that immediately arises regarding both
legitimate interest and routine uses is: who gets to decide
what is a legitimate interest, or what is a routine use? This
is an important question in a democratic society, and is one of
the biggest decisions that needs to be determined in a
democratic process. In the Privacy Act, the concept and
structure of routine uses allows for individuals, businesses,
and other entities to have a voice in what those routine uses
look like, but it is the Government that has the ultimate
authority to make bright-line decisions.
The details of deciding upon routine uses can be managed by
utilizing a combination of sectoral legislation to decide the
brightest lines (like the floor for HIPAA) and the addition of
due process voluntary consensus standards that would allow all
stakeholders to have a fair and robust dialogue to create the
more granular rules for what constitutes fair routine uses in
more particularized settings. Voluntary consensus standards are
due process standards, where all stakeholders have a say in
what those ``routine uses'' should look like. This kind of
standards work is in contrast to industry self regulation,
where only industry has a role in the process and key
stakeholders (such as consumers) might not be included.
Again, in some areas, and applying the routine use idea
broadly, beyond the confines of the Privacy Act, Congress will
need to make the general bright line boundaries for some
``routine uses.'' At a more granular level, multistakeholder
work can set the finer boundary lines, with input from all
stakeholders. Anything that goes beyond a checkbox will involve
a more time-intensive process, but one that is well worth the
effort.
Q.5. How effective are the GDPR's provisions surrounding
profiling and automated decision making, and is that something
we should emulate in the United States?
A.5. AI and machine learning systems require a lot of data, and
they can present a variety of meaningful risks, including
serious potentials for bias and inappropriate manipulations.
The approach the GDPR took to automated decision making is
understandable given the risks, yet the approach is also
proving to be problematic. I spent over a year as a member of
the OECD's AI Expert Group (AIGO). The AIGO group was tasked
with providing extensive technical input into the OECD
Principles on AI, which have now been ratified by the United
States and other OECD countries, see: https://www.oecd.org/
going-digital/ai/principles/.
Something that became very apparent throughout the
discussions of AIGO was that the GDPR approach to AI processing
brings many noncompetitive restrictions to data use and
analysis. The OECD final guidelines took a broader approach
than the GDPR, one that respected human values and privacy, and
also innovation and economic growth. It is important that
democratic societies such as the United States stay highly
competitive with other jurisdictions in regards to AI and
Machine Learning. The Belt and Road Initiative (BRI) countries
(https://www.worldbank.org/en/topic/regional-integration/brief/
belt-and-road-initiative) are focused on winning the AI and
Machine Learning race, and this focus on achieving AI dominance
should not be underestimated.
The United States faces an ethical dilemma. That is: do we
handle data as aggressively as nondemocratic jurisdictions do
in order to stay competitive? Or, do we protect privacy and
take potential risks with our ability to compete? Or is there
another way? We cannot take a stance of abusing the privacy,
autonomy, and trust of the American people. And we must also
innovate and lead in new technologies of prediction. After long
consideration, I believe it is imperative that we find the
third way, a way that allows us to retain privacy, autonomy,
and democratic values while still innovating and staying
competitive. This is both worthwhile and possible.
Legislating AI as a broad command and control statute is
not possible due to the complexity and variety of AI systems.
We believe that an approach where lawmakers determine a set of
general principles, then implement those principles with fair
standards setting processes using OMB Circular A-119 as a due
process model, will work well for addressing the complex
challenges AI analytics poses at a granular level.
This is an admittedly complex topic, and we do have
forthcoming research on governance of privacy in complex
ecosystems. In the meantime, a paper written by Jane Winn, who
is a law professor in the United States and has taught short
courses in China for many years, articulates some of these
issues (and potential solutions): The Governance Turn in
Information Privacy Law (July 11, 2019), https://ssrn.com/
abstract=3418286 or http://dx.doi.org/10.2139/ssrn.3418286.
Q.6. What are some of the gaps in currently existing law with
respect to how enforcement agencies deal with this multitude of
laws and what should we be thinking about in the Banking
Committee as we prepare to potentially consider broader privacy
legislation drafted by the Commerce Committee?
A.6. There are several meaningful gaps in existing law
regarding enforcement agencies:
A. LToo-narrow of enforcement authority at the FTC
B. LEnforcement gaps between existing sectoral laws
C. LEnforcement gaps of new sectors
Regarding the FTC's enforcement authority, this issue has
been well-discussed in Congress. The primary issues are the
limitations of The FTC Act to address the full range of modern
privacy problems, and the limitations created for the FTC under
Magnuson-Moss, which limits the FTC's rulemaking power. The
Magnuson-Moss vision of how the FTC should operate is not a
viable position for the FTC to be held to today, particularly
in light of the privacy and security concerns attending the
fast-moving data ecosystem.
Nevertheless, there is a school of thought that the FTC
should not be the Nation's main privacy enforcement authority
due to its constraints. This leads us to the idea of a new
structure. We favor the creation of a Federal oversight board
with responsibility for privacy--for example, a 12-member board
with broad enforcement oversight. An overarching administrative
privacy enforcement council or board would be in a position to
spot issues across sectors, agencies, more readily identify a
broader variety of gaps, and direct resources.
Regarding enforcement gaps between existing sectoral laws,
we see three pathways to enforcement. First, focused laws to
fill in the gaps, accompanied with clear enforcement authority.
Second, voluntary consensus guidelines at the State and Federal
level with Government oversight, again, directed at the gaps
where there is the most need. Third, we see a role for
certification and other tools to assist with enforcement,
again, with Government oversight.
Third, it would make sense to conduct an analysis to
identify any new sectors or potential sectors that need
separate rules. Data brokers may be such a sector, so may
certain kinds of platforms. It is an understatement to note
that discussions about regulating a group of businesses would
be an incredibly contentious discussion on all sides.
Nevertheless, it would still be a good idea to at least have
the discussion, because it is both reasonable and possible that
at some point in the future certain types of businesses and
platforms might be considered a sector unto themselves.
Q.7. How can we ensure the consumer is informed about scoring,
profiling, and other decisions that are made about them in
their daily lives while balancing the need to not put the
entire onus on the consumer?
A.7. Requirements for quality controls such as labeling,
certification, audit and documentation, bias and accuracy
testing, among other measures are some of the mitigations that
could be put in place to reduce informational risks without
placing the burden entirely on consumers. Rules that require
affirmative disclosure of meaningful consumer scores is
important, as are rules that allow consumers to request
disclosure of smaller scores. We include below a partial list
developed from our original Scoring of America report:
LThere should be no secret consumer scores. Anyone
who develops or uses a consumer score must make the
score name, its purpose, its scale, and the
interpretation of the meaning of the scale public. All
categories of factors used in a consumer score must
also be public, along with the source category of
information used in the score.
LScores used for meaningful decision making about
consumers should be subject to quality controls,
ideally stipulated in Federal standards.
LThe creator of a consumer score should state the
purpose, composition, and uses of a consumer in a
public way that makes the creator subject to Section 5
of the Federal Trade Commission Act. Section 5
prohibits unfair or deceptive trade practices, and the
FTC can take legal action against those who engage in
unfair or deceptive activities.
LAny consumer who is the subject of a consumer score
should have the right to see his or her score and to
ask for a correction of the score and of the
information used in the score. It is the responsibility
of business to know when they are using a score to make
a decision about a consumer.
LThose who create or use consumer scores must be
able to show that the scores are not and cannot be used
in a way that supports invidious discrimination
prohibited by law.
LThose who create or use scores may only use
information collected by fair and lawful means.
Information used in consumer scores must be
appropriately accurate, complete, and timely for the
purpose.
LAnyone using a consumer score in a way that
adversely affects an individual's employment, credit,
insurance, or any significant marketplace opportunity
must affirmatively inform the individual about the
score, how it is used, how to learn more about the
score, and how to exercise any rights that the
individual has.
LA consumer score creator has a legitimate interest
in the confidentiality of some aspects of its
methodology. However, that interest does not outweigh
requirements to comply with legal standards or with the
need to protect consumer privacy and due process
interests. All relevant interests must be balanced in
ways that are fair to users and subjects of consumer
scoring.
LThe Congress and the FTC should continue to examine
consumer scores and most especially should collect and
make public more facts about consumer scoring.
LThe FTC should investigate the use of health
information in consumer scoring and issue a report with
appropriate legislative recommendations.
LThe FTC should investigate the use of statistical
scoring methods and expand public debate on the
proprietary and legality of these methods as applied to
consumers.
LThe Consumer Financial Protection Bureau should
examine use of consumer scoring for any eligibility
(including identity verification and authentication)
purpose or any financial purpose. CFPB should cast a
particular eye on risk scoring that evades or appears
to evade the restrictions of the FCRA and on the use
and misuse of fraud scores. If existing lines allow
unfair or discriminatory scoring without effective
consumer rights, the CFPB should change the FCRA
regulations or propose new legislation.
LThe CFPB should investigate the selling of consumer
scores to consumers and determine if the scores sold
are in actual use, if the representations to consumers
are accurate, and if the sales should be regulated so
that consumers do not spend money buying worthless
scores or scores that they have no opportunity to
change in a timely or meaningful way.
LBecause good predictions require good data, the
CFPB and FTC should examine the quality of data factors
used in scores developed for financial decisioning and
other decisioning, including fraud and identity scores.
In particular, the use of observational social media
data as factors in decisioning or predictive products
should be specifically examined.
LThe use of consumer scores by any level of
government, and
especially by any agency using scores for a law
enforcement purpose, should only occur after complete
public disclosure, appropriate hearings, and robust
public debate. A government does not have a commercial
interest in scoring methodology, and it cannot use any
consumer score that is not fully transparent or that
does not include a full range of Fair Information
Practices. Government should not use any commercial
consumer score that is not fully transparent and that
does not provide consumers with a full range of Fair
Information Practices.
LVictims of identity theft may be at particular risk
for harm because of inaccurate consumer scores. This is
a deeply under-
researched area. The FTC should study this aspect of
consumer scoring and try to identify others who may be
victimized by inaccurate consumer scoring.
Q.8. Should some types of data, such as biometric information,
even be allowed to be shared with third parties?
A.8. If data--or knowledge derived from that data--is sensitive
enough, it should not be shared with third parties unless there
are specific protective rules and risk mitigations in place.
Some data is too sensitive to simply allow to be freely shared,
either because as data it is sensitive, or as combined with
other information, it could lead to knowledge impacting an
individual's ability to make a living or purchase a home, or
other issues related to eligibility under the FCRA.
Working with data types we know well, consider the Social
Security Number. In the 1980s, the SSN had grown to very broad
uses in the United States. As a result, at a time when the
United States was moving from a paper-based world to a digital
world, certain types of crimes--particularly identity theft--
were greatly facilitated by the relative availability of SSNs.
An early trickle of identity theft legislation in the mid-1990s
turned into a torrent of legislation in short order around the
use, storage, and protection of the SSN.
SSNs are still used today, but many beneficial protections
are now in place. Yes, SSNs are still used by third-parties,
for example, by credit bureaus. But generally, SSN uses are
much more restricted now. For example, SSNs have been removed
from being printed on Medicare cards and on drivers' licenses.
Data types and potential for uses need to be evaluated for
risks to make a determination about risks related to sharing.
In taking this a step further and discussing knowledge
derived from data, think of the mosaic of information that
outlines an individual's reputation and character such as that
which would be revealed in a comprehensive background check.
This is why the FCRA protections around background checks are
so important. Background checks may be undertaken, but not
without the subject's knowledge, and there is a procedure for
disputing errors. Where safety rails do not exist, then more
risk exists for that data or knowledge.
Regarding the biometric portion of your query, I would like
to respond in some detail. It is an important question.
All biometric data, including genetic data, rises to the
level of high sensitivity. As such, WPF proposes that
biometrics be designated as a technology of very high concern,
and be subjected to meaningful safety guardrails. The United
States is one of the few countries where biometric technologies
have not yet been as pervasively implemented as they have been
in other jurisdictions. But it is very unlikely that the United
States will fully escape the use of biometrics, as seen in
airport biometric entry/exit programs, among other biometrics
programs.
Because of the significant risks inherent in the uses of
the technology, biometrics--including facial recognition--
should be classified as a high-risk technology, and procedural
safety protections that are well-tested and understood in other
high-risk contexts should be adapted for biometrics and put in
place as guardrails.
The guardrails we are proposing are similar to those found
in existing safety regulations in the United States and Europe.
Regulatory Safety Structures that Act as Guardrails for Biometric
Systems (Facial Recognition)
The protections fall into three key areas: pre-and post-
market safety and quality regulations, use controls, and a
consumer complaint mechanism.
Pre-and Post Market Safety and Quality Regulations:
The following pre and post-market safety regulations for
biometrics are derived from the existing legislative models of
RoHS, REACH, and the Chemical Safety for the 21st Century Act
(updates U.S. Toxic Substances Control Act) as well as the Fair
Credit Reporting Act. Finally, the consumer complaint
mechanisms at the CFPB and CDC provide the model for the post-
market consumer complaint reporting.
LClassification: Biometrics would be classified as a
``technology of very high concern.''
LApplicable to full supply chain: The regulations
would apply to the full supply chain and to any entity
that produces, develops, sells, assembles, distributes,
installs, and uses biometric systems.
LID risks and reporting requirements: Biometric
entities would be required to identify risks in the
technology and document and report those risks to the
applicable Government body.
LTesting requirements: Biometric technologies
available for use would be required to be tested and
evaluated by NIST for accuracy and bias on a regular
basis, at a minimum, this review would be updated
annually.
LProven safe prior to launch: The technology must be
proven safe and fit for purpose prior to launch, and
must be cleared for market by the appropriate
Government oversight body. For facial recognition, a
nondiscrimination analysis would need to be performed.
LProduct labeling: The biometric product would be
labeled for accuracy and for bias. (Facial
recognition.)
LCertification and training requirements would
apply.
LOngoing monitoring: The full supply chain of
vendors and implementors must agree to ongoing
monitoring and documentation for compliance. Monitoring
can be in real time, or near real time.
Use controls:
Biometric technology is deployed in specific use cases.
Some use cases are not objectionable, however, some uses cases
are objectionable and pose threats of discriminatory impact or
other harms.
LSome use cases of biometrics would not be allowed
due to safety considerations, or lack of functionality.
For example, body cameras equipped with real-time
facial recognition are viewed by biometricians and a
majority of law enforcement as a high-risk use case.
This particular use case has both legal and technical
problems.
LAllowed use cases would have significant
definitional controls and procedural requirements. For
example, biometrics used in law enforcement
investigatory settings would be subject to the
procedures set forth at the Federal level. At the State
level, the Bureau of Justice Assistance procedures for
biometrics use, for example, could be required (https:/
/www.bja.gov/Publications/Face-Recognition-Policy-
Development-Template-508-compliant.pdf.)
LVoluntary Consensus Standards could be used in
conjunction with legislation to establish ongoing
multistakeholder evaluation of emerging use cases.
Post-Market Consumer Complaint Reporting:
LVoluntary Consensus Standards could be used in
conjunction with legislation to eUsing the adverse
event reporting model and the consumer complaint model,
biometrics technologies would have a dedicated post-
market monitoring mechanism at the Federal level.
LConsumers and others would be able to submit
complaints to a central structure.
LAs with the structure of the existing Consumer
Financial Protection Bureau (CFPB) consumer complaints
database, complaints would be available for viewing
within a matter of a week, and the complaints would be
available for download and analysis. This data will
provide ongoing insight into problem areas and detailed
implementation feedback.
Key Underlying Safety Statutes
RoHS: EU Directive, also implemented in some U.S. States.
LAs of July 2019 all RoHS deadlines active;
Directive is now applicable to any business that sells
electrical or electronic products, equipment, sub-
assemblies, cables, components, or spare parts directly
to RoHS-directed countries, or sells to resellers,
distributors or integrators that in turn sell products
to these countries, is impacted if they utilize any of
the restricted 10 substances.
LRequires products to be cleared for market prior to
launch and meaningful compliance documentation/
recordkeeping from all parties in the supply chain,
regularly updated information, mandatory compliance
labeling.
LIn the United States, California, Colorado,
Illinois, Indiana, Minnesota, New Mexico, New York,
Rhode Island, and Wisconsin have enacted RoHS-like and
e-waste regulations.
REACH: EU Regulation
LApplies to essentially every product manufactured,
imported, or sold within the EU.
LREACH regulates chemical substances, particularly
those known as Substances of Very High Concern (SVHC).
Substances considered carcinogenic, mutagenic, toxic
for reproduction, or bioaccumulative fall under SVHC
criteria.
LEU manufacturers and importers are required to
register all substances produced above a set yearly
volume to:
LID risks associated with the substances they
produce.
LDemonstrate compliance in mitigating the risks to
ECHA.
LEstablish safe use guidelines for their product so
that the use of the substance does not pose a health
threat.
Chemical Safety for the 21st Century Act: United States, Federal
LRequires pre-manufacture notification for new
chemical substances prior to manufacture.
LWhere risks are found, requires testing by
manufacturers, importers, and processors
LRequirements for certification compliance
LReporting and record keeping requirements
LRequirement that any person manufacturing
(including imports), processes, or distributes in
commerce a chemical substance or mixture and who
obtains information which reasonably supports the
conclusion that such substance or mixture presents a
substantial risk of injury to health or the environment
to immediately inform EPA, except where EPA has been
adequately informed of such information. (The EPA
screens all TSCA b 8(e) submissions.)
Additional Material Supplied for the Record
[GRAPHICS NOT AVAILABLE IN TIFF FORMAT]
[all]