[Senate Hearing 116-182]
[From the U.S. Government Publishing Office]


                                         S. Hrg. 116-182


                   DATA OWNERSHIP: EXPLORING IMPLICATIONS 
                     FOR DATA PRIVACY RIGHTS AND DATA VALU-
                     ATION

=======================================================================

                                HEARING

                               BEFORE THE

                              COMMITTEE ON
                   BANKING,HOUSING,AND URBAN AFFAIRS
                          UNITED STATES SENATE

                     ONE HUNDRED SIXTEENTH CONGRESS

                             FIRST SESSION

                                   ON

    EXAMINING THE CONCEPT OF PERSONAL DATA OWNERSHIP, INCLUDING ITS 
   EFFICACY ON ENHANCING INDIVIDUALS' PRIVACY AND CONTROL OVER THEIR 
                          PERSONAL INFORMATION

                               __________

                            OCTOBER 24, 2019

                               __________

  Printed for the use of the Committee on Banking, Housing, and Urban 
                                Affairs
                                
[GRAPHIC NOT AVAILABLE IN TIFF FORMAT]                                


                Available at: https: //www.govinfo.gov /

                                __________

                    U.S. GOVERNMENT PUBLISHING OFFICE                    
40-382 PDF                 WASHINGTON : 2021                     
          
-----------------------------------------------------------------------------------   


            COMMITTEE ON BANKING, HOUSING, AND URBAN AFFAIRS

                      MIKE CRAPO, Idaho, Chairman

RICHARD C. SHELBY, Alabama           SHERROD BROWN, Ohio
PATRICK J. TOOMEY, Pennsylvania      JACK REED, Rhode Island
TIM SCOTT, South Carolina            ROBERT MENENDEZ, New Jersey
BEN SASSE, Nebraska                  JON TESTER, Montana
TOM COTTON, Arkansas                 MARK R. WARNER, Virginia
MIKE ROUNDS, South Dakota            ELIZABETH WARREN, Massachusetts
DAVID PERDUE, Georgia                BRIAN SCHATZ, Hawaii
THOM TILLIS, North Carolina          CHRIS VAN HOLLEN, Maryland
JOHN KENNEDY, Louisiana              CATHERINE CORTEZ MASTO, Nevada
MARTHA McSALLY, Arizona              DOUG JONES, Alabama
JERRY MORAN, Kansas                  TINA SMITH, Minnesota
KEVIN CRAMER, North Dakota           KYRSTEN SINEMA, Arizona

                     Gregg Richard, Staff Director

                Laura Swanson, Democratic Staff Director

                Brandon Beall, Professional Staff Member

               Alexandra Hall, Professional Staff Member

                   Jan Singelmann, Democratic Counsel

           Corey Frayer, Democratic Professional Staff Member

                      Cameron Ricker, Chief Clerk

                      Shelvin Simmons, IT Director

                    Charles J. Moffat, Hearing Clerk

                          Jim Crowell, Editor

                                  (ii)


                            C O N T E N T S

                              ----------                              

                       THURSDAY, OCTOBER 24, 2019

                                                                   Page

Opening statement of Chairman Crapo..............................     1
    Prepared statement...........................................    31

Opening statements, comments, or prepared statements of:
    Senator Brown................................................     2
        Prepared statement.......................................    31

                               WITNESSES

Jeffrey Ritter, Founding Chair, American Bar Association 
  Committee on Cyberspace Law, and External Lecturer, University 
  of Oxford, Department of Computer Science (on research 
  sabbatical)....................................................     4
    Prepared statement...........................................    33
    Responses to written questions of:
        Senator Jones............................................   141
Chad A. Marlow, Senior Advocacy and Policy Counsel, American 
  Civil Liberties Union..........................................     5
    Prepared statement...........................................   104
    Responses to written questions of:
        Senator Menendez.........................................   143
        Senator Warren...........................................   146
        Senator Sinema...........................................   150
        Senator Jones............................................   150
Will Rinehart, Director of Technology and Innovation Policy, 
  American Action Forum..........................................     7
    Prepared statement...........................................   107
    Responses to written questions of:
        Senator Warren...........................................   153
        Senator Jones............................................   154
Michelle Dennedy, Chief Executive Officer, Drumwave, Inc.........     8
    Prepared statement...........................................   114
    Responses to written questions of:
        Senator Menendez.........................................   154
        Senator Jones............................................   155

              Additional Material Supplied for the Record

Letter submitted by the Electronic Privacy Information Center 
  (EPIC).........................................................   156
Letter submitted by The Association of Credit and Collection 
  Professionals..................................................   162
Letter submitted by Consumer ReportsTM................   164
Letter submitted by the National Association of Federally-Insured 
  Credit Unions..................................................   168

                                 (iii)

 
DATA OWNERSHIP: EXPLORING IMPLICATIONS FOR DATA PRIVACY RIGHTS AND DATA 
                               VALUATION

                              ----------                              


                       THURSDAY, OCTOBER 24, 2019

                                       U.S. Senate,
          Committee on Banking, Housing, and Urban Affairs,
                                                    Washington, DC.
    The Committee met at 10 a.m. in room SD-538, Dirksen Senate 
Office Building, Hon. Mike Crapo, Chairman of the Committee, 
presiding.

            OPENING STATEMENT OF CHAIRMAN MIKE CRAPO

    Chairman Crapo. This Committee will come to order.
    We would like to welcome today to the Committee four 
witnesses with extensive experience and a range of perspectives 
on issues related to data ownership, valuation, and privacy, 
including Mr. Jeffrey Ritter, Founding Chair of the American 
Bar Association Committee on Cyberspace Law and an external 
lecturer at the University of Oxford--and I guess you are on 
research sabbatical--Mr. Chad Marlow, Senior Advocacy and 
Policy Council at the American Civil Liberties Union; Mr. Will 
Rinehart, Director of Technology and Innovation Policy at the 
American Action Forum; and Ms. Michelle Dennedy, Chief 
Executive Officer of DrumWave.
    As a result of an increasingly digital economy, more 
personal information is available to companies than ever 
before. Private companies are collecting, processing, 
analyzing, and sharing considerable data on individuals for all 
kinds of purposes.
    There have been many questions about what personal data is 
being collected, how it is being collected, with whom it is 
being shared, and how it is being used, including in ways that 
affect individuals' financial lives. Given the vast amount of 
personal information flowing through the economy, individuals 
need real control over it.
    This Committee has held a series of data privacy hearings 
exploring possible frameworks for facilitating privacy rights 
to consumers. Nearly all have included references to data as a 
new currency or commodity.
    The next question, then, is, who owns it? There has been 
much debate about the concept of data ownership, the monetary 
value of personal information, and its potential role in data 
privacy. Some have argued that privacy and control over 
information could benefit from applying an explicit property 
right to personal data, similar to owning a home or protecting 
intellectual property. Others
contend that the very nature of data is different from that of 
other tangible assets or goods.
    Still, it is difficult to ignore the concept of data 
ownership that appears in existing data privacy frameworks. For 
example, the European Union's General Data Protection 
Regulation, or GDPR, grants an individual the right to request 
and access personally identifiable information that has been 
collected about them.
    There is an inherent element of ownership in each of these 
rights, and it is necessary to address some of the difficulties 
of ownership when certain rights are exercised, such as whether 
information could pertain to more than one individual or if 
individual ownership applies in the concept of derived data.
    Associated with concepts about data ownership or control is 
the value of personal data being used in the marketplace and 
the opportunities for individuals to benefit from its use.
    Senators Kennedy and Warner have both led on these issues, 
with Senator Kennedy introducing legislation that would grant 
an explicit property right over personal data and Senator 
Warner introducing legislation that would give consumers more 
information about the value of their personal data and how it 
is being used in the economy.
    As the Banking Committee continues exploring ways to give 
individuals real control over their data, it is important to 
learn more about what relationship exists between the true data 
ownership and individual's degree of control over their 
personal information; how a property right would work for 
different types of personal information; how data ownership 
interacts with existing privacy laws, including the Gramm-
Leach-Bliley Act, the Fair Credit Reporting Act, and the GDPR; 
and different ways that companies use personal data, how 
personal data could be reliably valued and what that means for 
privacy.
    I appreciate our witnesses today for offering their 
expertise and sharing their unique range of perspectives on 
these issues.
    Senator Brown.

           OPENING STATEMENT OF SENATOR SHERROD BROWN

    Senator Brown. Thank you, Mr. Chairman, for calling this 
hearing.
    Welcome to all four witnesses. A special welcome to Jeffrey 
Ritter, whom, shall we say, we knew each other 40 years ago. I 
will leave it at that. At the Statehouse in Columbus.
    This Committee has spent some time over the last several 
months discussing Facebook's poorly--no other adverb to 
describe it--poorly thought-out plan to create a global 
currency. The bottom line is we know that Facebook simply 
cannot be trusted with Americans' personal information. It is 
terrible at protecting its users' privacy. It was pretty clear 
the last thing we should do is trust them with America's 
currency.
    It is not just Facebook. Every time corporations in Silicon 
Valley come up with a new business model, the result is the 
same. They get more access to our personal data, spending 
habits, location, the websites we visit, and it means more 
money in their pockets. Everyone else gets hurt.
    So I want to begin this hearing with a simple question. Who 
has the right to control your personal, private information? 
You or Silicon Valley CEOs like Mark Zuckerberg?
    I think we all agree that Americans should have more 
control over their private information, but should we treat 
that private information just like property? Today's witnesses 
will discuss that idea.
    At first glance, it might seem like a simple way to tackle 
the common problems, the complex problems created by data 
collection and machine learning. The promise is that if we just 
treat personal data like property, markets will do the hard 
work of protecting our privacy for us. We all know that is not 
how it will work.
    Instead of making companies responsible for protecting 
their customers' privacy, this idea puts the burden on all of 
us individually and collectively.
    Now imagine if every time you wanted to use Facebook or pay 
for something with an app or login to a Wi-Fi network, you had 
to read even more legal fine print and then check a box saying, 
``OK, I waive my personal right to my data to use this 
service,'' or you had to join some kind of data collective to 
sell your data.
    Working people in this Country have enough to worry about: 
trying to get the kids out the door in the morning, get to work 
on time, make rent, save for college, pay the bills. The idea 
that people should also have to manage their data like a 
landlord manages its tenants is just ludicrous.
    It should be pretty simple. Corporations should not be 
allowed to invade our privacy. We know that today they are.
    But think about all the personal data that is already 
floating around out there. Equifax exposed the personal 
information of more than 150 million Americans: Social Security 
numbers, birthdays, addresses. Capital One exposed the personal 
information of more than 100 million Americans. How can you own 
your data when it is already littered all over the internet?
    Big tech companies does not want to protect your personal 
information. They want to profit off it. Protecting your 
privacy does not make them any money. It costs them money. So 
they are simply not going to do it.
    They want your data. They want to get it for free. They 
want to pay as little as possible if they cannot get it for 
free.
    So it should be no surprise that I am skeptical--I think 
most of us up here are quite skeptical--when I hear of plans 
for America's data to be treated, again, like property.
    If Americans want more control over their private 
information, we have to find a way to prevent corporations from 
mining our data and selling it to each other. Creating a 
supermarket for selling away our privacy does the opposite. 
Treating data as something that can be owned and bought and 
sold does not solve any of these problems, especially when 
undermining our privacy is the business model.
    Mark Zuckerberg and his Silicon Valley buddies want us to 
skip over the part when we have control over our own privacy, 
and they want to jump to the part where giant tech companies 
get to use their market power to squeeze our privacy out of us, 
and it would all be legal. That is not acceptable.
    I appreciate that Chairman Crapo has been working with me 
in a bipartisan way. There is a lot of interest in this 
Committee in doing it right to create real privacy protections. 
Privacy is not partisan. It is a basic right.
    I look forward to continuing our work together, Mike.
    Chairman Crapo. Thank you.
    We will now proceed to the testimony. I have already 
introduced each of our witnesses, and we will proceed in the 
order that I introduced you. I ask you to please pay attention 
to the 5-minute rule, and I ask the same thing of our 
colleagues here.
    Mr. Ritter, please proceed.

   STATEMENT OF JEFFREY RITTER, FOUNDING CHAIR, AMERICAN BAR 
ASSOCIATION COMMITTEE ON CYBERSPACE LAW, AND EXTERNAL LECTURER, 
   UNIVERSITY OF OXFORD, DEPARTMENT OF COMPUTER SCIENCE (ON 
                      RESEARCH SABBATICAL)

    Mr. Ritter. Thank you. Good morning, Chairman Crapo, 
Ranking Member Brown, and Members of the Committee.
    I join you today to speak in the role as an active 
contributor for over 30 years to law reforms enabling the 
United States and global electronic commerce, privacy, and 
information security.
    Speaking bluntly, the time is long overdue for this hearing 
and the work of this Senate Committee and the Senate and the 
Congress to develop comprehensive privacy reform. Right now, we 
have basically been relegated to playing catch-up with the 
Europeans and other nations that are embracing the rules they 
have written. We are trying to weave together what we have into 
some type of whole cloth that is new.
    But privacy law reform will surely fail if we do not 
address the issues that have been focused on by the Chair and 
by Senator Brown in their opening remarks. There is a 
fundamental question. It does need to be answered. Who owns 
digital information? It is not any question that it is an asset 
of human society in the 21st century, but is it something that 
can be owned?
    In the totality of all digital information, we have, in 
Senator Brown's words, skipped over it. For billions and 
trillions of dollars of development over the last 30 years, no 
one has given an answer.
    For personal information, this question is even more 
important. Yes, it is identifiable, but who truly owns it? Who 
can control it?
    From a perspective of decades working in the international 
scene, trying to advance these rules, I would observe that the 
United States once led the world writing the rules for 
electronic contracting, electronic signatures, and electronic 
commerce to work, but at this point, we are falling behind.
    So I would suggest that we need to do something new. We 
need to reestablish this Nation's leadership by confronting 
this very hard question and incorporating it into our privacy 
reforms. Clear, explicit rules are needed.
    I may disagree with Senator Brown's opening remarks about 
figuring out that it should be owned or not owned as property 
by whom, but we needed to clarify these rules so that we can, 
in fact, control the uses and misuses of this information.
    It is only by crafting those rules that we can then enhance 
and enable acquiring, using, transferring, selling, sharing, 
controlling data. We have got to know whose it is so we can 
make someone accountable.
    Every commercial system built on the rule of law for real 
estate, banking, consumer products, industrial products begins 
with a commitment to define and protect the owner of the 
property. Yet across the digital world we now live in, 
particularly for privacy and individual data, while the data 
subject has controls, the fundamental question has not been 
answered. Who owns it?
    This is not a question that is being addressed in isolation 
here in the United States. As summarized in an article that I 
included in our written testimony, German, Japan, and the OECD 
are all calling for formal legal rules on ownership, including 
Chancellor Merkel herself. Japan has already published model 
guidelines for structuring data sharing and licensing 
agreements based on ownership principles, not yet translated in 
English, allowing Japanese companies to build commercial 
momentum in being able to engage in these kind of transactions.
    So I submit, humbly, that failing to address data ownership 
in our privacy reforms will further isolate the United States 
and allow the rules for data and data as property to be written 
by others.
    Now, the solutions for crafting this legal concept are 
already part of Federal law. They were incorporated into the 
laws governing electronic transferable records and in U.N. 
model laws that have recently been approved with substantial 
U.S. input and influence.
    There, the rights of ownership are exercised by 
establishing and maintaining control of the file. Now, under 
this principle, realistically, the first owner of the 
information will be the business entity with whom the data 
subject is engaging--the bank, the hospital, the university. 
After all, they are the ones that have the systems that 
establish the control over the information.
    But I do believe that recognizing ownership should not do 
anything to diminish or remove a data subject's controls. We 
cannot keep the world's systems from engaging with our 
information, but we can regulate it and recognizing the 
property rights are more clear. It is going to help the 
process.
    So, in closing, just let me begin by emphasizing the 
opportunity, not the obstacle. Clarifying these rules, 
reconciling controls with ownership, will not only improve a 
data subject's--individual's control of their information but 
will, in fact, foster greater accountability for the misuse of 
that information against the individual's interest.
    Thank you, Mr. Chairman.
    Chairman Crapo. Thank you very much.
    Mr. Marlow.

    STATEMENT OF CHAD A. MARLOW, SENIOR ADVOCACY AND POLICY 
            COUNSEL, AMERICAN CIVIL LIBERTIES UNION

    Mr. Marlow. Chairman Crapo, Ranking Member Brown, and 
Members of the Committee, thank you for the privilege of 
testifying before you today.
    I serve as a Senior Advocacy and Policy Counsel at the 
ACLU, where my principal focus is on issues involving privacy 
and
technology. In that role, I spend a great deal of time with the 
ACLU's 53 affiliates throughout the Nation learning about and 
taking positions on State-level privacy legislation.
    Over the past year, I have encountered numerous State 
efforts to enact data-as-property laws, and today I would like 
to share what I have learned.
    In the 2019 State legislative sessions, data's property 
laws were pursued or introduced in 11 States. To illustrate 
those States' experiences, I am going to focus on the effort in 
the State of Oregon.
    In Oregon, where the bill specifically sought to treat 
personal health information as property, the leading talking 
point of the bill was that it was pro-privacy. Such a law, the 
argument went, would give people the right to authorize the 
sale of their data and to receive a portion of the proceeds in 
return. It is notable that what that amount might be was never 
actually discussed.
    The pro-privacy part of the pitch was that persons could 
also elect to not have their data sold. The presentation proved 
persuasive. Forty lawmakers out of 100 in the entire 
legislature signed on to sponsor the bill at the time it was 
introduced.
    But then something happened. Legislators started to learn 
more about what the data's property model looks like when it is 
put into practice, and they became concerned that the model was 
not pro-privacy after all.
    The basis for these concerns was threefold. First, in order 
to effectuate the data-as-property model, at the same time a 
patient's personal information was collected and they were 
notified of their privacy rights, the power of the Government 
would be applied to essentially advertise the option to forego 
those rights by selling away one's personal information. 
Lawmakers grew uncomfortable with the sense they were 
facilitating this anti-privacy choice.
    Second, lawmakers became concerned about adopting a model 
where persons with less wealth were likely to end up with less 
privacy. They recognized that Americans who were economically 
secure would find it easy to reject offers to sell their 
private information, but they also knew that might not be the 
case for an elderly person who has a hard time affording their 
prescriptions and rent or that it might be too tempting a sales 
pitch for a family that is struggling to put food on their 
table.
    Lawmakers also started to appreciate how a Government-
endorsed data-as-property law might serve to further expand the 
existing digital divide, where persons enduring socioeconomic 
and regional economic disadvantages, including 
disproportionately persons of color, already have less privacy 
because they are forced to rely on more affordable but less 
privacy protective technology products and services.
    Third, lawmakers were concerned that enacting a data-as-
property model would require the application of a unique 
tracking identifier to all personal information, which they 
were especially weary of, given the model's ability to expand 
beyond the healthcare context. Privacy and the ability to 
remain anonymous might both be casualties of the effort to turn 
data into property.
    In the end, Oregon State legislature, despite 40 percent of 
its members having originally sponsored the bill, wisely 
abandoned the data's property model, and the bill died. 
Ultimately, lawmakers in all 11 States in which the data's 
property bill was pursued came to the same conclusion, and not 
a single bill passed.
    In the laboratories of democracy, the data-as-property 
experiment is failing to gain traction, but let me end my 
testimony on a positive note from our State legislatures.
    In 2019, two State laws were adopted that made important 
advances in protecting privacy without treating data as 
property. The California Consumer Privacy Act, which among its 
many achievements allows consumers to opt out of their personal 
information being sold, and Maine's Act to protect the privacy 
of online customer information, which takes the superior 
approach of not allowing a person's information to be sold 
without first securing their opt-in permission, these efforts 
should be studied, replicated, perhaps improved upon, and most 
of all respected by Congress by not preempting the privacy 
protections that they have achieved.
    The high value Americans place on their privacy is 
universal, nonpartisan, and wisely enshrined in our Bill of 
Rights. The proper response to the pervasive loss of individual 
privacy is to pass stronger privacy laws, not just to throw up 
our hands and conclude the only issue left to tackle is who 
gets the money.
    Congress has the ability to adopt strong privacy laws 
without relying on models that undermine privacy in the 
process, and I have every confidence that you will.
    Thank you again for the opportunity to testify today.
    Chairman Crapo. Thank you.
    Mr. Rinehart.

    STATEMENT OF WILL RINEHART, DIRECTOR OF TECHNOLOGY AND 
            INNOVATION POLICY, AMERICAN ACTION FORUM

    Mr. Rinehart. Chairman Crapo, Ranking Member Brown, and 
Members of the Committee, thank you again for the opportunity 
to testify before you today on the issue of data property 
rights.
    Like others in privacy, I am skeptical that a data property 
right is actually the best policy mechanism for ensuring 
privacy. My written testimony today actually goes into far more 
detail, but I want to highlight three important points that I 
feel are necessary to point out.
    First all, a privacy--a property right to personal data is 
not necessary to, in fact, secure privacy, and in fact, it 
could be an efficient one economically.
    Second, valuing data is often difficult because raw or 
personal data is not what is in demand but, in fact, the 
insights that are built on top of it.
    And, third, regardless of the privacy regime that is 
adopted, privacy laws will create an unavoidable cost from 
compliance, which will reverberate throughout the economy.
    Data ownership seems to fit naturally with our common 
experience in relationship with technology, but I think there 
are fundamental reasons to be skeptical. Since my fellow 
panelists focused on some of the legal side of things, I want 
to highlight one area that concerns me.
    It is really unclear if the assignment of data property 
rights will actually align all of the incentives between users 
and firms. In particular, if a broad right to data is 
established, users would be forced to search for innovative 
opportunities. While some see this as a plus, I think, in 
reality, it would actually be a burden. Assigning property 
rights in data will dramatically change who can say no to any 
potential innovation. Users would be forced to become their own 
data entrepreneurs.
    In this world, users would become--rather, users would 
learn what companies already know, which brings me to my second 
point.
    Valuing personal data is often very difficult because what 
is in demand are actually the insights that are built on top of 
that data. In my written testimony, I go into far more detail 
and, in fact, lay out the four basic methods to price data, but 
the takeaway, I think, is pretty abundantly clear. There is no 
perfect way to value data, and it is highly context-dependent.
    I think one story really highlights this tension. So when 
Caesars Entertainment went bankrupt a couple years back, the 
Total Rewards customer program got valued at nearly $1 billion, 
which made it the most valuable asset in the proceeding. Even 
though it was not sold off, the ombudsman privacy report 
understood that it would actually be a very tough sell because 
of the difficulties incorporating it into another customer 
loyalty program.
    The Total Rewards example, I think, underscores a pretty, 
particularly important characteristic of data. It is often 
valued within a relationship or a firm relationship, but it is 
often difficult to value outside of that firm relationship.
    For my third and final point, I really want to highlight 
what I think is probably the most important thing to note here, 
that regardless of the particular policy mechanism that is 
chosen for privacy, these laws will create unavoidable costs 
from compliance, which will impact investment opportunities in 
countless industries.
    We have already seen this in Europe where investment from 
venture capital had gone down at least in the short term by 
some 39 percent.
    In the United States, California's CCPA was estimated by 
its own State agencies to cost nearly 1.8 percent of gross 
State product, which is quite massive.
    And the ITIF, a think tank here in the DC area, estimated 
that a Federal privacy law could cost as much as $122 billion 
per year. All this reminds me of one of my favorite authors, 
Seth Godin, who once remarked, ``The art of good decisionmaking 
is looking forward to and celebrating the tradeoffs, not 
pretending that they do not exist.''
    I thank you for your time, and I look forward to questions.
    Chairman Crapo. Thank you.
    Ms. Dennedy.

    STATEMENT OF MICHELLE DENNEDY, CHIEF EXECUTIVE OFFICER, 
                         DRUMWAVE, INC.

    Ms. Dennedy. Chairman Crapo, Ranking Member Brown, and 
Members of the Committee, thank you for having me here today.
    I am in the clean-up position. So I will not go deeply into 
the various analyses in my written testimony. I submitted to 
you, Chapter 13 of the Privacy Engineer's Manifesto, a book 
that was actually published before GPR was actually put into 
law but in the thick of the negotiations, and many of these 
issues were pertinent then as I believe they are now.
    As everyone you have heard reflected here is, I think the 
summation answer is this is hard, and we are not afraid of hard 
things, although Ranking Member Brown might have some cynicism 
around--I think it was you or Silicon Valley CEOs. I happen to 
be one. I am not like Mark in many different ways, but I assure 
you there is a lot of work that has been going on over the 
years.
    I have spent most of my career first as a patent litigator 
and then as a chief privacy officer, as one of the first five 
in the world, where we decided and sat down in a small room 
together that there must be something better. There must be 
something bigger. There must be a way to functionalize what was 
then the only real framework was the OECD Principles for Fair 
Information Management and the '95 Act coming out of Europe. So 
we have very much been led in high tech in the belly of the 
beast by the Europeans for decades.
    So what we did is really look at what is the functional 
tool. So just as Julia Apgar told us once upon a time, when 
there was a huge infant mortality, stop looking at just the 
mother and sometimes look at the baby. The procedural change of 
turning a nurse's head and looking at a child to see if it was 
breathing or not changed the child and the parent mortality 
rates.
    We believe at DrumWave if you look at the data, we will 
have a similar moment.
    If we indeed say value our data--``I proclaim that you must 
value the data. You must give an accounting, dear tech firms, 
dear hospitals, dear Senator bodies, of every single piece of 
information that is observed''--I will submit that you cannot 
today because the systems are designed to create more systems. 
The systems that we are sitting on top of and running our 
economies on and our cultures and our families and our 
educational institutions are based on software and hardware 
rather than data.
    So the shift is how do we account for and look at data, and 
I go into some length in my written submission about picking a 
model. Whether it is a tangible property right, we have aspects 
of tangible property that are working, but some are not.
    So, for example, let us look at a privacy right that is not 
necessarily a data privacy right. When I go into a public 
bathroom stall, for a brief moment, that stall is mine. I do 
not own that real estate. I do not even buy a ticket. I just go 
into the stall, and the expectation is that I close the stall, 
and it is mine for a moment. The moment I left, it is no longer 
mine.
    So think about data moving in and out of cells, in 
databases, left and right. We may think that we want to 
understand every single transaction. What you actually want to 
do is find the errant transaction. So we are actually working 
on developing something that we are calling a ``dynamic 
information meaning score,'' which is a combination of looking 
at the machine learning on databases and data systems as they 
do flow today.
    So engineers, such as my father, with whom I wrote this 
book, understand back from the days of Univac and the earliest 
days, when you model data, when a real engineer models data, 
they think constantly about all users, the auditors, the 
lawyers, the
regulator, and the user. We have forgotten the user in this 
current economy. To look at the user, you must look at the 
data. To look at how you build systems, you have to think about 
who these users are. There are aspects of data if we ask the 
systems. We will find the answers, but I believe the model 
looks a little bit more like the constructs around intellectual 
property, where you may have a shared right.
    In fact, the economic value for companies in a social 
networking concept is the grand value. So, ironically, if you 
wanted to get a payback of what your data was worth, it is 
actually worth less as an individual person the greater the 
dataset. The greater the dataset, the more the social network 
gets from the analytics off the top of it. So if we do a pure 
formula of how much each of us is worth, we will find a 
diminishing result.
    If instead, we look to things like copyright and trademark 
and goodwill and brand, then we might find a blooming type of 
right. So the more I participate online as an individual, I 
actually am increasing the value of that transaction.
    I will leave the rest of my remarks for question and 
answer, and thank you very much for your time and covering this 
important topic.
    Chairman Crapo. Thank you very much.
    Each of you raise very interesting aspects of this issue.
    I would like to start out with you, Mr. Ritter, and focus 
for just a minute on GDPR. The European Union's new privacy 
directive provides individuals with greater control over their 
data, including the right to access, erase, and restrict the 
processing of it. How does the European Union approach--how 
does the concept of data ownership show up in the GDPR?
    Mr. Ritter. Well, thank you, Senator.
    Many of the rights that you are describing are the things 
that we might associate as attributes of ownership to our 
computers, our cars, other things of physical goods that we own 
as consumers, who can use it, who can share it, who can 
possibly sell it on our behalf.
    But the reality is that ownership as a legal concept has 
not been addressed in the GDPR. So we have this awkward 
situation in privacy law where these rights are being assigned 
to or being confirmed by legislation to exist for the 
individuals, but they have no mechanism of attaching those 
rights to the information.
    In all other modern systems that we have in commerce, there 
is a concept of ownership, and I think that has been, as 
Senator Brown said, something that was skipped over, both in 
Europe and here in the United States.
    By establishing ownership as a concept, nothing should 
degrade or reduce the rights of an individual, a data subject 
to be able to have awareness of what their information is 
doing. But what I think the Europeans did in GDPR revisions has 
been to create transparency, create awareness, create 
accessibility, all of which are consistent with the kinds of 
things that are being proposed in legislation such as the DATA 
Act by Senator Cortez Masto, and certainly the Own Your Own 
Data Act by Senator Kennedy.
    So the Europeans skipped it as well, and I think that 
actually has created an opportunity for the commercial sector 
to exploit that data without giving the individuals effective 
enforcement. And I think that is what is missing from GDPR. If 
we know who owns it and, therefore, who breaks the rules of 
their custody of that data for the individual, then we have 
stronger enforcement of those.
    Chairman Crapo. All right. Thank you very much.
    Mr. Marlow and Mr. Rinehart, I think each of you, if I 
recall correctly, talked about the cost of privacy or, I guess, 
the cost of ownership if we create a system like that. Could 
you each just briefly expand on that a little bit?
    Mr. Marlow. Thank you, Mr. Chairman.
    Let me say, initially, that I think that the concepts of 
cost and the concepts of ownership come in because of the 
familiarity of that model to Americans, right? I own my car. I 
own my television. I own my house. So the idea that because I 
have control over my car, if I own my data, that would give me 
similar levels of control, and so we think about cost as being 
part of that conversation.
    But what I would remind the Committee is we have rights to 
free speech, even though we do not own our speech. We have 
rights to vote, even though we do not own our vote, and we have 
rights to privacy, even though we do not own our privacy and 
our data. So I would encourage people, although it is tempting 
to kind of come back to what is the value, the idea that 
because we see control elements in the marketplace and in 
private property, that if GDPR is trying to copy that level of 
control, it must also copy the marketplace model, including 
assigning costs, I think gets us on the wrong track.
    Chairman Crapo. Thank you.
    Mr. Rinehart?
    Mr. Rinehart. Yes. Thank you for the question.
    This, I think, really fundamentally goes at kind of the 
heart of what consumers want. I mean, I know there are a lot of 
concerns, obviously, with what is happening in Silicon Valley 
and concerns about privacy, but we do know that consumers do 
benefit pretty massively from a lot of these services. And 
there is a lot of reports and surveys and information to that 
extent.
    What I would highlight is that if you really do establish 
the property right and data where you are effectively requiring 
every single person to say yes to some sort of innovative 
service, the kind of new innovative services that you would 
want to compete with a Facebook or a Google might not actually 
be able to be created.
    I know this is not exactly the--sometimes it is not exactly 
convincing to individuals that these sorts of things could 
happen, but we have seen these in the past. Especially with, 
for example, the telecommunication companies, they had a very 
similar sort of, kind of opt-in requirement whenever they were 
doing--in the 1990s when they were trying to reach consumers, 
and we do know that that did have an effect on some of the 
telecommunications companies, including U.S. West.
    I would just highlight the fact that when you do require 
individuals to say yes to innovation, it just makes that sort 
of innovative service and products all that more difficult.
    Chairman Crapo. Well, thank you.
    And, Ms. Dennedy, I am out of my 5 minutes, but I am 
probably going to ask you in writing to respond. You had some 
very intriguing issues that you raised, and I want to pursue 
those with you.
    Ms. Dennedy. Absolutely.
    Chairman Crapo. Senator Warner.
    Senator Warner. Thank you, Mr. Chairman. Thank you for 
holding this hearing. This is an issue that I am very 
interested in and engaged in. I agree and disagree with a 
number of the comments that have been made.
    Mr. Ritter, I do think this America's failure to leadership 
in this area is going to come back and harm us. I think it is a 
pattern where we are seeing America start to retreat.
    Mr. Marlow, I think you accurately point out some of the 
tensions of a traditional ownership model. It gave me a lot to 
think about.
    Mr. Rinehart, I think you are a defender of the status quo, 
and frankly, I find your arguments very lacking because I 
actually think the status quo is not going to be continued. I 
find this notion of ``Gosh, it is hard to figure out some of 
these ideas'' is patently absurd.
    I made some money in the telecom business. I remember, 
initially, it was really hard to figure out beyond user base 
what these companies would be worth. Well, the market drove a 
methodology around value, around spectrum, not imperfect. We 
came to that. Marc Benioff, when he lost out on the acquisition 
of LinkedIn, basically said, ``Companies are buying these based 
on the value of the data that is being collected.''
    Ms. Dennedy, I agree there is a complexity to this. It is 
not single data points. It is that combination, but to make an 
argument that you cannot figure this out or it is hard to 
figure it out, it is going to be an unwarranted cost, that is 
just totally spurious. Companies are making acquisitions all 
the time in this space based upon valuations.
    That is why I think one of the areas I would like to start 
with that I think there could be some broad bipartisan 
agreement--I got bipartisan legislation on this called the 
DASHBOARD Act that would say we ought to at least know what 
information is being collected on us in a more granular way. We 
ought to be able to know who that is being offered to on a 
third-party basis, and yes, even, Mr. Rinehart, if it is a 
little bit hard and we will not be a perfect model, but we 
ought to know what it is worth because the premise of these 
companies that have said, ``We are offering you a proposition 
that this is all free,'' there is nothing free about what 
Facebook or Google offer. It is not morally wrong, but they are 
giant sucking sounds of personal data being collected about us, 
being monetized in a model that, based on advertising, there is 
nothing wrong. But there is so much opaqueness.
    And, clearly, the establishment and established companies 
do not want more transparency because we might have--grapple 
with the questions that Mr. Ritter and Mr. Marlow put out so 
strongly, and we might actually--if we had more valuation 
questions, we might say there might be companies that would 
say, ``Well, maybe there is a way to disintermediate between 
the platform and the user,'' because if we know Kennedy's data 
is worth $10 and Tester's is worth $20 and mine is worth $5, 
maybe you might be willing to give some portion of that, maybe 
not even having to get to the ownership question, so that 
somebody could provide a level of service in between to help 
protect you.
    So it is a hard, hard issue, but to say there is not any 
previous models, I think, is wrong. I think the notion that we 
cannot sort this out, I would urge--and I know Senator Kennedy 
has got some legislation in this area as well. I think it is 
really, really important that we do the hard work of trying to 
sort this out, particularly whether we are thinking about 
consumer protection, whether we are thinking about the idea of 
true transparency, whether we are thinking about the idea of 
pro-competition.
    Unlike some of my colleagues who want to go straight to 
break-up, I would rather see if we can introduce more 
competition into this model, and we are not going to have more 
competition if we have the level of opaqueness in where the 
large platform companies control all the data at this point and 
are not anxious to share, are not anxious to have more 
transparency continue that way.
    So I would like to start--I guess I am a little more 
speechifying than questioning here, but, Mr. Ritter and Ms. 
Dennedy, talk about the idea around digital markets, about the 
notion--without getting to ownership, but just the notion of 
more transparency, value add, value less. Either one of you 
want to go first?
    Ms. Dennedy. Jeffrey knows that I am obsessed with this 
area.
    There is a great deal to be gleaned, earned, and profited 
by through transparency. I am a huge, huge fanatic and fan. In 
one of my prior companies, I did--with the consent of the 
Federal Trade Commission, I did a graphic novel for our privacy 
policy for Intel because I believed the subject matter was too 
complex to read through 16 pages of legalese.
    So we did a cartoon to train our customers. At Cisco, we 
did what I called the ``Lord Ashfields.'' We did subway maps 
that look a lot like the underground in London to map out where 
the data is, where was mine, where was yours, where was third 
parties. Was it perfect? No. those maps are proliferating.
    Actually, ironically enough just this week, I heard from 
some of the privacy engineers at Facebook that they are 
adopting them. I do not know if that is going to bubble up to 
the top, but I am pleased to hear that the beginnings of that 
innovation are occurring.
    I think transparency allows you to understand where asset--
I go back to the old-fashioned accounting rules and say, ``What 
is an asset?'' It is something based on an activity that can 
potentially provide benefit in the future, and if it stops 
providing future benefit for the consumer or the user, it 
starts to become a liability. That sounds a lot like a market.
    So the more transparent, I believe, that we can be and 
create the systems to look at the metadata that we have 
already, we can start to create those markets.
    Now, I am a believer in this, but I am also early in the 
market. Is this something comprehensively that if you said 
tomorrow, we want abject transparency, it would take some time 
to catch up? I think we can get there.
    Chairman Crapo. We will probably have to have your remarks, 
your answers, Mr. Ritter, in writing or following.
    Sorry, Senator Warner. We have got to move on.
    Senator Kennedy.
    Senator Kennedy. First, I want to thank all of you for 
coming today. I know you are all busy, and I do not want you to 
construe what I am about to say as a comment on your 
personhood. It is more a comment on the testimony.
    I am not fluent in BS, and I have not the slightest idea 
what you are talking about, except that we have got a problem.
    One of the issues before us is whether data is property or 
not. I did not hear you answer that. If it is not property, it 
is something, and a lot of people are making money off of it.
    Mr. Rinehart, you said it is hard to value this something. 
Well, let me tell you one way to value it. Last year, 
Facebook's revenue was $56 billion. That is nine zeroes. 98.5 
percent of it was revenue from ads that were specifically 
targeted to people based on data.
    Mr. Rinehart. Yes.
    Senator Kennedy. So whether it is property, a property 
right, or a cabbage, it is being monetized, and it is people's 
personal information. Can we agree on that?
    Mr. Rinehart. I would want to add a little bit of 
complexity to that, that----
    Senator Kennedy. Of course, you would.
    Mr. Rinehart. So, I mean, what they are selling is the--
effectively, the attention of individuals, and that to me adds 
some complexity to this question about the value of data as 
compared to, say, the value of attention.
    Senator Kennedy. Mr. Rinehart, I do not mean to be rude, 
but we are Senators, OK? Whatever--this is what I sense, and I 
have nothing against social media. These are wonderful American 
companies, but we are dealing with problems that nobody 
anticipated, at least I did not, and my constituents did not.
    Forget about whether it is a property or not. It is 
something, and it is mine, or at least it originates with me. 
When I go on Facebook and share information, whatever you want 
to call it, it started with me, and now it is on Facebook and 
whoever Facebook chooses to give it to.
    Why can't we have a rule that says, whatever you want to 
call it, my rights follow it and I can license it, share it 
with you and Facebook--I hate to pick on Facebook, but that 
sharing has to be knowing, and it has to be willful. And I have 
to be able to change my mind, and I have to be able to call 
Facebook or click on an icon and say, ``By the way, I want to 
see what you have got on me.'' And consistent with Senator 
Warner's point, I would kind of like to know what it is worth. 
Why cannot we do that?
    Now, I am going to tell you what part of we cannot--reason 
we cannot do that. Part of that is our fault. We have been 
holding hearings on this subject--I do not know--2, 3 years. We 
will probably be doing it again. This is not a reflection on 
this Committee. This is on the Senate. All we do is issue press 
releases, hold hearings, and strut.
    This to me is not complicated. Why cannot we just agree on 
those rules? What is wrong with that?
    Mr. Rinehart. I would not necessarily say that those rules 
are wrong by any means. I mean, CCPA, California's Privacy Act, 
does, in fact, establish a lot of those sorts of rights without 
having to establish a property right. So that to me--I mean, we 
can--and I have been supportive of privacy laws in the past, 
and I think that that is something that is very much needed for 
this country.
    Senator Kennedy. Yeah. But you all keep saying that, and 
then you never propose anything.
    Look, I am about to run out of time here. Let me tell you 
what is going to happen. We have got a problem. I do not think 
anybody anticipated it. If they did, they did not do anything 
about it. At some point, the American people are going to get 
fed up, and then we are going to have to do something. 
Otherwise, we will not get reelected, which is what motivates 
people around here.
    Mr. Marlow. Senator, can I just suggest one thing?
    Senator Kennedy. And you may not like what we do.
    Mr. Marlow. Fair enough, sir, but if I could just suggest 
one thing. If the Senate were to pass very strict privacy laws 
that did not necessarily adopt a data property model, people 
with the right to say you can and cannot use my data would 
still be able to sell their permission. They could still say, 
``For $10, I will give you a yes instead of a no.'' So it is 
possible to have people sell their data without adopting the 
data's property model that carries with it a lot of downsides 
that strong privacy protections that do not necessarily impinge 
the right to sell data would not bring with them.
    Senator Kennedy. I am way over. I am sorry, Mr. Chairman.
    Chairman Crapo. Thank you.
    Senator Menendez.
    Senator Menendez. Thank you, Mr. Chairman.
    Mr. Marlow, companies such as LexisNexis Risk Solutions 
collect hundreds of nonmedical personal attributes in order to 
help to predict a person's medical costs. The company claims 
that providers can use this data to improve patient health care 
and health outcomes, but I have concerns that this same data 
can be used to discriminate against patients, including 
vulnerable populations such as low-income individuals and the 
elderly.
    What, if any, protections exist for consumers to prevent 
insurance companies from using this data to discriminate 
against customers?
    Mr. Marlow. Thank you, Senator.
    First of all, I want to recognize that you point out a very 
particular challenge. The healthcare data area is something 
that will have to be explored and drilled down to 
independently, regardless of where the Senate comes out on this 
issue, because certainly I think concepts of having people be 
able to study populations and develop healthcare solutions and 
cures to diseases is something that is very, very important.
    I think the challenge is that when data flows into a 
marketplace, slows into the insurance companies, it is very 
difficult to be able to parse the algorithms, to understand to 
what extent they are----
    Senator Menendez. So, in essence, you are telling me that 
there is no patient protection at this point?
    Mr. Marlow. Well, I think that there are certain laws that 
would kind of writ large propose kind of blatant 
discrimination, but it is the finer discrimination that is hard 
to identify, that is hard to get at.
    And so the question of--without, unfortunately, probably 
the answer you want--algorithmic transparency is a very 
challenging one, but it is within that question that an answer 
may or may not be revealed.
    Senator Menendez. Ms. Dennedy, after data companies analyze 
this data and generate a health risk score for a client such as 
a health insurance companies, do consumers have any right to 
see that analysis or score?
    Ms. Dennedy. I think that they should, if they do not 
already. I think this is, part and parcel, of what we call 
``privacy engineering'' and the new field of ethics engineering 
that is emerging, and this is exactly addressing these complex 
issues that you are talking about. When you are actually doing 
system design, the question must be asked. How do you 
interrogate how decisions were made?
    In the same light, when we are talking about ethics 
engineering for machine learning and machine algorithms as 
everyone is kind of calling, colloquially, ``AI'' these days, 
most of it is machine learning. Understanding for patients 
where the algorithm has been applied may be nonsense to that 
patient. However, to the ethics boards of hospitals today to 
the FDA that approves drugs and gadgets that are part of our 
healthcare system, I believe there are structures in place if 
we provide the transparency behind how these scores are 
calculated.
    Senator Menendez. What happens if some of the data is 
wrong?
    Ms. Dennedy. Well, that is exactly the problem.
    Senator Menendez. What recourse do consumers actually have 
to correct inaccurate data that is going to affect their health 
profile and their risk that ultimately is going to affect their 
insurance coverage and other things?
    Ms. Dennedy. So this is part of the beauty of what is 
included in the GDPR that we would love to see in a Federal 
bill, which is the privacy disclosures, the privacy impact 
assessments that must be done before a system is put into place 
to process information or when that system is changed by third 
parties, that information would be in those disclosures of 
those privacy impact assessments that are required under the 
law in Europe and not here.
    Senator Menendez. I see Mr. Ritter is anxious to give a 
comment.
    Before he does, let me throw out what probably will be next 
to my final question. Given the sensitive nature of health 
data, does that data deserve heightened protection and/or 
regulation?
    Do you want to start off, Mr. Ritter?
    Mr. Ritter. Thank you.
    First, I would take exception to the notion that health 
data can be treated apart from the broader questions, both the 
personal information and all information that we gather. We 
need solutions here, and with regard to how we enforce these 
rules, as a Government, we must demand transparency.
    I watched with interest, the hearings of this Committee 
earlier this week on the audit process and looking at brokers 
and the SEC rules, and that kind of transparency of automated 
awareness, automated surveillance of compliance is going to be 
what we as a
Nation must anticipate. We can no longer just write generic 
rules that tell companies what they should and should not do.
    The consumer benefits from that level of interaction in 
enforcing rules by being able to see that from an automated 
perspective.
    One of the earlier references was to the provision of 
Oregon law about having a unique tracer on a record. That is 
exactly how it is done today, and I actually think that is 
where we are moving in the future.
    We have license plates on automobiles. We have serial 
numbers. We have RFID devices. I put one inside the box of my 
bicycle when it ships across the ocean, so I know exactly where 
my box is, no matter what the airline says. That kind of 
traceability is getting smaller and smaller, and I think that 
is going to be part of how we can enforce the rules is asking 
the machines to execute the compliance with those capabilities.
    Senator Menendez. Mr. Chairman, this is a big topic, but I 
think, particularly, we are being naive if we simply assume 
that the collection analysis health data will be used only to 
lower costs and improve health outcomes. We have to take 
seriously the threat that this data could be abused and 
ultimately lead to consumers being taken advantage of, and I 
look forward to working with the Chairman.
    Chairman Crapo. Good point.
    Senator McSally.
    Senator McSally. Thank you, Mr. Chairman, for having this 
hearing. Thanks, everybody, for their testimonies.
    I first need to confess, like I am a privacy zealot. 
Personally, I lead a very boring life, but just on principle, I 
am the kind of person who fills out those forms when you get 
them in the mail about opting out from your credit card 
company. And I do not put location services on my phone. I do 
all sorts of other weird things that I want to let everybody 
know about it.
    And I always have these conversations, extremely 
frustrating to me that we are in this situation that we are in. 
I have arguments with one of my staff members who like loves 
tailored ads and does not mind giving up all his privacy in 
order to have tailored ads, and I am like, ``I will pick what I 
want to buy. I do not need you gathering my stuff and sending 
me ads. It is just like it is such a personal frustrating thing 
for me and for my constituents that I represent.
    And I appreciate it is a complex issue, and what I am 
hearing from you all today, maybe data as a private property 
maybe is not the way to do it. OK, fine. But I guess what I am 
struggling with is part of what has happened, I think, 
culturally, is we have got all these new tools that are out 
there that people are able to use for free, and their revenue 
model is to collect your data and sell it. I mean, we all know 
that, and maybe, unwillingly, people did it initially. But now 
that is kind of our expectation that we can do a free internet 
search. We can do free social media contacts and communication. 
It is not free because it is taking your information and 
profiting off of that.
    But if there is another model of like, OK, is the market 
not demanding this, I guess, where we have a social media 
platform where you can pay a certain amount and say, ``I want 
to be able to engage, but I want to keep all my stuff private, 
so you cannot sell it,'' what is the value of that, and is 
there even a market for that?
    Mr. Rinehart, on page 7 of your testimony, you said during 
a study, most subjects happily accepted to sell their personal 
information for just 25 cents. Part of this is kind of cultural 
on what our expectations are.
    I would also be concerned about the socioeconomic divide. 
So if some of us are willing to pay in order to protect our 
privacy but others cannot, then we have a division going on 
there of just those of the lower income having their data being 
taken and sold, and that is a problem too.
    So if the solution is not data as private property, what is 
it, and does it start with transparency? That is where I am 
sort of landing. Does it start with let us shine a flashlight 
on what is actually going on, let people see and be mortified 
by what information they have that is being collected and 
analyzed, and then maybe that will help drive a different 
market solution or even different companies?
    I mean, I am not one that wants a Government solution that 
is going to stop innovation. We want less regulations and more 
innovation, but in this case, how do we deal with that 
conflict, and what is the best place to start? Is it 
transparency?
    You guys are all saying it is not data as private property. 
Then what is it?
    Mr. Ritter. Yes. Transparency is critical.
    What we are seeing in the way we manage data as property is 
increased detail, increased capability of managing detail, and 
that also enables us to have more rapid response both in 
corporate systems, in government systems, in information 
security, allows us to see what is happening.
    Transparency is good, and the division between you and your 
staffer with regard to the acceptance of all this is something 
we are just going to live through as they----
    Senator McSally. But then we get to choose. It is all about 
freedom, right?
    Mr. Ritter. It is humankind.
    Senator McSally. Yeah.
    Mr. Ritter. Right? But for the systems to survive and for 
the digital age to not bring us down but actually to be a 
backbone for humankind to move forward, we have to demand that 
transparency in Government requirements, such as the ones that 
are being proposed in Senator Kennedy's legislation and some of 
the other pieces that have been drafted by Members of this 
Committee are outstanding first steps in that direction.
    Mr. Marlow. Senator, if I may--and I really appreciate 
being able to discuss with a privacy zealot like myself.
    Senator McSally. Yes.
    Mr. Marlow. But a couple of things. So, one, privacy is not 
about secrecy. It is about choice, right? And so it is 
certainly acceptable within a privacy regime for you and your 
staffer to come out in different places.
    Senator McSally. Exactly.
    Mr. Marlow. But what we want to make sure----
    Senator McSally. But right now, there is no choice for 
people like me.
    Mr. Marlow. Precisely. And we want to make sure not only 
that you have a choice----
    Senator McSally. Yeah.
    Mr. Marlow.----right? But that the choice is well educated, 
informed, transparent, meaningful, and precise. All those 
things, I think, have to enter the equation.
    Another aspect that you brought up that is important to 
privacy is that we want to make sure that consumers like 
yourself and myself cannot be punished by companies----
    Senator McSally. Right.
    Mr. Marlow.----for exercising our privacy situations, or 
conversely, that those who give up their privacy are rewarded 
in ways that we are not.
    But I think that the idea of bringing money into that 
equation goes to your final point, which is we do not want to 
establish a system like data-as-property where Government is 
used to put additional weight on the anti-privacy side by 
saying, ``And we are going to make sure you know there is 
money, although we may not tell you how much, but there is 
money at the other side of the equation.''
    So I thank you very much for your observations. There is 
absolutely a path to get there.
    Senator McSally. Well, I know I am out of time, but 
eventually, money is a part of it, whether it is monetizing 
your data or not. There is no such model for Facebook to exist 
if we are all opting out, right? Because they do not--they 
cannot--that is not their business model.
    So somehow--I mean, I know we got to go, but somehow we 
have got to have this conversation about how some of these 
tools can be accessible for people that really can be impacting 
their lives. Forget about the social media aspect, but even in 
medical advances, but also having people know what is going on 
with their information and why it is being used.
    Mr. Marlow. Right. But like your staffer, Senator, not 
everyone is going to opt out. That is kind of a dooms-day 
scenario, but that is not going to happen because some people 
do like targeted ads and like to be directed online.
    Senator McSally. All right. Thank you, Mr. Chairman. I 
appreciate it.
    Ms. Dennedy. Can I just have one more quick----
    Chairman Crapo. Real quick.
    Ms. Dennedy. Thank you. Very quickly.
    I think we are focused on transparency at the bottom 
lawyer. Let us not forget the top. The war that is going on for 
every CPO right now, the war for attention in the budgets, I do 
not see any CPOs sitting on public boards. I do not see the 
Chairman's letter to their shareholders talking about what 
their data scores are. Until we score our data and it comes 
from the top down, it is entirely a battle of one, and it 
depends on who you privacy officer and how big her hairnet is. 
I do not know what the appropriate senatorial thing is to say 
there.
    Chairman Crapo. Understood.
    Senator McSally. Thank you.
    Chairman Crapo. Senator Tester.
    Senator Tester. I want to thank you, Mr. Chairman. I want 
to thank Ranking Member Brown for having this hearing. The more 
that is talked, the more confused I become.
    OK. I mean, I deal with property the same way most people 
deal with property. It is something you own. It is something 
you sell. A bushel of wheat, for example, I am a farmer. I sell 
it. It goes to General Mills. General Mills might sell it to 
somebody else. They might make flour out of it, but I do not 
care. I got my money, and I do not give a damn.
    The problem here with this and why it is such a critically 
important issue and why this hearing is so important is because 
the solution is so complex because everybody has got a 
different perspective.
    Senator McSally said it. She loves her privacy, and not 
secrecy, but privacy.
    So we have got a situation where we have got health 
information that if it gets in the wrong hands, it can have 
incredibly--or even in the right hands, I might add, that can 
have incredibly negative impacts on my health insurance 
premiums; bank information that could cost me to buy a house a 
lot more or a lot less, or health premiums, same thing on the 
health premiums.
    And then we have got a situation where I buy a transmission 
for my combine, and who really gives a damn? Really, I mean, 
what are they going to do? I bought it. It is done. All that 
information can transfer. I am not going to buy another 
transmission because I do not need it.
    So it is really hard to figure out where we are in this and 
what works, but let me ask you this. And I have a great respect 
for everybody that is on this panel, and I mean that. I 
appreciate all of your perspectives because I think they are 
very interesting. But we had a hard time with REAL ID in 
Montana. If we put tracers on information, Mr. Ritter, should 
my constituents be concerned about me voting for a bill that 
has that in it?
    Mr. Ritter. Yes, but not because of the tracers, but 
because the inadequacy of the bill to not deal with the 
consequences of the records those tracers produce. That is 
where the difference can be made.
    Mr. Tester, my original clients 40 year ago were farmers.
    Senator Tester. Yes.
    Mr. Ritter. And so in response to your point, actually when 
you bought the new transmission, that fact is very interesting 
to the company from which you purchased the oil because it 
wanted to know the useful life of the transmission. The 
transmission company wants to know, one, you purchased it based 
on the number of hours of usage.
    Senator Tester. A really good point.
    Mr. Ritter. Number of hours of usage that created off of 
the tractor.
    Senator Tester. And so the issue becomes should we pass a 
bill that says, basically, you cannot----
    Excuse me. If this is my wife----
    Mr. Ritter. We all understand.
    Senator Tester. It is not.
    You know what? You know what this is? It is a targeted ad.
    Mr. Ritter. It is a scam likely.
    [Laughter.]
    Senator Tester. So you want to talk about cost and money--
do you want to talk about cost and money? I mean, if there is 
one thing that makes my head explode, it is the fact that I got 
people out there who I do not want to do business with that are 
trying to market crap to me that I do not want, and whether it 
is through this telephone or whether it is through the computer 
when I am sitting there trying to read an article and all these 
damn screens keep popping up--and I am not a tech genius to get 
all this stuff off, it is baloney. And that is the real problem 
for me. It may be a different problem for somebody that is in a 
different economic stature, but the problem for me is I am 
getting all this information I do not want. So how do we fix 
it? is the question. How do we really fix it and protect my 
civil liberties for privacy, not stop business? But I would 
tell you that several of you talked about cost. There is a lot 
of cost with doing nothing here too. So we need to do 
something.
    So how do we fix it? Because I got the impression from you, 
Mr. Ritter, that the European Union is doing--and other people, 
by the way. I do not want to pick on you. You are a good guy. 
The European Union is doing some stuff, and Japan is doing some 
stuff, but it is really not complete, or is it?
    Mr. Ritter. One of the things that distinguishes both 
European policy development and Japan is extensive research and 
consensus development before they introduce definitive rules. 
That is something we struggle with here in the United States to 
stay in the----
    Senator Tester. So what did you just say?
    Mr. Ritter. They think about it, and they write their rules 
with design.
    Senator Tester. OK.
    Mr. Ritter. All right? You would not buy a combined----
    Senator Tester. That is exactly what we do here in the U.S. 
Senate.
    [Laughter.]
    Mr. Ritter. A survey came out just about 8 weeks ago on the 
impact of GDPR on the European citizen, and it indicates that 
they have had over 145,000 complaints just in the first year. 
But--and I will take a look at my notes here--45 percent of the 
people that were surveyed in Europe liked the reduction in 
nonresponsive marketing. They were getting better tailoring, 
and they liked that.
    Senator Tester. And how did--excuse me. I am going to make 
it very, very short. How did they stop that reduction? How did 
they make that reduction in marketing happen?
    Mr. Ritter. Many of the rules that the GDPR embraced, which 
also, I think, Senator Cortez Masto's bill embraces, just put 
an accountability on companies for having to disclose the use 
of----
    Senator Tester. And what happens if they do not follow the 
rules? Are there penalties?
    Mr. Ritter. The problem of enforcement is one we also have 
to address.
    Senator Tester. I get back to my original point. If this 
was easy, it would already be done.
    Thank you, Mr. Chairman. Thank you, Ranking Member Brown, 
for your courtesy.
    Chairman Crapo. Thank you.
    By the way, we have a Do Not Call List bill that we are 
working on.
    Senator Tester. Yeah. Well, we passed it, did not we?
    Chairman Crapo. Well, we have a better, more enhanced one.
    [Laughter.]
    Chairman Crapo. Go ahead, Senator Toomey.
    Senator Toomey. Thanks, Mr. Chairman. I think this has been 
a great hearing and a great discussion. I appreciate you doing 
it, and I want to really thank all of our witnesses for 
contributing to thoughtful ideas to a very challenging and 
interesting conversation.
    Some of the points, I think, that are extremely important, 
I think Senator Tester was touching on the fact that probably 
most people have different views about different datasets about 
themselves. I mean, I want much more privacy about my 
healthcare records, for instance, than I do about whether I 
just bought a tractor.
    I am sympathetic to the argument that transparency is 
generally, probably a good thing, and I am sympathetic to the 
idea of consumers having greater ability to exercise some 
choices about what data is released and what is not.
    But one of the things we have not talked a whole lot about 
this morning is the fact that in the model that has organically 
evolved, consumers get compensated. There is actually a lot of 
compensation. We talk about how much revenue Facebook took in. 
It is a staggering number.
    It is hard for me to quantify, but I can tell you I 
perceive significant value every single day when I go and turn 
on Pandora. I can get to listen to any music I want for free as 
long as I want, and very regularly, I will go to any number of 
competing map software and get fantastic direction to any 
destination I want to go to anywhere. And I pay nothing for it.
    I can read newspapers and magazines. The list is endless, 
right, of all of the things, and in fact, in many of these 
spaces--not all--many of them, there is real competition. To 
the extent that there is real competition, you have to assume 
that the data that is being--that is the revenue stream for 
these companies that have these apps, they are competing for 
that. And so they are presumably competing to offer me ever 
more in return for me providing them my data, to the point 
where it should converge on something approximating the value 
of it.
    So I like that value. I love the innovation. I have no idea 
what new things are going to be available next year, but I bet 
there is a bunch of others that I am going to enjoy using.
    So I wonder if anybody--let me start with Mr. Rinehart--
would want to comment on the fact that without property rights, 
which I have not thought enough about to have an opinion on--
consumers are already being compensated every single day for 
the data that they share. What are your thoughts on the level 
of compensation?
    Mr. Rinehart. Yes. I mean, this is something I tried to 
highlight in my testimony is the conflict between the data that 
supports the services and how consumers then value the services 
because those are slightly different things.
    We do know that consumers do value these services pretty 
extensively. I did some quick calculations before the hearings, 
and social media, by most accounts, probably benefits people to 
about $13,000 per year. They use it pretty extensively. I can 
follow up later. But we also know from a whole bunch of other 
surveys that if you were to try to give up, for example, Google 
search, it would cost people something like $18,000 per year to 
be willing to pay to get--or to be compensated to not use 
Google, similarly like $8,000 or around $8,000 for maps, as you 
had mentioned, or mapping services.
    So, yes, consumers do benefit in kind of these implicit 
values or in an implicit way, which does not show up in a lot 
of data, and that is also what makes the valuation of data 
itself. And as I mentioned throughout my testimony, it just 
makes it a little bit more difficult.
    Ms. Dennedy. Can I add one thing? My ears are hurting that 
we are saying it cannot be a property right. I think that data 
is probably some sort of an intellectual property right. It may 
not be a tangible right, but I think exactly as you are talking 
about, data to me is currency.
    So if you think about a penny, a U.S. penny, that alone is 
not very much, and I can tell you I have got a penny had you 
have got a penny. The story that that penny tells when you put 
it together with the rest of my data story--where am I spending 
it? How am I spending? Who am I? If I hand him a dollar, it 
means less than if I have a dollar based on our past history.
    So when we are talking about valuing datasets, let us be 
careful to understand that what you are really talking about is 
the lifetime. Sometimes there is a one-time transaction with 
data, and it looks and feels like a property right or a 
bailment. And other times, it really is ``What is that 
relationship worth?'' And the banking community, in particular, 
understands that better than anyone.
    Mr. Rinehart. Can I just add one small comment? I mean, I 
would mention at least within intellectual property, the 
traditional reason why you establish property rights for 
intellectual property is because there is an underproduction of 
those sorts of services, and what we are talking about here, 
especially with privacy, is very much the opposite of that. 
What we are trying to establish or at least what we are trying 
to talk about is the limitations of disclosure.
    I just find the intellectual property, the typical way of 
talking about it and the typical way of thinking about it 
through property rights as a way to incentivize creation is, in 
fact, the opposite of what we are talking about with privacy, 
which is incentivizing some sort of disclosure, which is just, 
I think, a complicated--it is a more subtle way to think about 
the difference between the two.
    Senator Toomey. Thank you very much.
    Chairman Crapo. Thank you.
    Senator Brown.
    Senator Brown. Thank you, Mr. Chairman. Thanks for your 
flexibility. The Finance Committee was doing a really important 
hearing on opioids, and as Mr. Ritter knows and some of you 
know, it is a terrible problem in Ohio. It is one of the worst 
problems in the country.
    We know the system we have right now does not protect 
Americans' privacy--phones, laptops, TVs, credit cards--turned 
into tools to harvest personal information for a few or maybe 
more than a few to profit.
    Mr. Ritter or Mr. Marlow, when we are talking about a 
property rights model for data, is not that just a way to 
legitimize and expand on the way Facebook and other big tech 
companies spy on every aspect of our lives?
    Mr. Ritter. In the 21st century, Senator, surveillance will 
be part of our life. This is something that we as a society 
must decide how to regulate.
    I do not think that they are spying on our lives in a way 
that is, at first instance, inapposite to our purpose in 
interacting with them. I walk into a bank to open a bank 
account so that I can save my money and pay my bills. I roll 
onto a gurney into a hospital ER for them to save my life, and 
that involves collecting a whole bunch of data and X-rays and 
MRIs and vital information. This is part of how we provide 
services in the 21st century.
    Each of us, to use your metaphor, is a data generator, even 
our devices, and I think that what we have to recognize is that 
we cannot stop the surveillance that the digital age demands 
for operational efficiency. But we can impose appropriate rules 
against the misuse of that information beyond the original 
purpose for which it is collected, and we can better enforce 
those by making clear who stands responsible.
    Senator Brown. Thank you.
    Mr. Marlow, answer that, if you would, in however direction 
you want to take, but include in it explaining how a property 
rights model would or could or should provide meaningful 
privacy protections.
    Mr. Marlow. So I would say this. The property rights model 
would provide privacy protections, and the fact that any model 
that says you own this data, you have the right to sell it, the 
corollary would be ``And you have the right not to sell it.''
    The problem with the data property model is you can 
accomplish all those things with privacy legislation, like 
California has, like Maine has, without having to create a data 
property model that will further incentivize people to give up 
their property.
    And I would say, just to put a fine point on something, 
surveillance is not necessarily going to be a part of our lives 
going forward. Attempts to surveil us will be. The extent to 
which surveillance succeeds or fails, the extent to which we 
have privacy or not is in your hands.
    I would be more than happy to come back another time for 
this Committee to talk about the way that our laboratories of 
democracy, our State legislatures are approaching this issue 
and making significant progress.
    So I would not be defeatist here. The only way that privacy 
disappears and surveillance wins the day is if we take an 
attitude that it is inevitable.
    Senator Brown. Let me pursue that again with both of you, 
Mr. Ritter and Mr. Marlow.
    We have been discussing whether or not we should create 
property rights in data. I know that you both are skeptical 
that it is even possible.
    Senator Kennedy compared data to a cabbage. If I sell you a 
cabbage or house or a car or any other property, I have to hand 
it over to the property owner. I sell you data. I still can 
keep a copy of it. How does the difference between data and 
Senator Kennedy's cabbage make it hard to exercise privacy 
rights, the same way you could exercise property rights?
    Start with Mr. Ritter.
    Mr. Ritter. Actually, I think cabbage and my medical record 
are very similar, Senator.
    As detailed in my written testimony, the scientific 
consensus surprises us. In contrast to what Ms. Dennedy said, I 
do not believe information is intangible. It is a tangible 
object. It is just small. And as our technology evolves, we 
can, in fact, manage the rules and the use and the economic 
implications of that use through technical means and through 
rules being applied. We just have to design the rules 
correctly.
    So I actually do think there is a great potential here to 
improve the protection of the individual from misuse and 
increase the enforceability of violations of those rules by 
making more clear, if you will, who is on first and has the 
ownership responsibility to make sure the consumers' interests 
are protected.
    Senator Brown. Mr. Marlow, comments?
    Thank you, Mr. Ritter.
    Mr. Marlow. Data makes terrible coleslaw, unlike cabbage, 
so I would start there.
    [Laughter.]
    Mr. Marlow. But I would say, again, the problem is that it 
feels like we are trying to put a square peg into a round hole. 
The data-as-property model, although it has some facets that 
promote property, also undermines property, like the need to--
if you had data or a cabbage, that could be replicated 
indefinitely and passed around, the need to track it everywhere 
it went, so people could be paid every time they consumed a new 
cabbage or piece of data. And that would require the tracking 
of any information.
    I could post something online that gets linked to my 
computer. Therefore, it provides personal information, and now 
my speech is being tracked all over the internet.
    So, again, I think that if you want to focus on privacy, 
you should focus on privacy protections that do not have such 
obvious downsides that come along with them, and that is the 
data property model.
    Senator Brown. Thank you, Mr. Chairman, for allowing me one 
more question.
    Again, this is for Mr. Ritter. I know you well enough, many 
years ago, to know you care about the wealth gap in this 
Country, and it is getting worse. We cannot continue to have an 
economy, although the Senate does nothing to fix this, that 
divides Americans into the haves and have-nots.
    With that in mind, would a system of data ownership create 
a world in which the rich could afford to keep their data 
private, but those who--Senator Crapo and I have talked about 
the challenge for low-income people and the temptation to sell 
all of those things. Would this mean the rich could afford to 
keep their data private, but too expensive or difficult for 
everyone else?
    Mr. Ritter. People that are underprivileged or financially 
disadvantaged sell their blood to be able to put food on the 
table for their children, and I do not think we can escape the 
likelihood that they would also sell their data if there was an 
economic return that would be useful.
    The bigger problem, of course, is to reduce the inequality, 
and we do that by being more visible of how that money is being 
made.
    For all the comments that have been made about the cost of 
compliance, there are very few advocates that assert the 
objection to the cost that do not knowingly talk about the 
revenue, and when we realize how much money is being made by 
these corporations from the personal information without 
returning that back as part of the ongoing transactional 
relationship with the individual, we are making a mistake and 
improving the inequality.
    Senator Brown. Could Mr. Marlow answer that too? And then I 
yield.
    Mr. Marlow. I would just quickly say that blood is a 
replen-
ishable resource. When you sell your privacy, it is gone, and I 
think that if I were to say to any Member of this Committee, 
for $50, would you sell me your most intimate, private, 
personal information, you would say no. But if you could not 
put food on your kids' table, if they were getting bread and 
water for the third night in a row and I said, ``I will give 
you $50 for your most personal information,'' I would venture a 
guess that you would all say yes. I am not certain that that is 
what we want privacy to look like in this country.
    Chairman Crapo. Thank you.
    Senator Tillis.
    Senator Tillis. Thank you, Mr. Chairman.
    Thank you all. I think you have given us all a lot of good 
information to work on.
    You know, I am somewhere between Senator McSally and being 
a privacy zealot and Senator Toomey who has rightfully pointed 
out all the benefits of using this data responsibly.
    I think that Congress--and I am curious, and I have got a 
couple of questions for you that may to down the line. But, 
one, Mr. Marlow, I hear you talk about the laboratories of 
democracy. I believe in it. I was the Speaker of the House. I 
loved the kind of stuff that I was doing in North Carolina.
    But I am concerned if we do not come up with a Federal 
solution to this problem that deals with data breach, data 
privacy, and data ownerships, then we are creating a patchwork 
of lies that is going to make it more difficult for American-
based innovators to really produce the next generation of 
value-added. Is there anyone here who thinks it is a bad idea 
for us to pursue Federal preemption and take on those three 
buckets?
    Mr. Marlow. I do, and the reason I----
    Senator Tillis. That is why I asked you first.
    Mr. Marlow. Thank you, sir.
    [Laughter.]
    Mr. Marlow. So the reason I say that, to be simply 
practical, is this. The idea that the U.S. Congress would pass 
strong privacy laws that would set a reliable privacy floor for 
the entire country is exceptionally appealing.
    The idea that Congress would come in where some States have 
passed even higher privacy protections for their citizens, like 
North Carolina might, and then say we are going to bring that 
privacy ceiling down to the Federal level, I find less 
appealing because I, like you, Senator, believe in our Federal 
model, believe that the States have the ability to--you know, 
you may have North Carolina passing a privacy protection that 
turns out, as we let it exist in the world for 10 years, serves 
to be a guiding light for Congress. But if we squelch it 
through preemption from the very beginning, we will never learn 
the lesson from your State, and so that is why I am very 
hesitant to have Congress creating a ceiling on privacy rather 
than a floor.
    Chairman Crapo. Others down the line?
    Mr. Ritter. Thank you.
    For nearly 30 years, I interacted on behalf of this country 
at the United Nations to write the rules that have enabled 
global electronic commerce to now thrive. There is one 
essential truth of global communication systems--uniformity 
rules.
    Whoever writes the rules first usually wins that game, and 
our failure to be more proactive in addressing the valuation of 
personal information and all data has handicapped our 
competitiveness.
    Having 50 different States is going to be adverse to our 
interests and, in fact, does not allow us to thrive.
    Senator Tillis. I want to move on to some others. I have to 
say that, in full disclosure, at Pricewaterhouse, I was a 
partner in global sourcing, strategic sourcing, and data 
analytics and data privacy and worked on it quite a bit. I 
actually think, first, we have to get Congress better educated 
on the issue so that we really know what we are talking about.
    But I tend to believe we could be the jurisdiction that 
sets an international standard, that we may miss a few nuggets 
in some of the States that are doing good things, but at the 
end of the day, if we do not get this right, then the States 
that are laggards are going to expose their constituents and 
our innovation opportunity may be lost for the United States.
    We have to understand that they are going to be aggregating 
data across the globe, and if we set a standard, we may be 
better off as a result of doing it.
    Look, because I have been in technology all my life and I 
am still trying to teach all these young folks that work for me 
how to use it, when I want to be private, I go into incognito 
mode. When I do not want somebody to know where I am, I turn 
off location services, or I use VPN. When I see a platform that 
I do not want to share my data, I do not. When I see a platform 
that if I do share my data, I may be able to drive down the 
price point of something I am purchasing by 50 percent.
    There is a website out there now where I can go put 
something that I want to buy. It aggregates all of the online 
retailers, and over a period of time, I say when you see a 
price point at this point, let me know and I will buy it.
    When you talk about narrowing the wealth gap, I think there 
is a great opportunity for the people who have the least amount 
of money to benefit from these tools, if we get them right, to 
drive down the cost and drive down the cost to the consumer.
    We now have someone that may be growing up in the trailer 
park that I grew up in, in Tennessee, that can get on a phone 
an drive a price point down. That used to be only available to 
large corporations that did strategic sourcing and had access 
to the tools, but we have really got to educate people on the 
various layers of data that we are talking about here.
    Tom Tillis and all of my personal information, I believe, 
is mine. I may allow someone to take it and use it. I want 
there to be a very clear authorization there, and I want to 
know whoever has it will then ultimately have a responsibility 
for any kind of data breach or misuse of that information. 
Those rules of the road need to be defined.
    But then we have to talk about the abstractions of data, 
where my identity is less important than my demographic, and 
then the aggregation of data that you actually use to gain 
insights and then target. I mean, all of that, all those layers 
of information have different implications in terms of whether 
or not I own it or whether or not my behaviors are just making 
it easier to find people like me that say, ``Hey, you may be 
able to get what you are looking for, for half the case, based 
on your behaviors.''
    We have got to better educate Congress on what layer we 
have to protect and what consequences there are for platforms 
who fail to be good stewards of the data, but then recognize 
that these data models are also what are driving innovation 
that ultimately benefit consumers and I think will ultimately 
benefit patients in health care. That is the sort of stuff we 
have to work on.
    And in Congress, Mr. Chairman, I know right now we have got 
at least three committees that think that they have ball 
control over this issue. One of the things that we have to do 
fairly quickly is figure out how we come together--and this has 
to be very task focused--and have each of these committees come 
together and figure out how to approach this on a more holistic 
basis, or otherwise we are going to keep talking about it and 
not act on it.
    Ms. Dennedy, I think it is remarkable that you wrote a book 
with your dad on data privacy manifesto. I am going to get one. 
I am going to get a copy of that book, but what I would also 
like to get is the graphic novel that you were talking about, 
some of the other things, if they are publicly available.
    Ms. Dennedy. I think Intel has taken it down, but I will 
get you a copy, Senator.
    Senator Tillis. I think that that would be very helpful 
because that is the sort of stuff that we need to get out to 
the individuals so that they understand. You use the one as the 
underground, the map to show, so that you can very quickly say 
that when I opt in or if I opt out, an informed choice in that.
    Ms. Dennedy. Those are public. You can find them on 
trust.cisco.com.
    Senator Tillis. I will go through that, but I think all of 
you made very valid points. What we have got to do is 
synthesize
it and, I think, come up with a data breach, data privacy, data
ownership policy that could potentially set a standard. We can 
be instructed by GDPR. We can be instructed by California. We 
can be instructed by a dozen or so other States that are 
considering it. But I think at the end of the day, this is 
something that Congress is going to have to take on.
    Thank you all.
    Mr. Marlow. Senator, if I could just say one point. You 
actually, I think, perhaps inadvertently made a very strong 
case additionally why we need privacy laws. You mentioned that 
one of the ways that you protect your privacy is by using 
incognito mode. Incognito mode, despite its name, is not 
private. It basically means it is private for your computer and 
other people in your household.
    Senator Tillis. Right.
    Mr. Marlow. But if you search on Google, Google can still 
see where you are going.
    Senator Tillis. Right.
    Mr. Marlow. So that is part of the point. We need to make 
sure that not only that the Congress is educate but consumers 
are educated and actually have meaningful rights in the area.
    Senator Tillis. Absolutely. And there is a lot of other--
there needs to be a suite of tools available to the consumer 
that can basically counter everything else that is going on 
after you hit enter. I get that. I think that is a part of the 
discussion that we have to have.
    I just, for one, think that I have been in Congress now for 
4 \1/2\ years. We have been talking about this 4 \1/2\ years. 
We actually need to take action, and we need to do it on a 
multijurisdictional basis. And we will continue to value your 
input.
    Mr. Chair, the only reason I went over is I was the last 
person, so you are the only one I inconvenienced. And I am 
sorry about that.
    [Laughter.]
    Chairman Crapo. That is the only reason I let you keep 
going too.
    Thank you very much. That concludes the questioning.
    Every Senator who left told me this has been a really great 
hearing. The members of the panel are deeply appreciated. We 
did not even get to get as deep as any of us wanted to on a lot 
of what you wanted to say and what we wanted to discuss with 
you, but that will come, and we do have consensus that we need 
to move and move broadly and quickly, but effectively.
    And your testimony today and what you will continue to give 
us in terms of counsel and advice and response to questions 
will help us get that done.
    So, again, thank you all very much. You will probably get 
some more questions from Senators, and to those Senators who 
wish to submit questions for the record, those are due to the 
Committee by Thursday, October 31st. And then we ask that you 
respond to them as quickly as you can when you do receive them.
    With that, again, thank you very much for being here, and 
this Committee is adjourned.
    Mr. Ritter. Thank you, Mr. Chairman.
    Mr. Marlow. Thank you, Chairman.
    [Whereupon, at 11:29 a.m., the hearing was adjourned.]
    [Prepared statements, responses to written questions, and 
additional material supplied for the record follows:]
               PREPARED STATEMENT OF CHAIRMAN MIKE CRAPO
    We welcome to the Committee four witnesses with extensive 
experience and a range of perspectives on issues related to data 
ownership, valuation and privacy, including: Mr. Jeffrey Ritter, 
Founding Chair of the American Bar Association Committee on Cyberspace 
Law, and an external lecturer at the University of Oxford; Mr. Chad 
Marlow, Senior Advocacy and Policy Counsel at the American Civil 
Liberties Union; Mr. Will Rinehart, Director of Technology and 
Innovation Policy at the American Action Forum; and Ms. Michelle 
Dennedy, Chief Executive Officer of DrumWave.
    As a result of an increasingly digital economy, more personal 
information is available to companies than ever before.
    Private companies are collecting, processing, analyzing and sharing 
considerable data on individuals for all kinds of purposes.
    There have been many questions about what personal data is being 
collected, how it is being collected, with whom it is being shared and 
how it is being used, including in ways that affect individuals' 
financial lives.
    Given the vast amount of personal information flowing through the 
economy, individuals need real control over their personal data.
    This Committee has held a series of data privacy hearings exploring 
possible frameworks for facilitating privacy rights to consumers.
    Nearly all have included references to data as a new currency or 
commodity.
    The next question, then, is who owns it? There has been much debate 
about the concept of data ownership, the monetary value of personal 
information and its potential role in data privacy.
    Some have argued that privacy and control over information could 
benefit from applying an explicit property right to personal data, 
similar to owning a home or protecting intellectual property.
    Others contend the very nature of data is different from that of 
other tangible assets or goods.
    Still, it is difficult to ignore the concept of data ownership that 
appears in existing data privacy frameworks.
    For example, the European Union's General Data Protection 
Regulation, or GDPR, grants an individual the right to request and 
access personally identifiable information that has been collected 
about them.
    There is an inherent element of ownership in each of these rights, 
and it is necessary to address some of the difficulties of ownership 
when certain rights are exercised, such as whether information could 
pertain to more than one individual, or if individual ownership applies 
in the concept of derived data.
    Associated with concepts about data ownership or control is the 
value of personal data being used in the marketplace, and the 
opportunities for individuals to benefit from its use.
    Senators Kennedy and Warner have both led on these issues, with 
Senator Kennedy introducing legislation that would grant an explicit 
property right over personal data, and Senator Warner introducing 
legislation that would give consumers more information about the value 
of their personal data and how it is being used in the economy.
    As the Banking Committee continues exploring ways to give 
individuals real control over their data, it is important to learn more 
about what relationship exists
between true data ownership and individuals' degree of control over 
their personal information; how a property right would work for 
different types of personal information; how data ownership interacts 
with existing privacy laws, including the Gramm-Leach-Bliley Act, the 
Fair Credit Reporting Act and GDPR; and different ways that companies 
use personal data, how personal data could be reliably valued and what 
that means for privacy.
    I appreciate today's witnesses for offering their expertise and 
sharing a range of unique perspectives.
                                 ______
                                 
              PREPARED STATEMENT OF SENATOR SHERROD BROWN
    Thank you to the Chairman for calling this hearing.
    This Committee has spent some time over the last several months 
discussing Facebook's poorly thought out plan to create a global 
currency. And the bottom line is that we know that Facebook can't be 
trusted with Americans' personal information and it is terrible at 
protecting its users' privacy. It was pretty clear the last thing we 
should do is trust them with American's hard earned dollars.
    But it isn't just Facebook. Every time corporations in Silicon 
Valley come up with a new business model, the result is the same--they 
get more access to our personal data, spending habits, location, the 
websites we visit, and it means more money in their pockets. And 
everyone else gets hurt.
    So I want to begin this hearing with a simple question--who has the 
right to control your personal, private information: you or Silicon 
Valley CEOs like Mark Zuckerberg?
    I think we all agree that Americans should have more control over 
their private information.
    But should we treat that private information like property? Today's 
witnesses will discuss that idea.
    At first glance this might seem like a simple way to tackle the 
complex problems created by data collection and machine learning.
    The promise is that if we just treat personal data like property, 
markets will do the hard work of protecting our privacy for us.
    But that's not how it will work.
    Instead of making companies responsible for protecting their 
customers' privacy, this idea puts the burden on all of us.
    Now imagine that if every time you wanted to use Facebook, or pay 
for something with an app, or login to a Wi-Fi network, you had to read 
even more legal fine print, and check a box saying, ``OK, I waive my 
personal right to my data to use this service.'' Or you had to join 
some kind of so-called ``data collective'' to sell your data.
    Working people in this country have enough to worry about--they're 
trying to get the kids out the door and get to work on time; to make 
rent and save for college and pay the bills.
    The idea that people should also have to manage their data like a 
landlord manages its tenants is ludicrous.
    This should be pretty simple--corporations should not be allowed to 
invade our privacy.
    We know that today, they are.
    Just think about all the personal data that's already floating 
around out there. Equifax exposed the personal information of more than 
150 million Americans--Social Security numbers, birthdays, addresses. 
Capital One exposed the personal information of more than 100 million 
Americans.
    How can you own your data, when it's already littered all over the 
internet?
    Big tech companies don't want to protect your personal 
information--they want to profit off it. Protecting your privacy 
doesn't make them any money--it costs them money--so they aren't going 
to do it.
    They want your data, and they want to get it for free, or pay as 
little as possible for it.
    So it should be no surprise that I am skeptical when I hear of 
plans for Americans' data to be treated like property.
    If Americans want more control over their private information, we 
have to find a way to prevent corporations from mining our data and 
selling it to each other. Creating a supermarket for selling away our 
privacy does the opposite.
    Treating data as something that can be owned, bought, and sold 
doesn't solve any of these problems--especially when undermining our 
privacy is the business model.
    Mark Zuckerberg and his Silicon Valley buddies want us to skip over 
the part where we have control over our privacy, and jump to the part 
where giant tech companies get to use their market power to squeeze our 
privacy out of us--and it would all be legal.
    That's unacceptable.
    I appreciate that Chairman Crapo has been working with me in a 
bipartisan way to create real privacy protections. Privacy isn't 
partisan, it's a basic right.
    I look forward to continuing our work together, and to the 
witnesses' testimony.
    Thank you.
    [GRAPHICS NOT AVAILABLE IN TIFF FORMAT]
    
                  PREPARED STATEMENT OF CHAD A. MARLOW
                   Senior Advocacy and Policy Counsel
                     American Civil Liberties Union
                            October 24, 2019
    Chairman Crapo, Ranking Member Brown, and Members of the Committee 
on Banking, Housing, and Urban Affairs, on behalf of the American Civil 
Liberties Union (ACLU),\1\ I want to thank you for the privilege of 
testifying before your Committee today.
---------------------------------------------------------------------------
    \1\ For nearly 100 years, the ACLU has been our Nation's guardian 
of liberty, working in courts, legislatures, and communities to defend 
and preserve the individual rights and liberties that the Constitution 
and laws of the United States guarantee everyone in this country. With 
more than eight million members, activists, and supporters, the ACLU is 
a nationwide organization that fights tirelessly in all 50 States, 
Puerto Rico and Washington, DC, to preserve American democracy and an 
open Government.
---------------------------------------------------------------------------
    The ACLU is strongly concerned about the data-as-property model and 
how it is being presented to the American public and its lawmakers. 
While the data-as-property model may have merit as a tool for 
redistributing the money that is currently being made off the sale of 
personal information, any claim that it advances privacy is false. To 
the extent Congress is seeking to provide greater private protections 
for Americans' personal information, what we need is an affirmative 
consent-based model that provides all individuals the ability to opt-in 
(or not) to the sharing of their personal data. Whether consenting to 
such use results in monetary gain is a separate matter, and does not in 
and of itself advance privacy. We should not countenance misleading 
assertions that the data-as-property model is itself pro-privacy.\2\
---------------------------------------------------------------------------
    \2\ Chad Marlow, Beware the Tech Industry's Latest Privacy Trojan 
Horse, ACLU (Mar. 18, 2019), https://www.aclu.org/blog/privacy-
technology/medical-and-genetic-privacy/beware-tech-industrys-latest-
privacy-trojan.
---------------------------------------------------------------------------
    A central tenet of the data-as-property model is that the 
Government should establish--through regulating and policing a 
universal marketplace of personal data--that individuals are ``owners'' 
of their personal information and, consequently, have a property-based 
right to sell or refuse the sale of their data to third parties. 
However, if the objective is privacy protection, policymakers have 
identified other approaches that more directly facilitate advancements 
in the cause of personal information privacy and do not carry the 
adverse privacy risks associated with the data-as-property approach. 
For example, two State laws passed last year\3\--the ``California 
Consumer Privacy Act,''\4\ which allows consumers to opt-out of their 
personal information being sold, and Maine's ``Act To Protect the 
Privacy of Online Customer Information,''\5\ which takes the superior 
approach of not allowing a person's information to be sold without 
first securing their ``opt in'' permission--made important advances in 
protecting individual privacy, without treating data as property or 
focusing on its monetary value. Rather, they advanced privacy by 
empowering individuals to exercise control over their personal 
information. Indeed, at a time when our existing laws at the Federal 
level and in most States are wholly insufficient to ensure that 
individuals have control over protecting their personal information, 
the data-as-property model simply distracts us from pursuing meaningful 
privacy legislation.
---------------------------------------------------------------------------
    \3\ Francoise Gilbert, Maine Follows California Lead: Prohibits ISP 
Use, Sale, Disclosure of Online Consumer Information Without Prior 
Affirmative Consent, The National Law Review (June 10, 2019), https://
www.natlawreview.com/article/maine-follows-california-lead-prohibits-
isp-use-sale-disclosure-online-consumer.
    \4\ SB-1121, 2017-2018 Leg., (Cal. 2018) also available at https://
leginfo.legislature.ca.gov/faces/
billTextClient.xhtml?bill_id=201720180SB1121.
    \5\ S.P. 275, 2019 Leg., 129th Sess. (Me. 2019) also available at 
http://www.mainelegis-
lature.org/legis/bills/getPDF.asp?paper=SP0275&item=1&snum=129.
---------------------------------------------------------------------------
    Four aspects of the data-as-property model--which essentially 
mandates the creation of a Government-regulated and policed marketplace 
for personal information--would be especially harmful to privacy and 
free speech:
Creating Conflict at the Time Individuals Might Otherwise Choose To 
        Protect Their Personal Information
    To understand why the data-as-property model is concerning, one 
should start by looking to how it would be effectuated. Namely, at the 
time a person's information is collected--which is when pro-privacy 
laws typically mandate the disclosure of one's data privacy rights--a 
Government mandate would require the simultaneous advertising of the 
individual's ability to surrender their privacy by selling their 
personal information. To make the decision to sell one's data seamless, 
where this model has been pushed by data sales facilitators on the 
State level, the bills further require data sales authorization forms 
be concurrently provided.
    Imagine, as was the focus of a data-as-property bill in Oregon 
earlier this year, how uncomfortable that exchange might be where, in 
the course of ongoing medical treatment, a doctor requests a patient 
provide consent so they can sell the patient's personal information. 
Now further imagine what pressure might be applied where the doctor has 
been incentivized to secure consent by being offered a cut of the sale 
revenue for the data.
    Instead of giving consumers meaningful control over their personal 
information, many of the private sector entrepreneurs who are 
advocating for the data-as-property model want to use the power of the 
Government to mandate that the marketplace for selling data--one they 
will very profitably help to facilitate--is advertised to all persons 
at the time their information is collected. We have seen this as a 
central feature of the data-as-property bills being introduced in 
States, like the previously referenced bill in Oregon,\6\ where as soon 
as the bill was understood to be a privacy Trojan Horse, it was soundly 
rejected. In fact, no data-as-property bill has been adopted in any of 
the States in which they have been pursued or introduced, which 
includes Oregon, Maryland, Hawaii, California, Washington, Montana, 
Arizona, Georgia, New Jersey, Massachusetts, and Pennsylvania.
---------------------------------------------------------------------------
    \6\ S.B. 703, 2019 Leg., 80th Sess. (Or. 2019) https://
olis.leg.state.or.us/liz/2019R1/Downloads/MeasureDocument/SB703/
Introduced.
---------------------------------------------------------------------------
    If anything, when it comes to privacy, what the data-as-property 
model actually does is create a hedge against the growing likelihood 
that Congress and the States will pass tougher privacy laws. 
Specifically, it would ensure that, should stronger privacy protections 
be implemented, the data sales marketplace--which relies upon 
convincing people to relinquish their privacy--will be advertised right 
alongside any required notifications about individuals' new privacy 
rights. As Congress explores how to better protect Americans' privacy, 
it should strongly resist supporting the data-as-property model, which 
would undermine those efforts to directly protect privacy.
Widening of Digital Divide and Disproportionate Harm to the Most 
        Vulnerable Individuals
    The high value Americans place on their privacy is universal \7\ 
and nonpartisan.\8\ It is wisely enshrined in our Bill of Rights.\9\ As 
a result, adopting a model where persons with less wealth are likely to 
end up with less privacy should give lawmakers pause.
---------------------------------------------------------------------------
    \7\ National Science Board, Americans' Attitudes Toward Information 
Privacy in the World of Big Data at 1, also available at https://
nsf.gov/statistics/2018/nsb20181/assets/404/americans-attitudes-toward-
information-privacy-in-the-world-of-big-data.pdf.
    \8\ Carl M. Cannon, Digital Privacy, a Non-Partisan Issue, Real 
Clear Politics (July 23, 2013) https://www.realclearpolitics.com/
articles/2013/07/23/digital_privacy_a_non-partisan_
issue_119332.html.
    \9\ U.S. Const. amend. IV.
---------------------------------------------------------------------------
    Americans who are economically secure will find it easy to reject 
offers to surrender their private information in order to make a few 
extra dollars. But that might not be the case for an elderly person who 
has a hard time affording their prescriptions and rent. It may be too 
tempting a sales pitch for a family that is struggling to put food on 
their table. For persons who live in rural areas, where the cost of 
online access may already be steep, a chance to offset those costs 
while online may feel impossible to turn down. And so they will agree, 
when pressed, to sell their private information for an unquantified 
amount of money.
    As a consequence, a Government-endorsed data-as-property model 
would only serve to further expand this country's existing digital 
divide,\10\ where persons already enduring socioeconomic or regional 
economic disadvantages--including disproportionately, persons of 
color--frequently have little or no choice but to rely on cheaper, non-
encrypted cell phones, free email, and other more affordable but less 
secure tech products. The digital divide is a privacy divide, and the 
data-as-property model would only serve to worsen it.
---------------------------------------------------------------------------
    \10\ Gry Hasselbach and Pernille Tranberg, Privacy is creating a 
new digital divide between the rich and poor, The Daily Dot (Oct. 23, 
2016), https://www.dailydot.com/layer8/online-privacy-data-ethics/.
---------------------------------------------------------------------------
Requirement of a Universal Unique Tracking Identifier for All Persons
    One of the most pernicious practical requirements of any data-as-
property model would be the need to create some form of universal 
unique tracking identifier for all personal information. To track who 
owns personal data, who has sold it, who must pay, and who gets paid, 
each piece of data must be tagged with some form of a universal 
identifier.
    There likely would be no opt-out from a universal unique tracking 
identifier for anyone, even for those who consistently refuse to sell 
their personal information. Why? Because legal compliance is likely to 
not only require companies to identify what data they are permitted to 
sell and resell, but also to identify unlawfully distributed data as to 
which sales permission has been denied.
    The need for a universal unique tracking identifier gets 
particularly apparent, as well as difficult to implement as the lines 
blur on who owns what data. What happens when data is sold that has 
information about multiple parties, like DNA or a group photo? Does 
everyone have to agree and get paid? What happens when some parties 
whose personal information is contained is data elected to sell it and 
others refuse? Who prevails?
    In the end, whether people choose to sell their personal 
information or not, the effectuation of the data-as-property model, 
including the universal unique tracking identifier it may require be 
attached to all personal data, raises significant privacy concerns.
Harm to Free Speech on the Internet
    The need to track all communicated personal information, in order 
to effectuate and enforce the data-as-property model, will have an 
adverse impact on free speech. For example, every time a person shares 
content on the internet, sends an email or text message over a public 
network or using a free application, or posts a picture of themselves 
or their family or friends on social media, personal information about 
them will be transmitted, either within the communication itself or in 
its accompanying metadata. As a result, under the data-as-property 
model, it will need to be tracked and associated with the person who 
communicated it using a universal unique tracking identifier. Once the 
public becomes aware of this fact--and if the ACLU doesn't warn them, 
one of dozens of other privacy organizations certainly will--the public 
will know it has lost the ability to communicate anonymously.
    This would have an adverse effect on the free exchange of ideas, 
including on the ability to communicate private thoughts, or messages 
intended for a limited audience, or ideas that are either unpopular or 
represent opinions one is exploring but does not necessarily endorse. 
Privacy and free speech frequently go hand in hand, and that is 
certainly the case with the harms presented to them by the data-as-
property model.
                                 ______
                                 
A Better Way: Adopt Meaningful Privacy Legislation
    If Congress wants to pass a law that creates meaningful privacy 
protections for Americans--if Congress wants to pass a law so that 
every time Americans use the internet, or social media, or complete a 
commercial transaction, they do not have their personal information 
gathered and offered up for sale to third parties--it does not need to 
treat data as property to do so. In fact, passing legislation that 
treats data as property carries specific harms that would undermine 
that goal.
    The Government should not be promoting privacy as a resource to be 
bought and sold. A growing number of State constitutions\11\ now 
recognize that privacy is a fundamental right, including the 
constitutions of the home States of this Committee's Members from 
Arizona, Hawaii, Louisiana, Montana, and South Carolina, along with 
many others.
---------------------------------------------------------------------------
    \11\ Privacy Protections in State Constitutions, National 
Conference of State Legislatures (Nov. 7, 2018) http://www.ncsl.org/
research/telecommunications-and-information-technology/privacy-
protections-in-state-constitutions.aspx.
---------------------------------------------------------------------------
    The proper response to the pervasive loss of individual privacy is 
to pass stronger privacy laws,\12\ not just to throw up our hands and 
conclude the only issue left to tackle is who gets the money when 
people's data is sold. Yes, privacy protections for personal 
information are weak in this country, but Congress and the States have 
the ability to strengthen them. And they should. Limiting data 
collection, retention, and further transfers without a person's clear, 
distinct, and informed permission is a strong place to start.
---------------------------------------------------------------------------
    \12\ Consumer Perspectives: Policy Principles for a Federal Data 
Privacy Framework Before the S. Comm. On Commerce, Science, and 
Transportation, 116th Cong. 3 (2019) (statement of Neema Singh Guliani, 
Senior Legislative Counsel, ACLU) also available at https://
www.commerce.senate.gov/services/files/79ABFD7A-8BEB-45B5-806A-
60A3467255DD.
---------------------------------------------------------------------------
    Additionally, companies should be prohibited from denying a good or 
service to someone who chooses to exercise their privacy rights, and 
consumers should have a private right of action to seek compensation 
when their privacy rights are
violated. Most relevant to today's discussion, we should not be looking 
to a data-as-property model, which monetarily incentivizes people to 
give up their privacy, to enhance privacy protections.
    Again, if those who support the data-as-property model want to talk 
about it as a potential way to create a more robust and equitable 
marketplace for the sale of personal data, by all means they should 
make that argument, but they need to stop advancing the false narrative 
that the data-as-property model is pro-privacy.
    Congress has the ability to adopt laws that truly empower Americans 
to better protect their personal information without undermining 
privacy in the process, and I have confidence that you will.
    Thank you again for the opportunity to testify today. I look 
forward to answering your questions.
                                 ______
                                 
                  PREPARED STATEMENT OF WILL RINEHART
              Director of Technology and Innovation Policy
                        American Action Forum *
---------------------------------------------------------------------------
    * The views expressed here are my own and do not represent the 
position of the American Action Forum.
---------------------------------------------------------------------------
                            October 24, 2019
    Chairman Crapo, Ranking Member Brown, and Members of the Committee, 
thank you for the opportunity to testify today regarding data property 
rights. Like many privacy experts, I'm skeptical that data property 
rights are the best policy mechanism for ensuring privacy is secured in 
the digital age. I hope to make three main points today:

    A property right to personal data isn't needed to establish 
        consumer privacy rights, nor would it be economically efficient 
        to establish this kind of property right;

    Valuing personal data is difficult because raw or personal 
        data per se is not what is in demand, but rather the insights 
        that can be gleaned from that data-insights that often depend 
        on the data's environment; and

    Regardless of the particular policy mechanism, privacy laws 
        will create unavoidable costs from compliance, which will 
        impact investment opportunities in countless industries.
The Purposes and Limitations of Propertization
    With Congress again considering Federal privacy legislation, the 
idea of personal data property rights is being explored as one policy 
mechanism for securing privacy.\1\ The very phrase ``personal data'' 
conjures up the notion that individuals own that data and firms are 
merely taking it. Data propertization, which is the creation of 
property rights in law, has been seen as an attractive alternative 
since the 1970s, for two reasons.\2\ First, it would grant individuals 
the ability to sell their personal data, thus allowing them to 
recapture some of its value. Second, propertization would force 
companies to internalize the costs of disclosure, thereby aligning firm 
and user expectations about data collection and use since users would 
be able to bargain over the terms of the deal.\3\
---------------------------------------------------------------------------
    \1\ Michael Gorthaus, ``Andrew Yang proposes that your digital data 
be considered personal property,'' available at: https://
www.fastcompany.com/90411540/andrew-yang-proposes-that-your-digital-
data-be-considered-personal-property.
    \2\ Alan Westin, Information Technology in a Democracy.
    \3\ Pamela Samuelson, ``Privacy As Intellectual Property,'' 
available at: https://people.i
school.berkeley.edu/pam/papers/privasip_draft.pdf.
---------------------------------------------------------------------------
    There are reasons to be skeptical that assigning property rights in 
data will have unalloyed benefits. For one, assigning property rights 
to data is a contortion of the normal reasoning that underpins 
intellectual property (IP) rights such as copyright and patents. 
Information, which is embodied in copyrights and patents as well as 
user data, can be easily reproduced (i.e., is nonrivalrous), and it is 
difficult to prevent nonpaying consumers from accessing it (non-
excludable). Property rights, incentivize information creation, since 
those rights give the holder the ability legally to exclude others, 
thus making the information rivalrous. Yet, the problem faced in 
privacy is of the opposite kind--the purpose is to limit information 
disclosure.
    Second, it is unclear if the assignment of data property rights 
will align incentives between users and firms. While it is the case 
that information disclosure can either be beneficial or detrimental, 
users cannot know beforehand if the use of their data will necessarily 
lead to better products.\4\ Data propertization would only exacerbate 
this problem, forcing users to search for the best value for their 
data. In other words, data property rights would make users data 
entrepreneurs. Searching for innovative opportunities is costly, and 
thus one could imagine that users will likely hire an intermediary to 
do this task--which is the job of platforms and other data providers 
presently.
---------------------------------------------------------------------------
    \4\ Alessandro Acquisti, Curtis R. Taylor & Liad Wagman, ``The 
Economics of Privacy,'' available at: https://papers.ssrn.com/sol3/
papers.cfm?abstract_id=2580411.
---------------------------------------------------------------------------
    As is detailed in an appendix to this paper, the Hart-Grossman-
Moore model of property helps to flesh out the idea. This model can 
help to determine where it is most efficient to allocate property 
rights. When one party's investment in the data does not boost the 
total value that much, then it is better for the other party to have 
control of the assets. In the parlance of economics, the party with 
higher marginal returns from investment should be given the rights of 
control, which is why platforms, and not users, spend so much time and 
effort to understand what is happening on the platform. Assigning data 
property rights to users will likely be inefficient because it will 
change the investment decision veto point.
    Third, and most important for this Committee, real world 
implementation will prove tricky because of the interconnected nature 
of information. The vast majority of data generated in the last decade 
comes from user interactions with online platforms. If Google didn't 
exist, there would be no search data. If Facebook didn't exist, there 
wouldn't be social graph data. To understand the challenge of 
implementing data property rights, it is helpful to recognize how three 
classes of data interact in online platforms.\5\ Volunteered data is 
data that is both innate to an individual's profile, such as age and 
gender, and information they share, such as pictures, videos, news 
articles, and commentary. Observed data comes as a result of user 
interactions with the volunteered data; it is this class of data that 
platforms tend to collect in data centers. Last, inferred data is the 
information that comes from analysis of the first two classes, which 
explains how groups of individuals are interacting with different sets 
of digital objects. At the very least, then, data is a co-created 
asset, with the users providing volunteered data and the platform 
assembling observed data to create inferred data. Creating data 
property rights will likely necessitate that only one party has rights, 
which has been a sticking point for previous efforts.
---------------------------------------------------------------------------
    \5\ World Economic Forum, ``Personal Data: The Emergence of a New 
Asset Class,'' available at: http://www3.weforum.org/docs/
WEF_ITTC_PersonalDataNewAsset_Report_2011.pdf.
---------------------------------------------------------------------------
    As the German government discovered when trying to implement data 
property for connected cars, determining the owner isn't simple. Does 
the car company own the property right, or might it be the driver, or 
even the rider?\6\ Privacy scholar Robert Gellman demonstrated this 
problem would also beleaguer health care. For example, information 
about a child's health could simultaneously belong to the patient, the 
patient's family, the school, the pharmacy, the supermarket, the 
pediatrician, the drug manufacturer, social media platforms, 
advertising companies, or internet service providers.\7\ Questions of 
fuzzy ownership continually plague IP and would similarly afflict data 
property.
---------------------------------------------------------------------------
    \6\ Lothar Determann, ``No One Owns Data,'' available at: https://
papers.ssrn.com/sol3/papers.cfm?abstract_id=3123957.
    \7\ Robert Gellman, ``Health Information Privacy Beyond HIPAA: A 
2018 Environmental Scan of Major Trends and Challenges,'' available at: 
https://ncvhs.hhs.gov/wp-content/uploads/2018/05/NCVHS-Beyond-
HIPAA_Report-Final-02-08-18.pdf.
---------------------------------------------------------------------------
    Further, and even more practically, a property right in data isn't 
needed to establish consumer privacy rights. For evidence of this fact, 
one only needs only to consult the current laws in the United States. 
The Gramm-Leach-Bliley Act (GLBA), the Fair Credit Reporting Act 
(FCRA), the Children's Online Privacy Protection Act (COPPA), and the 
California Consumer Privacy Act (CCPA), just to name a few, all protect 
privacy without creating property rights. As Stanford Law Professor 
Lothar Determann has said quite bluntly, ``no one owns data'' because 
data are already ``subject to a complex landscape of access rights and 
restrictions.''\8\ Privacy regulation already defines certain kinds of 
entitlements to control and contract upon data. Adding a superordinate 
property right on top of these existing restrictions would make the 
entire enterprise all that more complicated and undermine current 
efforts to grant consumers control. If data property rights were 
implemented, for example, would an individual be able to limit critical 
information from being shared with credit rating agencies?
---------------------------------------------------------------------------
    \8\ See footnote 6.
---------------------------------------------------------------------------
    Determann isn't the only scholar of privacy who opposes 
propertization efforts. Technologist Larry Downes has been critical of 
the idea and instead prefers the current licensing model since it 
``recognizes that most information with economic value is the 
collaborative creation of multiple sources, including individuals and 
service providers.''\9\ Law Professor Julie Cohen has argued against 
privacy as property as well since it doesn't uphold the values of 
autonomy and participation that are so central to privacy.\10\ European 
law professor Bart Schermer agreed with Cohen when the issue was raised 
in 2015 as an alternative to the European Union's (EU) General Data 
Protection Regulation (GDPR), saying that ``Reducing the discussion 
about privacy and personal data to a discussion about ownership 
oversimplifies the discussion about privacy in the information society 
and may lead to sub-optimal results when it comes to regulating the use 
of personal data.''\11\ But Dr. Mark MacCarthy of Georgetown University 
said it best, laying out the world of data property rights as ``a 
privacy nightmare rather than a privacy paradise.''\12\ As much as 
there is disagreement in privacy advocacy and scholarship, there is 
consistent agreement that propertizing data has serious limits.
---------------------------------------------------------------------------
    \9\ Larry Downes, ``A Rational Response to the Privacy `Crisis,' '' 
available at: https://papers.ssrn.com/sol3/
papers.cfm?abstract_id=2200208.
    \10\ Julie E. Cohen, ``Examined Lives: Informational Privacy and 
the Subject as Object,'' available at: https://
scholarship.law.georgetown.edu/cgi/
viewcontent.cgi?article=1819&context=fac
pub.
    \11\ Bart Schermer, ``Privacy and property: do you really own your 
personal data?'' available at: https://leidenlawblog.nl/articles/
privacy-and-property-do-you-really own-your-personal-data.
    \12\ Mark MacCarthy, ``Privacy Is Not A Property Right In Personal 
Information,'' available at: https://www.forbes.com/sites/
washingtonbytes/2018/11/02/privacy-is-not-a-property-right-in-personal-
information/.
---------------------------------------------------------------------------
    Finally, pricing data, which is one stated goal of data property 
rights, will have deleterious effects on privacy expectations. As Jason 
Aaron Gabisch and George R. Milne reported in the Journal of Consumer 
Marketing, ``The findings show that receiving compensation, especially 
when it is a monetary reward, reduces consumer expectations for privacy 
protection.''\13\
---------------------------------------------------------------------------
    \13\ Jason Aaron Gabisch & George R. Milne, ``The impact of 
compensation on information ownership and privacy control,'' available 
at: https://www.emerald.com/insight/content/doi/10.1108/JCM-10-2013-
0737/full/html.
---------------------------------------------------------------------------
Valuing Data
    Four methods can be employed to value intangibles such as data: 
income-based methods, market rates, cost methods, and shadow prices.
    Most popular data valuations are accomplished through income 
derivations, often by simply dividing the total market capitalization 
or revenue of a firm by the total number of users. For those in 
finance, this method seems most logical since it is akin to an estimate 
of future cash-flows. In a Wired article, for example, Antonio Garcia 
Martinez placed an upper bound of $112 on the value of data for users 
in the United States, citing Facebook's 2018 annual report.\14\ 
Similarly, when Microsoft bought LinkedIn, reports suggested that it 
was buying monthly active users at a rate of $260 per user.\15\ 
Stanford Law Professor A. Douglas Melamed argued before the Senate 
Judiciary that the upper-bound value on data should at least be 
cognizant of the acquisition cost for advertisements--putting the total 
value per user at around $16.\16\
---------------------------------------------------------------------------
    \14\ Antonio Garcia-Martinez, ``No, Data Is Not the New Oil,'' 
available at: https://www.
wired.com/story/no-data-is-not-the-new-oil/.
    \15\ James E. Short & Steve Todd, ``What's Your Data Worth?'' 
available at: https://sloan
review.mit.edu/article/whats-your-data-worth/.
    \16\ A. Douglas Melamed, ``Prepared Statement,'' available at: 
https://www.judiciary.
senate.gov/download/melamed-testimony.
---------------------------------------------------------------------------
    Still, these income-based valuations aren't exact estimates because 
they are not capturing a user's ability to marginally earn revenue, 
which is where the price would be set. As noted before, inferential 
data is the key for platform operators, as it drives advertising 
decisions and helps determine what content is presented to users. Thus, 
the ultimate value of a user's data would combine the value of that 
user's data to increase all their friend's demand for content and the 
value of that user's data to contribute to increases in advertising 
demand. Calculating marginal income valuations in this manner are 
difficult, but Shapley values have been shown as a viable method 
theoretically.\17\, \18\ Still, it remains unclear if firms would be 
able to implement this method on their platform.\19\ Needless to say, 
income-based valuations are difficult.
---------------------------------------------------------------------------
    \17\ Amirata Ghorbani & James Zou, ``Data Shapley: Equitable 
Valuation of Data for Machine Learning,'' available at: https://
arxiv.org/abs/1904.02868.
    \18\ Eric Bax, ``Computing a Data Dividend,'' available at: https:/
/arxiv.org/pdf/1905.01805.pdf.
    \19\ While Bax has shown that Shapley values can be implemented in 
polynomial time, it is unclear if Shapley values that exhibit demand 
interdependencies could be implemented in polynomial time as well.
---------------------------------------------------------------------------
    Second, market prices are another method of valuing data, and they 
tend to place the lowest premium on data. For example:

    Vice recently reported that Departments of Motor Vehicles 
        across the United States have been selling individual records 
        for as little as one cent each;\20\
---------------------------------------------------------------------------
    \20\ Joseph Cox, ``DMVs Are Selling Your Data to Private 
Investigators,'' available at: https://www.vice.com/en_us/article/
43kxzq/dmvs-selling-data-private-investigators-making-millions-of-
dollars?utm_campaign=sharebutton.
---------------------------------------------------------------------------
    Wired editor Gregory Barber sold his location data, Apple 
        Health data, and Facebook data, and all he got was a paltry 
        $0.003 for everything together;\21\
---------------------------------------------------------------------------
    \21\ Gregory Barber, ``I Sold My Data For Crypto, Here's How Much I 
Made,'' available at: https://www.wired.com/story/i-sold-my-data-for-
crypto/.
---------------------------------------------------------------------------
    After a breach at Facebook, Facebook logins were selling on 
        the dark web for $2.60 per user;\22\
---------------------------------------------------------------------------
    \22\ Dan Hall, ``Hackers selling Facebook logins on the dark web 
for $2,'' available at: https://nypost.com/2018/10/01/hackers-are-
selling-facebook-logins-on-the-dark-web-for-2/.
---------------------------------------------------------------------------
    Advertisers typically pay $0.005 for complete profile for 
        an individual;\23\
---------------------------------------------------------------------------
    \23\ Frank Pasquale, ``The Dark Market for Personal Data,'' 
available at: https://www.
nytimes.com/2014/10/17/opinion/the-dark-market-for-personal-data.html.
---------------------------------------------------------------------------
    General information about a person, such as their age, 
        gender, and location is worth a mere $0.0005 per person, or 
        $0.50 per 1,000 people;\24\
---------------------------------------------------------------------------
    \24\ Financial Times, ``Financial worth of data comes in at under a 
penny a piece,'' available at: https://www.ft.com/content/3cb056c6-
d343-11e2-b3ff-00144feab7de.
---------------------------------------------------------------------------
    Auto buyers are worth about $0.0021 per person, or $2.11 
        for every 1,000 people;\25\ and
---------------------------------------------------------------------------
    \25\ Ibid.
---------------------------------------------------------------------------
    For $0.26 per person, buyers can access lists of people 
        with specific health conditions or taking certain 
        prescriptions.\26\
---------------------------------------------------------------------------
    \26\ Ibid.

    In reviewing these estimates, The Financial Times noted that ``the 
sum total for most individuals often is less than a dollar.'' It is 
worth noting that sub $1 payments have been unprofitable for firms to 
process due to the fixed technical costs for developing the backend 
architecture and hardware, storage costs for transaction integrity and 
legal purposes, computational costs for processing payments, 
communication costs for information transfer, and administrative 
costs.\27\
---------------------------------------------------------------------------
    \27\ Ioannis Papaefstathiou, ``Evaluation of Micropayment 
Transaction Costs,'' available at: http://web.csulb.edu/journals/jecr/
issues/20042/Paper3.pdf.
---------------------------------------------------------------------------
    As with any market, it is important to pay attention to the 
difference between the clearing price and the asking price. The 
bankruptcy proceedings for Caesars Entertainment, a subsidiary of the 
larger casino company, offers a unique example of this problem. As the 
assets were being priced in the selloff, the Total Rewards customer 
loyalty program got valued at nearly $1 billion, making it ``the most 
valuable asset in the bitter bankruptcy feud at Caesars Entertainment 
Corp.''\28\ But the ombudsman's report understood that it would be a 
tough sell because of the difficulties in incorporating it into another 
company's loyalty program. Although it was Caesar's asset with the 
highest valuation, its real value to an outside party was an open 
question.
---------------------------------------------------------------------------
    \28\ Kate O'Keeffe, ``Real Prize in Caesars Fight: Data on 
Players,'' available at: https://www.wsj.com/articles/in-caesars-fight-
data-on-players-is-real-prize-1426800166.
---------------------------------------------------------------------------
    The Total Rewards example underscores an important characteristic 
of data: It is often valued within a relationship but is difficult to 
value outside of it. Within economics, there is a term for this 
phenomenon, as economist Benjamin Klein explained: ``Specific assets 
are assets that have a significantly higher value within a particular 
transacting relationship than outside the relationship.''\29\ Asset 
specificity helps to explain why there isn't an auction market for 
personal data. It isn't the raw data that is in demand, but the 
insights that can be gleaned from that data.
---------------------------------------------------------------------------
    \29\ Benjamin Klein, ``Asset specificity and holdups,'' available 
at: http://masonlec.org/site/files/2012/05/WrightBaye_klein-b-asset-
specificity-and-holdups.pdf.
---------------------------------------------------------------------------
    Third, data might be valued using cost-based methods, but this 
method also has shortcomings. Proxying the value of data by summing the 
salaries of data analysts and the costs of data centers will likely 
underestimate the value of data. Data is an intermediate product for 
other business processes. In practice, cost-based methods would 
probably look like Shapley values anyway.
    Last, data can be valued through shadow prices.\30\ For those items 
that are rarely exchanged in a market, prices are often difficult to 
calculate, so other methods are used to appraise what is known as the 
shadow price. For example, a lake's value might be determined by the 
total amount of time in lost wages and money spent by recreational 
users to get there. Similarly, the value of social media data might be 
calculated by tallying all of the forgone wages in using the site. A 
conservative estimate from 2016 suggests that users spend about fifty 
minutes a day month on Facebook properties.\31\ Since the current 
average wage is about $28, this calculation indicates that people 
roughly value the site by about $8,516 over the entire year.\32\ A 
study using data from 2016 using similar methods found that American 
adults consumed 437 billion hours of content on ad-supported media, 
worth at least $7.1 trillion in terms of foregone wages.\33\
---------------------------------------------------------------------------
    \30\ Anthony E. Boardman, David H. Greenberg, Aidan R. Vining, & 
David L. Weimer, Cost Benefits Analysis Concepts and Practice.
    \31\ James B. Stewart, ``Facebook Has 50 Minutes of Your Time Each 
Day. It Wants More.'' available at: https://www.nytimes.com/2016/05/06/
business/facebook-bends-the-rules-of-audience-engagement-to-its-
advantage.html.
    \32\ Bureau of Labor Statistics, ``Average hourly and weekly 
earnings of all employees on private nonfarm payrolls by industry 
sector, seasonally adjusted,'' available at: https://www.bls.gov/
news.release/empsit.t19.htm.
    \33\ David S. Evans, ``The Economics of Attention Markets,'' 
available at: https://www.competitionpolicyinternational.com/the-
economics-of-attention-markets/.
---------------------------------------------------------------------------
    Shadow prices can also be calculated through surveys, which is 
where this method gets particularly controversial. Depending on how the 
question is worded, users' willingness to pay for privacy can be wildly 
variable. Trade association NetChoice worked with Zogby Analytics to 
find that only 16 percent of people are willing to pay any price for 
online platform service.\34\ Strahilevitz and Kugler found that 65 
percent of email users, even though they knew their email service scans 
emails to serve ads, wouldn't pay for an alternative.\35\ As one 
seminal study noted, ``most subjects happily accepted to sell their 
personal information even for just 25 cents.''\36\ Using differentiated 
smartphone apps, economists were able to estimate that consumers were 
willing to pay a one-time fee of $2.28 to conceal their browser 
history, $4.05 to conceal their list of contacts, $1.19 to conceal 
their location, $1.75 to conceal their phone's identification number, 
and $3.58 to conceal the contents of their text messages. The average 
consumer was also willing to pay $2.12 to eliminate advertising.\37\
---------------------------------------------------------------------------
    \34\ NetChoice, ``American Consumers Reject Backlash Against 
Tech,'' available at: https://netchoice.org/american-consumers-reject-
backlash-against-tech/.
    \35\ Lior Stahilevitz & Matthew B. Kugler, ``Is Privacy Policy 
Language Irrelevant to Consumers?'' available at: https://
papers.ssrn.com/sol3/papers.cfm?abstract_id=2838449.
    \36\ Jens Grossklags & Alessandro Acquisti, ``When 25 Cents is too 
much: An Experiment on Willingness-To-Sell and Willingness-To-Protect 
Personal Information,'' available at: https://www.econinfosec.org/
archive/weis2007/papers/66.pdf.
    \37\ Scott J. Savage & Donald M. Waldman, ``The Value of Online 
Privacy,'' available at: https://static1.squarespace.com/static/
571681753c44d835a440c8b5/t/5735f456b654f9749a4af
d62/1463153751356/The_value_of_online_privacy.pdf.
---------------------------------------------------------------------------
    In all, there is no one single way to estimate the value of data, 
and none of them is particularly easy to implement.
The Impact of New Privacy Laws
    Regardless of the path that is taken, new privacy laws will have 
both direct and indirect impacts on the economy, best seen in the wake 
of the GDPR and estimates from the CCPA. First, privacy regulation will 
force firms to retool data processes, known as refactoring, to comply 
with new demands. This refactoring is generally a one-time fixed cost 
that raises the cost of all information-using entities. Second, the 
regime will add risk compliance costs, causing companies to staff up to 
ensure compliance. Finally, privacy laws change the investment dynamics 
of the affected industries, as the market shifts to account for the 
newly expected returns.
    Currently, the retooling costs and risk compliance costs are going 
hand in hand, so it is difficult to determine the costs of each. Still, 
they are substantial. A McDermott-Ponemon survey on GDPR preparedness 
found that almost two-thirds of all companies say the regulation will 
``significantly change'' their informational workflows. According to 
this survey, the average budget for getting to compliance tops $13 
million. The International Association of Privacy Professionals 
estimated that GDPR will cost Fortune 500 companies around $7.8 
billion, and these won't be one-time costs since ``Global 500 companies 
will be hiring on average five full-time privacy employees and filling 
five other roles with staff members handling compliance rules.'' A PwC 
survey on the rule change in Europe found that 88 percent of companies 
surveyed spent more than $1 million on GDPR preparations, and 40 
percent more than $10 million.
    Refactoring and compliance costs are adding up for CCPA as well. 
California's standardized regulatory impact assessment (SRIA) for CCPA 
calculated the total costs at $55 billion, which is nearly 1.8 percent 
of the total gross State product.\38\ The range of affected firms is 
massive. On the bottom end of the estimate, 15,643 businesses could 
feel an impact. On the top end, 570,066 companies will have to come 
into compliance with the law. Most alarming, the authors conclude that 
``economic impact of the regulations on these businesses located 
outside of California [that serve California consumers] is beyond the 
scope of the SRIA and therefore not estimated.'' If something akin to 
the California law were applied to the United States, the Information 
Technology and Innovation Foundation estimated the cost at $122 billion 
per year.\39\
---------------------------------------------------------------------------
    \38\ Berkeley Economic Advising and Research, ``Standardized 
Regulatory Impact Assessment: California Consumer Privacy Act of 2018 
Regulations,'' available at: http://www.dof.ca.gov/Forecasting/
Economics/Major_Regulations/Major_Regulations_Table/documents/CCPA_
Regulations-SRIA-DOF.pdf.
    \39\ Alan McQuinn & Daniel Castro, ``The Costs of an Unnecessarily 
Stringent Federal Data Privacy Law,'' available at: https://itif.org/
sites/default/files/2019-cost-data-privacy-law.pdf.
---------------------------------------------------------------------------
    Finally, privacy laws will surely change the investment and market 
dynamics in countless industries. When the EU adopted the e-Privacy 
Directive in 2002, Goldfarb and Tucker found that advertising became 
far less effective, which reverberated throughout the ecosystem as 
venture capital investment in online news, online advertising, and 
cloud computing dropped by between 58 to 75 percent. In Chile, for 
example, credit bureaus were forced to stop reporting defaults in 2012, 
which was found to reduce the costs for most of the poorer defaulters, 
but raised the costs for nondefaulters. Overall the law led to a 3.5 
percent decrease in lending and reduced aggregate welfare. Early 
research on the GDPR has also found drops in investment. While much 
smaller than the United States, EU venture funding decreased by 
decreased 39 percent while the total number of deals saw a 17 percent 
drop.\40\
---------------------------------------------------------------------------
    \40\ Jian Jia, Ginger Jin & Liad Wagman, ``The short-run effects of 
GDPR on technology venture investment,'' available at: https://
voxeu.org/article/short-run-effects-gdpr-technology-venture-investment.
---------------------------------------------------------------------------
Conclusion
    The dilemma for this Committee and others within Congress is hardly 
enviable. America's privacy pandect is complex, making difficult the 
task of creating new laws to enhance consumer privacy. While there is 
much disagreement in the privacy community, there is widespread 
agreement that data property rights are an unwieldy way of doing 
things. There should be no delusions, however, about the impacts. There 
will be serious costs involved with any new law. As Seth Godin once 
remarked, ``The art of good decisionmaking is looking forward to and 
celebrating the tradeoffs, not pretending they don't exist.'' That is 
sage advice for any privacy legislation.
Technical Appendix
    One way to understand this bargain is through the Grossman-Hart-
Moore model, which considers a relationship between two risk-neutral 
parties, a buyer and a seller, or B and S. For this exercise, let's 
assume that the buyer of the data, B, is the platform, and the seller 
of the data, S, is the user, and again let's just work with the 
singular transaction. As such, the platform buys data, which is an 
intermediate good, from the users to create a final output. The value 
of the final good is V(e), which is contingent on e, a variable for the 
investment into the process by the platform. Similarly, the cost of the 
intermediate good is C(i), which is contingent on the investment, i, in 
the process conducted by the user.
    There are two periods. In the first period, each party undertakes 
some kind of investment and in the second period, they decide to trade 
at a specific price, p. If they don't end up trading, they can turn to 
others and do so. A key assumption of this model is that the 
investments in the first time period are not contractible.


    Because the parties will have to bargain over how to split the 
total surplus, each will get half of the benefits from their 
investment. See Aghion and Holden (2011) for further details on the 
Nash bargaining.\41\ Thus, each party will underinvest relative to the 
first best.
---------------------------------------------------------------------------
    \41\ Philippe Aghion & Richard Holden, ``Incomplete Contracts and 
the Theory of the Firm: What Have We Learned over the Past 25 Years?'' 
available at: https://www.aeaweb.org/articles?id=10.1257/jep.25.2.181.
---------------------------------------------------------------------------
    If the parties instead have vertically integrated, the result is 
slightly different. If, say, B controls the total gains from the 
production processes, then B will invest at their first best level 
while S will underinvest. Similarly, if S were to own total gains, then 
S will invest at their first best, while B will underinvest.
    This model yields some interesting insights. It is important to 
note that, like the rest of the literature in this space, the 
investment elasticities are key. Since S or users, have extremely 
inelastic investment decisions, that is, they don't change that much 
with the possibility of B appropriating them, it is the case that B 
should own the total gains.
    This makes sense in the case of platforms. The investment that 
matters the most lies in the inference data of the platform. Users have 
indeed tried to sell their own ``investment,'' but these transactions 
don't yield much. Moreover, the relative investments speak to why data 
ownership efforts are likely to fail. Since the marginal returns for 
any user S is much higher when a platform B controls both, as compared 
to when users simply ``own their data,'' independent ownership is 
likely to lead to inefficient gains for all sides.
                PREPARED STATEMENT OF MICHELLE DENNEDY *
---------------------------------------------------------------------------
    * Statement is an excerpt from The Privacy Engineer's Manifesto. 
Michelle Dennedy, Jonathan Fox, & Thomas R. Finneran (2014). Apress.
---------------------------------------------------------------------------
                Chief Executive Officer, Drumwave, Inc.
                            October 24, 2019
[GRAPHICS NOT AVAILABLE IN TIFF FORMAT]


 RESPONSES TO WRITTEN QUESTIONS OF SENATOR JONES FROM JEFFREY 
                             RITTER

Discrimination
Q.1. In the Banking Committee, we often discuss discrimination 
involving loans and housing. As technology helps companies 
become more sophisticated it is easier to put in place 
discriminatory policies. Are there are protections currently in 
place that prevent data discrimination by race, gender, or 
religion? If not, what methods should Congress consider to 
decrease discrimination within data?

A.1. Discrimination of any nature, whether by machine or by 
human conduct, is the result of two actions. First, an 
individual or class is assigned a classification. It does not 
matter if that assignment is accurate; what matters is that the 
classification is paired or linked to the individual or class. 
Second, rules are constructed, and applied, which differentiate 
between classes in the allocation or availability of benefits 
or the imposition of sanctions.
    When business is conducted through machines, both of these 
actions require specific inputs. A classification scheme must 
be composed, and the rules must be authored to expressly rely 
upon the classification scheme. Both the scheme and the rules 
must be inputted into the machine in order for any application 
or process to execute consistently with the scheme and the 
rules.
    To prevent discrimination, I suggest the key is to prohibit 
the use of classification schemes and, in turn, prevent those 
schemes being used to associate a classification with an 
individual or class. It is not sufficient to prohibit 
discrimination; Congress must enable regulators to be able to 
inspect the operating systems of those companies and financial 
institutions within their purview and affirmatively confirm the 
absence of the classification schemes or their connections to 
individuals or classes.
    Of course, if any institution wishes to establish and 
administer discriminatory policies, and not be caught doing so, 
then either the scheme or the rules can be cleverly designed. 
As just one example, rather than discriminate explicitly based 
on race, ZIP Codes or housing locations were historically 
introduced as a classification scheme that was not expressly 
racial, but still advanced the intended policies of those 
seeking to discriminate.
    Therefore, diligence will be required from the regulators 
to also evaluate the relevant rules. Unfortunately, 
discriminatory rules will often be embedded into decision 
algorithms that require competent analysis to both recognize 
and sanction inappropriate rules. Therefore, Congress must 
authorize suitable funding to both recruit, train, and support 
competent professionals capable of conducting the required 
analysis.
    Whether we like it or not, effective nondiscrimination 
regulatory frameworks in the 21st century will require 
increased transparency and real-time availability of the 
operating data of regulated entities to regulators. While much 
progress has been made, particularly in SEC-regulated areas, 
toward those outcomes, Congress must recognize that 
nondiscrimination regulatory frameworks will not be effective 
without increased transparency and real-time data availability.
    Given the global nature of competition, the United States 
must recognize that limiting governmental oversight of 
financial institutions will impair their trustworthiness in 
larger markets, handicapping both their strength in entering 
new non-U.S. markets, as well as attracting and retaining 
customers within the United States who increasingly find the 
stronger privacy-based oversights under which foreign 
institutions operate to be more appealing.
Data Privacy
Q.2. So much of data privacy is having the choice to share 
information. For example, many people find targeted ads to be 
disturbing and others find it serves as a helpful reminder. How 
should policymakers consider different preferences as they 
write legislation on personal data privacy and how it is used?
    Many services require information from one site to be 
shared to another in order for the consumer to have access to 
the website's services. Sometimes this information sharing is 
helpful and makes the website more user friendly, but sometimes 
the data shared does not have any obvious benefits to the 
consumer.

A.2. In my oral statement and written testimony, I advocated 
for the principle that we must answer the question: who owns 
data information? During the hearing, we did not address the 
many technology innovations which are advancing (and, most 
notably, almost without exception, outside the United States) 
that allow individuals greater exercise of control over their 
data. ``Control'' is the digital equivalent of ``possession'' 
(as in ``possession is \9/10\th of the law'') and, by enabling 
individuals to gain control of their data, they are aligning 
themselves with the essential basis for ownership to be 
asserted. And, in doing so, the individual can then better 
assert and exert their preferences on the use of their personal 
information.
    To date, individuals have enabled the collection, use and 
sharing of their information without much protest. But the real 
deficiency has been the absence of the technologies that allow 
the individual to exercise their control.
    While in many instances, such as a patient arriving at a 
hospital in an ambulance, data ownership is not relevant to 
securing the appropriate medical care. But in commercial 
engagements such as those involving data sharing between 
different companies to gain access, having the technologies to 
exercise control will be vital and appropriate for use.
    Rather than attempting to regulate specific preferences, 
consumers may or may not assert, I urge Congress to put in 
place the regulatory foundation for enabling consumers to own 
their information and, in turn, exercise the appropriate 
controls on how that data can be used.

Q.3. I am concerned that Congress will enact data privacy 
legislation but then websites will deny access to consumers for 
simply not approving sharing their data. Should consumers be 
denied access for not approving data sharing?

A.3. This is an entirely appropriate concern but one for which 
I strongly believe there is no basis.
    In the 21st century, data is becoming a different type of 
currency. In virtually every transaction, whether commercial or 
consumer-oriented, the ``buyer'' and ``seller'' are negotiating 
to establish equivalent value for what each is offering to the 
other. Since data is a new kind of property, it has become part 
of that valuation discussion. So, moving forward any 
transaction involves calculating the values for goods, services 
(such as access to a website), money, and data.
    If a business conditions access on a consent to data 
sharing, that is just one variable that the consumer can 
consider. The great thing about the internet is its capacity to 
foster competitive alternatives. While we view the big tech 
companies as big, we overlook how well competitive alternatives 
(such as Alibaba in China) developed. So, if consumers have 
options on how to access web-based services, where a competitor 
may offer different ``terms'' for data sharing, that is a 
tremendous, positive outcome.
    Indeed, I would encourage companies that wish to condition 
access on data sharing to do so, in order to foster the 
environment where competition can arise.
    From the consumer side, as anticipated in the proposed 
California regulations, it is also possible to imagine that 
additional incentives might be offered to secure the consent of 
a consumer to share their data--such as discounts, coupons, or 
added services. Wouldn't it be great if consumers had choices 
in which their ``price-shopping'' among competitors might also 
include comparing the relevant values being offered to enable 
data sharing.
    If the overall value of access is ``worth'' data sharing, 
that is entirely acceptable in my opinion. But its critical to 
make the affirmative decision to share part of the 
negotiation--unlike today's environment. Alternatively, though 
I do not support this option, legislation and regulation could 
require any website that conditions access on data sharing to 
offer a more limited alternative that does not require data 
sharing.
                                ------                                


RESPONSES TO WRITTEN QUESTIONS OF SENATOR MENENDEZ FROM CHAD A. 
                             MARLOW

Q.1. In 2018, pharmaceutical company GlaxoSmithKline announced 
a partnership with genetic testing kit company 23andMe. The 
companies touted the move as a step toward future scientific 
breakthroughs and cures. But critics cautioned that the 
companies will just use this data for marketing and may put 
customers' genetic data at risk.

Q.1.a. Should consumers be concerned that pharmaceutical 
companies have access to their personal genetic data?

A.1.a. Yes. Personal genetic data contains highly sensitive 
information about the person from whom it was gathered.

Q.1.b. How can we ensure that pharmaceutical companies are 
using this data for the benefit of society and population 
health, for example to discover new treatments?

A.1.b. The sharing of genetic data with third parties, such as 
a pharmaceutical company, should be limited to only that which 
is essential to complete the transaction or other purpose for 
which the data was originally provided. Further, where such 
data is shared by necessity, it should be mandated that the 
data cannot be used or shared by the third-party recipient for 
any purpose beyond the essential one for which it was provided, 
and that it should not be retained any longer than is needed to 
complete the task for which it was provided, regardless of 
whether the party in possession of the data is the original 
recipient of the genetic material/data or a third-party 
recipient.
    Most importantly, even where the use of the genetic data 
may be for a societally beneficial purpose, a well-designed 
privacy law must empower people who provide their genetic 
material to decide what their personally identifiable genetic 
material, and the data derived therefrom, may and may not be 
used for. They must also be empowered to demand their genetic 
material be destroyed, and the data derived therefrom be 
erased.
    Providing a meaningful privacy right with respect to 
genetic material and the data derived therefrom goes beyond 
just seeking individual consent to share it. The consent 
mandated by law must be fully informed, discretely requested 
and provided, and narrowly tailored, so that the granting of 
permission to use private genetic material/data for one purpose 
is not broadly construed to allow many additional uses that 
have not, in the mind of the person sharing the genetic 
material, been agreed upon. To that end, for example, 
requesting permission in the body of a multi-page customer 
agreement or dense package insert is not sufficient and should 
be prohibited. Assuming permission has been granted because a 
consumer did not object (``opt-out permission'') is also not 
adequately protective of privacy. Consumers should have to 
affirmatively and clearly give specific permission for their 
genetic material/data to be used for a purpose beyond that for 
which it was provided (``opt-in permission'').
    A final point bears making here. The recent disturbing 
revelation that Google, in a partnership with Ascension, the 
Nation's second largest health system, has been gathering and 
sharing the personal health information of tens of millions of 
patients\1\ highlights the problem here. In defense of the 
program--called ``Project Nightingale''--Tariq Shaukat, the 
President of Industry Products and Solutions for Google Cloud, 
stated that the goal of the program was to ``help healthcare 
organizations like Ascension improve the healthcare experience 
and outcomes.''\2\ Even if this stated goal is accurate, it 
raises two critical problems. First, whether pursuing a 
positive societal goal is worth surrendering deeply personal 
health information is a decision that should be made by 
individual patients, not by the companies seeking to collect 
and use their private health information. Federal laws need to 
be adopted and revised to ensure the right of individuals to 
decide if and how their genetic and other health information is 
collected and used is clear, unequivocal, and not placed at 
risk by unintended loopholes. Second, even if the patients' 
health information is collected by Google for a benevolent 
public health purpose, there is no certainty that data will not 
also be used for other purposes that have little or no 
connection to public health. Federal privacy laws need to be 
adopted that protect personal privacy through real, enforceable 
limits on when and under what conditions personal data can be 
collected, retained, and shared, and a private right of action 
must be provided to help individuals enforce those rules.
---------------------------------------------------------------------------
    \1\ https://www.wired.com/story/google-is-slurping-up-health-
dataand-it-looks-totally-legal/.
    \2\ https://cloud.google.com/blog/topics/inside-google-cloud/our-
partnership-with-ascension.

Q.1.c. If Congress were to pass legislation allowing 
individuals to sell their own data, how should we think about 
the implications of allowing an individual would to sell their 
---------------------------------------------------------------------------
genetic data?

A.1.c. Persons already have the right to sell their personal 
data, including their genetic data, so Congress does not need 
to pass legislation to provide that right. Congress should not 
pass any data-as-property laws that have the effect of 
encouraging or persuading people to forgo their privacy and 
sell their data, as doing so would undermine existing and 
future privacy laws, especially among poorer Americans for whom 
it is very difficult to say no to additional income, even if 
the amount promised is uncertain and likely to be small.

Q.1.d. What rights should one individual have to share private 
health information that describes not only themselves, but also 
their family members?

A.1.d. While genetic information contains sensitive information 
about the person providing it, as well as their family members, 
individuals have the right to share their own personal genetic 
information.

Q.2. Equifax has repeatedly shown that it is not a proper 
caretaker for consumer information. In the most recent example, 
Equifax was found to use the word ``admin'' as both password 
and username for a portal that contained sensitive information.

Q.2.a. Do consumers have any recourse against companies like 
Equifax, companies that repeatedly place sensitive consumer 
financial information at risk?

A.2.a. While some State data breach and data security laws may 
provide some consumer recourse rights in this area, there is no 
broad Federal law that provides such recourse rights to all 
Americans. In cases where the handling or protection of 
personal data is negligent, common laws tools may provide 
recourse. This questions highlights why a comprehensive Federal 
privacy law should include a robust private right of action.

Q.2.b. How would a property rights in data regime change this 
situation?

A.2.b. While a strong Federal privacy law could help here, a 
property rights in data law would, if anything, undermine 
individual privacy. Passing an unnecessary Federal data-as-
property law would have the effect of encouraging people to 
sell their data, rather than to protect it. That would feed 
into, rather than reduce, the risk presented by Equifax-type 
companies, especially for poorer Americans who will find it 
more difficult to say no to selling away their private 
information.

Q.3.a. It is important to recognize that consumer data outlives 
the relationship with the institution that collects the data.
    What happens to a consumer's data after a consumer 
terminates their relationship with an institution collecting 
their data? Does the company delete the consumer's data? Does 
it encrypt the data?

A.3.a. There are no Federal laws that create universally 
applicable rules governing consumer data collection, retention, 
deletion, or security. Each of these constitutes a major gap in 
privacy protections that urgently need to be addressed in a 
Federal data privacy law. A strong Federal data privacy law 
should include a ``right of erasure,'' which is the right of 
individuals to demand the personal data they furnished to a 
company be deleted when they terminate their relationship with 
the company or at any other time upon their request.

Q.3.b. Is there any uniform requirement that mandates 
institutions treat consumer data a certain way once a consumer 
decides to no longer conduct business with an institution?

A.3.b. There are no Federal laws that create universally 
applicable rules governing consumer data retention, deletion, 
or security once a commercial relationship ends or is 
terminated. Each of these constitutes a major gap in privacy 
protections that urgently need to be addressed in a Federal 
data privacy law.

Q.3.c. If the data collecting company is breached after the 
consumer has terminated their relationship, is the consumer's 
data is still vulnerable?

A.3.c. Yes.

Q.3.d. To ensure consumer's data is protected, should consumers 
be allowed to request their personally identifiable information 
be made nonpersonally identifiable, after the consumer ends 
their business relationship?

A.3.d. Yes, but Federal law should also give them the option to 
request their data be deleted.
                                ------                                


 RESPONSES TO WRITTEN QUESTIONS OF SENATOR WARREN FROM CHAD A. 
                             MARLOW

Q.1. In your written testimony, you expressed concerns 
regarding the data-as-property model. Specifically, you 
mentioned that data-as-property model creates a ``hedge'' 
against potential future privacy laws enacted at the State and 
Federal level. Can you explain further how a data-as-property 
model could interact with current and potential future privacy 
laws?

A.1. Presently, with the exception of Federal laws governing 
financial, health, and children's data, and a few strong State 
laws, there are very few barriers to prevent private, personal 
data from flowing from individuals to data collectors to 
enumerable third parties. This has allowed the marketplace for 
personal data to flourish at the expense of Americans' personal 
privacy.
    A strong Federal data privacy law, and strong State data 
privacy laws, would interrupt this flow. Such laws, where 
adopted, are likely to end the corporate practice of collecting 
and sharing individual's private, personal information without 
their knowledge and meaningful consent. To that end, such laws 
may and should empower individuals to decide what personal data 
of theirs may be collected and what may be shared. They may and 
should empower individuals to demand the use of their data be 
limited to the purpose for which it was collected. They may and 
should empower individuals to demand their data be deleted 
after the purpose for which it was collected is completed, or 
upon a deletion request by the consumer. They may and should 
empower individuals to make pro-privacy choices without being 
punished or otherwise disadvantaged compared to those who do 
not. And they may and should empower individuals to directly 
sue those who violate their data privacy rights. Not all 
individuals will take advantage of these privacy protections, 
but many will. The result will be, in the absence of some 
countervailing force, that the availability of personal data in 
the marketplace, and the profits that can be made therefrom, 
will be reduced.
    The data-as-property model is a hedge against stronger 
privacy laws because it seeks to use the levers of Government 
power to place that countervailing force directly in front of 
consumers at the time they would be contemplating exercising 
their newly bestowed data privacy rights. Specifically, 
consumers will be reminded of their right to sell their data 
and, more importantly, of the availability of companies that 
will facilitate that sale and their receipt of a ``royalty 
payment'' for doing so. While the amount individuals will 
receive for selling their personal information will not be 
stated, as it will be unknown at the time sales permission is 
sought, for Americans who are struggling to pay their bills or 
put food on their tables, the opportunity to earn any extra 
money--no matter how little and uncertain it may be--may be 
impossible to refuse.
    And so, even if tougher Federal and State privacy laws are 
passed, the ability to offer people financial incentives to not 
exercise those new rights will serve as an important hedge 
again those laws and as an effective way to undermine them, 
especially when it comes to the most financially needy 
Americans. These harms are a significant reason why data-as-
property bills were rejected by every one of the 11 State 
legislatures that considered them in 2019.

Q.2. How could a company's incentives change under a data-as-
property model with respect to the services they offer 
consumers?

   LWould companies be likely to change their business 
        models in certain circumstances to target consumers of 
        different income levels?

   LWould these potential responses conflict with 
        recent State and Federal efforts regarding privacy? If 
        so, how?

A.2. It is difficult to imagine all the ways in which companies 
might adjust their business models under a data-as-property 
regime. Could we see startup companies and existing tech giants 
racing to serve as the ``transaction agents'' for personal data 
sales so they can capture the substantial revenues to be made 
therefrom? Perhaps. Could we see companies paying more money 
for the
personal data of wealthy individuals compared to poorer 
individuals because the former would be less likely to consent 
to selling their data for trivial sums of money? Perhaps. Could 
we see companies investing huge sums of money in advertising 
campaigns to encourage individuals to sell their personal 
information? Perhaps? Could we see a system emerge where 
corporate data re-sellers and purchasers make huge profits off 
the sale of personal data, but very little trickles down to the 
individuals who actually sell their private, personal 
information? Perhaps.
    In the end, the question to ask isn't about if the data-as-
property model would conflict with recent Federal and State 
privacy efforts, but rather if it would undermine them. To 
that, for the reasons discussed in my answer to your first 
question, the answer is an unequivocal yes.

Q.3.a. What are the potential tracking requirements that would 
need to be put in place with a data-as-property model?

A.3.a. The data-as-property model is based on the premise that 
people should get paid when their data is sold and re-sold. To 
do that, an elaborate tracking and monitoring system would need 
to be deployed. That system would have two major components. 
First, it would require some sort of unique, universal tracking 
number be attached to all personal data. This is needed to 
track data as it moves through the virtual world so sellers can 
ascertain if permission to sell the data has been granted, if 
any limitations have been placed on its sale, and so the person 
who originally sold the data can get paid. It is possible that 
all data will need to be tagged with a tracking number, so 
potential sellers can determine if sales permission has been 
granted or denied, and so they know who to request permission 
from where it has not been granted. The use of these tracking 
numbers as a unique identifier would have serious negative 
impacts on data privacy and online anonymity. Second, it would 
require a comprehensive system of data monitoring to ensure 
that payments that are due are actually made. Other than 
proceeding through a weak honor system--because it is easy to 
copy data and allude tracking--it is hard to imagine how it 
will be possible to ensure payments are made and how to avoid 
the development of a black market for commission-free personal 
data. Undoubtedly, companies that track data and facilitate 
``royalty'' payments will charge fees for their services that 
may leave little compensation left for those who sell their 
personal information.

Q.3.b. How would such a model function in the absence of those 
requirements?

A.3.b. It could not.

Q.4. You have advocated for Congress passing strong privacy 
laws in lieu of a data-as-property model. What would you 
consider to be the key elements of a strong privacy law?

A.4. At a minimum, a strong Federal privacy law should:

  1) LPlace limits on how personal information can be 
        collected, used, and retained. Legislation must include 
        real protections that consider the modern reality of 
        how people's personal information is collected, 
        retained, and used. The law should limit the purposes 
        for which consumer data can be used,
        require purging of data after permissible uses have 
        completed, prevent coercive conditioning of services on 
        waiving privacy rights, and limit so called ``pay for 
        privacy'' schemes. A well-designed Federal privacy law 
        would empower people to choose what degree of privacy 
        they want for themselves. To make this right 
        meaningful, it must go beyond just seeking individual 
        consent. The consent mandated by law must be fully 
        informed, discretely requested and provided, and 
        narrowly tailored, so that the granting of permission 
        to transfer one's data to one third party for a 
        specific purpose is not broadly construed to allow many 
        additional transfers and uses that have not, in the 
        mind of the person sharing the data, been agreed upon. 
        To that end, for example, requesting permission in the 
        body of a multi-page user agreement is not sufficient 
        and should be prohibited. Assuming permission has been 
        granted because a consumer did not object (``opt-out 
        permission'') is also not adequately protective of 
        privacy. Consumers should have to affirmatively and 
        clearly give specific permission for their data to be 
        used for any purposes beyond that for which it was 
        provided (``opt-in permission'').

  2) LNot prevent States from putting in place stronger 
        consumer protections or taking enforcement action. Any 
        Federal privacy standards should be a floor--not a 
        ceiling--for consumer protections. The ACLU strongly 
        opposes legislation that would, as some industry groups 
        have urged, preempt stronger State laws. Such an 
        approach would put existing consumer protections, many 
        of which are State-led, on the chopping block and 
        prevent additional consumer privacy protections from 
        ever seeing the light of day. We also oppose efforts to 
        limit the ability of State Attorneys General or other 
        regulators from suing, fining, or taking other actions 
        against companies that violate their laws.

  3) LContain strong enforcement mechanisms, including a 
        private right of action. Federal privacy legislation 
        will mean little without robust enforcement. Thus, any 
        legislation should grant greater resources and 
        enforcement capabilities to the FTC and permit State 
        and local authorities to fully enforce Federal law. To 
        fill the inevitable Government enforcement gaps, 
        however, the ACLU urges Congress to ensure that Federal 
        legislation also grants consumers the right to sue 
        companies for privacy violations.

  4) LGuard against discrimination in the digital ecosystem. 
        Existing Federal laws prohibit discrimination in the 
        credit, employment, and housing context. Any Federal 
        privacy legislation should ensure such prohibitions 
        apply fully in the digital
        ecosystem and are robustly enforced. In addition, we 
        urge Congress to strengthen existing laws to guard 
        against unfair discrimination, including in cases where 
        it may stem from algorithmic bias.
                                ------                                


  RESPONSE TO WRITTEN QUESTION OF SENATOR SINEMA FROM CHAD A. 
                             MARLOW

Q.1. Opponents of the data-as-property model argue the user is 
not the sole creator of data and therefore does not deserve 
sole ownership. This argument is underpinned by the belief that 
data is a joint-creation of the user and platform because the 
platform creates an environment and technology to process the 
data. Most Arizonans are not aware when data brokers collect 
their data and typically find out when they search their own 
names and find information about themselves, accurate or 
otherwise. How would data brokers' practices be treated under a 
data-as-property model? Would this model provide recourse to 
Arizonans who wish to address inaccuracies in the personal 
information data brokers choose to sell?

A.1. To your first question, data-as-property laws would allow 
consumers to choose to have their data sold to third parties, 
such as data brokers, or to not have it sold; however, strong 
Federal data privacy laws could establish this same right 
without advancing a profit-driven system that influences and 
encourages persons to sell their data--an approach that would 
have a particularly deleterious effect on persons with greater 
financial needs.
    A well-designed Federal privacy law would give people 
control over their information and empower people to choose 
what degree of privacy they want for themselves. In the 
scenario presented here, that would mean empowering people to 
choose whether or not their private information is sold or 
shared by the original collector with third parties such as 
data brokers. To make this right meaningful, it must go beyond 
the broken ``inform and consent'' model that currently 
dominates the technology sector. The consent mandated by law 
must be fully informed, discretely requested and provided, and 
narrowly tailored, so that the granting of permission to 
transfer one's data to one third party for a specific purpose 
is not broadly construed to allow many additional transfers and 
uses that have not, in the mind of the person sharing the data, 
been agreed upon. To that end, for example, requesting 
permission in the body of a multi-page user agreement is not 
sufficient and should be prohibited. Assuming permission has 
been granted because a consumer did not object (``opt-out 
permission'') is also not adequately protective of privacy. 
Consumers should have to affirmatively and clearly give 
specific permission for their data to be used for any purposes 
beyond that for which it was provided (``opt-in permission'').
    To your second question, the data-as-property model 
contains no right for consumers to request the correction of 
inaccurate information about them, but such a right could be 
included in a comprehensive Federal privacy law.
                                ------                                


 RESPONSES TO WRITTEN QUESTIONS OF SENATOR JONES FROM CHAD A. 
                             MARLOW

Discrimination
Q.1. In the Banking Committee, we often discuss discrimination
involving loans and housing. As technology helps companies 
become more sophisticated it is easier to put in place 
discriminatory
policies. Are there are protections currently in place that 
prevent data discrimination by race, gender, or religion? If 
not, what methods should Congress consider to decrease 
discrimination within data?

A.1. Existing Federal laws prohibit discrimination in the 
credit, employment, and housing context. However, our existing 
infrastructure is insufficient to safeguard against 
discrimination in the digital world for several reasons.
    First, many online providers have been slow to fully comply 
with Federal antidiscrimination laws--and in many cases 
plaintiffs face challenges in getting the information necessary 
to raise discrimination claims. For example, Facebook recently 
settled a lawsuit brought by ACLU and other civil rights 
organizations amid allegations that it discriminated on the 
basis of gender and age in targeting ads for housing and 
employment.\1\ The lawsuit followed
repeated failures by the company to fully respond to studies 
demonstrating that the platform improperly permitted ad 
targeting based on prohibited characteristics, like race, or 
proxies for such characteristics. The company is also now the 
subject of charges brought by the Department of Housing and 
Urban Development (HUD), which includes similar allegations.\2\
---------------------------------------------------------------------------
    \1\ ACLU, Facebook Agrees to Sweeping Reforms to Curb 
Discriminatory Ad Targeting Practices (Mar. 19, 2019), https://
www.aclu.org/news/facebook-agrees-sweeping-reforms-curb-discriminatory-
ad-targeting-practices.
    \2\ Complaint of Discrimination Against Facebook, FHEO No. 01-18-
032308, https://www.hud.gov/sites/dfiles/Main/documents/
HUD_v_Facebook.pdf.
---------------------------------------------------------------------------
    Second, there have been efforts to weaken existing laws in 
ways that would make it more difficult to address algorithmic 
discrimination. For example, HUD recently proposed amending the 
existing Disparate Impact Rule, codified at 24 C.F.R.  
100.500, to make it more difficult for plaintiffs to raise 
discrimination claims based on disparate impact. Among other 
things, the Proposed Rule would allow a defendant to avoid 
liability for using an algorithmic model that 
disproportionately excludes members of protected classes if the 
defendant can prove one of three defenses, any of which will 
operate as a complete defense, with no opportunity for a 
plaintiff to prove the existence of less discriminatory 
alternatives to achieve any legitimate business objectives.
    Third, our existing laws need to be expanded to address 
digital discrimination that occurs outside the housing, credit, 
and employment context. For example, commercial advertisers 
should not be permitted to offer different prices, services, or 
opportunities to individuals, or to exclude them from receiving 
ads offering certain commercial benefits, based on 
characteristics like their gender or race. And regulators and 
consumers should be given information and tools to address 
algorithms or machine learning models that disparately impact 
individuals on the basis of protected characteristics.
    Federal law must be strengthened to address these 
challenges. First, Federal privacy law should make clear that 
existing antidiscrimination laws apply fully in the online 
ecosystem, including in online marketing and advertising. 
Federal agencies that enforce these laws, like HUD, the EEOC, 
and the Consumer Financial Protection Bureau, should be fully 
resourced and given the technical capabilities to vigorously 
enforce the law in the context of these new forms of digital 
discrimination. In addition, companies should be required to 
audit their data processing practices for bias and privacy 
risks, and such audits should be made available to regulators 
and disclosed publicly, with redactions if necessary to protect 
proprietary information. Finally, researchers should be 
permitted to independently audit platforms for bias, and 
Congress should not permit enforcement of terms of service that 
interfere with such testing.
Data Privacy
Q.2.a. So much of data privacy is having the choice to share 
information. For example, many people find targeted ads to be 
disturbing and others find it serves as a helpful reminder. How 
should policymakers consider different preferences as they 
write legislation on personal data privacy and how it is used?

A.2.a. Privacy is not a condition to be imposed upon 
individuals, but a right to be meaningfully provided. That 
means a well-designed privacy law would empower people to 
choose what degree of privacy they want for themselves. In the 
scenario presented here, that would mean empowering people to 
choose if they want to allow their personal data to be 
collected and used to provide them with targeted ads or if they 
do not.
    But providing a meaningful right goes beyond just seeking 
individual consent. The consent mandated by law must be fully 
informed, discretely requested and provided, and narrowly 
tailored, so that the granting of permission to use private 
information for one purpose is not broadly construed to allow 
many additional uses that have not, in the mind of the person 
sharing the data, been agreed upon. To that end, for example, 
requesting permission in the body of a multi-page user 
agreement is not sufficient and should be prohibited. Assuming 
permission has been granted because a consumer did not object 
(``opt-out permission'') is also not adequately protective of 
privacy. Consumers should have to affirmatively and clearly 
give specific permission for their data to be used for any 
purposes beyond that for which it was provided (``opt-in 
permission'').
    In addition, the law should be made unambiguously clear 
that the data cannot be used to discriminate against users. At 
a minimum, with respect to targeted advertising, discriminatory 
practices that are prohibited in the physical world should be 
equally prohibited on the internet.

Q.2.b. Many services require information from one site to be 
shared to another in order for the consumer to have access to 
the website's services. Sometimes this information sharing is 
helpful and makes the website more user friendly but sometimes 
the data shared does not have any obvious benefits to the 
consumer.

A.2.b. The vast majority of data sharing among web sites takes 
place for the purpose of advertiser-driven consumer tracking--
something polls find Americans remain deeply uncomfortable with 
but feel helpless to stop. Where the sharing of data with third 
parties is genuinely necessary for providing a service 
consumers find useful, such sharing should be limited only to 
data that is essential to complete the transaction or other 
purpose for which it was
originally provided. Further, where such data is shared by 
necessity with a third party, it should be mandated that the 
data cannot be used or re-shared by the third-party recipient 
for any purpose beyond the essential one for which it was 
provided. Beyond that, should Federal law put strong opt-in 
privacy protections in place, data should be sharable beyond 
the purpose for which it was originally provided if--and only 
if--the user-provider has given clear, well-informed, and 
discrete opt-in consent.

Q.3. I am concerned that Congress will enact data privacy 
legislation but then websites will deny access to consumers for 
simply not approving sharing their data. Should consumers be 
denied access for not approving data sharing?

A.3. This is an important concern. Privacy rights will be of 
little value if individuals who choose to exercise them can be 
punished or denied benefits for doing so. Federal law should 
protect those who elect to exercise their privacy rights by 
prohibiting companies from denying service, providing worse 
service, or charging higher prices to those who exercise their 
privacy rights, as well as prohibiting them from providing 
better service, lower prices, or other benefits to those who do 
not exercise their privacy rights.
                                ------                                


  RESPONSES TO WRITTEN QUESTIONS OF SENATOR WARREN FROM WILL 
                            RINEHART

Q.1. In your written testimony, you mentioned that creating a 
property right for data could make it more difficult for 
consumers to control their data. Can you provide further detail 
regarding how a data-as-property model could interact with 
privacy protections in current laws, such as the Fair Credit 
Reporting Act?

A.1. Response not received in time for publication.

Q.2. Your written testimony also discusses the different 
methodologies to value data.

   LHow could information asymmetry between companies 
        and consumers impact the valuation of data under the 
        different methods you described?

   LUnder any of the models you mentioned, do you 
        believe that customers will have the ability to 
        determine the value of their data before a given 
        transaction?

A.2. Response not received in time for publication.

Q.3. How could a company's incentives change under a data-as-
property model with respect to the services they offer 
consumers?

   LWould companies be likely to change their business 
        models in certain circumstances to target consumers of 
        different income levels?

   LWould these potential responses conflict with 
        recent State and Federal efforts regarding privacy? If 
        so, how?

A.3. Response not received in time for publication.
                                ------                                


   RESPONSES TO WRITTEN QUESTIONS OF SENATOR JONES FROM WILL 
                            RINEHART

Discrimination
Q.1. In the Banking Committee, we often discuss discrimination 
involving loans and housing. As technology helps companies 
become more sophisticated it is easier to put in place 
discriminatory policies. Are there are protections currently in 
place that prevent data discrimination by race, gender, or 
religion? If not, what methods should Congress consider to 
decrease discrimination within data?

A.1. Response not received in time for publication.
Data Privacy
Q.2. So much of data privacy is having the choice to share 
information. For example, many people find targeted ads to be 
disturbing and others find it serves as a helpful reminder. How 
should policymakers consider different preferences as they 
write legislation on personal data privacy and how it is used?
    Many services require information from one site to be 
shared to another in order for the consumer to have access to 
the website's services. Sometimes this information sharing is 
helpful and makes the website more user friendly but sometimes 
the data shared does not have any obvious benefits to the 
consumer.

A.2. Response not received in time for publication.

Q.3. I am concerned that Congress will enact data privacy 
legislation but then websites will deny access to consumers for 
simply not approving sharing their data. Should consumers be 
denied access for not approving data sharing?

A.3. Response not received in time for publication.
                                ------                                


    RESPONSES TO WRITTEN QUESTIONS OF SENATOR MENENDEZ FROM 
                        MICHELLE DENNEDY

Q.1. It is important to recognize that consumer data outlives 
the relationship with the institution that collects the data.

Q.1.a. What happens to a consumer's data after a consumer 
terminates their relationship with an institution collecting 
their data? Does the company delete the consumer's data? Does 
it encrypt the data?

Q.1.b. Is there any uniform requirement that mandates 
institutions treat consumer data a certain way once a consumer 
decides to no longer conduct business with an institution?

Q.1.c. If the data collecting company is breached after the 
consumer has terminated their relationship, is the consumer's 
data still vulnerable?

Q.1.d. To ensure consumer's data is protected, should consumers 
be allowed to request their personally identifiable information 
be made nonpersonally identifiable, after the consumer ends 
their business relationship?

A.1.a.-d. Response not received in time for publication.
                                ------                                


 RESPONSES TO WRITTEN QUESTIONS OF SENATOR JONES FROM MICHELLE 
                            DENNEDY

Discrimination
Q.1. In the Banking Committee, we often discuss discrimination 
involving loans and housing. As technology helps companies 
become more sophisticated it is easier to put in place 
discriminatory policies. Are there are protections currently in 
place that prevent data discrimination by race, gender, or 
religion? If not, what methods should Congress consider to 
decrease discrimination within data?

A.1. Response not received in time for publication.
Data Privacy
Q.2. So much of data privacy is having the choice to share 
information. For example, many people find targeted ads to be 
disturbing and others find it serves as a helpful reminder. How 
should policy makers consider different preferences as they 
write legislation on personal data privacy and how it is used?
    Many services require information from one site to be 
shared to another in order for the consumer to have access to 
the website's services. Sometimes this information sharing is 
helpful and makes the website more user friendly but sometimes 
the data shared does not have any obvious benefits to the 
consumer.

A.2. Response not received in time for publication.

Q.3. I am concerned that Congress will enact data privacy 
legislation but then websites will deny access to consumers for 
simply not approving sharing their data. Should consumers be 
denied access for not approving data sharing?

A.3. Response not received in time for publication.

              Additional Material Supplied for the Record
[GRAPHICS NOT AVAILABLE IN TIFF FORMAT]

                            [all]