[House Hearing, 116 Congress]
[From the U.S. Government Publishing Office]


           PROTECTING CONSUMER PRIVACY IN THE ERA OF BIG DATA

=======================================================================

                                HEARING

                               BEFORE THE

            SUBCOMMITTEE ON CONSUMER PROTECTION AND COMMERCE

                                 OF THE

                    COMMITTEE ON ENERGY AND COMMERCE
                        HOUSE OF REPRESENTATIVES

                     ONE HUNDRED SIXTEENTH CONGRESS

                             FIRST SESSION

                               __________

                           FEBRUARY 26, 2019

                               __________

                            Serial No. 116-7
                            
[GRAPHIC NOT AVAILABLE IN TIFF FORMAT]    
          
                       
      Printed for the use of the Committee on Energy and Commerce
                   govinfo.gov/committee/house-energy
                         energycommerce.house.gov
                   
                                __________
                               

                    U.S. GOVERNMENT PUBLISHING OFFICE                    
36-508 PDF                  WASHINGTON : 2020                     
          
--------------------------------------------------------------------------------------
                   
                   
                   
                        
                    COMMITTEE ON ENERGY AND COMMERCE

                     FRANK PALLONE, Jr., New Jersey
                                 Chairman
BOBBY L. RUSH, Illinois              GREG WALDEN, Oregon
ANNA G. ESHOO, California              Ranking Member
ELIOT L. ENGEL, New York             FRED UPTON, Michigan
DIANA DeGETTE, Colorado              JOHN SHIMKUS, Illinois
MIKE DOYLE, Pennsylvania             MICHAEL C. BURGESS, Texas
JAN SCHAKOWSKY, Illinois             STEVE SCALISE, Louisiana
G. K. BUTTERFIELD, North Carolina    ROBERT E. LATTA, Ohio
DORIS O. MATSUI, California          CATHY McMORRIS RODGERS, Washington
KATHY CASTOR, Florida                BRETT GUTHRIE, Kentucky
JOHN P. SARBANES, Maryland           PETE OLSON, Texas
JERRY McNERNEY, California           DAVID B. McKINLEY, West Virginia
PETER WELCH, Vermont                 ADAM KINZINGER, Illinois
BEN RAY LUJAN, New Mexico            H. MORGAN GRIFFITH, Virginia
PAUL TONKO, New York                 GUS M. BILIRAKIS, Florida
YVETTE D. CLARKE, New York, Vice     BILL JOHNSON, Ohio
    Chair                            BILLY LONG, Missouri
DAVID LOEBSACK, Iowa                 LARRY BUCSHON, Indiana
KURT SCHRADER, Oregon                BILL FLORES, Texas
JOSEPH P. KENNEDY III,               SUSAN W. BROOKS, Indiana
    Massachusetts                    MARKWAYNE MULLIN, Oklahoma
TONY CARDENAS, California            RICHARD HUDSON, North Carolina
RAUL RUIZ, California                TIM WALBERG, Michigan
SCOTT H. PETERS, California          EARL L. ``BUDDY'' CARTER, Georgia
DEBBIE DINGELL, Michigan             JEFF DUNCAN, South Carolina
MARC A. VEASEY, Texas                GREG GIANFORTE, Montana
ANN M. KUSTER, New Hampshire
ROBIN L. KELLY, Illinois
NANETTE DIAZ BARRAGAN, California
A. DONALD McEACHIN, Virginia
LISA BLUNT ROCHESTER, Delaware
DARREN SOTO, Florida
TOM O'HALLERAN, Arizona
                                 ------                                

                           Professional Staff

                   JEFFREY C. CARROLL, Staff Director
                TIFFANY GUARASCIO, Deputy Staff Director
                MIKE BLOOMQUIST, Minority Staff Director
            Subcommittee on Consumer Protection and Commerce

                        JAN SCHAKOWSKY, Illinois
                                Chairwoman
KATHY CASTOR, Florida                CATHY McMORRIS RODGERS, Washington
MARC A. VEASEY, Texas                  Ranking Member
ROBIN L. KELLY, Illinois             FRED UPTON, Michigan
TOM O'HALLERAN, Arizona              MICHAEL C. BURGESS, Texas
BEN RAY LUJAN, New Mexico            ROBERT E. LATTA, Ohio
TONY CARDENAS, California, Vice      BRETT GUTHRIE, Kentucky
    Chair                            LARRY BUCSHON, Indiana
LISA BLUNT ROCHESTER, Delaware       RICHARD HUDSON, North Carolina
DARREN SOTO, Florida                 EARL L. ``BUDDY'' CARTER, Georgia
BOBBY L. RUSH, Illinois              GREG GIANFORTE, Montana
DORIS O. MATSUI, California          GREG WALDEN, Oregon (ex officio)
JERRY McNERNEY, California
DEBBIE DINGELL, Michigan
FRANK PALLONE, Jr., New Jersey (ex 
    officio)
                             C O N T E N T S

                              ----------                              
                                                                   Page
Hon. Jan Schakowsky, a Representative in Congress from the State 
  of Illinois, opening statement.................................     3
    Prepared statement...........................................     4
Hon. Cathy McMorris Rodgers, a Representative in Congress from 
  the State of Washington, opening statement.....................     5
    Prepared statement...........................................     7
Hon. Frank Pallone, Jr., a Representative in Congress from the 
  State of New Jersey, opening statement.........................     8
    Prepared statement...........................................    10
Hon. Greg Walden, a Representative in Congress from the State of 
  Oregon, opening statement......................................    11
    Prepared statement...........................................    12
Hon. Anna G. Eshoo, a Representative in Congress from the State 
  of California, prepared statement..............................   101

                               Witnesses

Brandi Collins-Dexter, Senior Campaign Director, Color of Change.    14
    Prepared statement \1\.......................................    16
    Answers to submitted questions...............................   230
Roslyn Layton, Ph.D., Visiting Scholar, American Enterprise 
  Institute......................................................    21
    Prepared statement...........................................    23
    Answers to submitted questions...............................   232
Denise E. Zheng, Vice President, Technology and Innovation, 
  Business Roundtable............................................    34
    Prepared statement...........................................    36
    Answers to submitted questions...............................   254
David F. Grimaldi, Jr., Executive Vice President, Public Policy, 
  Interactive Advertising Bureau.................................    39
    Prepared statement...........................................    41
    Answers to submitted questions...............................   255
Nuala O'Connor, President and Chief Executive Officer, Center for 
  Democracy & Technology.........................................    52
    Prepared statement...........................................    54
    Answers to submitted questions...............................   258

                           Submitted Material

Article of January 15, 2019, ``2019 Data Privacy Wish List: 
  Moving From Compliance To Concern,'' by Ameesh Divatia, 
  Forbes.com, submitted by Mr. Lujan.............................   103
Statement of the Berkeley Media Studies Group, et al., ``The Time 
  is Now: A Framework for Comprehensive Privacy Protection and 
  Digital Rights in the United States,'' submitted by Ms. 
  Schakowsky.....................................................   105

----------

\1\ Ms. Collins-Dexter's entire statement, including supplemental 
material that does not appear in the printed edition, has been retained 
in committee files and also is available at https://docs.house.gov/
meetings/IF/IF17/20190226/108942/HHRG-116-IF17-Wstate-Collins-DexterB-
20190226.pdf.
Letter of February 26, 2019, from Brent Gardner, Chief Government 
  Affairs Officer, Americans for Prosperity, to Ms. Schakowsky, 
  submitted by Ms. Schakowsky....................................   107
Letter of February 25, 2019, from Edward J. Black, President and 
  Chief Executive Officer, Computer & Communications Industry 
  Association, to Ms. Schakowsky and Mrs. Rodgers, submitted by 
  Ms. Schakowsky.................................................   108
Letter of February 13, 2019, from Access Humboldt, et al., to 
  U.S. Senator Roger Wicker, et al., submitted by Ms. Schakowsky.   115
Letter of February 25, 2019, from American Hotel & Lodging 
  Association, et al., to Mr. Pallone, et al., submitted by Ms. 
  Schakowsky.....................................................   119
Letter of February 25, 2019, from Gary Shapiro, President and 
  Chief Executive Officer, Consumer Technology Association, to 
  Mr. Pallone, et al., submitted by Ms. Schakowsky...............   122
Comments of November 9, 2018, submitted by Engine to the 
  Department of Commerce, Docket Number 180821780-878-01, 
  submitted by Ms. Schakowsky....................................   124
Letter of February 25, 2019, from Evan Engstrom, Executive 
  Director, Engine, to Ms. Schakowsky, et al., submitted by Ms. 
  Schakowsky.....................................................   134
Statement of the American Bankers Association, February 26, 2019, 
  submitted by Ms. Schakowsky....................................   135
Letter of February 26, 2019, from David French, Senior Vice 
  President, Government Relations, National Retail Federation, to 
  Mr. Pallone, et al., submitted by Ms. Schakowsky...............   144
Letter of November 9, 2018, from David French, Senior Vice 
  President, Government Relations, National Retail Federation, to 
  David J. Redl, Assistant Secretary for Communications and 
  Information, National Telecommunications and Information 
  Administration, Department of Commerce, submitted by Ms. 
  Schakowsky.....................................................   152
Letter of February 26, 2019, from Scott Talbott, Senior Vice 
  President of Government Affairs, Electronic Transactions 
  Association, to Ms. Schakowsky and Mrs. Rodgers, submitted by 
  Ms. Schakowsky.................................................   166
Letter of February 26, 2019, from Jon Leibowitz, Co-Chair, 21st 
  Century Privacy Coalition, to Mr. Pallone, et al., submitted by 
  Ms. Schakowsky.................................................   170
Letter of February 26, 2019, from Mark Neeb, Chief Executive 
  Officer, Association of Credit and Collection Professionals, to 
  Ms. Schakowsky and Mrs. Rodgers, submitted by Ms. Schakowsky...   173
Letter of February 25, 2019, from Will Rinehart, Director of 
  Technology and Innovation Policy, American Action Forum, to Ms. 
  Schakowsky and Mrs. Rodgers, submitted by Mrs. Rodgers.........   175
Letter of February 25, 2019, from Thomas A. Schatz, President, 
  Council for Citizens Against Government Waste, to Mr. Pallone, 
  et al., submitted by Mrs. Rodgers..............................   190
Letter of February 26, 2019, from the Coalition for a Secure and 
  Transparent Internet to Ms. Schakowsky and Mrs. Rodgers, 
  submitted by Mrs. Rodgers......................................   193
Letter of February 26, 2019, from Charles Duan, Technology and 
  Innovation Policy Director, R Street Institute, et al., to Ms. 
  Schakowsky and Mrs. Rodgers, submitted by Mrs. Rodgers.........   195
Letter of February 25, 2019, from Tim Day, Senior Vice President, 
  U.S. Chamber of Commerce, to Ms. Schakowsky and Mrs. Rodgers, 
  submitted by Mrs. Rodgers......................................   198
Letter of February 25, 2019, from Katie McAuliffe, Executive 
  Director, Digital Liberty, to subcommittee members, submitted 
  by Mrs. Rodgers................................................   204
Letter of February 25, 2019, from Michael Beckerman, President 
  and Chief Executive Officer, Internet Association, to Ms. 
  Schakowsky and Mrs. Rodgers, submitted by Mrs. Rodgers.........   206
Excerpt from Report of the Attorney General's Cyber Digital Task 
  Force, Department of Justice, submitted by Mr. Latta...........   212
Statement by Google, undated, submitted by Mrs. Rodgers..........   216
Letter of February 26, 2019, from Jimi Grande, Senior Vice 
  President, Government Affairs, National Association of Mutual 
  Insurance Companies, to Mr. Pallone, et al., submitted by Mrs. 
  Rodgers........................................................   228
Letter of February 26, 2019, from Rob Atkinson, President, 
  Information Technology and Innovation Foundation, et al., to 
  Mr. Pallone and Mr. Walden, submitted by Mrs. Rodgers \2\

----------

\2\ The letter has been retained in committee files and also is 
available at https://docs.house.gov/meetings/IF/IF17/20190226/108942/
HHRG-116-IF17-20190226-SD024.pdf.

 
           PROTECTING CONSUMER PRIVACY IN THE ERA OF BIG DATA

                              ----------                              


                       TUESDAY, FEBRUARY 26, 2019

                  House of Representatives,
  Subcommittee on Consumer Protection and Commerce,
                          Committee on Energy and Commerce,
                                                    Washington, DC.
    The subcommittee met, pursuant to call, at 10:01 a.m., in 
the John D. Dingell Room 2123, Rayburn House Office Building, 
Hon. Jan Schakowsky (chair of the subcommittee) presiding.
    Members present: Representatives Schakowsky, Castor, 
Veasey, Kelly, O'Halleran, Lujan, Cardenas, Blunt Rochester, 
Soto, Rush, Matsui, McNerney, Dingell, Pallone (ex officio), 
Rodgers (subcommittee ranking member), Upton, Burgess, Latta, 
Guthrie, Bucshon, Hudson, Carter, Gianforte, and Walden (ex 
officio).
    Also present: Representatives Eshoo and Clarke.
    Staff present: Jeffrey C. Carroll, Staff Director; 
Elizabeth Ertel, Office Manager; Evan Gilbert, Press Assistant; 
Lisa Goldman, Counsel; Waverly Gordon, Deputy Chief Counsel; 
Tiffany Guarascio, Deputy Staff Director; Alex Hoehn-Saric, 
Chief Counsel, Communications and Technology; Zach Kahan, 
Outreach and Member Service Coordinator; Dan Miller, Policy 
Analyst; Joe Orlando, Staff Assistant; Kaitlyn Peel, Digital 
Director; Tim Robinson, Chief Counsel; Chloe Rodriguez, Policy 
Analyst; Mike Bloomquist, Minority Staff Director; Adam 
Buckalew, Minority Director of Coalitions and Deputy Chief 
Counsel, Health; Jordan Davis, Minority Senior Advisor; Melissa 
Froelich, Minority Chief Counsel, Consumer Protection and 
Commerce; Peter Kielty, Minority General Counsel; Bijan 
Koohmaraie, Minority Counsel, Consumer Protection and Commerce; 
Ryan Long, Minority Deputy Staff Director; Brannon Rains, 
Minority Staff Assistant; and Greg Zerzan, Minority Counsel, 
Consumer Protection and Commerce.
    Ms. Schakowsky. The Subcommittee on Consumer Protection and 
Commerce will now be called to order.
    So I am going to begin with a few comments that are off the 
clock and then invite our ranking member to do the same. I am 
going to say good morning and thank you all for joining us 
today. And before we officially start the hearing, I would like 
to welcome you to the first Consumer Protection and Commerce 
Subcommittee of the 116th Congress.
    Consumer protection has long been my passion and what first 
drew me to public life. I like to call our subcommittee the 
Nation's legislative helpline because we field consumer 
complaints. The subcommittee's jurisdiction is vast in scope, 
ranging from the safety of cars to consumer product defects to 
consumer fraud, both online and offline.
    In the past, when Democrats controlled the House, this 
subcommittee was responsible for making pools and children's 
products safer, increased the fuel efficiency of cars, and made 
sure that agencies aggressively protect consumers over 
corporate interests. Under my leadership this subcommittee will 
be extremely active and push companies and the administration 
to put consumers first.
    I look forward to working with Ranking Member McMorris 
Rodgers. I believe there are so many issues on which we will be 
able to work together in a bipartisan way. I would also like to 
welcome several new Democratic Members, Representative Mark 
Veasey from Texas--let's see, where I am looking the wrong way, 
OK--and Robin Kelly from Illinois, my home State; Tom 
O'Halleran from Arizona; Lisa Blunt Rochester from Delaware; 
and Darren Soto from Florida, are all new to the Energy and 
Commerce Committee and they also were smart enough to pick this 
best subcommittee at a very exciting time.
    I also welcome back many familiar faces and appreciate your 
continued commitment to consumer protection issues. And I would 
like to thank Tony Cardenas for serving as my vice chair of the 
subcommittee and he will provide the subcommittee with 
invaluable leadership.
    And, finally, I would like to recognize the return of my 
friend Debbie Dingell. Over the past 2 weeks we have mourned 
the passing of her husband, John Dingell, who was so important 
to this committee over the years and a friend to so many. 
Debbie has been a stalwart, but I know it has been a difficult 
time.
    Debbie, you have all of our sympathy and support from the 
entire subcommittee. And with the indulgence of my ranking 
member, just to let Debbie say a few words.
    Debbie?
    Mrs. Dingell. I just want to thank you and all of my 
colleagues. John Dingell loved this committee. He thought the 
work that they did was very important, and I hear him in my ear 
going, ``Woman, get on,'' and hearing him in the ears of 
everybody, ``Work together for the American people.'' Thank 
you.
    Ms. Schakowsky. I have been reminded that Darren Soto's 
birthday is today? Oh, yesterday. OK, never mind.
    OK. So Ranking Member McMorris Rodgers, would you like to 
take a couple of minutes to welcome your new Members as well?
    Mrs. Rodgers. Thank you. Thank you, Madam Chair and to all 
the members of the committee. Welcome to the committee, and I 
too want to extend my heartfelt thoughts and prayers to Debbie 
and so appreciate her friendship, her leadership on this 
committee, and I would join in saying let's work together. As 
John Dingell would challenge us, let's work together for the 
American people. And it is great to have you back, Debbie.
    To the new members of the committee, I would like to 
recognize some of the newest Members on our side of the aisle: 
Mr. Hudson from North Carolina--he will be here shortly--Mr. 
Carter from Georgia, Mr. Gianforte from Montana, and I also 
have the privilege of having former chairmen on this side of 
the aisle, Bob Latta and Burgess as well as full committee 
chairmen on this subcommittee.
    I look forward to working with you, Madam Chair, on putting 
consumers first while ensuring that we continue to celebrate 
the innovation and all that it has meant to the American way of 
life and improving our quality of life. As Americans we have 
led the world in technology and innovation, and I look forward 
to the many issues that are before this committee and working 
to find that bipartisan ground wherever possible. Thank you.
    Ms. Schakowsky. Let's shake on that.
    Mrs. Rodgers. All right.
    Ms. Schakowsky. All right. So I yield myself 5 minutes now 
for an opening statement.

 OPENING STATEMENT OF HON. JAN SCHAKOWSKY, A REPRESENTATIVE IN 
              CONGRESS FROM THE STATE OF ILLINOIS

    And as I said earlier, our subcommittee is the Nation's 
legislative helpline, and our first hearing, ``Protecting 
Consumer Privacy in the Era of Big Data,'' couldn't be more 
timely because the phone at the end of the helpline is 
definitely ringing off the hook.
    According to a recent survey, over 80 percent of U.S. 
adults were not very confident in the security of personal 
information held by social media, retail, and travel and travel 
companies, and 67 percent wanted the government to act to 
protect them. There is good reason for consumer suspicion. 
Modern technology has made the collection, analysis, sharing, 
and the sale of data both easy and profitable.
    Personal information is mined from Americans with little 
regard for the consequences. In the last week alone, we learned 
that Facebook exposed individual private health information and 
they thought was--that consumers thought was protected in 
closed groups, and collected--and Facebook also collected data 
from third-party app developers on issues as personal as 
women's menstrual cycle and cancer treatment. People seeking 
solace may instead find increased insurance rates as a result 
of the disclosure of that information.
    But Facebook isn't alone. We have seen the data collection 
industry transform from a nascent industry most Americans 
haven't heard of to an economic powerhouse gobbling up every 
piece of consumer data it can both online and offline. While 
many companies claim to provide notice and choice to consumers, 
the truth is that they provide little reason for believing we 
are protected.
    Who has the time to wade through the dozens of privacy 
policies that impact them? How many people think about being 
trapped through their phone or by the overhead light in the 
store? And often, the only choice that we have to avoid data 
collection is not to go to the store or to use the app. Reports 
of the abuse of personal information undoubtedly give Americans 
the creeps.
    But this data isn't being collected to give you the creeps. 
It is being done to control markets and make a profit. Without 
a comprehensive, Federal privacy law the burden has fallen 
completely on consumers to protect themselves and this has to 
end. Without a doubt, there are legitimate and beneficial 
reasons for consumers to use personal--for companies to use 
personal information, but data collection must come with 
responsibilities. There should be limits on the collection of 
consumers' data and on the use and sharing of their personal 
information.
    My goal is to develop strong, sensible legislation that 
provides meaningful protection for consumers while promoting 
competitive markets and restoring America's faith in business 
and government. Rules alone though are not enough. We also need 
aggressive enforcement. Unfortunately, in recent years the 
Federal Trade Commission's enforcement action have done little 
to curb the worst behavior in data collection and data 
security.
    Any legislation must give Federal regulators the tools to 
take effective action to protect consumers. It is important to 
equip regulators and enforcers with the tools and funding 
necessary to protect privacy, but it is also critical to make 
sure that requests for more tools and privacy are not used as a 
excuse for inaction. We must understand why the FTC hasn't used 
its existing suite of tools to the full extent such as section 
5 authority to ban unfair methods of competition or its ability 
to enforce violators.
    So I welcome our witnesses today to learn about how we 
should achieve these goals given the breadth of the issue. This 
will be the first of several hearings. Others will allow us to 
focus on specific issues of concern to the public.
    [The prepared statement of Ms. Schakowsky follows:]

               Prepared statement of Hon. Jan Schakowsky

    Good morning and thank you all for joining us today. Before 
we start the hearing, I'd like to welcome you to the first 
Consumer Protection and Commerce Subcommittee of the 116th 
Congress. Consumer protection is my passion, and what first 
drew me to public life. I like to call our subcommittee the 
Nation's legislative helpline, because we field consumer 
complaints.
    The subcommittee's jurisdiction is vast in scope, ranging 
from the safety of cars to consumer product defects to consumer 
fraud--both online and offline. In the past when Democrats 
controlled the House, this subcommittee was responsible for 
making pools and children's products safer, increasing the fuel 
efficiency of cars, and made sure agencies aggressively 
protected consumers over corporate interests.
    Under my leadership, this subcommittee will be extremely 
active and push companies and the administration to put 
consumers first.
    I look forward to working with Ranking Member McMorris 
Rodgers. I believe there are many issues on which we will be 
able to work together.
    As I said earlier, our subcommittee is the Nation's 
legislative helpline, and our first hearing, ``Protecting 
Consumer Privacy in the Era of Big Data,'' couldn't be more 
timely because the phone at the helpline is ringing off the 
hook. According to a recent SAS survey, over 80 percent of U.S. 
adults were not very confident in the security of personal 
information held by social media, retail, and travel companies 
and 67 percent wanted the government to act to protect them.
    There is good reason for consumers' suspicion. Modern 
technology has made the collection, analysis, sharing, and sale 
of data both easy and profitable. Personal information is mined 
from Americans with little regard for the consequences.
    In the last week alone, we learned that Facebook exposed 
individuals' private health information they thought was 
protected in closed groups, and collected data from third-party 
app developers on issues as personal as women's menstrual 
cycles and cancer treatments. People seeking solace may instead 
find increased insurance rates as a result of the disclosure of 
that information.
    But Facebook isn't alone. We have seen the data collection 
industry transform from a nascent industry most Americans 
haven't heard of to an economic powerhouse gobbling up every 
piece of consumer data it can--both online and offline.
    While many companies claim to provide notice and choice to 
consumers, the truth is this provides little real protection. 
Who has the time to wade through the dozens of privacy policies 
that impact them daily? How many people think about being 
tracked through their phones or by the overhead lights in a 
store? And often the only ``choice'' they have to avoid data 
collection is not to go to the store or use an app.
    Reports of the abuse of personal information undoubtedly 
give Americans the creeps. But this data isn't being collected 
to give you the creeps. It's being done to control markets and 
make a profit.
    Without a comprehensive Federal privacy law, the burden has 
fallen completely on consumers to protect themselves. This must 
end.
    Without a doubt, there are legitimate and beneficial 
reasons for companies to use personal information, but data 
collection must come with responsibilities. There should be 
limits on the collection of consumers' data and on the use and 
sharing of their personal information. My goal is to develop 
strong, sensible legislation that provides meaningful 
protections for consumers while promoting competitive markets 
and restoring Americans' faith in business and government.
    Rules alone are not enough. We also need aggressive 
enforcers. Unfortunately, in recent years, the Federal Trade 
Commission's (FTC) enforcement actions have done little to curb 
the worst behavior in data collection and data security. Any 
legislation must give Federal regulators the tools to take 
effective action to protect consumers. It is important to equip 
regulators and enforcers with the tools and funding necessary 
to protect privacy, but it is also critical to make sure that 
requests for more tools and privacy are not used as an excuse 
for inaction. We must understand why the FTC hasn't used its 
existing suite of tools to the fullest extent, such as its 
Section 5 authority to ban ``unfair methods of competition'' or 
its ability to enforce violations of consent decrees.
    I welcome our witnesses today to learn how we should 
achieve these goals. Given the breadth of this issue, this will 
be the first of several hearings. Others will allow us to focus 
on specific issues of concern to the public.
    At the same time, I want to work with my colleagues on both 
sides of the aisle on drafting privacy legislation. I have 
talked to a number of you about your priorities, and I want 
them to be reflected in what gets reported from this 
subcommittee.
    I look forward to working with each of you on this 
important issue.
    I now yield to Ranking Member Cathy McMorris Rogers for 5 
minutes.

    Ms. Schakowsky. So I look forward to working with all of 
you on both sides of the aisle, and I now yield to Ranking 
Member Cathy McMorris Rodgers for 5 minutes.

      OPENING STATEMENT OF HON. CATHY McMORRIS RODGERS, A 
    REPRESENTATIVE IN CONGRESS FROM THE STATE OF WASHINGTON

    Mrs. Rodgers. Thank you, Madam Chair. I would like to thank 
you for organizing this first hearing of the Congress on 
privacy and security. It really builds on important work that 
was done in the past by Chairman Walden and Latta in the last 
Congress and then Chairman Upton and Burgess in the 114th 
Congress. I am hopeful that we can find a bipartisan path to 
move forward on a single American approach to privacy, one that 
is going to protect consumers and individual privacy, one that 
ensures that consumers continue to benefit from the amazing 
technology and innovation that has happened in recent years.
    This morning I would like to lay out four principles as we 
approach this effort, one that supports free markets, consumer 
choice, innovation, and small businesses, the backbone of our 
economy. We often celebrate small businesses in America.
    Principle number 1, one national standard. The Constitution 
was crafted around the concept that one national marketplace 
would make America stronger in certain areas. It also 
recognizes the importance of intellectual property rights, free 
expression, and the rights of ``We the People'' to be protected 
from the power of government.
    The internet knows no borders. It has revolutionized our 
Nation's economy by seamlessly connecting businesses and people 
across the country. Online, a small business in Spokane, 
Washington can as easily reach customers in Illinois and New 
Jersey as in Eastern Washington. Distance is no longer a 
barrier. The internet economy is interstate commerce and 
subject to Federal jurisdiction.
    There is a strong groundswell of support for a Federal 
privacy law that sets a national standard. Many recognize the 
burdens multiple State laws would create, but what would it 
mean for someone in Washington State who buys something online 
from a small business in Oregon to ship to their family in 
Idaho? This is a regulatory minefield that will force 
businesses to raise prices on their customers. Setting one 
national standard makes common sense and is the right approach 
to give people certainty.
    Principle number 2, transparency and accountability. 
Companies must also be more transparent when explaining their 
practices. For example, we learned last week that Google 
included a microphone in their Nest device but failed to 
disclose it, and Facebook is collecting very personal health 
information from apps, the Chair mentioned that. Transparency 
is critical. When unfair or deceptive practices are identified, 
there should be enforcement and there should be consequences 
strong enough to improve behavior.
    Principle number 3, improving data security. Another area 
important to this debate is data security. Perfect security 
doesn't exist online, and companies are bombarded by hackers 
every second of every day. Certain data is more valuable on the 
black market, which is why Social Security Numbers, credit card 
data, and log-in credentials are always major targets for 
criminals. One goal must be to improve people's awareness. For 
one, how their information is being collected and used, and 
two, how companies are protecting it and how people can protect 
themselves.
    Our focus should be on incentivizing innovation security 
solutions and certainty for companies who take reasonable steps 
to protect data, otherwise we risk prescriptive regulations 
that cannot be updated to keep up with the bad actors' newest 
tactics.
    Principle number 4, small businesses. We must not lose 
sight of small- and medium-sized businesses and how heavy-
handed laws and regulations can hurt them. Established, bigger 
companies can navigate a complex and burdensome privacy regime, 
but millions of dollars in compliance costs aren't doable for 
startups and small businesses. We have already seen this in 
Europe, where GDPR has actually increased, has helped increase 
the market share of the largest tech companies while forcing 
smaller companies offline with millions of dollars in 
compliance costs.
    These startups and small businesses could be innovating the 
next major breakthrough in self-driving technology, health 
care, customer service, and so many other areas. To keep 
America as the world's leading innovator we cannot afford to 
hold them back. Heavy-handed and overly cautious regulations 
for all data will stop innovation that makes our roads safer, 
health care more accessible, and customer service experiences 
better.
    I am glad our teams were able to work together on today's 
hearing. This is a good step forward in finding a bipartisan 
solution for these critical issues. And as we move forward, I 
am sure there is going to be more hearings in the future to 
allow more small business owners, startups, and entrepreneurs 
to join this conversation.
    I believe we have a unique opportunity here for a 
bipartisan solution that sets clear rules for the road on data 
privacy. In its best use data has made it possible for grocery 
aisles to be organized on how people shop. But we need to 
explore data privacy and security with forward-looking 
solutions, and I look forward to hearing from the witnesses and 
being a part of this discussion today.
    Thank you very much, Madam Chair.
    [The prepared statement of Mrs. Rodgers follows:]

           Prepared statement of Hon. Cathy McMorris Rodgers

    Good morning and welcome to our first Consumer Protection 
and Commerce Subcommittee hearing. I would like to congratulate 
Chair Schakowsky.
    I would also like to recognize the newest Members of the 
Subcommittee, Mr. Hudson from North Carolina, Mr. Carter from 
Georgia, and Mr. Gianforte from Montana. I look forward to 
working with all of the Members this Congress. Our jurisdiction 
includes vast portions of the economy and I look forward to 
working with you on bipartisan solutions that improve the lives 
of all Americans. I also would like to thank the Chair for 
organizing this first hearing of the Congress on privacy and 
security. This hearing builds on the good work of Chairmen 
Walden and Latta in the last Congress, and Chairmen Upton and 
Burgess in the 114th Congress. While there have been issues 
achieving bipartisan consensus in the past, I'm encouraged that 
we can find a bipartisan path forward on a single American 
approach to privacy--one that supports free markets, consumer 
choice, innovation and small businesses---the backbone of our 
economy.
    Principle #1: One National Standard
    The Constitution was crafted around the concept that one 
national marketplace would make America stronger in certain 
areas. It also recognizes the importance of intellectual 
property rights, free expression, and the rights of ``We, the 
People'' to be protected from the power of the government. The 
Internet knows no borders. It has revolutionized our nation's 
economy by seamlessly connecting businesses and people across 
the country.
    Online, a small business in Spokane can just as easily 
reach customers in Illinois and New Jersey. Distance is no 
longer a barrier. The Internet economy is interstate commerce 
and subject to Federal jurisdiction. There is a strong 
groundswell of support for a Federal privacy law that sets a 
national standard. Many recognize the burdens a patchwork of 
State laws would create. What would it mean for someone in 
Washington State who buys something online from a small 
business in Oregon to ship to their family in Idaho? This is a 
regulatory minefield that will force businesses to raise prices 
on their customers. Setting one national standard is common 
sense and it's the right approach to give people certainty.
    Principle #2: Transparency and Accountability
    Companies must also be more transparent when explaining 
their practices. For example, we learned last week that Google 
included a microphone in their Nest device but failed to 
disclose it and Facebook is collecting very personal health 
information from apps. Transparency is critical. When unfair or 
deceptive practices are identified there should be enforcement 
and there should be consequences strong enough to improve 
behavior.
    Principle #3: Improving Data Security
    Another area important to this debate is data security. 
Perfect security doesn't exist online, and companies are 
bombarded by hackers every second of every day. Certain data is 
more valuable on the black market, which is why social security 
numbers, credit card data, and login credentials are always 
major targets for criminals. Our goal must be to improve 
people's awareness for one, how their information is being 
collected and used; two, how companies are protecting it; and 
three, how people can protect it themselves.
    Our focus should be on incentivizing innovative security 
solutions and certainty for companies who take reasonable steps 
to protect data. Otherwise, we risk proscriptive regulations 
that cannot be updated to keep up with the bad actors' newest 
tactics.
    Principle #4: Small Businesses
    Finally, we must not lose sight of small and medium-sized 
businesses and how heavy-handed laws and regulations can hurt 
them. Established bigger companies can navigate a complex and 
burdensome privacy regime. But millions of dollars in 
compliance costs aren't doable for startups and small 
businesses. We have already seen this in Europe, where GDPR has 
actually helped increase the market shares of the largest tech 
companies while forcing smaller companies offline with millions 
of dollars in compliance costs.
    These startups and small businesses could be innovating the 
next major breakthrough in self-driving technology, health 
care, customer service, and more. To keep America as the 
world's leading innovator, we cannot afford to hold them back.
    Heavy-handed and overly cautious regulations for all data 
will stop innovation that makes our roads safer, health care 
more accessible, and customer service experiences better. I'm 
glad our teams were able to work together on today's hearing. 
This is a good step forward to finding a bipartisan solution 
for these critical issues. As we move forward, I hope we make 
sure there's enough time before the next hearings to allow 
small business owners, startups, and entrepreneurs to join the 
conversation.
    We have a unique opportunity here for a bipartisan solution 
that sets clear rules for the road on data privacy in America. 
In its best use, data has made it possible for grocery store 
aisles to be organized based on how people shop. By exchanging 
our data with email providers, we receive free email and photo 
storage. Ridesharing services analyze traffic patterns and real 
time data on accidents to get us home safer and faster. These 
are just some examples of how data in aggregate has saved us 
time and money, kept us safe, and improved our lives.
    As we continue to explore data privacy and security, we 
must find a forward-thinking solution that fosters innovation 
and protects consumers from bad data practices that have caused 
people harm or create real risks. By achieving both, America 
will maintain our robust internet economy and continue to be 
the best place in the world to innovate.
    Thank you again to all of the witnesses for being here 
today and I look forward to your testimony. I yield back.

    Ms. Schakowsky. Thank you. The gentlelady yields back and 
now the Chair recognizes Mr. Pallone, chairman of the full 
committee, for 5 minutes for his opening statement.

OPENING STATEMENT OF HON. FRANK PALLONE, Jr., A REPRESENTATIVE 
            IN CONGRESS FROM THE STATE OF NEW JERSEY

    Mr. Pallone. Thank you. I also wanted to welcome back 
Debbie Dingell. Debbie has shown tremendous strength and 
courage during the past few weeks, and you were missed, Debbie, 
and we are glad you are back today. So I just wanted to say 
that.
    Welcome to the first hearing of the Consumer Protection and 
Commerce Subcommittee. We renamed the subcommittee to emphasize 
the importance of putting consumers first, and that is the lens 
through which I view the important issue of consumer privacy. 
How do we empower consumers and impose reasonable limits on 
companies that collect and use our own personal information?
    In the past we have talked about major data breaches and 
scandals involving the misuse and unauthorized sharing of 
people's data and we have talked about the potential for 
emerging technologies to be used in unintended and potentially 
harmful ways. But privacy isn't just about major incidents or 
predictions of the future, it is an everyday issue constantly 
affecting our lives and the lives of our children.
    Almost every company that we interact with and even many we 
don't are conducting surveillance of us. When we visit a single 
website, many companies are tracking our actions on that site, 
what we click on, how long we are on each page, even our mouse 
movements and that is true for each of the dozens of sites most 
of us visit every day.
    When we go out our location is tracked on our phones. Video 
surveillance of stores, on the street, in doctors' offices 
record what we do and who we are with. The purchases we make 
are recorded by the stores through store loyalty programs and 
by the credit cards we use to make those purchases. And 
companies use that information to sort and commodify us too.
    Inferences are drawn and we are labeled as a Democrat or 
Republican, white or Latino, gay or straight, pregnant teen, a 
grieving parent, a cancer survivor, so many more, and this is 
all done without our knowledge. And then our personal 
information and related inferences are being shared and sold 
many times over. Companies may share our information with 
business partners and affiliates that we have never heard of. 
Our data also may be sold to data brokers who collect massive 
amounts of data about all of us and then sell that off to 
anyone who is willing to pay for it.
    The scope of it all is really mind-boggling. Without a 
doubt there are positive uses of data. Companies need personal 
information to deliver a package or charge for a service. Some 
data is used for research and development of new products and 
improving services and sometimes it is used for fraud 
prevention or cybersecurity purposes and some of it is used for 
scientific research to find new treatments for medical 
conditions.
    But in some cases data use results in discrimination, 
differential pricing, and even physical harm. Low-income 
consumers may get charged more for products online because they 
live far away from competitive retailers. Health insurance 
companies could charge higher rates based on your food 
purchases or info from your fitness trackers. A victim of 
domestic violence may even have a real-time location tracking 
information sold to their attacker. And these are simply 
unacceptable uses of people's data.
    Yet for the most part, here in the U.S. no rules apply to 
how companies collect and use our information. Many companies 
draft privacy policies that provide few protections and are 
often unread. One study calculated that it would take 76 years 
to read all the privacy policies for every website the average 
consumer visits every year.
    And even if you could read and understand these privacy 
policies, often your only choice is to accept the terms or not 
use the service. In a lot of situations that is not an option. 
Consider when you need to pay for parking at a meter or use a 
website for work. You don't really have that choice. So we can 
no longer rely on a notice and consent system built on 
unrealistic and unfair foundations. As the chairwoman said, we 
need to look forward towards comprehensive privacy legislation, 
legislation that shifts the burden off consumers and puts 
reasonable responsibility on those profiting from the 
collection and use of our data.
    Because consumer privacy isn't new to this committee, we 
have been talking about it for years, yet nothing has been done 
to address the problem and this hearing is the beginning of a 
long overdue conversation. It is time that we move past the old 
model that protects the companies using the data and not the 
people. So I look forward to hearing from our witnesses today 
on how we can work together to accomplish this. I plan to work 
with my colleagues on both sides of the aisle to craft strong, 
comprehensive privacy legislation that puts consumers first.
    And I just want to thank you, Chairman Schakowsky, when you 
said that, you know, what this committee is all about is 
putting consumers first, and I think that having this hearing 
as you are today on the privacy issue is a strong indication 
that that is exactly what we intend to do. Thank you again.
    [The prepared statement of Mr. Pallone follows:]

             Prepared statement of Hon. Frank Pallone, Jr.

    Welcome to the first hearing of the Consumer Protection and 
Commerce Subcommittee. We renamed the subcommittee to emphasize 
the importance of putting consumers first. And that is the lens 
through which I view the important issue of consumer privacy--
how do we empower consumers and impose reasonable limits on 
companies that collect and use our personal information?
    In the past, we've talked about major data breaches and 
scandals involving the misuse and unauthorized sharing of 
people's data. And we've talked about the potential for 
emerging technologies to be used in unintended and potentially 
harmful ways. But privacy isn't just about major incidents or 
predictions of the future. It's an everyday issue, constantly 
affecting our lives and the lives of our children.
    Almost every company that we interact with, and even many 
we don't, are conducting surveillance of us. When we visit a 
single website, many companies are tracking our actions on that 
site-what we click on, how long we are on each page, even our 
mouse movements. And that's true for each of the dozens of 
sites most of us visit every day.
    When we go out, our location is tracked on our phones. 
Video surveillance at stores, on the street, and in doctors' 
offices record what we do and who we are with. The purchases we 
make are recorded by the stores we buy from, through store 
loyalty programs, and by the credit cards we use to make those 
purchases.
    Companies use that information to sort and commodify us, 
too. Inferences are drawn and we are labelled as gay or 
straight, Democrat or Republican, white or Latino, a pregnant 
teen, a grieving parent, a cancer survivor, and so much more. 
All without our knowledge.
    Plus, our personal information and related inferences are 
being shared and sold many times over. Companies may share our 
information with business partners and affiliates, which may be 
strangers to you. Our data also may be sold to data brokers, 
who collect massive amounts of data about all of us, and then 
sell that off to anyone willing to pay for it. The scope of it 
all is mindboggling.
    Without a doubt, there are positive uses of data. Companies 
need personal information to deliver a package or charge for a 
service. Some data is used for research and development of new 
products and improving services. Sometimes it's used for fraud 
prevention or cybersecurity purposes. And some is used for 
scientific research to find new treatments for medical 
conditions.
    But in some cases, data use results in discrimination, 
differential pricing, and even physical harm. Low-income 
consumers may get charged more for products online because they 
live far away from competitive retailers. Health insurance 
companies could charge higher rates based on your food 
purchases or information from your fitness tracker. A victim of 
domestic violence may even have real-time location tracking 
information sold to their attacker.
    Yet, for the most part, in the U.S., no rules apply to how 
companies collect and use our information. Many companies draft 
privacy policies that provide few protections and are often 
unread. One study calculated that it would take 76 years to 
read all of the privacy policies for every website the average 
consumer visits each year. And even if you could read and 
understand each privacy policy, often your only choice is to 
accept the terms or not use the service. And when you need to 
pay for parking at a meter or use a website for work, you don't 
really have that choice at all. We can no longer rely on a 
``notice and consent'' system built on such unrealistic and 
unfair foundations.
    As Chair Schakowsky said, we need to look toward 
comprehensive privacy legislation-legislation that shifts the 
burdens off consumers and puts reasonable responsibility on 
those profiting from the collection and use of our data.
    As I said, consumer privacy isn't new to this committee. 
We've been talking about it for years. And yet, nothing has 
been done to address the problems. But times have changed. We 
are not going to fail consumers any more.
    This hearing is beginning of that conversation. We need to 
move past the old model that protects the companies using the 
data, not the people. I look forward to hearing from our 
witnesses today on how we can do this. And I plan to work with 
my colleagues on both sides of the aisle to craft strong, 
comprehensive privacy legislation that puts consumers first.

    Ms. Schakowsky. I thank the gentleman. The gentleman yields 
back and now the Chair recognizes Mr. Walden, ranking member of 
the full committee, for 5 minutes for his opening statement.

  OPENING STATEMENT OF HON. GREG WALDEN, A REPRESENTATIVE IN 
               CONGRESS FROM THE STATE OF OREGON

    Mr. Walden. Well, good morning and welcome to our Members 
and our witnesses and congratulations to both Representative 
Rodgers as the new lead Republican and to Representative Jan 
Schakowsky as the new chair of the Consumer Protection and 
Commerce Subcommittee. I know we are off to a good start this 
morning.
    We have a lot of important issues to work on in this 
subcommittee and I am hopeful we can continue the bipartisan 
achievements out of this subcommittee from Chair Schakowsky and 
Representative Latta's SELF DRIVE Act to legislation focused on 
the Internet of Things and the oversight of the FTC, CPSC, and 
NHTSA. I hope we can continue working together for the benefit 
of the American consumer.
    I would also like to thank Chairs Pallone and Schakowsky 
for picking up on the privacy and security issues as the topic 
of the first hearing for this subcommittee. From the disrupter 
series of hearings that we held in the last Congress to the 
first congressional hearings with major tech companies' CEOs, 
this committee has been on the forefront of getting answers for 
our constituents.
    The debate over privacy, it is not new. From the first 
Kodak camera to caller ID, major privacy debates ensued when 
new innovation was introduced. But there are new challenges 
when it comes to privacy, and we have heard some of that today 
from our Members. Privacy means different things to different 
people, which makes this debate even more challenging in the 
age of Instagram and YouTube.
    I believe it is important that we work together toward a 
bipartisan Federal privacy bill that, one, improves 
transparency, accountability, and security for consumers; that, 
two, protects innovation and small businesses; and, three, sets 
one national standard. Now the first issue, as some like to 
frame as incredibly divisive, falls under the most basic 
principle underpinning our jurisdiction, and that is the term 
``interstate commerce.''
    A Federal privacy bill needs to be just that, one that sets 
the national standard for commercial collection use and sharing 
of personal data in the best interest of consumers. The Supreme 
Court has recently reaffirmed the principles of the commerce 
clause. State laws cannot discriminate against interstate 
commerce. They cannot impose undue burdens on interstate 
commerce and should take into consideration the small 
businesses startups and others who engage in commerce across 
State lines.
    There are many policy areas where it makes sense for States 
to innovate. However, the internet does not stop at a State 
line and neither should innovative privacy and security 
solutions. Your privacy and security should not change 
depending on where you live in the United States. One State 
should not set the standards for the rest of the country.
    We can improve the security and privacy of consumers' data 
without adding to the confusion or harming small businesses and 
entrepreneurs, so Congress should thoughtfully consider what 
various States are proposing so we can deliver that certainty 
and do so with a national standard. We can learn from 
California and we can learn from Washington and a growing 
number of other States who have drafted their own legislation 
reinforcing why we should begin with an agreement that a 
Federal privacy bill sets one national standard.
    Now a truly American approach to privacy and security can 
give consumers better control by supporting innovative 
solutions without massively expanding the regulatory State. We 
should avoid creating a system that floods people's inboxes 
with privacy policies that frankly they do not read, or click 
through notices that even make simple tasks very frustrating. 
We can and should, however, learn from previous efforts here at 
home and abroad.
    So transparency and accountability are critical to move 
forward and measurably improve consumers' ability to choose 
between services they want to use. People need to receive a 
clearer understanding of exactly how their data are used by the 
digital services with whom they interact. The FTC has announced 
their investigation into both Equifax and Facebook. The outcome 
of their work will help Congress evaluate the effectiveness of 
laws currently on the books and the enforcement tools utilized 
to hold companies accountable. We can write bill after bill and 
the FTC can publish rule after rule, but if we do not have 
effective enforcement, they are just rules on paper.
    So I believe we have a unique opportunity to address some 
of the most complex privacy and security questions of the day 
and I look forward to working with my colleagues across the 
aisle on setting a national framework and getting this debate 
moving forward toward a bipartisan national solution. With 
that, Madam Chair, I yield back.
    [The prepared statement of Mr. Walden follows:]

                 Prepared statement of Hon. Greg Walden

    Good morning. Welcome to our Members and witnesses.
    Congratulations to both Representative Rodgers as the new 
lead Republican, and to Representative Schakowsky as the new 
chair for the Consumer Protection and Commerce Subcommittee.
    We have a lot of important issues to work on in this 
subcommittee, and I am hopeful we can continue the bipartisan 
achievements out of this subcommittee. From Chair Schakowsky 
and Rep. Latta's SELF DRIVE Act, to legislation focused on the 
Internet of Things, and oversight of the FTC, C.P.S.C. and 
NHTSA, I hope we can continue working together for the benefit 
of the American consumer.
    I would like to thank Chairs Pallone and Schakowsky for 
picking up the privacy and security issues as the topic of the 
first hearing for the subcommittee. From the Disrupter Series 
of hearings, to the first congressional hearings with major 
tech company CEOs, this committee has been on the forefront of 
getting answers for our constituents.
    The debate over privacy is not new. From the first Kodak 
camera to caller-ID, major privacy debates ensued when they 
were introduced. But there are new challenges when it comes to 
privacy. Privacy means different things to different people, 
which makes this debate even more challenging in the age of 
Instagram and YouTube stars comfortably sharing their most 
private moments in real time.
    I believe it is important that we work together toward a 
bipartisan Federal privacy bill that: improves transparency, 
accountability, and security for consumers; protects innovation 
and small businesses; and sets one national standard.
    The first issue, that some like to frame as incredibly 
divisive, falls under the most basic principle underpinning our 
jurisdiction: interstate commerce. A Federal privacy bill needs 
to be just that: one that sets the national standard for 
commercial collection, use, and sharing of personal data in the 
best interest of consumers.
    The Supreme Court has recently reaffirmed the basic 
principles of the Commerce Clause: State laws cannot 
discriminate against interstate commerce, they cannot impose 
undue burdens on interstate commerce, and should take into 
consideration the small businesses, startups, and others who 
engage in commerce across State lines.
    There are many policy areas where it makes sense for States 
to innovate; however, the internet does not stop at State lines 
and neither should innovative privacy and security solutions. 
Your privacy and security should not change depending on where 
you are in the United States. One State should not set the 
standards for the rest of the country. We can improve the 
security and privacy of consumers' data without adding to the 
confusion or harming small businesses and entrepreneurs--so 
Congress should thoughtfully consider what various States are 
proposing so we deliver that certainty with a national 
standard.
    We can learn from California, Washington, and a growing 
number of other States who have drafted their own legislation--
reinforcing why we should begin with an agreement that a 
Federal privacy bill sets one national standard.
    A truly American approach to privacy and security can give 
consumers better control by supporting innovative solutions 
without massively expanding the regulatory state. We should 
avoid creating a system that floods people's inboxes with 
privacy policies they do not read or click-through notices that 
make even simple tasks frustrating. We can, and should, learn 
from previous efforts here at home and abroad.
    Transparency and accountability are critical to move 
forward and measurably improve consumers ability to choose 
between services they want to use. People need to receive a 
clearer understanding of exactly how their data are used by the 
digital services with whom they interact.
    The FTC has announced their investigations into both 
Equifax and Facebook. The outcome of their work will help 
Congress evaluate the effectiveness of laws currently on the 
books, and the enforcement tools utilized to hold companies 
accountable. We can write bill after bill, and the FTC could 
publish rule after rule, but if we do not have effective 
enforcement, they are just words on paper.
    I believe we have a unique opportunity to address some of 
the most complex privacy and security questions of our day.
    I look forward to working with my colleagues across the 
aisle on setting the framework for this debate and moving 
forward towards a bipartisan national solution.
    Thank you and I yield back.

    Ms. Schakowsky. Thank you. The gentleman yields back. And 
the Chair would like to remind Members that pursuant to 
committee rules, all Members' written opening statements shall 
be made part of the record.
    And now I would like to introduce our witnesses for today's 
hearing and thank you all for coming. We have Ms. Brandi 
Collins-Dexter, senior campaign director, media, democracy and 
economic Justice, at Color of Change; Dr. Roslyn Layton, 
visiting scholar at the American Enterprise Institute; Ms. 
Denise Zheng--is that correct, ``Zhong''? OK--vice president, 
technology and innovation, Business Roundtable; Dr. Dave 
Grimaldi, executive vice president for public policy, IAB; and 
Dr. Nuala O'Connor, president and CEO at the Center for 
Democracy & Technology.
    And let's begin then with Ms. Collins-Dexter.

STATEMENTS OF BRANDI COLLINS-DEXTER, SENIOR CAMPAIGN DIRECTOR, 
   COLOR OF CHANGE; ROSLYN LAYTON, PH.D., VISITING SCHOLAR, 
AMERICAN ENTERPRISE INSTITUTE; DENISE E. ZHENG, VICE PRESIDENT, 
   TECHNOLOGY AND INNOVATION, BUSINESS ROUNDTABLE; DAVID F. 
    GRIMALDI, Jr., EXECUTIVE VICE PRESIDENT, PUBLIC POLICY, 
 INTERACTIVE ADVERTISING BUREAU; AND NUALA O'CONNOR, PRESIDENT 
 AND CHIEF EXECUTIVE OFFICER, CENTER FOR DEMOCRACY & TECHNOLOGY

               STATEMENT OF BRANDI COLLINS-DEXTER

    Ms. Collins-Dexter. Good morning Madam Chair, Ranking 
Member Rodgers, Committee Chairman Pallone, Committee Ranking 
Member Walden, and members of the subcommittee. My name is 
Brandi Collins-Dexter, and I am a senior campaign director at 
Color of Change, the largest online civil rights organization 
in the United States with more than 1.5 million members who use 
technology to fight for change.
    In the wild, wild West of the digital economy, 
discriminatory marketing practices are so lucrative that entire 
industries have sprung up to discriminate for dollars. One 
company called Ethnic Technologies--subtle, I know--developed 
software that predicts an individual's ethnic origin based on 
data points easily purchased from ISPs and then sells that 
data, which has been turned into a predictive algorithm, to any 
company that wants to target groups or services to a particular 
ethnic group. Part of what we are seeing now is bad online 
behavior that circumvents civil rights laws.
    Google and Facebook have both had numerous complaints filed 
against them for allowing discriminatory housing and employment 
ads. State commission reports found that voter suppression ads 
were specifically targeted towards black Americans on social 
media during the 2016 Presidential election and that social 
media companies made misleading or evasive claims about those 
efforts.
    Additionally, low-income communities are targeted by 
predatory payday loan companies that make billions of dollars 
in interest and fees on the back of struggling families. We 
have seen online price gouging and digital redlining where 
corporations like Staples have used geotracking and personal 
data to charge customers higher prices for products based on 
their geography. Some data brokers even lump consumers into 
categories like, quote unquote, getting by, compulsive online 
gamblers. One company has even used a category called ``Speedy 
Dinero,'' described as, quote, ``Hispanic communities in need 
of fast cash receptive to some prime credit offers.''
    Last week, as was mentioned, Facebook was caught obtaining 
sensitive personal information submitted to entirely separate 
mobile apps using software that immediately shares data with 
social networks for ad targeting. I mean, literally, my iPad 
knows more about me than my husband and he is an ex-journalist 
who is very nosy. Even information that feels innocuous can 
become a proxy for a protected class. And sensitive 
information, right now corporations are able to easily combine 
information about you that they have purchased and create a 
profile of your vulnerabilities.
    Earlier this month, Color of Change joined with advocacy 
groups to urge Congress to put civil and human rights at the 
center of the privacy fight. Our letter states in part, ``Civil 
rights protections have existed in brick and mortar commerce 
for decades. Platforms and other online services should not be 
permitted to use consumer data to discriminate against 
protected classes or deny them opportunities in commerce, 
housing, and employment, or full participation in our 
democracy.''
    There are many bills out there, some we think are weak and 
some like language we have seen from Senator Cortez Masto, so a 
great deal of promise. But ultimately we would like to see 
bipartisan legislation written through an antidiscrimination 
lens that prevents manipulative or exclusionary marketing 
practices that exacerbate poverty. It should offer a baseline 
that does not preempt innovative State policy and it must 
contain enforcement mechanisms and not rely on self-regulation.
    Some say privacy is the currency you pay to engage in our 
digital ecosystem. We should not have to make that choice. Our 
communities need to trust that when we go online we can count 
on our privacy and the safety of our information for ourselves 
and our children. This shouldn't be a game of political 
football. Eighty percent of Americans support making it illegal 
for companies to sell or share their personal information. At 
least 80 percent of us believe that we should have control over 
how companies use our information.
    Privacy is a concept in its most aspirational sense. It is 
not merely about the freedom and ability to close your digital 
curtain, so to speak. Instead, we should consider privacy and 
digital rights for all a necessary framework crucial for 
ensuring that our human, civil, and constitutional rights are 
not confined to our offline lives, but are also protected 
online where so much of our daily life occurs. I would even say 
that if we fail in the mission to ensure our rights online are 
protected, we stand to render many of our offline rights 
meaningless.
    Thank you again for having me here today, and I look 
forward to your thoughts.
    [The prepared statement of Ms. Collins-Dexter follows:]
    [GRAPHICS NOT AVAILABLE IN TIFF FORMAT]
    
    Ms. Schakowsky. Thank you. I meant to mention that each of 
you has 5 minutes, and I appreciate you, Ms. Collins-Dexter, 
for sticking to that. The lights that will go on initially will 
be green, and then the light will turn yellow when you have 1 
minute remaining, and then red means you need to stop.
    And so, Dr. Layton, you are recognized for 5 minutes.

                   STATEMENT OF ROSLYN LAYTON

    Dr. Layton. Good morning. Thank you, Chair Schakowsky, Ms. 
McMorris Rodgers, and members of the committee. It is an honor 
to be here, and I am heartened by your bipartisanship.
    Today I represent only myself and my research. I have lived 
in the European Union for the last decade, and I work at a 
European university where I make international internet policy 
comparisons. As the mother of three Danish-American children, I 
am legitimately interested in policy that makes Europe a better 
place.
    The academic literature shows that online trust is a 
function of institutions, business practices, technologies, and 
users' knowledge. But unfortunately the EU rejected this 
formula for its data protection policy. My hope is that 
Congress will avoid the mistakes of the GDPR and ultimately 
leapfrog Europe with a better framework based upon privacy-
enhancing technologies, a strong Federal standard, and consumer 
education.
    To analyze a policy like the GDPR we must evaluate its 
real-world effects. Since its implementation, Google, Facebook, 
and Amazon have increased their market share in the EU. This is 
a perverse outcome for a policy promised to level the playing 
field. Today, only 20 percent of EU companies are online. There 
is little to no data that shows that small and medium-sized 
enterprises are gaining as a result of the GDPR.
    The data shows a consistent lag in the small to medium-
sized business segment particularly for them to modernize their 
websites and market outside their own EU country. Now this 
outcome isn't necessarily surprising. As a Nobel Prize 
economist, George Stigler, observed 40 years ago, regulation is 
acquired by industry and operated for its benefit. A number of 
large companies have come out in support of the GDPR. It 
doesn't surprise me either, that is because it cements their 
market position. They don't need permissionless innovation 
anymore, but they don't have a problem depriving startups of 
the same freedom.
    Now to comply with the GDPR today, an average firm of 500 
employees will spend about $3 million. And thousands of U.S. 
firms have decided that this is not worthwhile, including the 
Chicago Tribune, which is no longer visible in the European 
Union. There are over 1,000 American news media that no longer 
reach Europeans. This is also concerning because the EU is the 
destination of two-thirds of America's digital goods and 
services.
    Now the GDPR might be justified if it created greater trust 
in the digital ecosystem, but there is no such evidence. After 
a decade of these kinds of data protection regulations in the 
EU, in which users endure intrusive pop-ups and disclosures in 
every digital site they visit, Europeans report no greater 
sense of trust online. More than half of the survey respondents 
in the UK alone say that they feel no better since the GDPR 
took effect and it has not helped them to understand how their 
data is used.
    I am skeptical of both the GDPR and the CCPA in California 
with their laundry list of requirements, 45 in Europe and 77 in 
California. These are not scientifically tested and there is no 
rational policy process to vet their efficacy. Now I imagine if 
we held--now what would happen if we would hold government to 
the same standards? Australia tried a ``when in doubt, opt 
out'' policy and half a million people left the national 
healthcare record program. It crashed their system for 
healthcare.
    We have another reason to be skeptical of the claims of the 
EU being morally superior with their GDPR. Their networks are 
not secure because they are built with equipment by dubious 
Chinese equipment makers. Your data protection standard means 
little if the Chinese Government can hack your data through 
back doors.
    In any event, Europe's attempt to create a common market 
for data is something that was actually part of our founding 
and of our country with our national standard in interstate 
commerce, which has been discussed, and I support such a 
national standard for sensitive data consistently applied 
across enterprises. To leap the Europeans on data protection we 
need to review the empirical research that the Europeans 
ignored, namely how privacy-enhancing technologies and user 
knowledge will promote online trust.
    The answer is not to copy the EU, but to build world-class, 
scientifically superior, privacy-enhancing technologies here in 
the United States. Congress should incentivize the development 
of such technologies through grants and competitions and 
provide safe harbors for their research, development, and 
practice. There is no consumer protection without consumer 
education and we should support people to acquire their digital 
competence so they make informed decisions about the products 
they use.
    In closing, please do not fall prey to the European 
regulatory fallacy which substitutes the bureaucratization of 
data instead of a natural right of privacy. Increasing the 
number of agencies and bureaucrats who govern our data does not 
increase our privacy. It reduces our freedom, makes enterprise 
more expensive, and deters innovation. Thank you for your 
leadership. I welcome your questions.
    [The prepared statement of Dr. Layton follows:]
    [GRAPHICS NOT AVAILABLE IN TIFF FORMAT]
    
    Ms. Schakowsky. Thank you.
    Ms. Zheng, you are recognized for 5 minutes.

                  STATEMENT OF DENISE E. ZHENG

    Ms. Zheng. Thank you, Chairwoman Schakowsky, Ranking Member 
McMorris Rodgers, members of the subcommittee, thank you for 
the opportunity to testify on behalf of the Business 
Roundtable.
    Business Roundtable represents more than 200 CEOs of the 
largest American companies that operate in nearly every corner 
of the economy including technology, telecommunications, 
retail, banking, health, manufacturing, automotive, and many 
other industries. Our companies touch virtually every American 
consumer. They process 16 trillion in global consumer payments 
each year and service roughly 40 million utilities customers 
across the country.
    They fly more than 250 million passengers to their 
destinations each year and provide wireless communications and 
internet services to more than 160 million consumers. They 
sponsor nearly 70 million medical insurance memberships and 
deliver more than 42 million packages every single day. Data 
privacy is a major priority for the Business Roundtable 
especially as companies that rely on data and digital platforms 
to deliver products and services to consumers and to conduct 
day-to-day business operations.
    That is why CEOs from across industry sectors have come 
together to call for a Federal privacy law that provides 
consistent consumer privacy protections, promotes 
accountability, and fosters innovation and competitiveness. We 
strongly support giving consumers control over how their 
personally identifiable information is collected, used, and 
shared.
    At the same time, it is important to remember the value of 
data in our economy as well as the enormous benefits that data-
driven services provide to our consumers. Data enables 
companies to deliver more relevant and valuable user experience 
to consumers. It allows companies to detect and prevent fraud 
on user accounts and to combat cybersecurity attacks. It 
creates greater productivity and cost savings for manufacturing 
to transportation and logistics and it leads to breakthroughs 
in health and medical research.
    Innovation thrives in stable policy environments where new 
ideas can be explored and flourish within a well-understood 
legal and regulatory framework. So in December, Business 
Roundtable released a proposal for privacy legislation. Our 
proposal is the product of extensive deliberation with the 
chief privacy officers of our companies and approval from CEOs 
across industry sectors.
    We believe that privacy legislation must prioritize four 
important objectives. First and foremost, it should champion 
consumer privacy and promote accountability. Legislation should 
include strong protections for personal data that enhance 
consumer trust and demonstrate U.S. leadership as a champion 
for privacy.
    Second is fostering innovation and competitiveness 
especially in a dynamic and evolving technology landscape. 
Legislation should be technology-neutral and allow 
organizations to adopt privacy protections that are appropriate 
to the specific risks such as the sensitivity of the data.
    Third, it should harmonize privacy protections. Congress 
should enact a comprehensive, national law that ensures 
consistent protections and avoids a State-by-State approach 
that leads to disjointed consumer protections, degraded user 
experience, and barriers to investment and innovation.
    And fourth, legislation should promote consumer privacy 
regimes that are interoperable on a global basis and it should 
seek to bridge differences between the U.S. and foreign privacy 
regimes.
    At the heart of the Business Roundtable proposal is a set 
of core individual rights that we believe consumers should have 
over their data, including transparency. Consumers deserve to 
have clear and concise understanding of the personal data that 
a company collects, the purposes for which that data is used, 
and whether and for what purposes personal data is disclosed to 
third parties.
    Control, consumers should have meaningful control over 
their data based upon the sensitivity of the information 
including the ability to control whether that data is sold to 
third parties. Consumers should also have the right to access 
and correct inaccuracies in their personal data about them and 
they should have the right to delete personal data.
    A Federal privacy law should be comprehensive and apply a 
consistent, uniform framework to the collection, use, and 
sharing of data across industry sectors. It should also 
recognize that there are situations that do justify exceptions 
such as cases of public health and safety, or to prevent fraud 
and provide cybersecurity, or when certain data is necessary to 
deliver a product or a service that the consumer requested, or 
to ensure First Amendment rights and to protect the rights of 
other individuals.
    Establishing and protecting these consumer rights also 
requires effective, consistent, and coordinated enforcement to 
provide accountability and protect consumer rights. Absent 
action from Congress, we will be subject not only to a growing 
confusing set of State government requirements, but also to 
different data protection laws from governments in Europe, 
countries like Brazil, and elsewhere. Make no mistake, 
consumers deserve meaningful, understandable, and consistent 
privacy rights regardless of where they live and where their 
data may be located.
    I thank the subcommittee for its leadership in holding this 
hearing and for encouraging a dialogue and I look forward to 
the questions. Thank you.
    [The prepared statement of Ms. Zheng follows:]
    [GRAPHICS NOT AVAILABLE IN TIFF FORMAT]
    
    Ms. Schakowsky. Thank you.
    Mr. Grimaldi, you are now recognized for 5 minutes.

              STATEMENT OF DAVID F. GRIMALDI, Jr.

    Mr. Grimaldi. Thank you, Chairman Schakowsky, Ranking 
Member McMorris Rodgers, and members of the committee. I 
appreciate the opportunity to testify here today. I am Dave 
Grimaldi, executive vice president for Public Policy at the 
Interactive Advertising Bureau which was founded in 1996 and 
headquartered in New York City. We represent over 650 leading 
media and technology companies that are responsible for 
selling, delivering, and optimizing digital advertising or 
marketing campaigns.
    Today the U.S. economy is increasingly fueled by the free 
flow of data. One driving force in this ecosystem is data-
driven advertising. Advertising has helped power the growth of 
the internet for decades by delivering innovative tools and 
services for consumers and businesses to connect and 
communicate. Data-driven advertising also allows consumers to 
access these resources at little to no cost to them and it has 
created an environment where small publishers and start-up 
companies can enter the marketplace to compete against the 
internet's largest players.
    As a result of this advertising based model, U.S. 
businesses of all sizes have been able to grow online and 
deliver widespread consumer and economic benefits. According to 
a 2017 study, in 2016 the U.S. ad-supported internet created 
10.4 million jobs and added 1.1 trillion to the U.S. economy.
    The study, designed to provide a comprehensive review of 
the entire internet economy and answer questions about its 
size, what comprises it, and the economic and social benefits 
Americans deprive from it, revealed key findings that analyze 
the economic importance as well as the social benefits of the 
internet. And, indeed, as the Federal Trade Commission noted in 
its recent comments to the National Telecommunications and 
Information Administration, if a subscription-based model 
replaced the ad-based model, many consumers would not be able 
to afford access to or would be reluctant to utilize all of the 
information, products, and services they rely on today and that 
could become available in the future.
    The time is right for the creation of a new paradigm for 
data privacy in the United States. And IAB, working with 
Congress and based on our members' successful experience 
creating privacy programs that consumers understand and use, 
can achieve a new Federal approach that instead of bombarding 
consumers with notices and choices comprehensively describes 
clear, workable, and consistent standards that consumers, 
businesses, and law enforcers can rely upon. Without a 
consistent Federal privacy standard, a patchwork of State 
privacy laws will create consumer confusion, present 
substantial challenges for businesses trying to comply with 
these laws, and fail to meet consumers' expectations about 
their digital privacy.
    We ask Congress to standardize privacy protections across 
the country by passing legislation that provides important 
protections for consumers while allowing digital innovation to 
continue to flourish. We caution Congress not to rely on the 
framework set forth in Europe's General Data Privacy Regulation 
or California's Consumer Privacy Act as examples of the ways in 
which a national privacy standard should function.
    Far from being a desirable model, the GDPR shows how overly 
restrictive frameworks can be harmful to competition and 
consumers alike. Less than a year into GDPR's applicability the 
negative effects of its approach have already become clear. The 
GDPR has led directly to consumers losing access to online 
resources with more than 1,000 U.S.-based publishers blocking 
European consumers from access to online material, in part 
because of the inability to profitably run advertising.
    To that unfortunate end, as was pointed out before, I would 
note that the Chicago Tribune, including its Pulitzer Prize-
winning stories on government corruption, faulty government 
regulation, et cetera, is no longer accessible in Europe due to 
GDPR. Additionally, the San Fernando Sun newspaper. which has 
been open since 1904. is no longer accessible, and The Holland 
Sentinel, founded in 1896. can no longer be seen in Europe.
    Small businesses and startups also saw the negative impact 
of GDPR with many choosing to exit the market. Consent banners 
and pop-up notices have been notably ineffective at curbing 
irresponsible data practices or truly furthering consumer 
awareness and choice. The CCPA follows in the footsteps of GDPR 
and could harm consumers by impeding their access to expected 
tools, content, and services, and revealing their personal 
information to unintended recipients due to lack of clarity in 
the law.
    To achieve these goals, IAB asks Congress to support a new 
paradigm that would follow certain basic principles. First, in 
contrast to many existing privacy regimes, a new law should 
impose clear prohibitions on a range of harmful and 
unreasonable data collection and use practices specifically 
identified in the law. Consumers will then be protected from 
such practices without the need for any action on their part.
    Second, a new law should distinguish between data practices 
that pose a threat to consumers and those that do not, rather 
than taking a broad-brush approach to all data collection and 
use. And finally, the law should incentivize strong and 
enforceable compliance and self-regulatory programs and thus 
increase compliance by creating a rigorous safe harbor process.
    IAB asks for Congress' support in developing such a 
framework. We look forward to partnering with you to enhance 
consumer privacy and thank you for your time today and I 
welcome your questions.
    [The prepared statement of Mr. Grimaldi follows:]
    [GRAPHICS NOT AVAILABLE IN TIFF FORMAT]
    
    Ms. Schakowsky. Thank you.
    And, Ms. O'Connor, you are recognized for 5 minutes.

                  STATEMENT OF NUALA O'CONNOR

    Ms. O'Connor. Chairwoman Schakowsky, Ranking Member 
McMorris Rodgers, members of the subcommittee, thank you for 
the opportunity to testify today. My colleagues and I at the 
Center for Democracy & Technology are tremendously excited 
about the prospect of Federal privacy legislation. We 
appreciate your leadership in taking on this challenging issue.
    Privacy and data over the last several decades have become 
full of jargon and overly complexified, so I have one basic 
message today and that is notice and choice are no longer a 
choice. Any privacy legislation that merely cements the current 
status quo of the notice and consent model for personal data is 
a missed opportunity.
    Let me take a moment to demonstrate why that status quo is 
not working for individual consumers and companies. If I could 
respectfully request the Members and their staff to take out 
their phones--some of you already have them out, I hear them 
ringing--and take a look at the home page. Open it up with 
whatever you use to open up your phone. Mine is my fingerprint 
and it is not working. Now look at your home page. How many 
apps do you have? I have 262 apps on my phone. I had 261 until 
Saturday night when the kids said, ``Mom, we want Chipotle for 
dinner,'' and I had to download again the Postmates app, so now 
it is 262. The average person has around 80, according to 
current research. You can call me an overachiever or just a 
working mom.
    But for each of these 80 or so applications you have 
already given the company behind it your consent to use your 
personal data and likely in a variety of ways. For some of 
those apps you are sharing your location data, others your 
financial data, your credit card numbers, some of your apps 
have information about your physical activity, your health, and 
other intimate information even in real time.
    Regardless of the types of data, you have received 80 
notices and 80 different consents have already been given. Do 
you remember the personal data you agreed to consent to give 
and do you remember the purposes for which you shared it? Do 
you have a good understanding of how the companies behind those 
apps and devices are going to use that information 6 weeks from 
now, 6 months or 6 years from now?
    Now let's assume for the sake of this demonstration that 
each of those 80 companies has even just a modest number of 
information-sharing agreements with third parties. Back in 
2015, which is the ancient times of the internet, the average 
smart phone app was already automatically sharing data with at 
least three companies and three different parties. You don't 
know those companies, you don't have a direct relationship with 
them, and now they have your personal information because you 
were given notice and you consented. And that means the average 
smart phone user has given consent for their data to be used by 
at least 240 different entities.
    That doesn't reflect how information is already being 
shared by the companies with vendors, corporate affiliates, 
business partners--in reality, the number is likely much higher 
and that is just what is on your phone. That 240 number doesn't 
account for your other devices, the devices in your daily life 
in your house, in your car, your other online accounts, data 
initially collected in the non-digital world, loyalty programs, 
cameras, paper surveys, and public records. Does that feel like 
you have control over your personal information? But you gave 
your consent at some point.
    Clearly, it is time for a change. Some will say that the 
way to fix this problem is just make more privacy policies, 
more notices, make them clearer so consumers can better 
understand those decisions. More checkboxes will provide the 
appearance of choice, but not real options for consumers. 
Pursuing legislation like this just doubles down on our current 
system of notice and choice and further burdens already busy 
consumers.
    There is fundamentally no meaningful way for people to make 
informed, timely decisions about the many different data 
collectors and processors with whom we interact every day. 
Instead, the goal should be to define our digital civil rights. 
What reasonable behavior can we expect from companies that hold 
our data? What rights do we have that are so precious they 
cannot be signed away?
     The Center for Democracy & Technology has drafted 
comprehensive legislation that is already available and has 
been shared with your staffs. I am happy to answer questions 
about it today. But most importantly, our bill and any 
meaningful privacy legislation must first prohibit unfair data 
practices, particularly the repurposing or secondary use of 
sensitive data with carefully scoped exceptions.
    Two, prevent data-driven discrimination and civil rights 
abuses. Three, provide robust and rigorous enforcement. 
Reasonable data security practices and individual-controlled 
rights, such as the right to access, correct, and delete your 
data are obviously essential. Enacting clear comprehensive 
rules will facilitate trust and cement America's economic and 
ethical leadership on technology.
    Now is the time for real change. You have the opportunity 
to shape a new paradigm for data use and you have the support 
of the majority of Americans to do so. Thank you.
    [The prepared statement of Ms. O'Connor follows:]
    [GRAPHICS NOT AVAILABLE IN TIFF FORMAT]
    
    Ms. Schakowsky. Thank you.
    So we have now concluded our opening statements and we now 
will move to Member questions. Each Member will have 5 minutes 
to ask questions of our witnesses and I will start by 
recognizing myself for 5 minutes.
    So this is a stack of, really, just some of the privacy 
policies of the websites, apps, stores, and other services I 
interacted with just yesterday and actually not all of 
yesterday. I haven't read them all. And I check the weather on 
my phone so I have a privacy policy for that app. I flew into 
town yesterday. I have the privacy policy for the airline and 
for the online travel.
    In order to get onto the plane I had to go my phone. I used 
the app to book the flight. I went to the drugstore and used my 
loyalty card so I have that privacy policy. I checked the news 
online so I have a few privacy policies of a few of the 
newspaper sites that I visited. I watched TV. I went online. I 
used my cell phone. I have a privacy policy for my cable 
provider, my internet service provider, my cell phone 
manufacturer and the operating system, and that is still just 
some of them.
    And at that point did I have the option to proceed--and I 
didn't have the option at any point to proceed without agreeing 
to the terms. And frankly I think like most consumers because I 
am anxious to actually get the job done, I agree. I agree. So 
this stack does not include each of their service providers or 
affiliates or the data broker that gets my information from 
them or a third party advertiser, advertising company or 
analytic company or whoever else is lurking unseen to me and 
unheard and unknown.
    By the way, a lot of these policies are pretty vague about 
what they do with my data and who they share it with or sell it 
to. This is the limitation of the notice and consent system 
that we use right now. A person should not need to have an 
advanced law degree to avoid being taken advantage of. We need 
to find solutions that take the burden off the consumer and put 
some responsibilities on those who want our data.
    So, Ms. Collins-Dexter, can you talk a little bit about 
some of the ways that our data is being used by consumers and 
then, Ms. O'Connor, if you could follow up.
    Ms. Collins-Dexter. Some of the ways in which our data is 
being used by consumers?
    Ms. Schakowsky. We are talking about--oh no, being--I am 
sorry--how it is being used by companies. I am sorry.
    Ms. Collins-Dexter. Yes, it is being used in all sorts of a 
number of ways. And I think to your point earlier, I think even 
if we know our data is being used in a number of ways, even if 
we--black folks, I think a report was released last week that 
said black people are actually more likely to read the fine 
print before they sign onto things on the internet and have 
long believed that their information and data was being sold, 
and yet that hasn't made us particularly safer. We have still 
had to experience all sorts of ways in which our data is being 
used against us.
    Even data points that feel innocuous can be used as sort of 
proxies for protected class. I offered some examples in the 
document that I shared with you. But another example comes from 
the insurance industry in the realm of car insurance, for 
example. Auto insurance telematics devices collect what would 
be considered, quote unquote, non-sensitive data such as 
vehicle speed, the time of day someone is driving, the miles 
driven, the rates of acceleration and braking.
    Those devices aren't collecting what we would consider 
sensitive data such as location and driver's identity, and yet 
that information is being used to like charge people higher 
rates for insurance. And it happens at that people most likely 
to be driving at night, most likely to be braking, all of these 
things are usually like working, lower-class people.
    Ms. Schakowsky. If I could interrupt, and we will get more 
of that. But I want to see if Ms. O'Connor wants to add at 
least one thing to this.
    Ms. Collins-Dexter. Sure.
    Ms. O'Connor. Thank you so much.
    There is a primary purpose for data. When you give your 
data to a company to deliver that yellow sweater they need to 
know your name and address. That makes sense. There are 
secondary purposes in terms of business processing and 
activities that might be legitimate, where we feel in our draft 
legislation the secondary purpose for sensitive data, like, for 
example, the fingerprint I was using to open my phone, I want 
to be able to open my phone with that, I don't want that 
sensitive biometric data used for a secondary purpose by that 
company or by other companies.
    So we would say there is a higher level of sensitivity 
around biometric data. Intimate or immutable information about 
you deserves a second, a higher level of care. And also there 
is sharing, obviously there is your data going from a first 
party to an entirely separate third party in the transaction 
that would lead to concern and those parties should be bound by 
the promises that first party made.
    Ms. Schakowsky. Thank you. And now let me recognize our 
ranking member, Cathy McMorris Rodgers.
    Mrs. Rodgers. Thank you, Madam Chair. I appreciate again 
everyone being here, and I do believe that there is bipartisan 
support to move forward so that we can ensure strong protection 
of personal data that will ensure that we are improving upon 
consumer trust and demonstrating U.S. leadership in privacy and 
innovation.
    I am concerned about the patchwork of privacy and security 
laws that I see coming at the State level. And we are moving 
forward in Washington State, there is a debate going on as well 
as other States that are taking action that I believe are going 
to lead to higher cost and impact on consumers. It is actually 
going to increase their prices and reduce the options that 
consumers have.
    I would like to start with Dr. Layton and just ask the 
question, do you think that it is important for one Federal 
privacy law to set that national standard and, if so, just 
explain some more why.
    Dr. Layton. Thank you for the question. I was heartened to 
hear our panelists and our representatives agree that we do 
need a comprehensive Federal standard.
    Because California is such a large economy, if it can go 
forward with its particular rules it can dictate the rules for 
the rest of America. We have talked a lot about rights here on 
this panel and all of Americans have rights and it isn't fair 
that one State gets to dictate for everyone else. We should 
certainly look at California and learn from them, but it is, as 
I understand, a law that came together in 1 week and that was 
their choice about how they did it. So I certainly agree that 
we need a national standard.
    Mrs. Rodgers. I would like to ask Mr. Grimaldi and Ms. 
Zheng if you also would address this question and if your 
members agree with the one national standard.
    Mr. Grimaldi. Thank you, Congresswoman, we do. But make no 
mistake, we are very much in favor of the concepts of 
transparency and accountability and choice which are the 
bedrocks of CCPA and the reason that Californians came together 
to rally behind a law and the merits in it.
    But to echo what Dr. Layton said, that patchwork could have 
incredibly negative effects on the American internet economy 
because it will force compliance costs not just on California 
companies but on all companies in America. It will imbalance 
what the larger providers can pay for those compliance costs 
and to retrofit their systems and to get ready to field what 
will be likely a barrage of lawsuits and, quite honestly, just 
fewer users, meaning fewer advertising costs once the 
enforcement of CCPA goes into effect in January.
    And that is not indicative of a good privacy policy that 
provides to consumers what they currently enjoy, their content, 
their news, their video, and everything else.
    Ms. Zheng. I also completely agree. Thank you for that 
question, Ranking Member McMorris Rodgers.
    I think from the Business Roundtable perspective a national 
consumer privacy law should not mean that consumers get less 
protections than currently exist, but if we set the standard at 
an appropriate level it can mean that every American across 
this country has protections that they don't currently have. So 
when we developed our proposal we looked at the California law. 
We looked at GDPR. We looked at other State proposals and FTC 
authority and tried to take the best practices of each of these 
individual laws in developing our proposal.
    Mrs. Rodgers. Great. And just as a follow-up, I think as we 
move forward we need to be very concerned about making sure 
that we are protecting individuals' privacy but also ensuring 
that we are not becoming too regulatory, that the regulations 
are not too complex and through the regulations actually 
helping, or like the largest actors can pay those costs but it 
will make it harder for our startups and our innovators to get 
into the marketplace.
    Dr. Layton, would you just address what you have seen with 
GDPR to date as far as the impact on businesses or innovators?
    Dr. Layton. Yes. Well, in the case of the European Union, 
you have a data protection authority in each State and you have 
a super regulator overseeing that. And when this has come into 
play there was no training, there was no funding to help the 
particular agencies to get up to speed. They are not all 
equipped with the same set of skills. Some regulators may have 
worked there their whole life, other ones may be new. They have 
a different set of expertise. So, and each country had its own 
particular rules. And this issue and question around how do 
they manage this going forward that even the framers of the 
GDPR themselves said it will be 2 years before we have a 
judgment because of the actual process and how long it takes 
and so on.
    So in the minds of the Europeans that this was also an 
important what they see as a way to empower government that 
they are looking to place people in jobs. They expect that they 
were going to have 75,000 more bureaucrats working in these 
particular jobs to look over the privacy and so on. So it is--
they are sort of--it reflects what is going on in the EU today 
is a desperation. There are many people dissatisfied with the 
European Union. You probably know about Brexit. And this is a 
way that the EU is trying to respond to demonstrate to 
constituents that the EU can do something and it is not, you 
know, in the U.S. we might say, well, let's make it better or 
innovate----
    Ms. Schakowsky. If you could wrap up.
    Dr. Layton. Yes. So that was my point. Thank you.
    Mrs. Rodgers. Thank you. I will yield back.
    My time is expired.
    Ms. Schakowsky. Now the gentlelady from Florida, Kathy 
Castor.
    Ms. Castor. Thank you. You know, Americans are increasingly 
fed up with the violation of their privacy by online companies. 
There is just simply a massive amount of data being collected 
on each and every person. And then when that data is used, 
misused without their permission, or there is a breach of their 
financial data or their health data, I mean that is, it is 
really outrageous we have let it get this far. And I think 
American consumers understand that this needs to be fixed.
    So I want to thank Chairwoman Schakowsky for calling this 
hearing, and I look forward to working with her and the other 
Members on this committee to adopt strong privacy protections 
for American families and consumers.
    Ms. O'Connor, help us assess the current state of 
Americans' online privacy protections. Let me know if you agree 
or disagree with these statements. Currently there is no 
general Federal law that requires online companies to have 
privacy policies or protect our privacy. Is that correct or not 
correct?
    Ms. O'Connor. That is correct.
    Ms. Castor. And there is no general Federal law that 
requires an online company to secure our personal information 
or notify a customer if his or her personal information has 
been stolen. Is that correct?
    Ms. O'Connor. That is correct.
    Ms. Castor. And the only way the Federal Trade Commission 
is able to examine companies that violate our privacy is 
through Section 5, unfair or deceptive acts or practices 
authority, which basically means that companies can do whatever 
they want with our data as long as they don't lie about what 
they are doing. Is that right?
    Ms. O'Connor. That is correct.
    Ms. Castor. So is it accurate to say that a bad online 
actor can collect all sorts of very personal information such 
as your location, your birthday, your messages, your biometric 
data, your Social Security Number, political leanings without 
your permission and sell it to the highest bidder as long as 
they don't lie about what they are doing?
    Ms. O'Connor. That is pretty accurate.
    Ms. Castor. Well, that is outrageous. And I think that is 
why American consumers now have--there has been an awakening to 
what has been happening. They understand this now and they are 
demanding strong privacy protections.
    One of the areas that concerns me the most, Ms. Collins, is 
the data that is collected on children. There is a bedrock 
Federal law, the Children's Online Privacy Protection Act, that 
is supposed to protect kids from data being gathered on them 
and being targeted, but it was signed into law over 20 years 
ago. And think about how much the internet has changed in 20 
years, the apps that are available to kids, the toys that talk 
to them and gather data.
    Do you agree that COPPA needs to be updated as well?
    Ms. Collins-Dexter. Yes, I do. Can I expand on that a 
little more?
    Ms. Castor. Please. I noticed in your testimony you cited a 
Cal Berkeley study where they identified how many apps targeted 
to kids that are probably gathering their data. Could you go 
into that in greater detail?
    Ms. Collins-Dexter. Yes. Yes. So I mean, I think a 
general--COPPA is the only Federal internet privacy law on the 
books and beyond that I think it is a solid blueprint for what 
comprehensive privacy legislation could look like with an opt-
in model and placing obligations on companies for adequate 
disclosure. But as you point out, it is 20 years old and, like 
the Civil Rights Act, it does not account for the digital 
economy we are immersed in today.
    So as I mention, a Cal Berkeley study found that thousands 
upon thousands of children's apps currently available on Google 
Play violate COPPA. The fact that the market is flooded with 
data collection apps and devices targeted at kids like Echo 
Dot, CloudPets, Furby Connect, and others should alarm us. More 
than one-third of U.S. homes have a smart toy. And so it is 
really important for us to like really, you know, think of the 
implications of that as we look to modernize that legislation.
    Ms. Castor. Because we kind of have an understanding now 
that online companies are building profiles on all of us with 
huge amounts of data. But they are doing this to our kids now, 
notwithstanding the fact that we have a Federal law that 
supposedly says you can't do this. Is that right?
    Ms. Collins-Dexter. That is correct.
    Ms. Castor. Ms. O'Connor, I don't think the average 
American parent understands that the apps and the toys that are 
provided, you know, for their kids to have fun and play games 
are creating these shadow profiles. Is that accurate?
    Ms. O'Connor. I work in technology and I have many, many 
children and I feel overwhelmed with the choices and the lack 
of transparency about not just their online environment, but as 
you point out correctly the devices in our daily lives, even 
the toys and what they can and cannot collect. And it doesn't 
necessarily matter that it is identifiable by name if it is 
targeting you based on your habits and preferences and choices 
that could close their world view as opposed to open it up, 
which is what we would hope the internet would do.
    Ms. Castor. Thank you very much. I yield back.
    Ms. Schakowsky. I now recognize the ranking member of the 
full committee, Mr. Walden, for 5 minutes.
    I am sorry? Oh, I am sorry. Was that wrong?
    OK, let me recognize Mr. Upton for 5 minutes.
    Mr. Upton. Thank you, Madam Chair. It is a delight to be 
here. I know that Mr. Walden is at the other hearing. I think 
he intends to come back.
    Ms. Zheng, I think that we all recognize that the elephant 
in the room is truly we can have a system that is 40 or 50 with 
States or we are going to have one standard. What is the 
perception from the number of companies that you represent from 
the Business Roundtable in terms of how they would have to deal 
with maybe as many as 30 or 40 different standards, as I would 
figure that a number of States might join up with and team up 
with others? What is the reaction to that? It goes along with 
what Ms.----
    Ms. Zheng. Yes, we strongly believe that a fragmented sort 
of regulatory environment where we pursue a State-by-State sort 
of regulatory approach to privacy makes for very inconsistent 
consumer protections. It also creates massive barriers to 
investment and innovation for companies that have to operate in 
all of these different States. It is simply unworkable.
    And so that is why we think it is necessary to have a 
single national Federal privacy law that preempts State laws. 
And I think the assumption that preemption weakens existing 
privacy protections is a false assumption. You know, we 
strongly believe that a Federal consumer privacy law should be 
strong and should provide additional protections for consumers 
that are consistent across every State in the country.
    As I think, you know, folks here mentioned earlier, 
devices, data, people, they constantly move across borders, 
across States. A State-by-State approach just simply doesn't 
work for this type of domain. And, in fact, even when you look 
at California's own privacy law, there is a rather strong 
preemption clause in the California law that preempts city, 
county, and municipality laws within the State of California, 
likely for exact same reason why a Federal privacy law should 
preempt State laws.
    Mr. Upton. And are you aware, is anyone tracking what the 
other 49 States might be doing?
    Ms. Zheng. We are. I think a lot of folks on this panel are 
as well.
    Mr. Upton. Yes. And are any of those States getting close 
to something like California has done? I know it is a new 
legislative year for many States, but----
    Ms. Zheng. There are a number of----
    Mr. Upton [continuing]. What are your thoughts on where 
other States may be?
    Ms. Zheng. Yes. I think there are roughly about 30 
different State legislative proposals related to privacy. They 
all take, many of them take very, very different approaches or 
regulate certain types of sectors. Some of them are more 
general. Some of them may be focused on specific types of 
information that are personal. But what it demonstrates is that 
there is a ton of interest within the States and they are not 
taking a coherent, consistent approach.
    Mr. Upton. And what are your thought--do you think that any 
of these States will actually do anything yet this calendar 
year or not? I know that it is early.
    Ms. Zheng. It is hard to say, but I think it is highly, 
highly likely that a number of States will pass privacy laws 
this year.
    Mr. Upton. I know I don't have a lot of time left as I ask 
my last question, but I thought that Mr. Grimaldi had some very 
good comments in his testimony about four different parts to 
achieve the goals. One, to have clear prohibitions on a range 
of harmful, unreasonable data collection; two, is that the new 
laws should distinguish between data practices that pose a 
threat to consumers and those that don't; three, that the law 
should incentivize a strong and enforceable compliance and 
self-regulatory programs; and, finally, that it should reduce 
consumer and business confusion by preempting the growing 
patchwork of State privacy laws.
    As it relates to the first three, knowing where I think I 
know you all are in part four, where are you in terms of your 
thoughts as to those first three principles? And maybe if we 
can just go down the line and we will start it with Ms. 
Collins-Dexter as to whether she thinks that is a good idea or 
not, briefly, knowing that I have a minute left.
    Ms. Collins-Dexter. Could you repeat that one more time? 
Apologies. I was like taking furious notes.
    Mr. Upton. So Mr. Grimaldi had three, four points of which 
I think that the first three that I would like to focus on. 
One, that the clear, have clear prohibitions on a range of 
harmful and unreasonable data collection and use practices 
specifically identified by the law, these are goals for 
legislation. Two, that the new laws should distinguish between 
data practices that pose a threat to consumers and those that 
don't. And third, that the law should incentivize a strong and 
enforceable compliance in self-regulatory programs.
    So I guess now we just have to go to yes or no with 20 
seconds left.
    Ms. Collins-Dexter. Yes.
    Mr. Upton. Dr. Layton?
    Dr. Layton. Yes.
    Mr. Upton. Ms. Zheng?
    Ms. Zheng. Yes.
    Mr. Upton. And Ms. O'Connor?
    Ms. O'Connor. Yes.
    Mr. Upton. OK.
    Ms. O'Connor. The self-regulation alone is not going to be 
enough. That was revolutionary in 1999, but it is no longer 
sufficient to protect consumers today.
    Mr. Upton. My time has expired. Thank you.
    Ms. Schakowsky. I now recognize Mr. Veasey for 5 minutes.
    Mr. Veasey. Thank you, Madam Chair. You know, earlier, in 
Ms. Collins-Dexter's testimony something really, you know, 
concerned me and really hit home for me when she was talking 
about, you know, how poor people are being targeted for some of 
this marketing and these privacy issues that we are having. And 
for a lot of the people that do fall within that category, it 
is going to be very important for them that these services 
remain, quote unquote, free, whatever free is. And of course we 
know that nothing is really free.
    And what is so troubling about that is that in our society 
obviously we live in an economy that is based on profit and 
gain. What is the sweet spot? I would like to know maybe from 
Ms. Zheng or Mr. Grimaldi from a business standpoint what is 
the sweet spot? How can you still provide these services for 
free for the constituents that I represent and the people that 
Ms. Collins-Dexter was talking about, how do you preserve them 
being able to access this without them having to pay additional 
fees, but the market research and the other things that go 
along with these services being free, and how do you combine 
all of that? Is there a real sweet spot in all of this?
    Ms. Zheng. So I think--thank you for that question, 
Congressman. It is a really important issue and I am glad that 
you raised it and I am glad that Ms. Collins-Dexter raised it. 
It is complex. It requires additional attention. There is 
significant technical, legal, and ethical considerations as 
well. Companies should not be using personal data about 
consumers to make discriminatory decisions in the areas of 
employment, housing, lending, insurance, or the provision of 
services.
    But defining that line between using an algorithm to 
discriminate against consumers and using it to target, for 
example, ads in Spanish to Spanish-speaking consumers is 
challenging. So we need to be mindful of some of the more, 
these legitimate uses of certain demographic information that 
enable products and services to be better tailored to a 
consumer.
    But we recognize that this is a really important issue as 
is the, you know, differential pricing issue that you raised. 
Although we have significant concerns with the particular 
approach taken in the California law, we welcome the 
opportunity to work with the committee on this issue and 
consider different proposals though. Thank you.
    Mr. Veasey. For the areas where these companies are trying 
to obviously maximize their return on investment where they 
need control groups and run tests, can that still happen, Mr. 
Grimaldi, with more consumer protection? And obviously the 
consumer protection is definitely needed. I think that you can 
just listen to just a very few minutes of today's testimony and 
realize that.
    Mr. Grimaldi. Correct, Congressman Veasey. Associating 
myself with Denise's comments, we need to break apart any 
discriminatory practices from good practices. And you mentioned 
the value exchange that goes on behind consumers transacting 
their business on the internet and Chairman Schakowsky went 
through a long list of what she has only done in the last 48 
hours going to a store, taking a flight, et cetera. Those are 
useful practices that people come to accept. However, that 
information cannot be gamed for reasons of eligibility, of 
discrimination, of price discrimination. Our industry is 
absolutely against that.
    There is a self-regulatory code that our companies adhere 
to in the Digital Advertising Alliance, a body that we stood 
up, stipulating to what Ms. O'Connor has said in that self-
regulation, the reason that we are here, we need help apart 
from self-regulation. We are here to partner with Congress to 
say it is past time, we are overdue in a national framework 
that speaks to these issues.
    But yes, there are good uses. There are harmful uses. That 
is what we need to break apart and distinguish.
    Mr. Veasey. Madam Chair, I yield back. Thank you.
    Ms. Schakowsky. I now recognize the ranking member of the 
full committee, Mr. Walden.
    Mr. Walden. Thank you, Madam Chair. And as you know we have 
another hearing going on upstairs, so I'm having to bounce back 
and forth.
    In the United States we currently enjoy an environment that 
allows small to medium-sized companies to grow, to raise money 
and compete and in large part because they do not have to come 
to the government to get their business plans approved and how 
we have successfully legislated based on well-defined risks and 
harms.
    Dr. Layton, if data sharing and privacy is regulated 
differently by individual States in the U.S., what will that do 
to the American marketplace?
    Dr. Layton. So assuming this could pass a court challenge, 
because I think it would violate the commerce clause as we 
discussed, I don't see how it is possible you can send products 
into other States if you are a retailer in Maine and you have 
to send your products to 50 different States and you have to 
have 50 different ways to do it. I don't see why you would 
start that business. I think you would move to another 
industry.
    Mr. Walden. So how has GDPR impacted Google's market share 
in the EU?
    Dr. Layton. It has increased since it came into effect.
    Mr. Walden. And I think that is what we are showing right 
here on the slide that nobody could read from afar, I am sure. 
Maybe we can put it on the big screen and take me off, which 
would be a pleasant experience for everybody. But I don't have 
a copy of that here at my desk.
    [Slide.]
    Mr. Walden. But I think what you are seeing here is that 
small innovators are actually leaving this space, right? And 
investment in small entrepreneurs is going down in Europe and 
going up in the United States since GDPR was put in place. Is 
that accurate?
    Dr. Layton. Yes. So this particular graph is looking at 
what is, what they are highlighting here is the competitor, the 
analytics competitor. So Google Analytics is running on a lot 
of websites and depending on the company they may have multiple 
competitors to Google Analytics. Retailers have a set, you 
know, different sorts of areas.
    So essentially some media companies, some larger firms are 
kicking off the smaller competitors for their--they are kicking 
them off, so that means that those trackers have not been 
firing. That is what this is measuring.
    Mr. Walden. Yes. My understanding shows that shortly after 
GDPR was implemented, Google's market share increased by almost 
a full percent and smaller ad tech firms suffered losses of 
anywhere from 18 percent to almost 32 percent. GDPR has proven 
to be anticompetitive and makes it more difficult for small 
businesses to compete and just one example of that negative 
impact. Now there may be other things going on affecting these 
numbers, I will stipulate to that. But clearly GDPR has had an 
effect.
    Mr. Grimaldi, since GDPR has been in effect, academic 
research shows that investments in startup companies in the EU 
have dropped by an aggregate of 40 percent, 4-0. Compare that 
to the United States, where in 2018 investments and startups 
neared $100 billion, which is the highest year since the dot-
com boom, protecting consumers including protecting them from a 
marketplace devoid of choices so they are forced to use certain 
products or services.
    What should an American approach to data privacy look like 
and that does not hamper small business and investment?
    Mr. Grimaldi. Thank you, Chairman. You are correct. We are 
seeing that fall off in Europe and it is not because--I listed 
some newspapers at the beginning that are not currently 
operating in Europe and it is not because they are not 
complying with the law and it is not because they were at 
fault. It is because they just can't afford that kind of a 
pivot to construct their services that could be at legal risk, 
at great legal risk.
    This is one of the many things that we are seeing with CCPA 
that is going to be a major deterrent, if not a killing blow, 
to American companies that can't deal with the labyrinth in 
construct of new regulations in California, or other States 
that might force them to take down their online advertising 
funding regime for fear that they could be susceptible to a 
major lawsuit because they did not classify or categorize data 
in a way that could be returned to consumers.
    Because they currently, these companies don't have those 
structures in place and now in order to do something that again 
I stipulate was correct in its founding--transparency, choice, 
accountability--is now potentially going to force companies to 
say we just can't afford to retrofit all of our systems and be 
able to collect that much data, and even if we do there is a 
litigation risk that we wouldn't be able to swallow. So.
    Mr. Walden. Could you put that litigation risk in common 
person's terms? What are we talking about here if you are a 
small business online?
    Mr. Grimaldi. Correct. Under CCPA some of the provisions--
and we are active as I think many in this room are in dealing 
with the California Attorney General's Office, former 
Congressman Xavier Becerra being that Attorney General. He is 
taking a look at the current law and promulgating it to be 
enforced in January. The litigation risk could mean that if a 
consumer requests their data from a company, if a consumer 
reaches out and says, ``What do you have on me and how is it 
shared,'' a company has to be able to provide that in a certain 
time frame. And if it doesn't, it is in violation of the law. 
That litigation risk you can compound into the thousands or 
hundreds of thousands of requests that will multiply into the 
millions and billions of dollars. And that is something that 
smaller companies would not be able to deal with.
    Mr. Walden. My time has expired. I thank all of our 
witnesses for enlightening us in this issue. Thank you.
    Ms. Schakowsky. And now I yield to the chairman of the full 
committee, Mr. Pallone.
    Mr. Pallone. Thank you, Madam Chair. I wanted to build on 
your questions. Some uses of our data is certainly concerning. 
This committee has explored many of them, Cambridge Analytica's 
use of people's data to manipulate their political opinions and 
influence their votes, for example. And we had hearings with 
Equifax, Facebook, and Twitter.
    We can't begin to reveal just how little we all know about 
who is collecting our data or what they are actually 
collecting. And I think many of us have this vague idea that 
everyone is collecting everything and that there is nothing we 
can do about it, but in my opinion that is not acceptable 
because some data maybe just shouldn't be collected at all.
    So in that vein I wanted to ask Ms. O'Connor, data 
collection has become extremely profitable leading some 
companies to collect every bit of data they can, but is there a 
line that shouldn't be crossed? Should there be some limits on 
actual collection?
    Ms. O'Connor. It would be our position that yes, at least 
as to the most sensitive information there should be very clear 
notices and awareness on the part of the consumer, again the 
example I used of my fingerprint in my phone being collected 
for one purpose, not being used for any other. When I use a map 
app they obviously need to know my location. I do not want that 
location sold or transferred.
    Are there types of data that shouldn't be collected at all? 
In our bill, in our proposal we look very seriously at issues 
of precise geolocation, biometric information, children's data, 
content of communications, and health information as deserving 
higher sensitivity and higher protections.
    Mr. Pallone. All right. Let me ask Ms. Collins-Dexter, how 
do you think we should be--well, how should we be thinking 
about limits on collection and what about limits on sharing, 
sharing with or selling to third parties?
    Ms. Collins-Dexter. I echo Ms. O'Connor. I think we should 
be looking at all of this right now. Companies have a financial 
incentive to collect as much information as they can and store 
it forever with no obligation not to do that. I think we have 
to have meaningful data minimization requirements. I think we 
have to definitely look at the various ways in which 
information is often used as a proxy for race.
    So, for example, we know that Facebook and a lot of big 
tech companies actually don't collect explicitly race data. 
However, many things around geolocation and daily habits are 
able to like put together this data profile in which like 
people are able to ascertain race and that is used for 
predatory marketing practices.
    And so we have to be able to like parse through all of that 
information and keep a constant eye on impact, which I think 
should be at the core of any legislation that we are looking 
at.
    Mr. Pallone. Thank you.
    Ms. O'Connor, what about limits on sharing with or selling 
to third parties?
    Ms. O'Connor. Absolutely. We put those in two separate 
buckets. First, limits on sharing again for the most highly 
sensitive of the categories I mentioned, particularly things 
that are immutable or most intimate about you. On selling we 
would also put limitations, or sharing with third parties that 
the third parties would have to be bound by whatever promises 
the first party made about that data.
    So absolutely, we would look very hard and limit secondary 
use and third-party sharing.
    Mr. Pallone. Thank you. I just wanted to ask about limits 
on sharing people's information with affiliates, because we 
know that many corporations own multiple affiliated companies 
that the average person would not contact, like YouTube, 
Android, and DoubleClick are all owned by Google, or Jet.com 
and Sam's Club both owned by Walmart. Data collectors who say 
they don't sell data to third parties may still want to share 
that with their affiliates.
    So let me ask Ms. Collins-Dexter, should there be limits on 
sharing people's information with these corporate's affiliates?
    Ms. Collins-Dexter. Yes, absolutely. We should definitely 
be looking at how these third party companies are operating as 
we saw with Facebook last week and as we continue to see with, 
as you all have mentioned, Cambridge Analytica and others. You 
have these third-party data mining companies that aren't 
regulated, aren't looked at. They are gathering data, scraping 
it, selling it to companies for predatory marketing purposes, 
selling them to like law enforcement without our consent and 
because we don't even know that these companies are looming in 
the background it really even further limits our choice or 
ability to say no.
    Mr. Pallone. And just quickly, Mr. Grimaldi, behavioral 
ads, advertising needs data to target as to the most 
appropriate audiences. How would limitations on the collection 
and retention affect your member companies? Are there limits 
that can be established through legislation that provide 
reasonable protections to consumers that your member companies 
would accept?
    Mr. Grimaldi. Sure, thank you. We currently have a very 
robust, self-regulatory program that is targeted to consumers 
having transparency into their online behavioral advertising 
and the ability to click through the ad via an icon in the 
upper right corner of every ad that is served over a trillion 
times per month that takes you to a page that says, why am I 
seeing this ad and how can I stop seeing it?
    There is tremendous uptake in terms of people going through 
that ad up to the tune of about 70 to 80 million unique 
impressions. So we offer that control. One of the messages 
today before you is as much as we are trying to educate 
consumers on that there is still a need for a Federal program 
that can help us distinguish what kind of advertising is 
working, what is considered harmful and what do consumers need 
to know.
    Again before they click on something it could be something 
that is very much tailored to what they are looking for, an ad 
that speaks to them. We have much research that shows that 
consumers prefer targeted behavioral advertising rather than 
generic advertising, but we want to make sure consumers have 
those controls so that they can stop seeing those ads and again 
that could be enshrined.
    Mr. Pallone. Thank you.
    Ms. Schakowsky. And now I yield to Mr. Latta, the former 
chair of this subcommittee and my friend.
    Mr. Latta. Well, thank you very much. If I could ask just a 
quick point of personal privilege and congratulate the Chair on 
assuming the gavel. So congratulations, it is a great 
subcommittee.
    And Madam Chair, before I begin I would also like unanimous 
consent to enter into the record excerpts from the WHOIS report 
from the Department of Justice Attorney General's cybersecurity 
task force.
    Ms. Schakowsky. Sorry. Without objection, so ordered.
    [The information appears at the conclusion of the hearing.]
    Mr. Latta. Thank you, Madam Chair, if I could reclaim about 
30 seconds there.
    Last Congress, the Energy and Commerce Committee held 
nearly a dozen hearings discussing privacy and security issues. 
That includes much publicized hearings where we heard from the 
CEOs of Facebook and Twitter about how the companies collect, 
safeguard, and use data. From those hearings it was clear that 
while these companies provide a service that Americans like, 
consumers aren't always clear about what happens with their 
personal information.
    With the California law slated to take effect at the 
beginning of next year, time is of the essence. In divided 
government it is not always easy to tackle the tough problems, 
but I believe the time is right to work together on a Federal 
data privacy solution. Both consumer groups and business 
organizations have come onboard in calling for a national 
standard. We all agree that consumers should have transparency 
and accountability and that we want to ensure that the United 
States stays the prime location for innovation and technology.
    Dr. Layton, if I could ask you, I have been hearing from 
many groups regarding the loss of access to information about 
domain name registration or the WHOIS data and the role it 
plays in protecting consumers. Would you explain how WHOIS 
increases online transparency so that consumers may have a 
better understanding of who they are interacting with online?
    Dr. Layton. Right. So the WHOIS database, for just lack of 
a better way, would be a sort of address book for the internet, 
who is registered, who owns what particular domain.
    Mr. Latta. And following up, would you want to comment on 
how the GDPR is creating challenges to accessing that data?
    Dr. Layton. Absolutely, so one of the key problems is that 
because of its ability to retract information, that people 
are--that the domain name registers are masking their identity. 
This is making it very difficult for law enforcement to find 
out perpetrators of crimes. It is also an issue to if you need 
to contact things where intellectual property, for example.
    So there are many concerns with this and this reflects, you 
know, our historical view of privacy of prioritizing the right 
to know. We believe that the public has a right to know about 
these things.
    Mr. Latta. Well, could you go into a little more depth 
about on how, you know, that information helps in identifying 
those bad actors and those criminals that are out there and 
that law enforcement needs to be able to find those individuals 
and bad actors?
    Dr. Layton. Right. Well, in just the same way that if you 
looked at a phone book and you would see, well, you know, a 
certain address and this place, who lives at that address, I 
mean that is a key function of law enforcement. So if you are 
taking that away for the internet for global, for law 
enforcement everywhere that it is a serious problem.
    Mr. Latta. And if you could list your top three concerns 
for the GDPR and also the CCPA which is the California law?
    Dr. Layton. Sure. Well, I would say the first concern from 
the U.S. perspective would be First Amendment free speech 
concerns that the level of government requirements is so high 
that it reduces expression. That would be number one. I would 
certainly say safety would be number two with regard to just 
what you described. You have other issues with people who have 
committed crimes in the European Union who are asking that 
their records be erased or removed that have committed murders, 
child molestation, and so on. That is a serious problem.
    And I would say thirdly, the sort of a dumbing down of 
consumers that there is creating a false sense of security that 
somehow that regulators have the answer on what to do, it 
doesn't allow consumers to take responsibility for when they go 
online. And I would add number four, which is I think that you 
are freezing in place technologies and you don't let them 
evolve.
    So, for example, the EU will require using certain kinds of 
data protection technologies, but we can actually make them 
better. So if you require a company to do technology A today, I 
can invent technology B tomorrow and I am not allowed to 
upgrade to it. So that is a major problem as well.
    Mr. Latta. All right, I appreciate it very much and I yield 
back the balance of my time.
    Mr. O'Halleran [presiding]. Next will be Mr. Lujan, New 
Mexico.
    Mr. Lujan. Thank you very much, Mr. Chairman, for this 
important hearing. Let me jump into this.
    In 2000, the FTC recommended that Congress enact a consumer 
internet privacy legislation. That was 19 years ago. This 
subcommittee held a hearing after the Equifax breach in 2017. 
We had Mark Zuckerberg before the full committee in April 2018. 
The 115th and previous Congresses failed to pass meaningful 
privacy protections even though there were commitments made to 
the American people.
    So as we jump into this, Ms. O'Connor, an entire economy 
based on data has been built but we didn't stop to consider the 
risks and potential downsides companies collecting data have 
put consumers at risk.
    Mr. Grimaldi, in your testimony you say that the law should 
incentivize strong and enforceable compliance and self-
regulatory programs by creating a safe harbor process, but I am 
concerned that incentives won't be enough. We need some 
accountability. So what one of the ideas that we have is to 
require companies to conduct risk assessments, if you want to 
process data for consumer-related uses you need to assess the 
foreseeable risks of such uses.
    So, Ms. O'Connor, yes or no, should we require risk 
assessments so companies factor the risk and potential harms in 
their decision making?
    Ms. O'Connor. Certainly the concept of risk assessments or 
privacy impact assessments has been around since even before 
those FTC hearings, which I attended in the year 2000 and 
before, and certainly that is part of a robust privacy program. 
But we do want to be mindful of the burden on small businesses 
and make sure that the legislation that is comprehensive is 
elegant and efficient. It is simple. It is streamlined and easy 
for a small, a medium, and a large company to know what the 
rules are and to abide by them.
    So while I am certainly in favor of and I have implemented 
a number of PIAs or risk assessments in my time in the 
government and in the private sector, I want to make sure that 
the law is simple and clear for consumers and for companies.
    Mr. Lujan. So assuming the same disclaimer holds true to 
the next question, yes or no, should we require a privacy 
protection officer at companies that collect large amounts of 
data who would be responsible for training staff, conducting 
audits, working with authorities, and advocating for privacy 
with the entity?
    Ms. O'Connor. Yes.
    Mr. Lujan. There is a great editorial that was authored in 
Forbes, January 15th, 2019, titled ``2019 Data Privacy Wish 
List: Moving From Compliance To Concern.'' I would ask 
unanimous consent to submit it into the record.
    Ms. Schakowsky [presiding]. Without objection.
    [The information appears at the conclusion of the hearing.]
    Mr. Lujan. In it one of the points that was made here is 
from a move from privacy compliance to concern and care. That 
``rather a philosophy that treats data with extreme care and 
with prevention of data breaches in mind,'' that that is 
something that companies should be doing. So that is where I am 
thoughtful from a incentive prospective, but what we must be 
doing going forward.
    Ms. Collins-Dexter, you highlighted in your testimony some 
important aspects here. And I am concerned about implications 
for access to housing, lending, digital redlining, and voter 
suppression as we talked about information that is shared that 
is sensitive. Would you agree that this is a problem?
    Ms. Collins-Dexter. Yes. I absolutely do.
    Mr. Lujan. Have companies responded when it has been 
brought to their attention that their products or services are 
having discriminatory effects?
    Ms. Collins-Dexter. On the whole, no, it has not. We have 
sat at the table. Part of our model is a corporate 
accountability model which requires direct engagement in 
negotiation. We have sat at many companies, Facebook included, 
for many years and have a lot of discussions with them. And for 
every policy they develop we tend to find weeks, days, months 
later that the problem is really much larger than what was 
initially indicated. And so self-regulation has not proven to 
be a viable option.
    Mr. Lujan. So with that being said, have the responses from 
industry been adequate in this space?
    Ms. Collins-Dexter. Have the responses from the industry?
    Mr. Lujan. Been adequate?
    Ms. Collins-Dexter. No.
    Mr. Lujan. Are there changes companies have made 
voluntarily that should be made into law? And we can get into 
the details, just yes or no.
    Ms. Collins-Dexter. Yes.
    Mr. Lujan. So we would be happy to work with you in that 
space.
    Mr. Grimaldi, the IAB represents over 650 media and 
technology companies that together account for 86 percent of 
online advertising in the U.S. You heard the quote that I 
referenced from this editorial. Are these companies looking to 
protect my privacy when they are making business decisions?
    Mr. Grimaldi. Congressman, they are. They are without a 
doubt. One of the things again why we are here today is to ask 
government to fill in those holes that we can't fill in. Should 
there be mandatory components of a privacy policy that does not 
let a user accidentally click something to give consent? Is 
there other pieces where we could work with you on 
strengthening what we already have put in the market for 
consumer controls.
    Mr. Lujan. Let me ask a question as my time expires and I 
will be happy to submit that to the record so we can get a 
response. Would you agree that companies need to shift to a 
philosophy that treats data with extreme care with prevention 
of data breaches in mind?
    Mr. Grimaldi. I think what needs to be defined are those 
unreasonable and reasonable uses of data. Again many on the 
committee have said we use data, we give our data to certain 
apps or to certain programs to help us every day. Is that data 
being used for those purposes? Are there harmful uses of data? 
I think the absolute answer is yes. Are there guardrails we can 
put around it, more self-regulation, more partnership, yes.
    Mr. Lujan. Madam Chair, just as my time has expired and I 
thank you for the latitude here, it just seems that we wouldn't 
be here today if, in fact, there was an effort to concern and 
care versus just compliance. And I think that is what we are 
looking for is how can we work on this collectively and 
together such that we get to that point. So I appreciate that 
time. Thank you, Madam Chair.
    Ms. Schakowsky. I recognize for 5 minutes Congressman 
Bucshon.
    Mr. Bucshon. Thank you, Madam Chairwoman.
    I was a healthcare provider before, and health information 
is some of the most sensitive information that is out there and 
it is also some of the most valuable. So I hope that whatever 
we do here in Congress specifically addresses health 
information because it is really critical and important.
    As you may have heard, last week it was revealed that 
Google's Nest Guard home security device had a microphone 
inside the device that consumers did not know about and it was 
not disclosed. As I have discussed in prior hearings on data 
privacy including with Mr. Zuckerberg, I am concerned about the 
inappropriate collection of audio data. And it seems that 
everyone denies that that happens, but I think everyone knows 
that it probably does.
    So Ms. Zheng, can you expand on how the right to privacy 
would play into this type of practice and how we would deal 
with that?
    Ms. Zheng. Thank you for that question, Congressman. When 
it comes to audio data if it is personally identifiable 
information or personal information and falls within the scope 
of a privacy, you know, a new privacy bill, I certainly believe 
that transparency, control, access, the right to correct it, 
the right to delete it, should be rights the consumer should 
have including for audio data.
    Mr. Bucshon. Because that is going to be important because 
if we exclude things that you actually type on the internet but 
we don't have things in privacy where if you are talking your 
phone picks it up and sends a keyword to someone and they 
advertise based on that, then we are missing the boat on that. 
I want to prevent collection of data without consumers' 
knowledge and audio data would be there.
    And, Dr. Layton, do current laws cover this type of 
omission from Google about a microphone? And second, if we 
decide to grant additional authority to the FTC, would you have 
any suggestions on how the FTC may play a role on addressing 
intrusive data collection policies including audio data without 
harming innovation?
    Dr. Layton. Thank you, Congressman. I think it is excellent 
that you raised the point when you use various devices in your 
home, Alexa home and so on, you are having conversations with 
your family members. And I think law enforcement has actually 
used some of that data in some cases and with good purposes for 
it, actually. In terms of the Federal Trade Commission, they 
are engaged in this process now. I don't know if audio is a 
specific part of their inquiry. I would have to get back to you 
on that.
    Mr. Bucshon. OK.
    Dr. Layton. I can't recall at this moment. But I don't see 
from a technical perspective why audio would be different 
because it would be recorded as the same data. Even though you 
are speaking it, it would be transcribed into a data file, so.
    Mr. Bucshon. OK. The other thing I want to quickly say, and 
then I have a question for Mr. Grimaldi, is that also we need 
to address hardware as part of this. Not just an app but 
hardware, because data, location data is really important. And 
there was a local news media here in town who turned off their 
phone and did everything they could except take the battery 
out. Went all over the city of DC and then went back, plugged 
it in, and all the metadata everywhere they were was recorded, 
and as soon as they turned that phone on it all went out to the 
internet. So hopefully anything we do on privacy also includes 
hardware, not just apps, not just software. That would be 
important.
    So, Mr. Grimaldi, in your testimony you highlight that 
data-driven advertising has helped power the growth of the 
internet by delivering innovative tools and services to 
consumers. Many constituents including myself, and I am going 
along the audio theme here, have concerns about how 
conversations when not directly using an app, device, or other 
electronic device appear in a later online ad based on keywords 
in the conversation. Can you help me understand how this is 
happening?
    Mr. Grimaldi. Sure. There is--and also I think it is 
important to understand the difference between personal data 
and synonymized data. And that is if you were using, if you 
were in your conversation using words that were flagged that 
weren't, you know, Congressman Bucshon, but they were an 
individual who was into hunting or was into automotive, cars, 
you name it, sports, that data could be tagged for you and used 
to serve you better targeted ads.
    Mr. Bucshon. Can I just interrupt for a second? So I was 
having a conversation with my communications director, this 
happened about a month ago, talking about a certain subject and 
the next day he got ads on his computer specifically about that 
particular subject. We happened to be talking about tennis 
because he is a tennis instructor, but nonetheless. So 
continue.
    Mr. Grimaldi. Right. And without intimate knowledge of how 
that hardware is constructed, if I were to take that as an 
example of just your web browsing those sorts of things could 
be flagged in order to serve you ads that are not generic, that 
are more tailored to your interests and done in a way that 
again the word ``synonymized,'' meaning you are put into a 
category rather than your name, your address, your Social 
Security Number, but just your likes and dislikes. And then 
that enters a marketplace behind the web where that information 
is used to serve you better ads without linking you personally 
to your information, your intimate information. It is another 
piece of that reasonable and unreasonable construct we are 
talking about.
    Mr. Bucshon. OK. My time has expired, but I want to make 
sure that whatever we do here in this committee it includes 
audio data and also considers location data based on hardware 
within a device. Thank you very much. I yield back.
    Ms. Schakowsky. I recognize Congresswoman Rochester.
    Ms. Blunt Rochester. Thank you, Madam Chairwoman. And thank 
you so much for setting the tone of this hearing and this is a 
vitally important topic for Delawareans but also for our 
Nation, and I want to thank the panel as well.
    You know, more and more in our daily activities they 
involve the use of the internet. Many of us pay our bills, 
shop, play games, and keep in contact with friends and 
relatives through websites or online applications. However, 
with all of these activities taking place online, websites are 
amassing more and more personal information. This presents 
serious privacy concerns.
    Large-scale data breaches are becoming more common and 
consumers have a right to know what is being collected, how it 
is being used, and should be notified when a breach has 
occurred. Most of you on the panel today have discussed the 
need to give consumers more control over their own information, 
to get more control over their own information and should it 
be, you know, how it should be collected and how it should be 
used.
    And I want to drill down just a little bit deeper on that 
and ask Ms. Zheng, the Business Roundtable's privacy framework 
promotes the idea of giving the right to access the correct, 
and correct inaccuracies in the information collected about 
them. So can you talk a little bit about what you mean by 
information collected about them and does that just refer to 
data points collected or does it also include any inferences 
made based on that data?
    Ms. Zheng. Congressman, that is a good question and it is a 
very specific and detailed question that to be honest with you 
we still need to discuss within our membership. Right now as we 
drafted our proposal, our framework, the right to access, 
correct, and delete your data does apply to your actual 
personal data. So, but to answer your further question I would 
need to follow up with you.
    Ms. Blunt Rochester. And I am going to ask a few other 
people questions around this as well. I mean I think a lot of 
us are familiar with, you know, the story of the individual at 
Target who got the coupons, came to the father's house for a 
pregnant teen, and again it was inferences.
    And so I want to ask Ms. Collins-Dexter, what are your 
thoughts on access and correction and should consumers be able 
to see and correct inaccurate inferences made about them? And I 
want to start with you.
    Ms. Collins-Dexter. Yes, absolutely. We think that people 
should, similar to a credit report, have an opportunity to 
challenge and correct information. One of the things that we 
have even seen with some of our work around voting records and 
purges that have happened across the country is that there is a 
lot of data collected and based on like inaccurate names or 
misspelled names that allow for voters to be purged from files 
across the country.
    I think, you know, as we think about all of the various 
data points and all of the mistakes that happen, again we are 
finding the people that tend to be most impacted are low-income 
communities of people of color, people who aren't able to 
actively challenge and correct the record on themselves. So I 
would say it is extremely important on a number of different 
fronts that we are allowed to do that and any privacy 
legislation should allow for that.
    Ms. Blunt Rochester. Thank you.
    And, Mr. Grimaldi, you didn't really talk about consumers' 
right to access and correct information collected in your 
testimony, but how do you think giving those rights to 
consumers would affect your member companies?
    Mr. Grimaldi. Thanks, Congresswoman. To echo what some of 
my co-panelists have said, consumers have a right to delete 
their data and I think there are things to explore with those 
rights. There are obviously fraud, misuse, other components 
that could negatively affect either a consumer's online 
experience or their just life experience, and we are seeing 
that contemplated in Europe and we are seeing that contemplated 
in California. There are problems though I would point out that 
could come about when consumers request their data to be 
deleted and the authentication of those consumers requesting 
it.
    One of the major pitfalls that we are currently working on 
with the California law is if somebody could have their data 
deleted, how do they authenticate themselves to make sure it is 
them? If somebody can request their data, how do we know it is 
them and it is not somebody stalking them or somebody meaning 
to do them harm. Those are really important questions.
    Ms. Blunt Rochester. You know, I want to kind of close out 
my comment by just saying that why this is so important is 
because I think a lot of people do feel that it is a fait 
accompli. This is the world that we now live in. And that is 
really what the role of Congress is, is to make sure consumer 
protection going back to what our chairwoman said. Thank you so 
much. My time has expired.
    Ms. Schakowsky. I now recognize for 5 minutes Congressman 
Carter.
    Mr. Carter. Thank you very much, Madam Chair, and thank 
you, all of you for being here. This is an extremely important 
subject and we want to do the right thing, so that is why we 
got you here. You are the experts. You are the ones we want to 
learn from and hopefully build upon.
    Dr. Layton, I want to start with you. First of all, 
earlier, one of my colleagues mentioned the WHOIS database. Can 
you explain that very briefly what that is exactly?
    Dr. Layton. Well, I just use the address book for the 
internet, you know, those who registering the names that they 
have to disclose who they are.
    Mr. Carter. Well, it is clear through your testimony as 
well as your background that you have a good grasp of GDPR and 
the impact that this had. It is my understanding that the 
WHOIS, or ICANN is the governing agency over WHOIS, that they 
have actually run into problems with this and they have 
actually said that they are not going to be collecting that 
data anymore?
    Dr. Layton. So, no. They have actually for some, for quite 
a long, at least a year they have been trying to work with the 
officials in the European Union to highlight to them the 
problems and to find a resolution. And the pressure from the, 
you know, extreme privacy advocates in the European Union are 
not letting them come to a resolution. So as I understand 
today, I don't have the most up-to-date, but I think there is 
an impasse right now because it is not resolved. So the 
information is not available.
    Mr. Carter. Well, this is the kind of thing that we want to 
learn from. I mean we don't want to make the same kind of 
mistake that obviously they have made and because it is my 
understanding that WHOIS data is very important particularly to 
law enforcement. Has that been your experience?
    Dr. Layton. Yes. Well, absolutely. I mean it is a major 
issue for law enforcement, intellectual property rights holder, 
you know, people in the public who may need to do research and 
so on. I think the lesson learned here is, you know, we have 
heard before the way to hell is paved with good intentions. I 
think everyone has had good intentions and they have 
overreached. They went too far. They didn't have a process to 
test the various provisions. Everybody got to tack on what they 
thought made sense and then they just bring it over the finish 
line and we have to live with it.
    Mr. Carter. What do you think we could learn from that? I 
mean how could we make it better?
    Dr. Layton. Well, at least one of the things I would say in 
terms of how we are ahead in this respect, in the United States 
we have a transparent policy process. When we are submitting 
anything to the Federal Trade Commission, as part of what they 
are doing you have to disclose your name, who you are, you are 
conducting this hearing today.
    The policy process now in the EU because of this rule means 
you can mask your identity. So you can submit into a regulatory 
hearing, you don't have to say your name. You don't have to say 
who you are, for privacy reasons. So what I would encourage 
Congress to do is keep with our tradition for the public's 
right to know, to continue in this vein as you are having the 
hearings today, and to, you know, to take these steps to look 
at where it hasn't worked and to not make the same mistakes.
    Mr. Carter. Let me move on. Earlier we talked about market 
share particularly as some of the companies have grown in 
market share and at the expense of others as a result of the 
GDPR. What is the primary reason for the change in market share 
for some of these companies?
    Dr. Layton. So, well, in many respects there are, it is 
because a number of firms have exited the market. They have 
decided they are no longer going to operate, so in many 
respects that the advertising market has shrunk in the sense 
that there are fewer properties on which to conduct advertising 
that would be one thing. The other issue is that when those 
other smaller players leave it just means that people visit the 
larger players more.
    Mr. Carter. Has this had an impact, obviously it has had an 
impact on the exports to Europe of various content and digital 
goods?
    Dr. Layton. Right. Well, so for me when I am sitting in my 
office in Copenhagen and I try to go to Chicago Tribune, I 
cannot open it. I just see a white page that says, ``Sorry, we 
are not delivering our content.'' And, you know, that is 
unfortunate for me, I can't see the information. It is too bad 
for the advertiser, they can't put the advertisement on the 
page. It is sad for the 1 million Americans that live in the 
EU.
    Mr. Carter. I was about to say it obviously has an impact 
on them, and they are not able to get the information.
    Dr. Layton. Right. So, but I think as Mr. Grimaldi, he 
pointed it out very well and I think his testimony makes it 
very clear it is not that they don't want to do it, but it 
costs too much money and there is a regulatory uncertainty. The 
legal risk is so high because it is not just--it is so new, 
this rule, so we don't know how they will be interpreted and it 
is a whole value chain that all of the partners who might be 
working with Chicago Tribune or whomever may also be liable. So 
they don't want to take the risk.
    Mr. Carter. Well, again I want to thank all of you for 
being here. I think there are important lessons that we can 
learn from the experiences about the European Union as well as 
what we are trying to do in California. Obviously what we don't 
need is 50 different sets of rules governing. We need one set 
of rules here in America.
    And hopefully, and I have always said I don't want to 
stifle innovation so that is one thing I hope we keep in mind 
in this committee as we move forward. Thank you, Madam Chair, 
and I yield back.
    Ms. Schakowsky. Thank you. And now I welcome the vice chair 
of this committee, Mr. Cardenas.
    Mr. Cardenas. Thank you very much, Madam Chair, and thank 
you for holding this very important matter before the public. 
And to the ranking member as well, thank you.
    Ms. O'Connor, would you like to shed maybe a little bit of 
light on the dialogue that we just witnessed over the last 3 or 
4 minutes about the EU and maybe the mistakes they made and 
things that we could learn and the cross reference between 
innovation and privacy?
    Ms. O'Connor. Thank you so much, sir. I think it is fairly 
certain that we in the United States will pass a United States 
law that reflects our values and our cultural traditions and 
our unique opportunity here as the birthplace of Silicon 
Valley. But I think there are also our shared values, values of 
respect and dignity, values of customer trust that our 
companies, our U.S.-bred companies can certainly adhere to.
    I think privacy and security are a form of corporate social 
responsibility in the digital age and are essential to doing 
business in a thriving U.S. economy and around the world. Yes, 
it is important to get to a Federal standard, but it is 
important that that standard be strong and be understandable by 
small, medium, and large enterprises in the United States and, 
most importantly, be one that customers can trust, that 
consumers and citizens of this country can have certainty that 
their information is being treated fairly, that they are not 
being discriminated against, and that they understand the 
consequences of the bargains that they strike with companies.
    Mr. Cardenas. Well, one thing that I enjoy the most is 
being able to go back to my district and I am blessed that my 
two grandchildren live in my district, so I can drive 5 
minutes, jump on the carpet and roll around with them and play 
with them and know that when they grab a toy--like my 6-month-
old, she is at that age where everything goes in her mouth--
know that consumer protection is something that we take for 
granted in this country. We didn't do that back in the day 
maybe decades ago, but at least today I know that there is a 
99.999 percent chance that that toy is not going to hurt my 
little granddaughter.
    Speaking of children, under the CCPA businesses are 
supposed to provide an opt-in mechanism for children 16 and 
under to allow companies to sell their personal information as 
defined by the CCPA. How do they know whether the children are 
16 and under, under any system?
    Ms. O'Connor. Well, that is such a great point because it 
requires more authentication and more knowledge in order to 
know who your consumer is. I think you have identified one of 
the very compelling gaps in our coverage right now, the above 
COPPA but below majority age group in our country. I have 
several of those people living in my house right now and they 
are a challenging age on the internet to say the least. And it 
certainly bears consideration of what we should do going 
forward to consider whether COPPA is working adequately and 
what to do with that in-between age group.
    Mr. Cardenas. What is the mechanism to get parental consent 
for children under 13?
    Ms. O'Connor. It is somewhat complicated and requires 
several steps of the parent self-authenticating and providing 
phone numbers or email addresses or the like. I seem to do this 
every single day on my computer for my youngest child. But it 
still is fraught with some peril that the child may be 
providing inaccurate information or that the data may be used 
in a way that is unanticipated by the parent or the child.
    Mr. Cardenas. Under the Federal law COPPA companies must 
obtain parental consent before collecting personal information 
online from children under the age of 13. How do companies 
verify parental consent and how does the FTC enforce this?
    Ms. O'Connor. The parent often has to respond to an email 
verifying that they are the parent or that they have 
authorization. The FTC has taken some cases and I think there 
is concern in the marketplace about whether the enforcement 
mechanisms have really fully grasped the complexity of the 
issue both in the online world and as you point out in the 
Internet of Things world.
    Mr. Cardenas. What seems to be the logic or the history on 
the difference between a 12-year-old and a 13-year-old, and why 
is that the cutoff point?
    Ms. O'Connor. I am sorry. I can't speak to the legislative 
history on why that number. It certainly is one that bears a 
relevance in a number of cultural traditions. But I think we 
all know that one 13-year-old is not the same as another in 
many households and there is a large age group between again 13 
and 18 that we should be thinking about as well.
    Mr. Cardenas. How do we expect a 13-year-old to do, wade 
through this without parental consent or somebody, an adult 
helping them?
    Ms. O'Connor. I totally agree. I think kids, teenagers, and 
grownups in this country deserve greater supports and 
protections around their personal data online and off.
    Mr. Cardenas. I think it would be naive for us to believe 
that there isn't a motivation out there with the largest 
corporations in the world and getting more dominant and larger 
for them not to look at our children as consumers. If you look 
at the bandwidth of a consumer power of a teenager and a 20-
some-year-old and a 30-some-year-old, et cetera, there is 
tremendous motivation for individuals to abuse the information 
of our children. And I think it is important that--thank you 
for the confidence that you gave that you believe that Congress 
is actually going to pass something. I hope that we do. Thank 
you for that confidence. I yield back.
    Ms. Schakowsky. And now I yield 5 minutes to Mr. Gianforte.
    Mr. Gianforte. Thank you. And, first, I would like to thank 
the chairwoman and ranking member for welcoming me to this 
committee. Thank you. I look forward to serving and I am 
encouraged by the conversation today. I think there is some 
good bipartisan common ground here to find solutions.
    The internet has removed geographic barriers from many our 
rural areas that previously prevented small companies in rural 
towns from competing globally. Concerns about data misuse are 
warranted, but creating an overburdensome regulatory 
environment would have devastating effects for this coming new 
prosperity we are seeing in rural America.
    I think we all agree and we have heard it in the testimony 
today that consumer data must be secured and that we need more 
transparency and accountability in all of our practices and we 
need a national standard. Our job is to find a balance between 
these overly prescriptive laws like GDPR and versus a patchwork 
of 50 different laws in different States. Trying to comply with 
either would devastate small businesses. We have heard that in 
the testimony today, while increasing market share for some of 
the largest companies we see and this is what has caused the 
concern.
    The burdensome top down approach taken by GDPR can stifle 
innovation and lead to less information simply because it is 
too costly to comply. It is imperative then we adopt one 
national standard and that clearly defines the responsibilities 
of consumers and businesses and I think we have unanimity on 
the panel today, so I appreciate that. Consumer concerns over 
their data can be attributed back to a lack of transparency and 
misunderstanding of how their information is being collected 
and used. Bad actors should be punished. We have seen many of 
them pursued by the FTC and also through the loss of consumer 
confidence.
    The market tends to enter in here. In our internet business 
my wife and I started in our home, over 15 years it grew to one 
of the top 100 websites in the world. We had about 8 million 
consumers a day and we were entrusted with the data for nearly 
2,000 organizations around the world. Protecting customer data 
was paramount in our business. We knew that the safety of our 
customers' data which we protected in the cloud was the key to 
continued viability of our business. The stakes and the 
consequences could not have been higher. We had to protect our 
customer data or face going out of business. It is difficult to 
regulate a dynamic industry and hastily rushing to draft 
legislation could have more unintended consequences than 
solutions. We have seen that in GDPR and in the California 
regs. As debate over consumer protection continues we should 
pursue one national standard that increases transparency and 
accountability while protecting small business and innovation.
    I have a couple of questions. Dr. Layton, with all of this 
in mind and in light of the light regulatory touch we have 
taken in the U.S., historically, can you please discuss what 
you believe are the best way to guard against entrenching 
larger companies and disadvantaging smaller business?
    Dr. Layton. Well, in two words, permissionless innovation. 
I mean, I think that that has been one of the most important 
things about our economy, was that we allowed companies to try. 
Just as you, yourself, you didn't have to--I doubt that you 
went to Washington and said, ``May I try this website?'' and 
you just got going.
    Mr. Gianforte. Yes. OK, thank you.
    And, Mr. Grimaldi, we heard from Ms. O'Connor and her 
litany of 260 applications--very impressive--and the 
intractability of complying with them all. And in your 
testimony I thought it was very helpful you recommended moving 
from these disclosures and checkboxes to prohibited practices. 
Can you give us a couple of examples of prohibited practices 
that you would put on that list if we were to draft legislation 
with that approach?
    Mr. Grimaldi. Sure. Thank you, Congressman. I think Ms. 
Collins-Dexter has an unbelievable list in her testimony. 
Eligibility, improper targeting because of eligibility, and 
discrimination, the use of sensitive information which would 
need to be defined, we have spoken a lot about it today that 
consumers don't anticipate and would never want to share and 
would never want to be used. I would say even if it is 
synonymized and not linked to their personal data along the 
lines of healthcare providers or addresses, et cetera. I think 
that is all important.
    Mr. Gianforte. Do we need to differentiate between the 
types of data that is being collected and how would you suggest 
we do that?
    Mr. Grimaldi. Absolutely. I think that is--again, Europe 
should not dictate what our national law should be. I don't 
think one State should either. I think this body and the Senate 
is the best representation of what consumer sentiment is around 
these issues. My industry needs trust or else we don't have 
people logging on to our websites, we don't have people 
clicking on our ads. The whole internet economy is built on 
that. These are the things, these are the important 
conversations.
    Mr. Gianforte. OK, thank you. I want to thank the panel for 
your testimony today. It is very helpful. And with that I yield 
back.
    Ms. Schakowsky. And now a belated happy birthday, and I 
call for 5 minutes on Mr. Soto.
    Mr. Soto. Thank you, Madam Chairwoman. I believe most 
Americans have a basic understanding that their personal data 
is being used, but there are certain expectations of privacy 
that I think are reasonable for users to be able to have 
throughout the United States that their personal data be kept 
secure and not be stolen in a cyber breach, that their health 
data be protected so that it couldn't just be acquired without 
their permission, or that we avoid a society where government 
monitors all of our data in some Big Brother-type of situation 
that we are seeing now in China and in Russia.
    You know, we have heard some complaints about States 
getting involved in this and the Supreme Court has gotten 
involved in it, which I will get into in a second. Really, the 
internet is a part of interstate commerce, but it is this 
committee's lack of action in legislating that has created this 
vacuum for States to act.
    First, I want to just point out that the Supreme Court has 
already stated we some right to privacy for our personal data. 
In the recent Carpenter v. United States case, they at least 
applied the Fourth Amendment to say that government cannot get 
personal data from our cell phones without a warrant and I 
wouldn't be surprised by a 5-4 majority or more that that is 
extended to other rights. So the Supreme Court is already 
acting. States have already stepped up.
    There has been a lot of talk, first, about a duty of care. 
That has mostly been in the purview of academia, but it is 
something that we ought to consider, cybersecurity protections, 
proper use of data consistent with disclosures, and handling 
requests and complaints for use of data. A second big issue we 
saw Delaware tackle with requiring privacy policies to be 
conspicuously available on websites. I don't think that is much 
to ask since we have that for a lot of contracts.
    And then, thirdly, is really sort of the big question on 
privacy in general. California passed the Consumer Privacy Act 
of 2018 where there is a right to request businesses to 
disclose data collected, right to request businesses delete 
personal information, and then a right to opt-out without being 
discriminated against. And I think that is the multitrillion-
dollar question in the room today and that is where I want to 
start by asking our panel.
    Starting with Ms. O'Connor, do you think that you should be 
able to opt out of these sites' ability to collect data without 
being discriminated against, basically denied use of service?
    Ms. O'Connor. Certainly. And as I mentioned before, there 
is a primary purpose and a primary data collection for the 
transaction. So to send me the book or the yellow sweater you 
have to know my address, but I do think individual consumers 
deserve more, not only agency but control over their data and 
the data lifecycle to access, correct, and delete data if they 
want to as well.
    Mr. Soto. Thank you for your input.
    And, Ms. Collins-Dexter, do you think you should be able to 
opt out without discrimination?
    Ms. Collins-Dexter. Yes. I think opt-in forces--well, 
rather, I think when you set an opt-in framework it forces 
companies to make the case for why data is needed for desired 
use and why consumers should consent to that. I think, however, 
even in an opt-in framework, I think as we have heard examples 
over the day, companies will do all sorts of tricky things to 
get consumers to consent to things that they want to do.
    And so I think legislation has to really move beyond a 
choice framework and really focus on prohibiting harmful use of 
data, establishing baseline norms and obligations such as data 
minimization and purpose limitation.
    Mr. Soto. Thank you.
    And turning to innovation on this aspect, Ms. Zheng, do you 
think it would be a viable alternative that people can charge a 
user fee should they want to opt out of data collection? Would 
that still embrace the kind of innovation that you have been 
talking about?
    Ms. Zheng. Thank you for that question. I think if the 
companies choose to do that or choose to adopt that approach 
that would make sense, but I am not sure that mandating it in 
statute would make any sense. It would certainly hurt 
innovation.
    Mr. Soto. And, Mr. Grimaldi, on this sort of choice should 
you be able to opt out without discrimination or would it be 
appropriate to potentially charge the user fee in the 
alternative or deny a service altogether?
    Mr. Grimaldi. Thanks, Congressman Soto, a couple things. We 
see that not in terms of data for shopping data, for other use, 
but we see that in terms of just the value of exchange on if 
you want to access a certain subscription website and view 
their content you have to pay a fee. That is that value 
exchange.
    To your question of should you be able to opt out and not 
receive those services, I think that is another thing that 
needs serious contemplation, because I don't think a one-fits-
all approach would work here, just in terms of that being a 
defined right and the massive disruption that could cause to 
websites large, small, Google, Amazon, a small yogurt shop. If 
you opt out of giving your data, can those companies survive? 
Are they monetizing it in a way that a consumer knows about 
that, has that policy in their face, or the opt-out mechanism 
in their face? We supply that, as I mentioned earlier, via a 
large multistakeholder regime.
    So there are tools out there. Could they be stronger? I 
think that is a great question.
    Mr. Soto. Thanks. My time has expired.
    Ms. Schakowsky. Now I am happy to yield to Congresswoman 
Matsui.
    Ms. Matsui. Thank you very much, Madam Chair. And I want to 
thank the panel for being here today. This has been a very 
enlightening discussion. And I just want to make a comment 
about the elephant in the room, although I don't really regard 
it that way. As you can tell I am from California and there has 
been a lot of comment about the California law.
    But may I just say about California there has not been much 
action on the Federal front, we all know that. And California 
being California with its myriad of businesses both big and 
small and its diversity, we have rural areas, urban areas, and 
suburban areas and it is not something that--we are not a small 
State, we have a myriad of opinions. And we are also a very 
innovative State, the home of many of the large companies that 
actually testified last spring.
    So I just will tell you this. There are ways that I know 
Mr. Grimaldi saying he is already working with the State of 
California, I think that is really very important, but I must 
say also that it is something to be considered that it is a 
State that is large enough to really be able enact a law but 
also to bring in many of the stakeholders too. So that is my 
piece on California.
    I want to talk about advertising. Advertising supported 
models generate revenue through user provided data. Many 
platforms have broad statements that claim what is yours is 
yours, you own your content. I appreciate that. But I want to 
understand more about that. To me that means users ought to 
have some say about if, how, and when it is used.
    But online platforms have an evolving set of rules for how 
partners can interact with the user content and how the 
platform may modify or adapt this content as it is distributed. 
The hearings this committee has held demonstrate that the real 
crux of the issue is how content is used and modified to 
develop assumptions and inferences about users to better target 
ads to the individual.
    I want to ask, how should a Federal privacy law ensure 
consumers have a meaningful say about how their data is used 
even when that data has modified use to develop inferences 
supplemented by additional data or otherwise? And I will start 
with you, Ms. O'Connor.
    Ms. O'Connor. Thank you so much for that question. We would 
believe that there should be limitations on the secondary use 
of data that you have provided for a particular service and 
obviously transparency around the operations of the company and 
their intended use. I think your question gets to the heart of 
the matter, which is that individuals do not want to be 
discriminated online or offline and they want to know how the 
decisions that are being made about them are affecting their 
daily lives.
    So we would absolutely want to look at issues of 
discrimination again in the online-offline world based on the 
data that is collected and allow the individual greater agency 
and control over that data.
    Ms. Matsui. Thank you.
    Now it has been noted that advertising is less concerned 
with identifying the individual, per se, than with the activity 
of the users to predict and infer consumer behavior. But I 
wonder if that is becoming a distinction without a difference 
even when user content isn't associated with that user's name, 
precise information can and is gathered through metadata 
associated with messages or tweets. For instance, online 
platforms often are offered geospatial metadata that they 
provide by parsing messages for location names of interest 
including nicknames. This metadata could then be associated 
with other publicly available social media data to re-identify 
individuals.
    Ms. O'Connor or Mr. Grimaldi, so even though advertising 
itself may not be considered with identifying the individual in 
the context of the Federal privacy law, how do we ensure data 
is not being used by others to do so?
    Mr. Grimaldi, first.
    Mr. Grimaldi. Sure. Thank you, Ms. Matsui. And I think that 
those are very important questions that a potential, new, 
strong oversight regime would contemplate. A number of folks 
have mentioned the Federal Trade Commission. They have brought 
500 cases or more on issues around these types. And while they 
are incredibly capable and very strong, they don't have the 
resources right now, I think, that would allow them to play a 
role in a massive part of the American economy.
    So I think that that is up for discussion as to whether or 
not a new paradigm, the one that we are contemplating could 
bring new oversight and new enforcement and that is part of 
what we are discussing now. A moment ago I think it was Mr. 
Soto or Mr. Cardenas mentioned the jurisprudence in the past 
around these issues. And I think it would--I was a staffer on 
this committee when long after the 1996 act was passed and 
there was much discussion about why that was never updated, why 
there was never momentum behind that to update it. And I think 
it is because getting in the way of innovation and getting in 
the way of consumers enjoying what they want and the services 
they are provided is a sticky thing. But in terms of more 
oversight and new powers to protect consumers, I think we are 
at a place right now where we need to seriously think about 
that and make it happen.
    Ms. Matsui. OK, thank you. I am out of time. I yield back.
    Ms. Schakowsky. And next, also from California, Congressman 
McNerney.
    Mr. McNerney. There is a lot of us from California. Thank 
you.
    Ms. Matsui. Big State.
    Mr. McNerney. Thank you. I want to thank the witnesses for 
your perspectives on this. It is an important subject and it is 
complicated. It is not something you can get your hands around 
easily, so thank you very much.
    My first question goes to all the witnesses and please just 
answer yes or no. Is it important that any law that we draft be 
able to adapt to technological innovation and advancements over 
time? Starting with Ms. Collins.
    Ms. Collins-Dexter. Yes.
    Dr. Layton. Yes.
    Ms. Zheng. Absolutely, yes.
    Mr. Grimaldi. Yes.
    Ms. O'Connor. Yes.
    Mr. McNerney. Unanimous. Well, that makes my point.
    In order for comprehensive privacy laws created by this 
slow-moving Congress to meet the current challenges and to be 
able to adopt the new circumstances, I believe it is critical 
that we give the FTC APA rulemaking authority for privacy and 
data security. I have called for this over time and I expect to 
see that in our policy.
    My next question will go to Ms. Collins-Dexter. When 
Facebook CEO testified before this committee I asked him if I 
could download all of my data that Facebook had and he said an 
unqualified yes. And then later in the hearing after being 
advised by his staff that that wasn't correct he corrected his 
statement. Now, Ms. Collins-Dexter, if a CEO of a major company 
that deals in data, that is their business, isn't sure what 
data they make available to its users, can we have any 
confidence at all that these companies will actually make their 
data available to users when requested?
    Ms. Collins-Dexter. No, we can't.
    Mr. McNerney. Well, good. And clearly it is important that 
the comprehensive data privacy legislation grant consumers the 
right to access their data and to correct it if it is wrong.
    You are not raising your hand to make a statement, I don't 
think.
    Dr. Layton. No, I agree.
    Mr. McNerney. Thank you.
    Again Ms. Collins-Dexter, can you explain the risks that 
location tracking poses for low-income Americans like so many 
of my constituents?
    Ms. Collins-Dexter. Yes. I also, if I may, want to sort of 
take us back again. I think there has been like a lot of 
conversation around patchwork legislation. And while I think 
there is certainly issues with GDPR, there is improvements to 
be made with California legislation.
    I think one thing that I think came up in the testimony 
with Mark Zuckerberg that I think we should identify as really 
part of the issue of coming here is really an issue around tech 
monopolies and how they are consolidating power. And so I 
really think that it is important for us to maintain that even 
as we are looking at the ways in which they are collecting 
innocuous data points such as geolocation in order to ascertain 
things around race and come and use that as an opportunity to 
use predatory payday advertising, junk food marketing, and all 
sorts of sort of harmful advertising targeted at communities in 
different locations.
    Mr. McNerney. Thanks for that comment. Well, I think it is 
important that we limit the use of data location information 
and that is something that I will be working with Members 
across the aisle on.
    Again Ms. Collins-Dexter, in your written testimony you 
mention that algorithms work as kind of a black box to drive 
exclusionary practices and you need to raise, need to ensure 
that fairness in automated decisions. What do you think are 
some of the challenges that companies face in this today?
    Ms. Collins-Dexter. Yes. I think part of what we are 
looking at or thinking about is this proposition of kind of 
garbage in-garbage out, right. And so I think there is a lot of 
presumptions that algorithms can't be biased or that tech is 
neutral. And what we find is history, a long, you know, history 
of systemic inequities are actually being and put in from data 
points and then replicating models of discrimination free from 
accountability.
    And so I think, you know, one of the things that we want to 
look at is kind of the algorithm, distribution of 
advertisements related explicitly to education, employment, and 
housing opportunities, algorithmic distribution of political 
advertisements in communications, and algorithmic 
determinations of product prices and same-day shipping. These 
are examples of some of the things in which I think we need to 
see more intelligence and information on.
    Mr. McNerney. Thank you.
    Finally, Ms. O'Connor, I am worried about data security as 
well as data privacy. Would you agree with that?
    Ms. O'Connor. Yes, sir.
    Mr. McNerney. What is the relationship between privacy and 
security?
    Ms. O'Connor. They are inexplicably linked. They are two 
sides of the same coin. In our draft proposal we copy some of 
Congresswoman Schakowsky's language about thresholds and best 
practices and it is an essential part of a privacy program for 
any company large or small.
    Mr. McNerney. Thank you. And I just want to say I was 
shocked by your earlier statement, Ms. Collins-Dexter, that 
discriminatory technology is lucrative to identify ethnicity. 
In other words it is a lucrative technology used nefariously. 
Thank you. I yield back.
    Ms. Schakowsky. And now Mr. O'Halleran for 5 minutes, you 
are recognized.
    Mr. O'Halleran. Thank you, Madam Chair. And I thank too the 
witnesses also that are appearing before us today.
    You know, I am all for a national policy, but it has to be 
balanced. And it has to be balanced for the good of the people 
of America and their privacy. We have to recognize that there 
is, you know, not only are these changing times but the speed 
at which technology is changing has to be taken into account. I 
was a former investigator and I have to tell you, I would love 
to be an investigator in these times because of the speed of 
information that I could get that used to take me maybe a month 
to get, I could get in minutes maybe.
    So we have to be very concerned about these issues. And 
this is a national dialogue on how to enhance the data privacy 
of consumers. This is a debate that it is important not only to 
the people in my district in Arizona, but the American people. 
I have to kind of thank California and thank Europe for getting 
us pushed. Do I agree with necessarily about what they want to 
do? No. But do I think it has allowed us to be pushed in the 
right direction in a timely fashion? Yes, we should have done 
this much sooner.
    As members of this committee across the aisle, we must take 
seriously our duty to closely examine how to ensure consumer 
privacy remains protected in today's increasingly connected 
global economy.
    Ms. Zheng, as you know my rural district in Arizona is home 
to many small businesses who constantly strive to compete in a 
modernizing economy and internet ecosystem. Under current law, 
the Federal Trade Commission serves as the primary enforcer for 
internet privacy as prescribed by the FTC Act. Taking into 
consideration the FTC's mandate to combat unfair and disruptive 
trade practices, deceptive trade practices against consumers, 
what privacy framework do you see as striking the right balance 
between protecting the rights of consumers and helping ensure 
regulatory certainty for small businesses?
    Ms. Zheng. Thank you for that question, Congressman. I 
would note that in a number of laws as well as legislative 
proposals, lawmakers have contemplated an exception for small 
or medium-sized businesses. I assume that is something that 
this body will also contemplate. You know, as the Business 
Roundtable we do represent large American companies, but many 
of our companies do business with small companies as their 
clients or as their suppliers so we certainly care about the 
well-being of the small business community.
    I think, you know, there are different types of thresholds 
you could look to in considering a possible small business 
exception including potentially the number of records held or 
the annual revenue. But I am not certain that the Business 
Roundtable is really the best organization to pontificate on 
what specifically that threshold ought to be.
    Mr. O'Halleran. And probably the reason for my question is 
because I want to see that there is a protection for businesses 
across the entire spectrum, not just for those with large 
business concerns.
    Ms. O'Connor, in your testimony you state that existing 
privacy regimes rely too heavily on the concept of notice and 
consent which you state place an untenable burden on consumers. 
As we all know, consumers often overlook the extremely dense 
language--here I am--in user agreements and simply accept in 
order to use internet applications and services.
    Under any new consumer privacy statute how could privacy 
notices be simplified for consumers whether they are 
technologically experts or novices to better and more 
meaningfully understand how their information is being stored, 
used, and, if applicable, shared after accepting privacy 
agreements? And I will say I believe the chairwoman was correct 
in her stack, it is probably a much bigger stack. And we have 
to design something that works for the American people. Please.
    Ms. O'Connor. Thank you, sir. That is exactly right. The 
number of hours and the number of words we would all have to 
read on a daily or weekly or monthly basis to stay up-to-date 
on the choices we are making online and off about how our data 
flows are staggering and overwhelming to any busy consumer. I 
think there should be things that are in bounds, again for the 
furtherance of the transaction, so the primary purpose of the 
deal.
    There should be things that are simply out of bounds like 
taking biometrics for purposes that are far afield from the 
primary purpose of the transaction, and then you could limit 
notices to that middle ground of things that are less clear but 
that consumers might want that are related to the transactions 
that they have at hand or their relationship with the company. 
They definitely need to be shorter, clearer, and more to the 
point. But notice and choice alone do not get us where we need 
to go.
    Mr. O'Halleran. Thank you, and I yield. Thank you, Madam 
Chair.
    Ms. Schakowsky. Now I am happy to yield to my colleague 
from Illinois, Mr. Rush.
    Mr. Rush. I certainly want to thank you, Madam Chair, and I 
want to thank all the witnesses who have appeared before this 
subcommittee today. I chaired this subcommittee back in 2007. I 
introduced a data bill back in 2007, and we are still here 
today discussing data and data security and a data bill. And I 
hope that under this current chairman that we are able to 
finally come up with a bipartisan bill and that will pass in 
Congress and then the President will sign. I certainly look 
forward to it and I have been pretty patient about it.
    I reintroduced my data protection, data privacy bill, H.R. 
1282, that had one provision that dealt with this specter of 
data brokers. And I just wanted to know am I off-base, Ms. 
Collins? Am I off-base trying to rein in this specter of data 
brokers? How big is that problem and as it relates to 
protection of consumers' data?
    Ms. Collins-Dexter. Yes. I think that you are right to be 
concerned. I think there is like so much work we have to do. I 
think one of the things that I tried to articulate in my 
comments I think is super important is that 50 years ago as a 
country we made a sort of social, legislative, and legal 
contract that is that certain things would no longer be 
accepted in our society. Kids being turned away from 
Woolworth's counter was not acceptable. People hanging signs 
that said no Jews, dogs or blacks allowed were no longer 
acceptable. And we didn't throw our hands up at that time and 
say don't go to that restaurant, right. We took an ethical and 
moral stance.
    And not just that, it was about knowing that if we could 
compete globally and thrive economically we had to ensure that 
we had more taxpaying members of our community, more people 
able to have opportunity and be economically mobile. And so 
part of what we are looking at with this like privacy 
legislation is basically looking at stopping Jim Crow online. 
It is around simply bringing, you know, looking at our online 
activities and ensuring that there is--that those same laws 
that we created 50 years ago to prevent discrimination apply to 
what we do online.
    Mr. Rush. Thank you.
    Ms. O'Connor, what should we do to regulate data brokers?
    Ms. O'Connor. Thank you, sir. And I think underpinning so 
many of the questions today is the issue of opaque or 
surreptitious surveillance or data collection. And that is the 
position again, and I just want to associate myself with Ms. 
Collins-Dexter because she is so right that these are issues of 
fairness, of transparency, of accountability, and of equality 
for all Americans.
    Data brokers really came up because of the Fair Housing Act 
and the Equal Opportunity Act and the fundamentals of providing 
fair credit to all Americans. They served at that time a 
purpose. Right now the opaque and surreptitious behind-the-
scenes data collection by third parties that Americans do not 
understand is fundamentally untenable going forward.
    So, and I think the CEO of one of those companies is 
actually directly across the hall right now, so maybe we could 
go ask him some of these questions. But they do serve a 
purpose. And to the previous comments, we need to reform, we 
need transparency, and we need greater control and 
accountability over these third parties.
    Mr. Rush. In your testimony you discuss how the CDT's draft 
legislation--well, I quote you, ``would direct the FTC to 
promulgate rules addressing unfair advertising practices, 
particularly those that result in unlawful discrimination in 
violation of civil rights law.'' Describe for this committee 
what should these rules look like?
    Ms. O'Connor. There are good laws on the books as we all 
know about unfair discrimination and what that looks like in 
the offline world. However, intimate and immutable and real-
time decisions can be made about us in the online world even 
prior to knowing who we are based on inferences, based on 
patterns of surfing and habits. We would simply want to make 
sure that each individual's world view is not prescribed and 
limited by judgments that are made about them by companies that 
they are not aware of. That a child in one part of the country 
is not seeing ads for educational opportunities or a grownup is 
not seeing credit opportunities that another person is being 
served based on judgments companies are making about them 
without their knowledge.
    Mr. Rush. Thank you, Madam Chair. I yield back.
    Ms. Schakowsky. Now it is my pleasure--last but not least--
to call on Representative Kelly, also from Illinois.
    Ms. Kelly. Madam Chair, Illinois is holding it down for you 
or with you. Thank you, Madam Chair, for holding this hearing 
today.
    As we have heard, repeated news stories about breaches and 
data collection malpractice have shown that it is time for 
Federal privacy legislation. As the founder of the Tech 
Accountability Caucus, I want to follow up on the discussion of 
use of limitations.
    Ms. O'Connor, in your testimony you discuss two buckets of 
use limitations, the first of which you refer to as unfair data 
practices. The CDT draft legislation prohibits secondary uses 
of certain sensitive data like biometric information and health 
information. Can you clarify something for me? Other than the 
specific exceptions listed, is it your intention in the draft 
that these seven unfair categories are just not permitted?
    Ms. O'Connor. That is correct, ma'am, that the secondary 
use of those categories of data would not be permitted. Each 
individual would have to enter into a separate contract or 
agreement for a separate service or a separate device.
    Ms. Kelly. I know we talked about during this hearing about 
opting in and all of that, but a company cannot even seek opt-
in consent for their uses; is that correct?
    Ms. O'Connor. It would have to be an entirely separate 
transaction. That is right.
    Ms. Kelly. OK. How did you decide the types of data that 
necessitated the extra protections?
    Ms. O'Connor. The Center for Democracy & Technology worked 
over the last several years and we have stood for and been in 
favor of omnibus Federal privacy legislation for the entire 25 
years of CDT's existence. But we have re-energized this debate 
internally and worked with academics across this country and 
really around the world, business partners, other advocates in 
civil society and looked at the research and the consumer 
polling, the consumer research in this area, and that is where 
we ended up with the list that we created.
    Ms. Kelly. OK, thank you. And to the panel, are there 
certain types of data that shouldn't be collected or used at 
all? We can just run down from Ms. Collins-Dexter.
    Ms. Collins-Dexter. Yes, I think there is certain pieces of 
like personal identifying data, geolocation, things like that 
that I think should not be collected and kept in use.
    Ms. Kelly. Dr. Layton? Just your opinion, are they any 
types of data that shouldn't be used at all or collected?
    Ms. Zheng. Thank you, Congresswoman, for that question. I 
think that the question deserves a little bit of nuance. What 
we are talking about here is, is there data that deserves an 
opt-in consent standard and I think the answer to that is 
likely yes. For example, a precise geolocation data, the FTC's 
current guidance right now is you acquire opt-in consent for 
precise geolocation data.
    What the Business Roundtable proposal recognizes is that 
there are sensitive categories of data that do absolutely 
deserve heightened protections and obligations including 
potentially opt-in consent.
    Ms. Kelly. Thank you.
    Mr. Grimaldi. Congresswoman, I would chime in by saying in 
order for the entire online ecosystem to work there has to be 
data to render a website to provide services, et cetera. And so 
in addition to some of the prohibited pieces that we have heard 
today that we all agree on, how do we expand that list to 
include other things in the marketplace that as my co-panelists 
have mentioned are just getting such blowback or are just on 
their face too personal, too off limits to be used by our 
companies, by other companies, I think that is important. And 
we need to make sure that the value that consumers are getting 
from their online experience can still be reaped even as we 
expand that list and we would love to work with you on that.
    Dr. Layton. Congresswoman, I just wanted to come back. I 
didn't want to take a position on this because I know, I 
actually know of important health and academic studies that 
under today's circumstances in the GDPR the data could not be 
collected. But data that had been collected in the past has 
been used today to make very important conclusions for health 
questions. So I only urge--I just want to put a note of 
caution, I understand that we have these concerns. But we don't 
necessarily know in the future how the data may be available.
    So I would tend to fall on the side of where we can 
identify that it is sensitive and have a higher standard, but 
not necessarily to outlaw it altogether. I am just concerned 
about the future because I have seen these studies that, you 
know, going forward we won't be able to do these important 
health outcome studies in the EU.
    Ms. Kelly. OK, thank you. Anything else? I will yield back 
the balance of my time. Thank you.
    Ms. Schakowsky. So, in closing, first let me request 
unanimous consent to enter the following documents into the 
record: 1) Public Citizen Framework for Privacy and Digital 
Rights for All; 2) a letter from the Americans for Prosperity; 
3) a letter from Computer and Communications Industry 
Association; 4) a letter from the ACLU and 42 other civil 
rights organizations; 5) a letter from Main Street Association; 
6) a letter from Consumer Technology Association; 7) Engine 
consumer privacy comments; 8) letter from Engine; 9) a letter 
from American Bankers Association; 10) the NRF letter; 11) NRF 
comments; 12) Electronic Transactions Association letter; 13) 
21st Century Privacy Coalition letter; 14) ACA International 
letter; 15) Representative Eshoo's opening statement for the 
record. You can see the kind of broad spread interest.
    I want to thank our ranking member, the staff that worked 
so hard on all of this, thank you, and especially our witnesses 
for your participation today in this very first hearing of the 
session dealing with this issue of data privacy which is 
clearly going to go forward. I encourage you to also keep in 
touch as we move forward. We welcome your input.
    I remind Members that pursuant to committee rules they have 
10 business days to submit additional questions for the record 
to be answered by the witnesses who have appeared. I ask each 
witness to respond promptly to any such requests that you may 
receive.
    Oh, there is more. OK. So we will have a letter from the 
American Action Forum to put in the record, a letter from the 
Council for Citizens Against Government Waste, a letter from 
consumer tech--oh, I see--a letter from the Coalition for 
Secure Transparent Internet, a letter from R Street Institute, 
a letter from United States Chamber of Commerce, a letter from 
Digital Liberty, a letter from the Internet Association, DOJ 
Cyber Digital Task Force, a letter from Google.
    Is that it? There is more? OK, a lot of interest. OK. 
Still, I had the Public Citizen, I think. But Public Citizen 
Framework for Privacy and Digital Rights for All, the 
Electronic Transaction Association letter, the letter from the 
National Association of Mutual Insurance Companies, a letter 
from Information Technology and Innovation Foundation, and 
along with the others I ask unanimous consent to put these in 
the record. So ordered.
    [The information appears at the conclusion of the 
hearing.]\1\
---------------------------------------------------------------------------
    \1\ The Information Technology and Innovation Foundation letter has 
been retained in committee files and also is available at https://
docs.house.gov/Committee/Calendar/ByEvent.aspx?EventID=108942.
---------------------------------------------------------------------------
    Ms. Schakowsky. And now, I think, at this time the 
subcommittee is adjourned.
    [Whereupon, at 12:51 p.m., the subcommittee was adjourned.]
    [Material submitted for inclusion in the record follows:]

                Prepared statement of Hon. Anna G. Eshoo

    I thank Chairwoman Jan Schakowsky for holding today's 
hearing and for allowing me to waive on to the Subcommittee on 
Consumer Protection and Commerce for this hearing.
    Three important events set the table for our debate about 
online privacy. In March 2018, we learned that Cambridge 
Analytica abused Facebook data to harm our democracy. In May 
2018, the European Union's General Data Protection Regulation 
went into effect. And in June 2018, then-Governor Jerry Brown 
signed into law the California Consumer Privacy Act. These 
three events have created the context within which I'm hopeful 
that Congress may be able to pass privacy legislation to 
protect all Americans. We should keep the lessons of each of 
these events in mind as we debate any privacy legislation.
    I have long called for protecting users' privacy online, 
and I reiterate my commitment to ensuring Congress passes 
strong and enforceable privacy legislation. However, not all 
privacy proposals are equal. Strengthening disclosures and 
simply expanding our ``notice and consent'' regime would be 
woefully insufficient for protecting users' privacy. We must 
shift the burden of privacy away from consumers who do not--and 
could not possibly--read hundreds of privacy policies that each 
run thousands of words long. A Federal law should require that 
companies minimize collection of personal data, give users 
access to and control of their data, eliminate problematic 
types of third-party data exchange, and institute safeguards to 
secure user data.
    Further, too many people are calling for preemption when we 
haven't even agreed on the contours of what the law should 
include. As Congress debates national privacy standards, it 
should take care not to undermine California's groundbreaking 
privacy law. Instead, Congress should pass baseline privacy 
protections that bring the same--or stronger--safeguards to all 
Americans.
    I represent much of Silicon Valley, and yes that includes 
some of the large tech companies that are at the center of the 
problems privacy legislation aims to solve. I also represent a 
thriving startup ecosystem. In my district, Y Combinator, the 
most successful startup accelerator in the world, has funded 
nearly 2,000 startups since 2005. These startups should be seen 
as part of the solution. Congress should consider proposals, 
such as data portability, that support privacy by encouraging 
competition.
    Nearly every stakeholder is calling for a Federal privacy 
law. I'm hopeful that now is the time we will be able to pass 
something that truly protects Americans online.

[GRAPHICS NOT AVAILABLE IN TIFF FORMAT]

                                 [all]