[Senate Hearing 112-857]
[From the U.S. Government Publishing Office]


                                                        S. Hrg. 112-857
 
 PROTECTING MOBILE PRIVACY: YOUR SMARTPHONES, TABLETS, CELL PHONES AND 
                              YOUR PRIVACY 

=======================================================================

                                HEARING

                               before the

                        SUBCOMMITTEE ON PRIVACY,
                         TECHNOLOGY AND THE LAW

                                 of the

                       COMMITTEE ON THE JUDICIARY
                          UNITED STATES SENATE

                      ONE HUNDRED TWELFTH CONGRESS

                             FIRST SESSION

                               ----------                              

                              MAY 10, 2011

                               ----------                              

                          Serial No. J-112-19

                               ----------                              

         Printed for the use of the Committee on the Judiciary


                         U.S. GOVERNMENT PRINTING OFFICE 

86-775 PDF                       WASHINGTON : 2011 

  For sale by the Superintendent of Documents, U.S. Government Printing 
   Office Internet: bookstore.gpo.gov Phone: toll free (866) 512-1800; 
        DC area (202) 512-1800 Fax: (202) 512-2104 Mail: Stop IDCC, 
                          Washington, DC 20402-0001 



                       COMMITTEE ON THE JUDICIARY

                  PATRICK J. LEAHY, Vermont, Chairman
HERB KOHL, Wisconsin                 CHUCK GRASSLEY, Iowa
DIANNE FEINSTEIN, California         ORRIN G. HATCH, Utah
CHUCK SCHUMER, New York              JON KYL, Arizona
DICK DURBIN, Illinois                JEFF SESSIONS, Alabama
SHELDON WHITEHOUSE, Rhode Island     LINDSEY GRAHAM, South Carolina
AMY KLOBUCHAR, Minnesota             JOHN CORNYN, Texas
AL FRANKEN, Minnesota                MICHAEL S. LEE, Utah
CHRISTOPHER A. COONS, Delaware       TOM COBURN, Oklahoma
RICHARD BLUMENTHAL, Connecticut
            Bruce A. Cohen, Chief Counsel and Staff Director
        Kolan Davis, Republican Chief Counsel and Staff Director
                                 ------                                

            Subcommittee on Privacy, Technology and the Law

                    AL FRANKEN, Minnesota, Chairman
CHUCK SCHUMER, New York              TOM COBURN, Oklahoma
SHELDON WHITEHOUSE, Rhode Island     ORRIN G. HATCH, Utah
RICHARD BLUMENTHAL, Connecticut      LINDSEY GRAHAM, South Carolina
                Alvaro Bedoya, Democratic Chief Counsel
               Elizabeth Hays, Republican General Counsel


                            C O N T E N T S

                              ----------                              

                    STATEMENTS OF COMMITTEE MEMBERS

                                                                   Page

Witness List.....................................................    49
Franken, Hon. Al, a U.S. Senator from the State of Minnesota.....     1
Leahy, Hon. Patrick J., a U.S. Senator from the State of Vermont.     1
    prepared statement...........................................    51
Coburn, Hon. Tom, a U.S. Senator from the State of Okahoma.......     5

                               WITNESSES

Rich, Jessica, Deputy Director, Bureau of Consumer Protecion, 
  Federal Trade Commission, Washington, DC.......................     6
    prepared statement...........................................    54
Weinstein, Jason, Deputy Assistant Attorney General, Criminal 
  Division, U.S. Department of Justice, Washington, DC...........     8
    prepared statement...........................................    66
Soltani, Ashkan, Independent Privacy Researcher and Consultant, 
  Washington, DC.................................................    21
    prepared statement...........................................    99
Brookman, Justin, Director, Project on Consumer Privacy, Center 
  for Democracy and Technology, Washington, DC...................    23
    prepared statement...........................................    80
Tribble, Guy ``Bud,'' M.D., Ph.D., Vice President of Software, 
  Technology, Apple Inc., Cupertino, California..................    25
    prepared statement...........................................   112
Davidson, Alan, Director of Public Policy, Google Inc., 
  Washington, DC.................................................    27
    prepared statement...........................................    90
Zuck, Jonathan, President, The Association for Competitive 
  Technology, Washington, DC.....................................    28
    prepared statement...........................................   125

 QUESTIONS FROM HON. AL FRANKEN, HON. RICHARD BLUMENTHAL, AND HON. TOM 
                                 COBURN

Questions from Hon. Al Franken to Alan Davidson and Guy ``Bud'' 
  Tribble........................................................   143
Questions from Hon. Richard Blumenthal to Justin Brookman, Alan 
  Davidson, Ashkan Soltani, and Guy ``Bud'' Tribble..............   146
Questions from Hon. Tom Coburn to Alan Davidson and Guy ``Bud'' 
  Tribble........................................................   156

                         QUESTIONS AND ANSWERS

Responses of Justin Brookman to questions submitted by Senator 
  Blumenthal.....................................................   158
Responses of Alan Davidson to questions submitted by Senators 
  Blumenthal, Coburn and Franken.................................   163
Responses of Ashkan Soltani to questions submitted by Senator 
  Blumenthal.....................................................   180
Responses of Jessica Rich to questions submitted by Senator 
  Coburn.........................................................   182
Responses of Guy ``Bud'' Tribble to questions submitted by 
  Senators Franken, Coburn and Blumenthal........................   185

                MISCELLANEOUS SUBMISSIONS FOR THE RECORD

Baker, James A., Associate Deputy Attorney General, Department of 
  Justice, Washington, DC, statement.............................   210
Franken, Hon. Al, a U.S. Senator from the State of Minnesota, and 
  Hon. Richard Blumenthal, a U.S. Senator from the State of 
  Connecticut, joint letter (April 12, 2011).....................   223
Franken, Hon. Al, a U.S. Senator from the State of Minnesota: 
  Letter to Mr. Steve Jobs (Apple; April 20, 2011)...............   224
American Civil Liberties Union (ACLU), Laura W. Murphy, Director, 
  Washington Legislative Office; Christopher Calabrese, 
  Legislative Counsel, Washington Legislative Office and 
  Catherine Crump, Staff Attorney, Speech, Privacy and Technology 
  Project, Washington, DC, statement.............................   226
Additional Documents from Hon. Al Franken, Incorporated by 
  Reference into the Record......................................   237
arstechnica.com, Chris Foresman, article: ``Android phones keep 
  location cache, too, but it's harder to access,'' May 17, 2011.   238
Apple App Store Review Guidelines for Hon. Al Franken............   240
Apple's July 12, 2010, letter to the Hon. Edward J. Markey and 
  the Hon. Joe Barton from Bruce Sewell, General Counsel and 
  Senior Vice President of Legal and Government Affairs..........   257
Apple's May 6, 2011, letter to the Hon. Al Franken from Bruce 
  Sewell, General Counsel and Senior Vice President of Legal and 
  Government Affairs.............................................   270
cnet.com, Declan McCullagh, April 22, 2011, article: ``Android 
  data tied to users? Some say yes''.............................   281
Department of Justice, Prosecuting Computer Crimes, Michael 
  Battle, Director, EOUSA and Michael W. Bailie, Director, OLE, 
  reports........................................................   285
Department of Justice, Cybercrime Manual.........................   310
Bureau of Justice Statistics Special Report: ``Stalking 
  Victimization in the United States,'' by Katrina Baum, Ph.D.; 
  Shannan Catalano, Ph.D.; and Michael Rand, January 2009........   334
zwillgenblog.com, April 27, 2011 article: ``Are Smartphones 
  Making Stakeouts a Thing of the Past?''........................   350
Wall Street Journal, WSJ.com, April 5, 2011, article: ``Mobile-
  App Makers Face U.S. Privacy Investigation''...................   352
National Center for Victims of Crime, Mai Fernandez, Executive 
  Director, Washington, DC, statement............................   355
Wall Street Journal, WSJ.com, November 19, 2010, article: 
  ``Insurers Test Data Profiles to Identify Risky Clients''......   377
IPhone Software Agreement........................................   382
Wall Street Journal, WSJ.com, April 25, 2011, article: ``IPhone 
  Stored Location in Test Even if Disabled''.....................   387
Oreilly.com, April 27, 2011, article: ``Got an iPhone or 3G iPad? 
  Apple Is Recording Your Moves''................................   389
Department of Justice, Office of Legislative Affairs, Lanny A. 
  Breuer, Assistant Attorney General, Washington, DC, May 9, 
  2011, letter to Hon. Al Franken................................   391
Levinson, Alex; article posted April 21, 2011 at wordpress.com: 
  ``3 Major Issues with the Latest iPhone Tracking `Discovery'''.   394
Pcmag.com, April 27, 2011, article: ``Most Mobile Apps Lack 
  Privacy Policies: Study''......................................   399
National Network to End Domestic Violence with the Minnesota 
  Coalition for Battered Women, Washington, DC, statement........   401
nielsen.com, April 21, 2011, article: ``Privacy Please! U.S. 
  Smartphone App Users Concerned with Privacy When It Comes to 
  Location''.....................................................   408
Wall Street Journal, WSJ.com, December 19, 2010, article: ``How 
  One App Sees Location Without Asking''.........................   413
Wall Street Journal, WSJ.com, August 3, 2010, article: ``Stalkers 
  Exploit Cellphone GPS''........................................   415
Washington Post, Washingtonpost.com, May 8, 2011, article: 
  ``Parting with Privacy with a Quick Click''....................   421
Google; patent application publication by Youssef, et al.........   425
Wired.com, April 25, 2011, article: ``iPhone's Location-Data 
  Collection Can't Be Turned Off''...............................   486
Wall Street Journal, WSJ.com, April 26, 2011, article ``The 
  Unique ID Android Uses in Collecting Location''................   489
Wall Street Journal, WSJ.com, April 22, 2011, article: ``Apple, 
  Google Collect User Data''.....................................   490
Wall Street Journal, WSJ.com, December 17, 2010, article: ``Your 
  Apps Are Watching You''........................................   494

                 ADDITIONAL SUBMISSIONS FOR THE RECORD

Submissions for the record not printed due to voluminous nature, 
  previously printed by an agency of the Federal Government, or 
  other criteria determined by the Committee, list:..............   500

http://info.publicintelligence.net/GoogleWiFiSpy.pdf.............   500


 PROTECTING MOBILE PRIVACY: YOUR SMARTPHONES, TABLETS, CELL PHONES AND 
                              YOUR PRIVACY

                              ----------                              


                         TUESDAY, MAY 10, 2011

                                       U.S. Senate,
          Subcommittee on Privacy, Technology, and the Law,
                                Committee on the Judiciary,
                                                    Washington, DC.
    The Subcommittee met, pursuant to notice, at 10:08 a.m., in 
Room SD-226, Dirksen Senate Office Building, Hon. Al Franken, 
Chairman of the Subcommittee, presiding.
    Present: Senators Franken, Leahy, Schumer, Whitehouse, 
Blumenthal, and Coburn.

 OPENING STATEMENT OF HON. AL FRANKEN, A U.S. SENATOR FROM THE 
                       STATE OF MINNESOTA

    Senator Franken. This hearing will come to order, and it is 
my pleasure to welcome all of you to the first hearing of the 
Senate Judiciary Subcommittee on Privacy, Technology, and the 
Law. I am sorry that everyone was not able to get into the 
room, into the hearing room, but we are streaming live on C-
SPAN, thankfully, and we thank C-SPAN for that.
    I would like to turn it over to Chairman Leahy and thank 
you, sir, for creating this Subcommittee and giving me the 
opportunity to lead it.
    The Chairman has a long track record on protecting privacy, 
and I am honored to join him in this effort.
    Mr. Chairman.

  STATEMENT OF HON. PATRICK J. LEAHY, A U.S. SENATOR FROM THE 
                        STATE OF VERMONT

    Chairman Leahy. Well, thank you, Senator Franken, and I 
want to commend you for holding what is a very timely hearing 
on the privacy implications of smartphones and other mobile 
applications.
    This is actually the first hearing for the new Subcommittee 
on Privacy, Technology, and the Law, and so I thank Senator 
Franken for his dedicated leadership on consumer privacy issues 
as Chairman of the Subcommittee. And I thank Dr. Coburn for his 
commitment to such issues, too, and I appreciate the both of 
them working together on this.
    Throughout the three decades I have been in the Senate, I 
have worked to safeguard the privacy rights of all Americans. 
Ensuring that our Federal privacy laws accomplish this goal--
while at the same time addressing the needs of both law 
enforcement and America's vital technology industry--has been 
one of my highest priorities as Chairman of the Senate 
Judiciary Committee. That is why I decided to establish this 
new Privacy Subcommittee and was delighted when Senator Franken 
said he would be willing to chair it. It is also why I am 
working to update the Electronic Communications Privacy Act--
ECPA.
    Now, the digital age can do some wonderful, wonderful 
things for all of us, but at the same time, American consumers 
and businesses face threats to privacy like no time before. 
With the explosion of new technologies, such as social 
networking sites, smartphones, and other mobile applications, 
there are, of course, many new benefits to consumers. But there 
are also many new risks to their privacy.
    Like many Americans, and certainly in Vermont where we 
cherish our privacy, I am deeply concerned about the recent 
reports that the Apple iPhone, Google Android phone, and other 
mobile applications may be collecting, storing, and tracking 
user location data without the user's consent. I am also 
concerned about reports that this sensitive location 
information may be maintained in an unencrypted format, making 
the information vulnerable to cyber thieves and other 
criminals.
    In an interview this morning, I heard somebody from the 
industry speaking about how this can be a very valuable thing 
to them, being able to sell information to various industries 
for advertising purposes and the amount of money they may make 
on that. Of course, they are charging the consumer for the use 
of the phones, and they will then make money from that. When I 
raised that point, they said they can make them aware of 
products that might be in the location they go. I said, 
``Great, we all love to get a whole lot more unsolicited ads.'' 
So it is more of a one-way street, I think.
    A recent survey commissioned by the privacy firm TRUSTe 
found that 38 percent of American smartphone users surveyed 
identified privacy as their No. 1 concern with using mobile 
applications.
    And they have good reason to be concerned. The collection, 
the use, and the storage of location and other sensitive 
personal information has serious implications regarding the 
privacy rights and personal safety of American consumers.
    This hearing provides a good opportunity for us to talk 
about this and examine these pressing privacy issues and to 
learn more about it. I am pleased that representatives from the 
Department of Justice and the Federal Trade Commission are here 
to discuss the administration's views on the privacy 
implications. I am also pleased that representatives from 
Google and Apple will address the privacy implications of their 
smartphones, their tablets, and other mobile applications.
    And I welcome the bipartisan support on the Committee for 
examining these important consumer privacy issues, and I look 
forward to a productive discussion.
    Again, Senator Franken and Senator Coburn, I thank you both 
for holding this hearing.
    Senator Franken. Well, thank you again, Mr. Chairman, for 
this opportunity. I really want to just express my pleasure in 
working with the Ranking Member of this Committee, Senator 
Coburn, and thank you for your friendship and for working on 
these critical issues.
    Now, before we turn to the business of today's hearing, I 
want to take a moment to explain what I think the Subcommittee 
is about and where we are headed. To me, this Subcommittee is 
about addressing a fundamental shift that we have seen in the 
past 40 or 50 years in who has our information and what they 
are doing with it.
    When I was growing up, when people talked about protecting 
their privacy, they talked about protecting it from the 
Government. They talked about unreasonable searches and 
seizures, about keeping the Government out of our families, out 
of our bedrooms. They talked about ``is the Government trying 
to keep tabs on the books I read and the rallies I attend.''
    We still have to protect ourselves from Government abuses, 
and that is a big part of the digital privacy debate. But now 
we also have relationships with large corporations that are 
obtaining and storing increasingly large amounts of our 
information. And we have seen the growth of this whole other 
sphere of private entities whose entire purpose is to collect 
and aggregate information about each of us.
    While we are familiar with some of these entities, the 
average person is not remotely aware of most of them. I bet 
that two months ago if you stopped a hundred people on the 
street and asked them, ``Have you ever heard of Epsilon? '' one 
hundred of them would have said no. I certainly had not. But 
suddenly, when people started getting emails in their box 
telling them, ``Your information has been compromised,'' you 
bet they wanted to know who Epsilon was.
    Now, do not get me wrong. The existence of this business 
model is not a bad thing. In fact, it is usually a great thing. 
I love that I can use Google Maps--for free, no less--and the 
same for the app on my iPad that tells me the weather. But I 
think there is a balance we need to strike, and this means we 
are beginning to change the way we think about privacy to 
account for the massive shift of our personal information into 
the hands of the private sector, because the Fourth Amendment 
does not apply to corporations; the Freedom of Information Act 
does not apply to Silicon Valley. And while businesses may do a 
lot of things better than the Government, our Government is at 
least, by definition, directly accountable to the American 
people.
    Let me put it this way: If it came out that the DMV was 
creating a detailed file on every single trip you had taken in 
the past year, do you think they could go one whole week 
without answering a single question from a reporter?
    Now, this is not a new trend, and I am hardly the first 
person to notice it. Twenty-five years ago, a Senator named 
Patrick Leahy wrote and passed a law called the Electronic 
Communications Privacy Act, which talked a lot about government 
but which also contained commercial disclosure provisions. In 
1996, Congress passed a law protecting the privacy of medical 
records. In 1998, we passed a law protecting children's 
privacy, and in 1999, we passed a law protecting financial 
records. So we have some protections here and there, but we are 
not even close to protecting all of the information that we 
need to.
    I believe that consumers have a fundamental right to know 
what data is being collected about them. I also believe they 
have a right to decide whether they want to share that 
information and with whom they want to share it and when. I 
think we have those rights for all of our personal information.
    My goal for this Subcommittee is to help Members understand 
the benefits and privacy implications of new technology, to 
educate the public, to raise awareness, and, if necessary, to 
legislate and make sure that our privacy protections are 
keeping up with our technology.
    Now, today in this hearing we are looking at a specific 
kind of really sensitive information that I do not think we are 
doing enough to protect, and that is data from mobile devices: 
smartphones, tablets, and cell phones. This technology gives us 
incredible benefits. Let me say that. Let me repeat that. This 
technology gives us incredible benefits. It allows parents to 
see their kids and wish them good night even when they are 
halfway around the world. It allows a lost driver to get 
directions, and it allows emergency responders to locate a 
crash victim in a matter of seconds.
    But the same information that allows those responders to 
locate us when we are in trouble is not necessarily information 
all of us want to share all the time with the entire world. And 
yet reports suggest that the information on our mobile devices 
is not being protected in the way that it should be.
    In December, an investigation by the Wall Street Journal 
into 101 popular apps for iPhone and Android smartphones found 
that 47 of those apps transmitted the smartphones' location to 
third-party companies, and that most of them did this without 
their user's consent.
    Three weeks ago, security researchers discovered that 
iPhones and iPads running Apple's latest operating system were 
gathering information about users' locations up to a hundred 
times a day and storing that information on the phone or tablet 
and copying it to every computer that the device is synced to.
    Soon after that, the American public also learned that both 
iPhones and Android phones were automatically collecting 
certain location information from users' phones and sending it 
back to Apple and Google, even when people were not using 
locating applications.
    In each of these cases, most users had no idea what was 
happening, and in many of these cases, once users learned about 
it, they had no way to stop it. These breaches of privacy can 
have real consequences for real people.
    A Justice Department report based on 2006 data shows that 
each year over 26,000 adults are stalked through the use of GPS 
devices, including GPS devices on mobile phones. That is from 
2006 when there were a third as many smartphones as there are 
today. And when I sent a letter to Apple to ask the company 
about its logging of users' locations, the first group to reach 
out to my office was the Minnesota Coalition for Battered 
Women. They asked, ``How can we help? Because we see case after 
case where a stalker or an abusive spouse has used the 
technology on mobile phones to stalk or harass their victims.''
    But it is not just stalking. I think today's hearing will 
show that there is a range of harms that can come from privacy 
breaches, and there is also the simple fact that Americans want 
stronger protections for this information.
    But as I have started to look into these issues in greater 
depth, I have realized that our Federal laws do far too little 
to protect this information. Prosecutors bringing cases under 
the Federal anti-hacking law often rely on breaches of privacy 
policy to make their case, but many mobile apps do not have 
privacy policies, and some policies are so long and complicated 
that they are almost universally dismissed before being read.
    In fact, once the maker of a mobile app, a company like 
Apple or Google or even your wireless company, gets your 
location information, in many cases under current Federal law 
these companies are free to disclose your location information 
and other sensitive information to almost anyone they please 
without letting you know. And then the companies they share 
your information with can share and sell it to yet others--
again, without letting you know.
    This is a problem. It is a serious problem. And I think 
that is something the American people should be aware of, and I 
think it is a problem we should be looking at.
    Before I turn it over to the distinguished Ranking Member, 
I just wanted to be clear that the answer to this problem is 
not ending location-based services. No one up here wants to 
stop Apple or Google from producing their products or doing the 
incredible things that you do. And I thank you for testifying. 
You guys are brilliant. When people think of the word 
``brilliant,'' they think of the people that founded and run 
your companies. No. What today is about is trying to find a 
balance between all of those wonderful benefits and the 
public's right to privacy. And I, for one, think that is 
doable.
    Now I will turn the floor over to my friend, the Ranking 
Member, Senator Coburn, for his opening remarks.
    [The prepared statement of Senator Leahy appears as a 
submission for the record.]

STATEMENT OF HON. TOM COBURN, A U.S. SENATOR FROM THE STATE OF 
                            OKLAHOMA

    Senator Coburn. Thank you, Mr. Chairman. I will be brief. I 
just wanted you to know, that weather app that you have on your 
phone sends me the location of all the meetings you attend, so 
just be forewarned.
    Senator Franken. That makes me very frightened.
    [Laughter.]
    Senator Coburn. I will thank our witnesses for being here 
today, both our government witnesses and our outside witnesses. 
Transparency in what we do in government and outside of 
government, when it is not fiduciary and when it is not 
proprietary, is important for the American people, as is the 
issue of privacy. And rather than making the decision on what 
needs to change, I think we need a whole lot more information 
and knowledge in terms of those of us on the legislative side 
before we come to conclusions about what should be or needs to 
be done.
    So I am looking forward to our witnesses' testimony, and 
with that, I will shorten this up and rather would hear from 
our witnesses rather than to continue to propound from the 
dais.
    Senator Franken. Thank you. I think we will begin our first 
panel now, and I want to introduce them.
    We have Jessica Rich. She is Deputy Director of the Bureau 
of Consumer Protection at the Federal Trade Commission. She has 
served as an Assistant Director in the Federal Trade 
Commission's Bureau of Consumer Protection since 1998, first in 
the Division of Financial Practices and now in the Division of 
Privacy and Identity Protection. She previously served as legal 
adviser to the Director of the Bureau of Consumer Protection. 
She received her law degree from New York University and her 
undergraduate degree from Harvard University.
    Jason Weinstein is the Deputy Assistant Attorney General 
for the Criminal Division of the U.S. Department of Justice. 
Before joining the Criminal Division, Mr. Weinstein served as 
the Chief of the Violent Crimes Section in the U.S. Attorney's 
Office for the District of Maryland. He was also an Assistant 
U.S. Attorney in the U.S. Attorney's Office for the Southern 
District of New York. Mr. Weinstein attended Princeton 
University and George Washington University Law School, and I 
understand that your wife is very pregnant and that you may 
have to leave during your testimony or during Ms. Rich's 
testimony, and as Chairman, that will be fine if you have to 
leave.
    [Laughter.]
    Senator Franken. Ms. Rich.

STATEMENT OF JESSICA RICH, DEPUTY DIRECTOR, BUREAU OF CONSUMER 
      PROTECTION, FEDERAL TRADE COMMISSION, WASHINGTON, DC

    Ms. Rich. Chairman Franken, Ranking Member Coburn, Chairman 
Leahy, and Members of the Subcommittee--let me turn on the 
microphone. That would help.
    Senator Franken. Yes.
    Ms. Rich. I am Jessica Rich, Deputy Director of the Federal 
Trade Commission's Bureau of Consumer Protection. I appreciate 
this opportunity to present the Commission's testimony on 
mobile privacy.
    The FTC is the Nation's consumer protection agency, and 
privacy has been an important component of our mission for 40 
years. During this time, the Commission has employed a variety 
of strategies to protect consumer privacy, including law 
enforcement, regulation, outreach to consumers and businesses, 
and policy initiatives. Just as we have protected consumer 
privacy in the brick-and-mortar marketplace, on the phones, on 
email, on mail, and on the Internet, we are committed to 
protecting privacy in the rapidly growing mobile arena.
    To ensure the Commission staff has the technical and 
practical ability to engage in law enforcement and inform 
policy development in the mobile space, the Commission has 
hired technologists to work as FTC staff. The agency also has 
created a mobile lab with numerous smartphone devices on 
various platforms and carriers as well as software and other 
equipment to collect and preserve evidence. In addition, 
Commission staff have explored the key mobile consumer 
protection issues through workshops and reports.
    What is clear from our work in this area is that the rapid 
growth of mobile products and services creates many 
opportunities for consumers, but also raises serious privacy 
concerns. These concerns stem from the always-on, always-with-
you personal nature of mobile devices; the invisible collection 
and sharing of data with multiple parties; the ability to track 
consumers, including children and teens, to their precise 
location; and the difficulty of providing meaningful 
disclosures and choices about data collection on the small 
screen.
    Law enforcement is, of course, critical to our consumer 
protection mission. The FTC's primary law enforcement tool, the 
FTC Act, prohibits unfair or deceptive practices. This law 
applies regardless of whether a company is marketing offline, 
through your desktop or telephone, or using a mobile device.
    In the Commission's testimony, we described four recent FTC 
cases brought under the FTC Act that address practices in the 
mobile arena. Two of these cases against two of the largest 
players in the mobile ecosystem, Google and Twitter, highlight 
the FTC's efforts to challenge deceptive claims that undermine 
consumers' choices about how their information is shared with 
third parties.
    In Google, the Commission alleged that the company deceived 
consumers by using information collected from Gmail users to 
generate and populate a new social network, Google Buzz. The 
Commission's proposed settlement contains strong injunctive 
relief, including independent audits of Google's privacy 
policies and procedures lasting 20 years, that protects the 
privacy of all Google customers, including mobile users.
    In Twitter, the Commission charged that serious lapses in 
the company's data security allowed hackers to take over 
Twitter's accounts and gain access to users' private tweets as 
well as their non-public mobile phone numbers. As in Google, 
the Commission's order protects data that Twitter collects 
through mobile devices and requires independent audits of 
Twitter's practices in this case for 10 years. If either 
company violates its order, the Commission may obtain civil 
penalties of $16,000 per violation.
    Similarly, in our ongoing Phil Flora litigation, the 
Commission obtained a temporary restraining order against a 
defendant who allegedly sent five million unsolicited text 
messages to the mobile phones of U.S. consumers. And in the 
Reverb case, the Commission alleged that a public relations 
company planted deceptive endorsements of gaming applications 
in the iTunes mobile app store.
    The Commission's public law enforcement presence in the 
mobile arena is still at a relatively early stage, but we are 
moving forward rapidly and devoting resources to keep pace with 
developing technologies. Commission staff have a number of 
mobile investigations in the pipeline, including investigations 
related to children's privacy on mobile devices. I anticipate 
that many of these investigations will be completed in the next 
few months, and any complaints or public statements will be 
posted on our website, FTC.gov.
    I want to emphasize that while the mobile arena presents 
new methods of data collection and new technologies, many of 
the privacy concerns build on those the FTC has been dealing 
with for 40 years. At bottom, it is all about ensuring that 
consumers understand and can control data collection and 
sharing and that their data does not fall into the wrong hands. 
The FTC has the authority, experience, and strong commitment to 
tackle these issues.
    In closing, the Commission is committed to protecting 
consumer privacy in the mobile sphere through law enforcement 
and by working with industry and consumer groups to develop 
workable solutions that protect consumers while allowing 
innovation. I am happy to answer any questions.
    [The prepared statement of Ms. Rich appears as a submission 
for the record.]
    Senator Franken. Thank you, Ms. Rich.
    Mr. Weinstein.

    STATEMENT OF JASON WEINSTEIN, DEPUTY ASSISTANT ATTORNEY 
    GENERAL, CRIMINAL DIVISION, U.S. DEPARTMENT OF JUSTICE, 
                         WASHINGTON, DC

    Mr. Weinstein. Thank you, Mr. Chairman. I have asked the 
baby to stay put until after about 11:30, which will probably 
be the last time it ever listens to anything I say.
    Good morning, Chairman Franken, Ranking Member Coburn, and 
Members of the Subcommittee, and I thank you for the 
opportunity to be here today.
    Over the last decade, we have witnessed an explosion of 
mobile computing technology. From laptops and cell phones to 
tablets and smartphones, Americans are using more mobile 
computing devices, more extensively, than ever before. We can 
now bank and shop and conduct business and socialize remotely 
with our friends and loved ones instantly almost anywhere. And 
now more than ever, the world is almost literally at our 
fingertips.
    But in ways that we do not often think about, what we say 
and write and do with these mobile devices can be open to the 
world. And as the use of mobile devices continues to grow, 
these devices are increasingly tempting targets for identity 
thieves and other criminals.
    So as these devices increase our connectivity, our 
productivity, and our efficiency, they also pose potential 
threats to our safety and our privacy, and those threats fall 
into at least three very different categories.
    The first category is the threats posed by cyber criminals, 
identity thieves, cyber stalkers, and other criminals who seek 
to misuse the information that is stored in or generated by our 
mobile devices to facilitate their crimes. From around the 
corner or around the globe, skilled hackers work every single 
day to access the computer systems and the mobile devices of 
government agencies, universities, banks, merchants, and credit 
card companies to steal large volumes of personal information, 
to steal intellectual property, and to perpetrate large-scale 
data breaches that leave tens of millions of Americans at risk 
of identity theft.
    In addition, some of these cyber criminals seek to infect 
the computers in our homes and our businesses with malicious 
code to make them part of a botnet, a network of compromised 
computers under the remote command and control of a criminal or 
a foreign adversary who can capture every keystroke, every 
mouse click, every password, credit card number, and email that 
we send.
    Smartphones and tablets are, in a very real sense, mobile 
computers, and the line between mobile devices and personal 
computers is shrinking every day. So these devices provide yet 
another computing platform for cyber criminals to target for 
botnets and infection by malicious code.
    Unfortunately, Americans who are using infected computers 
and mobile devices are suffering from an extensive, pervasive 
invasion of their privacy at the hands of these criminals 
almost every single time they turn on their computers. One of 
the Department of Justice's core missions is protecting the 
privacy of Americans and prosecuting the criminals who threaten 
and violate that privacy. Through the dedication and skill of 
our prosecutors and our agents, we have had a number of major 
enforcement successes, including most recently the operation in 
Connecticut to successfully disrupt the Coreflood botnet, which 
was believed to have infected over two million computers 
worldwide.
    As mobile devices become more prevalent and as they store 
more and more personal information about their users, we should 
expect that they will be increasingly targeted by criminals. It 
is critical, therefore, that law enforcement has the necessary 
tools to investigate and to prosecute those crimes, which are 
crimes against the privacy of all Americans.
    The second category of threats to our privacy comes from 
the collection and disclosure of location information and other 
personal information by the providers themselves, including app 
providers. These situations may or may not be appropriate for 
criminal investigation and prosecution. It all depends on the 
circumstances. Some may best be addressed through regulatory 
action. And as we evaluate these matters, we must carefully 
consider the clarity and the scope of privacy policies and 
other user agreements that govern the relationship between 
providers and their customers.
    The third category of threats comes from criminals who use 
mobile devices to facilitate all sorts of their own crimes, 
from traditional cyber crimes like identity theft to violent 
crimes like kidnapping and murder. As technology evolves, it is 
critical that law enforcement be able to keep pace. Law 
enforcement must be able to get the data it needs to 
investigate and prosecute these crimes successfully and to 
identify the perpetrators--what we used to call ``putting 
fingers at the keyboard,'' and which I guess we should now call 
``putting fingers on the touchpad.''
    This kind of identification is already a challenge in cases 
involving more traditional computers where data critical to 
investigations of cyber criminals and child predators and 
terrorists and other malicious actors has too often been 
deleted by providers before law enforcement can obtain it 
through a lawful process. That challenge is even greater in 
cases involving mobile devices. Although we increasingly 
encounter suspects who use their smartphones and tablets just 
as they would a computer, many wireless providers do not 
maintain the records necessary to trace an IP address back to a 
suspect's smartphone. Those records are an absolutely necessary 
link in the investigative chain that leads to the 
identification of a particular suspect.
    I thank you for the opportunity, Mr. Chairman, to discuss 
some of the challenges the Department sees on the horizon as 
Americans' use of smartphones and tablets continues to grow and 
how the Department works every day to protect the privacy of 
users of computers and mobile devices. We look forward at the 
Department of Justice to continuing to work with the Congress 
as it considers these issues, and I would be pleased to answer 
your questions.
    [The prepared statement of Mr. Weinstein appears as a 
submission for the record.]
    Senator Franken. Thank you. Thank you both.
    Ms. Rich, in the FTC's December 2010 Consumer Privacy 
Report, the Commission states that certain kinds of information 
are so sensitive that before any of this data is collected, 
used, or shared, companies should seek ``express affirmative 
consent'' from a customer. You identify four categories of data 
that are this sensitive: information about children, financial 
information, medical information, and precise geolocation data.
    First of all, why does the FTC think that before a company 
gets or shares your location information, they should go out of 
their way to get your consent?
    Ms. Rich. We identified those four categories because 
misuse of that kind of data can have real consequences for 
consumers. So in the case of location data, as you mentioned 
and your colleagues mentioned, it can lead to--if it falls into 
the wrong hands, it can be used for stalking. Teens and 
children have a lot of mobile devices, and so we are often 
talking about teen and children information and their location.
    Location cannot just tell you where a person is at a 
particular time. If it is collected over time, you can also 
know what church somebody has gone to, what political meeting 
they have gone to, when and where they walk to and from school. 
So that is sensitive data that requires special protection.
    Senator Franken. Thank you.
    Mr. Weinstein, let me ask you a related question. When I 
use my smartphone, a lot of people can and do get a hold of my 
location, my wireless company, companies like Apple and Google, 
as well as the mobile apps that I have on my phone. My 
understanding, Mr. Weinstein, is that in a variety of cases 
under current Federal law, each of those entities may be free 
to disclose my location to almost anyone that they please 
without my knowing it and without my consent. Is that right?
    Mr. Weinstein. That is right, Mr. Chairman. The statute, 
ECPA, that you made reference to that Chairman Leahy wrote 25 
years ago does provide in those instances in which it covers 
the provider--and that is a separate question. It places a 
great deal of restrictions on the ability of providers to share 
that information with the Government, but virtually no legal 
restriction on providers' ability to share that with other 
third parties.
    There may be specific types of restrictions if you are 
talking about data other than location, like health care data, 
that may be covered by other particular privacy laws. But if 
you are talking about location data, then there is no legal 
restriction.
    If the company is not covered by ECPA, that is, it is not 
considered to be an electronic communications service provider 
or a provider of remote computing service, then there is no 
restriction at all. The company is free to share it with 
whoever they want.
    Senator Franken. Mr. Weinstein, one of the defining 
features of the mobile market is that you have a lot of 
different entities--app developers, advertisers, companies like 
Apple and Google--that are amassing large amounts of 
information about users.
    Outside of any assurances that they make to their customers 
or the requirements of financial records laws, do the companies 
in this sphere have to meet certain data security standards? In 
other words, what is to prevent them from getting hacked?
    Mr. Weinstein. I am not aware, Mr. Chairman, of any legal 
requirement that a company that is in possession of your 
personal data--whether we are talking about location data or 
financial data or other data about your use of what you do 
online--secure that data in any particular way. My 
understanding is that that is essentially a decision made by 
the company based on its own business practices and its 
assessment of risk.
    This relates to one of the arguments that you often hear 
when we talk about data retention, because there is also no 
requirement that the company retain data for any particular 
length of time, and that often impacts our ability to 
investigate and solve crimes, including crimes that threaten 
privacy. And when we talk to industry and when we talk to 
privacy groups about the need for data retention for some 
reasonable period of time to make sure that law enforcement 
could get the data it needs to protect privacy, what you often 
hear is that if companies are required by law to store that 
data for some length of time, it will put them at greater risk 
of being hacked. And it is an open question, certainly one for 
the Congress to consider, whether if there were to be a 
requirement for data retention, whether it is also appropriate 
to impose some requirement that the data be secured in some way 
to reduce that risk.
    Senator Franken. Thank you, Mr. Weinstein.
    Before I turn to the Ranking Member, I want to introduce a 
few key pieces of testimony into the record.
    First, I want to introduce joint testimony from the 
Minnesota Coalition for Battered Women and the National Network 
to End Domestic Violence, as well as testimony from the 
National Center for Victims of Crime. This testimony lays out 
how law enforcement can use this technology to find stalkers. 
It also cites cases of two Minnesota women who were both 
stalked by their partners through their smartphones. These are 
extreme cases, but I think there is no clearer statement on how 
this technology presents clear benefits and also very clearly 
privacy threats and how we need to be very careful in this 
space.
    [The prepared statement appears as a submission for the 
record.]
    Senator Franken. Now I would like to turn it over to the 
Ranking Member, Senator Coburn.
    Senator Coburn.
    Senator Coburn. Thank you, Mr. Chairman.
    One comment I would make--I hope after you all testify that 
you will hang around and listen to the second panel. What I 
find is in Congress a lot of time we talk past each other, and 
when we are observing us talking past each other, we actually 
learn something if we are an outside observer. And I would hope 
that when we hear both sides of this today, it will actually 
accentuate the ability to solve the problems that are in front 
of us.
    I want to thank you for your testimony. I have a question 
directed to both of you, and I would like for you to just 
individually answer it.
    Both of you have demonstrated that under certain laws that 
we have on the books today you can do a lot in terms of 
addressing these privacy issues. My question for you is: In 
your opinion, what else do you need in terms of statute to 
actually facilitate your ability to protect the privacy of 
individuals in this country without diminishing the benefits 
that we are seeing from this technology?
    Ms. Rich. The Commission has not taken a position at this 
point on legislation in this area; however, in the report that 
Senator Franken referred to, we did discuss some key 
protections we think should be applied across industry, 
including in mobile, that we believe would protect privacy 
while also allowing innovation to continue. First, companies 
should have privacy by design, meaning at the very early stages 
of developing their products and services, they need to give 
privacy serious thought so that they develop those products and 
services in a way that maximizes the safety to consumer data. 
That means not collecting more data than is needed, not 
retaining it for longer than is needed, providing security for 
it, and making sure it is accurate. Those things, if 
implemented early, can be done in a way that still permits 
innovation and still permits the business to function.
    Senator Coburn. Can you do that through regulation now? Can 
you make those demands through regulation?
    Ms. Rich. We have used Section 5 of the FTC Act, which 
prohibits unfair or deceptive practices, to bring enforcement 
against companies that do not do those things under certain 
circumstances.
    The second piece is streamlined, easy-to-use choice for 
consumers. Streamlining choice and making it easy for consumers 
would be particularly important on mobile devices where we 
either do not see privacy policies, as was mentioned in the 
Wall Street Journal article, or when we do, it may take a 
hundred clicks to get through the terms of service to find 
them.
    So, we have encouraged the use of icons and other ways to 
make it easier for consumers to exercise choice about things 
like sharing data with third parties.
    Senator Coburn. Like writing in plain English instead of 
lawyerese?
    Ms. Rich. Yes. And then the third piece is, of course, 
greater transparency overall, which means if you do have 
privacy policies, they should be written in a simple way so 
they are easy to compare. Also, potentially a consumer should 
be able to access the data that companies have on them.
    We believe, if implemented, these protections would achieve 
much greater protection for consumers while also allowing 
innovation.
    Senator Coburn. So the question I would have for you is: Do 
you have the ability to implement that now under the FTC 
guidelines?
    Ms. Rich. Some of the polices can be implemented under the 
FTC Act, but some of them are forward-looking policy goals.
    Senator Coburn. Would you mind submitting to the Committee 
which are which so that it can guide us in addressing where we 
think we might need to go?
    Ms. Rich. Yes, we will.
    Senator Coburn. Thank you.
    [The information referred to appears as a submission for 
the record.]
    Senator Coburn. Mr. Weinstein.
    Mr. Weinstein. Senator Coburn, there are four or five 
things that the Justice Department thinks Congress should 
consider in terms of legal changes, but most of them are not 
particular to mobile devices. A few of them are. And the reason 
that they are not all specific to mobile devices is I think it 
is important to put in perspective that the threats that you 
see in terms of cyber crime committed on mobile devices are 
really just new variations on old problems. You know, when 
someone puts malware on your computer because they attach it to 
an email, that is a threat to your computer. If someone uses an 
Android app as a delivery system for their malware, that is 
old-school cyber crime committed with new-school technology. 
And so what we need to protect privacy is the same thing we 
need to be able to fight cyber crime generally.
    That being said, number one, there are a number of further 
fixes to 1030, to the Computer Fraud and Abuse Act, even beyond 
those that were contained in the Identity Theft Enforcement and 
Restitution Act in 2008 that we believe are appropriate and 
would strengthen penalties and strengthen deterrence and make 
sure that there were significant consequences, more significant 
consequences for cyber crime. Those we anticipate will be part 
of the cyber security package which I told Senator Whitehouse a 
month ago was imminent, and now it is imminent measured in 
terms of days instead of weeks.
    The second relates to cyber stalking. The cyber stalking 
statute requires currently that the victim and the defendant 
actually be in different States, and that significantly hampers 
our ability to use that statute since, as you know, cyber 
stalkers are people who harass, whether through cyber or other 
means, and are frequently right down the street, not 
necessarily across the State line.
    The third is data retention. We think that there are--
although we do not have a specific proposal, there are 
undoubtedly--there is a reasonable period of time that Congress 
can require providers to retain data that would allow us to 
solve crimes against privacy that properly balances the needs 
of law enforcement, the needs of privacy, and the needs of 
industry.
    The fourth is data breach reporting. You know, as we see, 
every week we see a new article in the newspaper about another 
significant data breach, whether it is Sony or Epsilon or RSA, 
and it highlights the fact that there is no legal requirement 
federally--although there are a number of State laws, there is 
no comprehensive Federal legal requirement that requires data 
breach reporting either to customers or law enforcement.
    The fifth, which is mobile device specific, is the one I 
alluded to in my oral remarks, and that is that among the data 
that is not even maintained, let alone retained, is data that 
would allow us to trace back an IP address to the smartphone 
that was using it at the time that a criminal conversation or 
other criminal conduct occurred.
    The last piece--and then I will stop--is not a particular 
proposal but just something we encourage Congress to consider 
because it relates to privacy generally. As I alluded to a few 
minutes ago, there are significant legal restrictions on a 
provider's ability to share data with law enforcement. There 
are no restrictions, virtually no restrictions, certainly none 
provided by ECPA, on a provider's ability to share that 
information with third parties for any purpose, commercial or 
otherwise. And we think that Congress may wish to consider 
whether ECPA properly strikes that balance between privacy--the 
privacy balance between consumers and the providers that they 
are engaged in commerce with.
    Senator Coburn. All right. Thank you very much.
    Thank you, Mr. Chairman.
    Senator Franken. Thank you, Senator Coburn.
    Mr. Chairman.
    Chairman Leahy. Mr. Weinstein, you mentioned ECPA, and I am 
glad you did because I am going to be introducing a bill very 
shortly to update ECPA, the Electronic Communications Privacy 
Act. I think it is a very important Act. Many of us have a 
concern it does not apply to the mobile applications currently 
available, and that can be bad for consumers and also bad for 
law enforcement.
    Let me just point out the privacy requirements in ECPA only 
apply to providers of either electronic communications service 
providers or remote computing service providers. But if Google 
or Apple or other application providers collect data 
automatically or generates data from a smartphone, they might 
not fall into either of the definitions. But that would mean 
the government could just step in and obtain location and other 
sensitive information collected without obtaining a search 
warrant. I had mentioned a search warrant situation earlier 
when I spoke, but they might be able to do it without.
    Does ECPA apply to providers of mobile applications? And if 
not, what are some of the changes we should make?
    Mr. Weinstein. Mr. Chairman, the answer really would be the 
same answer I would give if you asked me not about mobile 
application providers but if you asked me about Verizon or 
Google, or Apple, for that matter. As companies provide a 
broader range of services, a company may be considered a 
provider of electronic communications service for one service 
it provides, remote computing service for another service it 
provides, and neither for some other service it provides. So 
even a company like Verizon is clearly an ECS for its 
communications services. A company like Apple might be an RCS 
for the mobile media remote back-up service. Google might be 
for Google docs, but for--Google might be an ECS or would be an 
ECS for Gmail.
    So a mobile app provider could be an ECS or an RCS or 
neither one. A lot of it depends not on the nature of the 
company but on the nature of the particular service. So----
    Chairman Leahy. Well, does that mean we have a gap in ECPA 
and we should be addressing it in the new legislation?
    Mr. Weinstein. I think that as all of these companies 
expand the range of services they provide, there are going to 
be gaps. There are going to be companies, whether more 
traditional companies or newer companies, that provide services 
that do not fall in one of the two categories. And so I do not 
have a particular proposal, but we would certainly be happy to 
work with you to explore where those gaps are and how they 
should be filled.
    Chairman Leahy. In the scenario I suggested, is this 
something where law enforcement could come in and get all this 
information without a search warrant and without going through 
a court?
    Mr. Weinstein. Well, if a company is not covered by ECPA, 
then we can get stored data using a subpoena or other legal 
process. A search warrant would not be required--in most 
instances.
    Chairman Leahy. Now, you mentioned Epsilon and Sony and the 
breach, which, as I read more and more about it, it is more and 
more frightening what is there. On three occasions, the 
Judiciary Committee has favorably reported my comprehensive 
data privacy and security bill. Among other things it would 
establish a national standard for notifying consumers about 
data breaches involving their personal information, and we will 
try again this Congress to get this passed. But if there has 
been a data breach and your information is there, you would not 
have to rely on the good graces of the company that screwed up 
allowing the data breach, but they would be required to notify 
you of it.
    How important is it for your Department and other law 
enforcement agencies to be notified of data security breaches 
so that they can look at whether it affects our criminal laws 
and national security? And then I will ask Ms. Rich a similar 
question.
    Mr. Weinstein. It is vital for law enforcement. If we do 
not know about a breach, we cannot investigate it, and if we 
find out about it too late, by the time we find out about it 
and begin investigating, the trail very well may have gone 
cold.
    There are, as I think you know, 46 or 47 State laws that in 
some fashion govern breach reporting, but only a few of them 
require the victim to notify law enforcement. Some of our 
biggest hacking and identity theft cases, a number of which I 
testified about in front of the Crime Subcommittee a month ago, 
were made possible because we got early reporting from the 
victim companies and we got cooperation from the victim 
companies throughout the investigation, and that was critical 
to our ability to follow the trail and find the hackers and 
find the people who stole personal data.
    The two things that law enforcement needs to be able to 
have a shot at making these cases are prompt victim reporting 
and, if there is customer notification, which there certainly 
should be, the opportunity to delay that notification, where 
appropriate, if law enforcement or national security needs 
dictate. But we think that breach reporting is vital to our 
ability to do our jobs, and we anticipate that in this imminent 
cyber security package there will be a data breach proposal 
that is contained in it.
    Chairman Leahy. Ms. Rich.
    Ms. Rich. The FTC has long supported legislation to require 
data breach notification and data security. We play a 
complementary role to the Department of Justice in that they 
pursue the hackers, the malicious folks who get the data, but 
our perspective is it is extremely important to also shore up 
the protections of those companies that have the sensitive 
data. There are always going to be criminals, but it is very 
important that companies secure themselves, so they are not 
easy targets. And we believe legislation requiring notification 
and security is vital to that mission.
    Chairman Leahy. Thank you. And, again, Chairman Franken, I 
thank you for holding this hearing. I think it is extremely 
important. I will go off to some budget matters now, but I 
appreciate your doing this.
    Senator Franken. Please do that. Thank you, Mr. Chairman.
    Senator Blumenthal.
    Senator Blumenthal. Thank you, Senator Franken, for your 
leadership. Again, thank you, Senator Leahy, for your 
championing many of these privacy issues over decades, 
literally, and providing a model of that kind of leadership for 
us. And I want to thank our witnesses for being here, also 
Apple and Google and the consultants that we have, in this 
profoundly important hearing. And whatever the kinds of 
challenging questions that we may ask, I hope that we are all 
on the same side of this cause, because right now what we face, 
in my view, is literally a Wild West so far as the Internet is 
concerned. We can debate the legal niceties and technicalities, 
but the FTC statutes that prohibit unfair and deceptive 
practices simply do not provide the kind of targeted 
enforcement opportunity that I think is absolutely necessary, 
and I know the Department of Justice is going to be seeking 
additional authority, which is absolutely necessary. And just 
one area pertains to young people, children, which we have not 
discussed so far today, but which obviously raises very 
discrete and powerfully important issues.
    And so let me begin with Ms. Rich. Do you think that the 
present statutes sufficiently protect young people, children 
who are 13 and under, when we are talking about marketing, 
locational information, other kinds of privacy issues?
    Ms. Rich. We do have a very strong law, the Children's 
Online Privacy Protection Act, that applies to children 12 and 
under, and we are undertaking a review of that right now. One 
of the reasons we are reviewing the Rule is to see if it is 
keeping up with technology, and we have not reached the end of 
that process. But in a workshop we had on the topic, there was 
a fair amount of agreement from industry and consumer groups 
alike that that statute is sufficiently flexible to cover a lot 
of mobile activities across a broad swath of technologies.
    Senator Blumenthal. And do you agree, Mr. Weinstein?
    Mr. Weinstein. I do. I was thinking this morning I have 
two, soon to be three, little kids, and my three-year-old is 
better with my iPhone than I am. And it is terrifying, 
actually, to think about what kind of online threats will be 
out there by the time he is actually old enough to really be 
using my iPhone with permission.
    So I think that as we move into this space, I think it is 
important that any legal changes that we make be technology 
neutral to the extent possible, and one of the geniuses of ECPA 
is that it has been able to be flexible and adaptable over a 
period of 25 years as technologies change. But I do think that 
anything the Congress can do, I think, to protect kids in 
particular in this space is a worthy effort.
    Senator Blumenthal. And let me ask, Ms. Rich, referring to 
your description of privacy by design, in addition to the 
requirement that Senator Leahy is supporting that there be 
notification--and I strongly support that requirement. I think 
it is a basic, fundamental protection--shouldn't there be some 
requirement that companies design and safeguard this 
information when they structure these systems and also 
potentially liability if they fail to sufficiently safeguard 
that information, liability so that we provide incentives for 
companies to do the right thing?
    Ms. Rich. Absolutely. We have brought, using Section 5, 34 
cases against companies that failed to secure data, and we 
believe it is vital to hold companies accountable for that.
    Senator Blumenthal. And what about a private right of 
action?
    Ms. Rich. The Commission has not taken a position on 
legislation or private right of action.
    Senator Blumenthal. Because we had testimony from Professor 
John Savage of Brown University who said to us, and I am 
quoting, ``Computer industry insiders have solutions to many 
cyber security problems, but the incentives to adopt them are 
weak, primarily because security is expensive and there is no 
requirement they be adopted until disaster strikes.''
    Ms. Rich. Let me correct something I just said. The 
Commission has actually taken a position on data security. I 
was a little confused by the question. We strongly support data 
security and data breach legislation, absolutely, which 
includes civil penalties.
    Senator Blumenthal. Thank you.
    My time has expired, and I will be submitting some 
additional questions for the record. Thank you both.
    Senator Franken. Senator Whitehouse.
    Senator Whitehouse. Thank you, Chairman Franken.
    A quick question, and then a slightly longer one. The quick 
question is that both of you have had a chance to look into, 
you might call it, the dark side of the Internet, the dark 
underbelly of the Internet. And you are also people who use it 
and have families who use it, and so you both have the 
experience of the regular American at dealing with the Internet 
and having a certain measure of confidence in it. And you have 
a heightened awareness based on your professional obligations.
    Based on that, how well informed do you believe the average 
American is about the dangers and hazards that lurk out there 
on the Internet? And is this significant in terms of things as 
simple as willingness to download protective patches and get up 
to date with commercial off-the-shelf technology to protect 
yourself, setting aside other responses that the public might 
have if it were more informed? Can you quantify a little bit 
how well informed you think the average American is about these 
risks?
    Ms. Rich. We believe that consumers really have no idea of 
the layers of sharing that go on behind the scenes. So, for 
example, many consumers may like location services, and they 
may want to share their location information in order to obtain 
them. What they do not realize is that their location data as 
well as the device ID may then be flowing to service providers, 
to advertisers, to all sorts of other parties in the chain. And 
we believe that is why, when certain high-profile security 
breaches happen to companies like Epsilon who are service 
providers and behind the scenes, people are so shocked because 
they had no idea their data was there.
    Senator Whitehouse. Mr. Weinstein.
    Mr. Weinstein. You know, I think with the large population 
that we are talking about, I think that there is going to be 
great variation. But I venture to say that--and this is based 
on sort of professional and personal observation--the vast 
majority of people are not as informed as they should be. And, 
in fact, if nothing else comes out of the heightened awareness 
that the Apple and Google media frenzy has created and that 
this Subcommittee's interest has generated, I think it will be 
that people focus more on these issues.
    The fact is that these kinds of situations may or may not 
be criminal enforcement matters, but what they do highlight is 
the need for everybody to be more vigilant. Undoubtedly, 
providers can take steps to make sure that their user 
agreements and their privacy policies are more transparent and 
are easier for the average----
    Senator Whitehouse. Let me jump into that, if you do not 
mind, a little bit.
    Mr. Weinstein. Sure.
    Senator Whitehouse. Earlier in your answer you basically 
set up the traditional dichotomy, if you will, between a 
legitimate communication or application and something that is 
infected with malware and is probably a law enforcement problem 
if it could be discovered.
    We are now in a new area, kind of in between those two, 
where the product might actually be something that the 
subscriber would want. I can imagine a location application 
that told you whenever you were near a particular fast-food 
restaurant so they could ping you and say, ``Come on in for a 
Big Mac,'' or whatever it would be. And that might be something 
that somebody would want. It also might be something that 
somebody would really not want at all, and I think part of the 
concern here is that if you are loading an app, for instance, 
onto a smartphone, you know that you are loading one dimension 
of the app. You do not know what else is being attached onto 
that. And what should the FTC be doing by way of disclosure 
requirements to make sure that when you load an app, whoever 
has put that app on the menu, really, for people to choose 
among has fully disclosed that all of the elements are in it 
and it is not just a Trojan horse to attract you with a 
particular thing when its real purpose is to find out 
information about you to sell to other individuals?
    Where are you in terms of getting that transaction properly 
overseen and with some rules? I guess what you would call 
privacy by design in your earlier statement.
    Ms. Rich. It is a challenge in the mobile sphere because of 
the nature of the small screen, but the FTC has called on 
industry to develop simplified disclosures that are embedded in 
the interaction. So, for example, when you are downloading an 
app and it is going to share the information with third 
parties, it should tell you that there and then, not in some 
privacy policy that will take you a hundred screens to download 
and look at.
    So, I think there needs to be serious work done to improve 
the interaction between these companies and consumers. We also 
think that if it is not necessary to share data with other 
companies for the business model, it should not be happening. 
We have also seen that even when sharing is necessary for the 
business model, instead of sharing the limited slice of 
information that is needed, pull the information off the whole 
device and share it with third parties. That is why privacy by 
design is needed.
    Senator Whitehouse. And from your point of view, the Trojan 
horse analogy for some apps is a fair one.
    Ms. Rich. Yes.
    Senator Whitehouse. OK. Thank you.
    Senator Franken. Thank you, Senator Whitehouse.
    I am going to have one more question here for Ms. Rich, and 
the Ranking Member has one more question.
    Ms. Rich, in your testimony--and you were just talking 
about the little screen and signing off on privacy agreements. 
Anyway, in your testimony you emphasize the FTC's ability to 
protect consumers against deceptive trade practices. When an 
iPhone user activates her phone, they have to click and agree 
to a 4,144-word software license agreement, and that tells 
users they can withdraw their consent to Apple's collection of 
location information at any time by simply turning off the 
location services button on their phones. I will add a copy of 
that agreement--this is it--to the record.
    [The agreement appears as a submission for the record.]
    Senator Franken. As it turns out, until about a week ago, 
turning off the switch did not stop the collection of location 
information by Apple, so I guess my question is: Ms. Rich, is 
that a deceptive trade practice?
    Ms. Rich. Well, I cannot comment on a specific company's 
practices, but I can say that if a statement is made by a 
company that is false, it is a deceptive practice. Similarly, 
as we have shown in our cases, if there is a misleading 
statement and then some sort of disclaimer in fine print, that 
could be a deceptive practice.
    So there is a lot we could do under our deception authority 
to challenge the types of practices you are talking about, 
although I am not going to comment on a specific company.
    Senator Franken. Thank you.
    Ranking Member.
    Senator Coburn. Mr. Chairman, I just have one comment. I 
think we need to be very careful on this idea of security 
because the greatest example I know is we spend $64 billion a 
year on IT in the Federal Government, and then on top of that, 
we spend tens of billions on security, and we are breached 
daily. So we should not be requesting a standard that we cannot 
even live up to at the Federal Government.
    So the concern is an accurate one, but I think we are going 
to have to work on what that standard would be, whether it is a 
good-faith effort or something. But to say somebody is liable 
for a breach of their security when we all know almost every 
system in the world can be breached today, we need to be 
careful with how far we carry that. And that is all I would 
add.
    Ms. Rich. Can I just address that briefly to say that we 
agree there is no such thing as perfect security, and we have 
always used a reasonableness standard. Many of the types of 
practices that would prevent breaches are things like not 
collecting more data than you need.
    Senator Coburn. I agree.
    Senator Franken. Senator Blumenthal, do you have another 
question?
    Senator Blumenthal. Yes, just to follow up on Senator 
Coburn's observation, as with any kind of liability or 
accountability, legal responsibility, there is a duty of care, 
and that duty of care can impose reasonable measures that 
common sense or technology would provide the means to do. And 
so I guess my question is: Why not some liability to ordinary 
consumers imposed through Federal law that would impose 
accountability for a standard of care that is available under 
modern technology with the kinds of reasonable approach, 
sensible responsibility?
    Ms. Rich. Yes, Senator, we agree with you. In the data 
security sphere, it is reasonable security. It is having a good 
process that assesses risks and addresses those risks. It is 
not perfection.
    Senator Blumenthal. And why not also require remedies in 
the case of a breach where that kind of accountability is 
imposed, for example, insurance or credit freezes, credit 
monitoring, as a matter of law, so that what is increasingly 
becoming standard practice would be imposed on all companies 
and provide the incentive to do more?
    Ms. Rich. Absolutely. We think that is important both to 
address what has happened to consumers and provide effective 
deterrence.
    Senator Blumenthal. Do you agree, Mr. Weinstein? I know you 
are speaking out of the consumer protection area, but----
    Mr. Weinstein. Well, I am trying to stay in my lane, but, 
look, I think from a--I will make the general observation, and 
I think this touches on some issues we talked about at the 
hearing last month. There is no perfect system. Cyber security, 
true cyber security, requires sort of a multi-layered approach, 
requires laws that breaches be reported. It undoubtedly 
requires providers to take as much of an effort, make as much 
of an effort as they can to protect their systems. It requires 
some public-private partnership, and I think that some of the 
proposals that will be in this package that you will be 
receiving address that issue. And it requires, I think, better 
work by everybody involved.
    Senator Blumenthal. Well, we look forward to the package, 
and to the package that you will be receiving in hopefully a 
very short time. Thank you.
    Senator Franken. Thank you, Senator, and I want to thank 
Ms. Rich and Mr. Weinstein. Mr. Weinstein, good luck and 
congratulations with your new baby.
    We will now proceed to the second panel of this hearing. I 
think I will introduce our panel as they are making their 
transition to the table, just to move things along. Well, there 
seems to be a little chaos here. We will take a little moment 
of pause to think about the first panel and all the issues that 
were raised and thoughts that were expressed.
    [Pause.]
    Senator Franken. I would like to introduce our second panel 
of witnesses, and I want to thank you all for being here.
    Ashkan Soltani is a technology researcher and consultant 
specializing in consumer privacy and security on the Internet. 
He has more than 15 years of experience as a technical 
consultant to Internet companies and Federal Government 
agencies. Most recently, he worked as the technical consultant 
on the Wall Street Journal's ``What They Know'' series, 
investigating digital privacy issues. He has a master's degree 
in information science from the University of California at 
Berkeley and a B.A. in cognitive and computer science from the 
University of California at San Diego.
    Justin Brookman is the director of the Project on Consumer 
Privacy at the Center for Democracy and Technology. He was also 
the chief of the Internet Bureau of the New York Attorney 
General's Office. Under his leadership the Internet Bureau was 
one of the most active and aggressive law enforcement groups 
working on Internet issues. He received his J.D. from the New 
York University School of Law in 1998 and his B.A. in 
government and foreign affairs from the University of Virginia 
in 1995.
    Mr. Bud Tribble is the vice president of software 
technology at Apple. Tribble helped design the operating system 
for Mac computers. He was also the chief technology officer for 
the Sun-Netscape Alliance. Tribble earned a B.A. in physics at 
the University of California at San Diego and an M.D. and Ph.D. 
in biophysics and physiology at the University of Washington, 
Seattle.
    Alan Davidson is the director of public policy for the 
Americas at Google. He was previously associate director for 
the Center for Democracy and Technology and a computer 
scientist working at Booz, Allen & Hamilton, where he helped 
design information systems for NASA's Space Station Freedom. he 
has an S.B. in mathematics and computer science and an S.M. in 
technology and policy from MIT and a J.D. from Yale Law School.
    Jonathan Zuck is the president of the Association for 
Competitive Technology. ACT represents small- and mid-sized 
information technology companies. Before joining ACT, Zuck 
spent 15 years as a professional software developer and an IT 
executive. He holds a B.S. from Johns Hopkins University and a 
masters in international relations from the Paul H. Nitze 
School of Advanced International Studies at the Johns Hopkins 
University.
    I want to thank you all for being here today, and please 
give your opening statements. We will start from my left and 
your right. Mr. Soltani.

STATEMENT OF ASHKAN SOLTANI, INDEPENDENT PRIVACY RESEARCHER AND 
                   CONSULTANT, WASHINGTON, DC

    Mr. Soltani. Chairman Franken, Ranking Member Coburn, and 
distinguished Members of the Subcommittee, thank you for the 
opportunity to testify about mobile privacy and the location 
ecosystem.
    My name is Ashkan Soltani. I am a technology researcher and 
consultant specializing in privacy and security on the 
Internet. I should note the opinions here are my own and do not 
reflect the views of my previous employers.
    Mobile devices today are powerful computing machines. But 
unlike desktop computers, mobile devices introduce unique 
privacy challenges. Consumers carry their phones and tablets 
with them nearly everywhere they go, from their homes to their 
offices, from daycare to the grocery store.
    A device's location can be determined using a number of 
different technologies, including GPS, information about nearby 
cell towers and WiFi access points, and other network-based 
techniques. While their accuracy can vary depending on the 
technology being used, the resulting insights derived from this 
data can be sensitive and personal in nearly all the cases.
    If you imagine a historical trail of your whereabouts over 
the course of many days, it would be reasonably easy to deduce 
where you work, where you live, and where you play. this 
information can reveal much about who you are as a person and 
how you spend your time. I believe this is why many consumers 
have been surprised by the recent stories of how their mobile 
devices have been collecting their location information and 
other sensitive data.
    With the exception of GPS, the process by which a device's 
location is determined can actually expose the location of that 
device to multiple parties. These parties include the wireless 
carrier, for example, AT&T and Verizon; the location service 
provider, such as Apple, Google, or Skyhook; and even the 
content provider used to deliver the information about that 
location, such as a mapping website or service.
    Researchers, including myself, recently confirmed that 
smartphones, such as the Apple iPhones and Google Android 
devices, send location information quietly in the background to 
Apple's and Google's servers, respectively, even when the 
device is not actively being used. That is, the background 
collection happens automatically unless the user is made aware 
of the practice and elects to turn it off. This is the default 
behavior when you purchase these devices.
    Furthermore, most smartphones keep a copy of historical 
location information directly on the device. Until recently, 
Apple's iPhone would retain an approximate log of your location 
history for about a year, stored insecurely on the phone and on 
any device the computer was backed up to. Anyone with access to 
this file would be able to obtain a historical record of your 
approximate location, and there was no way to disable it.
    Many mobile smartphone platforms like Apple's iOS and 
Google Android also allow third parties to develop applications 
for the device: productivity software like e-mail, social 
networking tools like Facebook, and, of course, games. As 
reported in the Wall Street Journal last year, many popular 
apps transmit location information or its unique identifiers to 
outside parties. For instance, if a user opens Yelp, a popular 
restaurant discovery app, not only does Yelp learn information 
about the user but so could Yelp's downstream advertising and 
analytics partners.
    This may be surprising to most customers since they may not 
have an explicit relationship with these downstream partners. 
This information is not limited to just location. Upon 
installation, many of these apps would have access to a user's 
phone number, address book, and even text messages.
    Disclosure about the collection and use of consumer 
information are often ineffective or at times completely 
absent. Many disclosures are often vague or too confusing for 
the average consumer to understand, and they rarely mention 
specifics about data retention and information-sharing 
practices--things that a privacy conscious consumer would care 
about. Notably, a mere half of the popular apps analyzed by the 
Wall Street Journal lacked discernible privacy policies.
    To conclude, in order to make meaningful choices about 
their privacy, consumers need to increase transparency into who 
is collecting information about them and why. Clear definitions 
should be required for sensitive categories of information, 
such as location and other identifiable information. Software 
developers need to provide consumers with meaningful choice and 
effective opt-outs that allow consumers to control who they 
share information with and for what purpose. Only in an 
environment that fosters and control will consumers be able to 
take full advantage of all the benefits that mobile 
technologies have to offer.
    I thank the Committee for inviting me here today to 
testify, and I look forward to answering your questions.
    [The prepared statement of Mr. Soltani appears as a 
submission for the record.]
    Senator Franken. Thank you.
    Mr. Brookman.

  STATEMENT OF JUSTIN BROOKMAN, DIRECTOR, PROJECT ON CONSUMER 
  PRIVACY, CENTER FOR DEMOCRACY AND TECHNOLOGY, WASHINGTON, DC

    Mr. Brookman. Thank you very much, Chairman Franken, 
Ranking Member Coburn, Members of the Subcommittee. Thank you 
very much for the opportunity to testify here today. there 
really could not be a more timely topic for the first hearing 
of this Subcommittee than the issue of mobile privacy. 
Consumers are enthusiastically embracing mobile devices, and 
they offer an amazing array of functionality that truly makes 
our lives better.
    However, many of the same privacy issues that have 
frustrated consumers in the online space are actually 
significantly heightened in the mobile environment. As opposed 
to websites, apps can access a far broader range of personal 
information such as contact information, access to a 
smartphone's camera or microphone, and precise geolocation 
information. At the same time, the tools that consumers have to 
see and control how apps share their personal information are 
actually weaker than they are on the Web.
    I have been invited here today to discuss the existing laws 
that govern mobile data flows and whether that framework has 
proven adequate to safeguard consumer information. The short 
answer is no. There is no comprehensive privacy law in the 
United States. There are a few sector-specific laws that govern 
relatively small sets of consumers' information. In the mobile 
space, I think it is fair to say that there is a patchwork of 
outdated and inapt laws that may apply at the margins, but do 
not offer consumers meaningful and consistent protections.
    Now, traditionally mobile devices were one area where there 
actually were strong protections over consumer data. The 
Communications Act and the associated CPNI rules historically 
required carriers to get a customer's affirmative permission to 
share or sell the relatively limited information around the 
traditional dumb phones, which is who you can call and whatnot. 
However, as cell carriers branched out into offering data plans 
for smartphones, the FCC opted not to extend CPNI rules to 
those information services, leaving the treatment of customer 
information about this new usage of mobile services 
unregulated.
    Furthermore, CPNI rules never applied to most of the 
players in the modern apps space, such as operating system and 
location providers like Apple and Google, apps makers, mobile 
advertising networks, and data brokers. So as the mobile data 
ecosystem has dramatically expanded, the relatively narrow CPNI 
rules, which at one point effectively covered everything, no 
longer offer sufficient protections for consumers in the mobile 
space.
    There are a couple other statutes that arguably apply at 
the margins, but they do not consistently protect consumers 
here. So one would be the Electronic Communications Privacy 
Act, which we discussed, which generally covers Government 
access to information, but does have some protections around 
certain companies in disclosing the contents of customer 
communications. Unfortunately, the definitions of this law were 
written in 1986, well before the modern apps ecosystem 
developed. The law could arguably be interpreted to cover some 
apps, but certainly not all, and probably it does not extend to 
the operating systems like Apple and Google. In short, the law 
does not really map well to mobile privacy issues, and 
certainly not consistently. Even if it did apply to all the 
players, without additional rules to require meaningful 
transparency and telling consumers what you are doing with 
their data, companies could just bury permissions to share data 
in terms of service agreements that consumers would be unlikely 
to read.
    Finally, some have tried to apply criminal statutes, like 
the Computer Fraud and Abuse Act, to mobile privacy issues. 
Last month, for example, it was reported that the U.S. Attorney 
from New Jersey was investigating certain apps for transmitting 
customer information without adequate disclosure. And I think I 
am sympathetic to the policy goals of requiring better 
disclosure from apps. I think it is probably not the ideal 
approach to use a very broad criminal statute designed to 
combat hacking and protect financial information to protect 
privacy. I may not like it when companies share my information, 
and I think that should be protected by the law. I do not think 
people should necessarily go to jail for it.
    So assuming that none of these diverse laws actually 
applied, the baseline in this country is the FTC's prohibition 
on unfair or deceptive practices. The FTC has brought some 
incredibly important cases in this area, but the bar is still 
very low. The baseline rule for most consumer data is merely 
that companies cannot affirmatively lie about how they are 
treating your data, so many companies' response might just be 
not to make any representations at all. This is why privacy 
policies tend to be legalistic and vague. The easiest way for a 
company to get in trouble is to actually make a concrete 
statement about what they are doing.
    Indeed, in the mobile space, as Mr. Soltani testified, many 
apps makers do not make representations at all. Only a small 
percentage actually offer any privacy policies whatsoever. And 
so it is just not possible in the modern environment for people 
to figure out how their data is being stored by apps and 
shared. So we have long petitioned for a baseline comprehensive 
privacy law that requires companies to say what they are doing 
with data, to give some choice around secondary transfer of 
that data, secondary uses, and to tell companies to get rid of 
it when they are doing.
    Furthermore, for sensitive information such as relating to 
religion or sexuality, health, financial, and most relevant to 
this hearing, precise geolocation information, we believe that 
an enhanced application of the fair information practice 
principles, including affirmative opt-in consent, should 
govern. For this type of information, we should err on the side 
of user privacy and against presuming assent to disclosure.
    Thank you very much for the opportunity to testify, and I 
look forward to your questions.
    [The prepared statement of Mr. Brookman appears as a 
submission for the record.]
    Senator Franken. Thank you, Mr. Brookman.
    And, by the way, for all of you, your complete written 
testimonies will be made part of the record.
    Mr. Tribble.

 STATEMENT OF GUY ``BUD'' TRIBBLE, M.D., PH.D., VICE PRESIDENT 
   OF SOFTWARE, TECHNOLOGY, APPLE INC., CUPERTINO, CALIFORNIA

    Mr. Tribble. Good morning, Chairman Franken, Ranking Member 
Coburn, and Members of the Subcommittee. My name is Bud 
Tribble. I am the vice president for software technology for 
Apple. Thank you for the opportunity to further explain Apple's 
approach to mobile privacy, especially location privacy. I 
would like to use my limited time to emphasize a few key 
points.
    First, Apple is deeply committed to protecting the privacy 
of all of our customers. We have adopted a single comprehensive 
customer privacy policy for all of our products. This policy is 
available from a link on every page of Apple's website. We do 
not share personally identifiable information with third 
parties for their marketing purposes without our customers' 
explicit consent, and we require third-party application 
developers to agree to specific restrictions protecting our 
customers' privacy.
    Second, Apple does not track users' locations. Apple has 
never done so and has no plans to ever do so. Our customers 
want and expect their mobile devices to be able to quickly and 
reliably determine their current locations for specific 
activities such as shopping, traveling or finding the nearest 
restaurant. Calculating a phone's location using just GPS 
satellite can take up to several minutes. iPhone can reduce 
this time to just a few seconds by using pre-stored WiFi 
hotspot and cell tower location data on the phone in 
combination with information about which hotspots and cell 
towers are currently receivable by the iPhone.
    In order to accomplish this goal, Apple maintains a secure 
crowdsourced data base containing information with known 
locations of cell towers and WiFi hotspots that Apple collects 
from millions of devices. It is important to point out that 
during this collection process, an Apple device does not 
transmit to Apple any data that is uniquely associated with the 
device or with that customer. This information is used to 
determine the locations of cell towers and WiFi hotspots for 
our crowdsourced data base.
    Third, by design, Apple gives customers control over 
collection and use of location data on all our devices. Apple 
has built a master location services switch into our iOS mobile 
operating system that makes it extremely easy to opt out 
entirely of location-based services. The user simply switches 
the location services off in the setting screen. When the 
switch is turned off, the device will not collect or transmit 
location information. Equally important, Apple does not allow 
any application to receive device location information without 
first receiving the user's explicit consent through a simple 
pop-up dialog box. The dialog box is mandatory and cannot be 
overridden. Customers may change their mind and opt out of 
location services for individual applications at any time by 
simple on-off switches. Parents can also use controls to 
password-protect and prevent access by their children to 
location services.
    Fourth, Apple remains committed to responding promptly and 
deliberately to all privacy and technology concerns that may 
arise. In recent weeks, there has been considerable attention 
given to the manner in which our devices store and use a cache 
subset of Apple anonymized crowdsourced data base. The purpose 
of this cache is to allow the device to more quickly and 
reliably determine a user's location. These concerns are 
addressed in detail in my written testimony. I want to reassure 
you that Apple was never tracking an individual's actual 
location from the information residing in that cache.
    Furthermore, the location data that was seen on the iPhone 
was not the past or present location of the iPhone but, rather, 
the location of WiFi hotspots and cell towers surrounding the 
iPhone's location. Apple did not have access to the cache on 
any individual user's phone at any time. Although the cache was 
not encrypted, it was protected from access by other apps on 
the phone. Moreover, cache location information was backed up 
on a customer computer. It may or may not have been encrypted, 
depending on what the user settings were.
    While we were investigating the cache, we found a bug that 
caused this cache to be updated from Apple's crowdsourced data 
base even when the location services switch had been turned 
off. This bug was fixed and other issues, including the size 
and the back-up of the cache, have been addressed in our latest 
free iOS software update released last week. In addition, in 
our next major iOS software release, the location information 
stored in the device's local cache will be encrypted.
    In closing, let me state again that Apple is strongly 
committed to giving our customers clear and transparent notice, 
choice, and control over their information, and we believe our 
products do so in a simple and elegant way. We share the 
Subcommittee's concern about the collection and misuse of any 
customer data, particularly location data, and appreciate this 
opportunity to explain our approach.
    I would be happy to answer any questions you may have.
    [The prepared statement of Mr. Tribble appears as a 
submission for the record.]
    Senator Franken. Thank you, Mr. Tribble.
    Mr. Davidson.

 STATEMENT OF ALAN DAVIDSON, DIRECTOR OF PUBLIC POLICY, GOOGLE 
                      INC., WASHINGTON, DC

    Mr. Davidson. Thank you, Chairman Franken, Ranking Member 
Coburn, and Members of the Subcommittee. My name is Alan 
Davidson, and I am the director of public policy for Google in 
North and South America. Thank you for this opportunity to 
testify at this important hearing before this new Subcommittee.
    Mobile devices and location services are now used routinely 
by tens of millions of Americans and create enormous benefits 
for our society. Those services will not be used, and they 
cannot succeed, without consumer trust. That trust must be 
built on a sustained effort by our industry to protect user 
privacy and security. With this in mind, at Google we have made 
our mobile location services opt-in only, treating this 
information with the highest degree of care.
    Google focuses on privacy protection throughout the life 
cycle of a product, starting with the initial design. This is 
the Privacy by Design concept that was discussed in the last 
panel.
    We subscribe to the view that, by focusing on the user, all 
else will follow. We use information where we can provide value 
to our users, and we apply the principles of transportation, 
control, and security. We are particularly sensitive when it 
comes to location information.
    As a start, on our Android mobile platform, all location 
sharing for Google services is opt-in. Here is how it works.
    When I first took my Android phone out of its box, one of 
the initial screens I saw asked me, in plain language, to 
affirmatively choose whether or not to share location 
information with Google. A screen shot of this process is 
included in our testimony and on the board over here. If the 
user does not choose to turn it on at set-up or does not go 
into their settings later to turn it on, the phone will not 
send any information back to Google's location servers. If they 
opt in, if the user opts in, all location data that is sent 
back to Google's location servers is anonymized and is not 
traceable to a specific user or device, and users can later 
change their mind and turn it off.
    Beyond this, we require every third-party application to 
notify users that it will be accessing location information 
before the user installs the app. The user has the opportunity 
to cancel the installation if they do not want information 
collected.
    We believe that this approach is essential for location 
services: highly transparent information for users about what 
is being collected, opt-in choice before the location 
information is collected, and high security standards to 
anonymize and protect information. Our hope is that this 
becomes a standard for the broader industry.
    We are doing all this because of our belief in the 
importance of location-based services. Many of you are already 
experiencing the benefits of these services, things as simple 
as seeing real-time traffic, transit maps to aid your commute, 
finding the closest gas station on your car's GPS. And it is 
not just about convenience. These services can be life savers. 
Mobile location services can help you find the nearest hospital 
or police station. They can let you know where to fill a 
prescription at one in the morning for a sick child. And we 
have only scratched the surface of what is possible.
    For example, Google is working with the National Center for 
Missing and Exploited Children to explore how to deliver AMBER 
alerts about missing children to those in the vicinity of the 
alert. And mobile services may soon be able to tell people in 
the path of a tornado or tsunami or guide them in an evacuation 
to an evacuation route in the event of a hurricane.
    These promising new services will not develop without 
consumer trust. The strong privacy and security practices that 
I have described are a start, but there are several privacy 
issues that require the attention of government, problems 
industry cannot solve on its own.
    As a start, we support the idea of comprehensive privacy 
legislation that could provide a basis framework to protect 
consumers online and offline. And we support action to improve 
data breach notification instead of the current confusing 
patchwork of State laws that exist.
    And a critical area for Congress, and particularly for this 
Committee, is the issue of access, Government access, to a 
user's sensitive information. We live now under a 25-year-old 
surveillance law, ECPA, first written before web mail or text 
messaging was even invented. Most Americans do not understand 
that data stored online does not receive the Fourth Amendment 
protections given to that same information on a desktop. Nor do 
users know that the detailed location information collected by 
their wireless carrier can be obtained without a warrant.
    Google is a founding member of the Digital Due Process 
Coalition, a group of companies and public interest groups 
seeking to update these laws to meet the needs and expectations 
of 21st century consumers. We hope you will review its work, 
and in summary, I will just say we strongly support your 
involvement in this issue. We appreciate the chance to be here. 
We look forward to working with you to build consumer trust in 
these innovative new services.
    Thank you.
    [The prepared statement of Mr. Davidson appears as a 
submission for the record.]
    Senator Franken. Thank you very much, Mr. Davidson.
    Mr. Zuck.

  STATEMENT OF JONATHAN ZUCK, PRESIDENT, THE ASSOCIATION FOR 
             COMPETITIVE TECHNOLOGY, WASHINGTON, DC

    Mr. Zuck. Chairman Franken, Ranking Member Coburn, and 
distinguished Members of the Subcommittee, my name is Jonathan 
Zuck, and I am the president of the Association for Competitive 
Technology, and I want to thank you for holding this important 
hearing on privacy in the emerging mobile marketplace.
    As a representative of more than 3,000 small and medium-
size IT companies, a former software developer myself, and as 
spokesman for the people that write the applications for these 
mobile devices, I want to encourage you to treat the issue of 
privacy generally and of the mobile marketplace specifically in 
a holistic manner.
    The science of holistic processing is really known best for 
faces where we are able to recognize an entire face and not 
just see it as a nose, two eyes, and a mouth. You only need to 
watch a television commercial for a mobile device, such as an 
iPod or a Xoom or a Droid phone, to understand that the face of 
mobile computing is the applications. These ads showcase the 
more than hundreds of thousands of applications that are 
available for these devices, some of which we have already 
heard about in previous testimony today, that allow you to find 
out where you are, to find services and products that are close 
to you, et cetera. And these are exciting and dynamic 
applications that have been made available to users and that 
many users are using today.
    Location-based services and advertising offer a unique 
opportunity for Main Street businesses as well. A user 
searching for a particular product or service on their 
smartphone can receive an ad from a local small business based 
on their current location data. These ads have the benefit of 
reaching potential customers at the exact time a purchasing 
decision is being made for a much smaller cost than the 
newspaper circulars or TV ad that big-box stores are able to 
afford.
    This dynamic market, valued today at about $4 billion, is 
projected to be the size of $38 billion by 2015. Application 
developers are enjoying a kind of renaissance brought by the 
lower cost to entry in the decision and are often consumer-
facing applications. These applications we have all come to 
enjoy are made predominantly by small businesses--over 85 
percent of them are made by small businesses--and not just in 
Silicon Valley.
    The next time, Chairman Franken, you are drawing one of 
your famous maps, you will be able to reflect that over 70 
percent of these applications come from outside of California, 
including in places such as Moorhead, Minnesota, and Tulsa, 
Oklahoma.
    This is a national phenomenon with international 
implications for economic growth and recovery. We have an 
opportunity to meet the President's goals to double exports. We 
are in a period of rapid experimentation and delivery of new 
services with a complete focus on the customer. One benefit of 
small businesses taking the lead here is that they cannot 
afford to ignore the demands of their customers.
    Second, when approaching the issue of data privacy in a 
holistic manner, I think it is imperative, as we heard from the 
earlier panel, to remember that there is a whole lot of data. 
To focus on a particular new type of data collection is to 
truly cut off our nose to spite our face. There is more data, 
including location data, in large company data bases than the 
top thousand mobile applications could hope to collect in a 
lifetime. In fact, to focus on a particular type of data 
collection in a particularly new market would necessarily 
discriminate against the small businesses that are responsible 
for so much economic growth in the mobile sector while leaving 
larger players largely untouched.
    Finally, there are myriad laws in place to address 
legitimate privacy and consumer protection concerns, as was 
raised earlier. Whether it is unfair or deceptive trade 
practices at the State or Federal level, there are vehicles in 
place to address transgressions. Even the use of antitrust has 
been used in the past to deal with privacy issues.
    While I do not agree with all of the recommendations made 
by the Center for Democracy and Technology, I would agree that 
any approach to privacy legislation needs to be comprehensive 
and should focus on the data itself and how it is used and 
answer these general questions and not focus on a particular 
means of collection or a particular technology platform.
    There is legitimate concern among American consumers about 
their privacy. As we heard from Chairman Leahy, a number of 
Americans are concerned about their privacy. I think one of the 
ongoing frustrations of my constituents, and of small 
businesses in general, is that they find themselves time and 
time again doing the time without really having done the crime. 
It is as though once a week there is some kind of a big company 
news, like the Sony PlayStation debacle, Epsilon's data loss, 
and Google with Spy-Fi, collecting children's Social Security 
numbers, and Buzz. These are the issues that are really causing 
the concern and fear among customers, not the prospect of 
getting one more customized ad to their phone.
    Despite that fact, the rules that get created inevitably 
impact small businesses more than our larger brethren. The 
Google Buzz settlement is a good example of this phenomenon. 
The FTC has stated it would like to use the Google Buzz 
settlement as a model for regulation going forward for the 
entire industry. The true irony is that not only has Google 
brought this regulation to our doorstep, the level of vertical 
integration they enjoy makes them immune to most of the 
consequences. Who is most likely to be affected by a law that 
affects the transfer of information to third parties? A small 
business that has to form partnerships in order to provide 
these services in an ever-changing marketplace or a huge 
company that can simply buy the third party, thereby 
circumventing the rule?
    The idea of holism dates back to Aristotle, who was the 
first to say the whole is more than the sum of its parts, and 
nowhere is that more true than in the mobile computing 
marketplace. Accordingly, I would like to encourage members of 
this Committee to take a step back from the headlines of today 
and look at the issue of privacy in a holistic manner.
    Thank you, and I look forward to your questions.
    [The prepared statement of Mr. Zuck appears as a submission 
for the record.]
    Senator Franken. Thank you, Mr. Zuck, and thank you all for 
being here today and for your thoughtful testimony.
    Mr. Tribble, last month I asked Apple in a letter why it 
was building a comprehensive location data base on iPhones and 
iPads and storing it on people's computers--when they synched 
up, of course. Apple's reply to my letter will be added to the 
record.
    [The information referred to appears as a submission for 
the record.]
    Senator Franken. But this is what Apple's CEO Steve Jobs 
said to the press: ``We build a crowdsourced data base of WiFi 
and cell tower hotspots, but those can be over 100 miles away 
from where you are. Those are not telling you anything about 
your location.''
    Yet in a written statement issued that same week, Apple 
explained that this very same data will ``help your iPhone 
rapidly and accurately calculate its location.'' Or as the 
Associated Press summarized it, `` `The data help the phone 
figure out its location,' Apple said.'' But Steve Jobs the same 
week said, ``Those are not telling you anything about your 
location.''
    Mr. Tribble, it does not appear to me that both these 
statements could be true at the same time. Does this data----
    Mr. Tribble. Senator--sorry.
    Senator Franken. I understand you are anticipating my 
question, so I will just ask and then you will answer it. Does 
this data indicate anything about your location, or doesn't it?
    Mr. Tribble. Senator, the data that is stored in the data 
base is the location of as many WiFi hotspots and cell phone 
towers as we can have. That data does not actually contain in 
our data bases any customer information at all. It is 
completely anonymous. It is only about the cell phone towers 
and the WiFi hotspots.
    However, when a portion of that data base is downloaded 
onto your phone, your phone also knows which hotspots and cell 
phone towers it can receive right now. So the combination of 
the data base of where those towers and hotspots are plus your 
phone knowing which ones it can receive right now is how the 
phone figures out where it is without the GPS.
    Senator Franken. OK. Mr. Soltani, consumers are hearing 
this a lot from both Apple and Google, and I think it is 
confusing because Apple basically said, yes, that file has 
location, but it is not your location. And when it separately 
came out that both iPhones and Android phones were also 
automatically sending certain location data to Apple and 
Google, they both said, yes, we are getting location but it is 
not your location.
    Mr. Soltani, tell me, whose location is it? Is it accurate? 
Is it anonymous? Can it be tied back to individual users?
    Mr. Soltani. Thank you, Senator. I think that is a great 
question. So, yes, in many cases, the location that this data 
refers to is actually the location of your device or somewhere 
near it. While it is true that in some rural areas, this can be 
up to 100 miles away. In practice, for the average customer or 
the average consumer, it is actually much closer, on the order 
of about 100 feet, according to a developer of this technology, 
Skyhook.
    If you refer to Figure 3 of my testimony, you can see an 
example of this location as identified by one of these WiFi 
geolocation data bases. I took my location based on GPS and my 
location based on the strongest nearby WiFi signal in the 
Senate lobby just out here, and the dot on the left refers to 
my location as determined by GPS, and then the dot on the right 
determines my location based on this WiFi geolocation 
technology, and it was about 20 feet from where I was sitting 
on the bench. So, you know, depending on how you want to slice 
it, I would consider that my location.
    The files in these data bases contain time stamps that 
describe at what point I encountered some of these WiFi access 
points, so they could be used to trace a kind of trail about 
you.
    And then, finally, to the degree that this data contains 
identifiers, that is sent back, so IP addresses. We heard 
earlier that the gentleman from the DOJ, he was claiming that 
IP addresses are necessary to identify consumers--or criminals. 
To the degree that those IP addresses are used to identify 
criminals, they become identifiable, and it is really difficult 
to call this stuff anonymous. Making those claims I think is 
not really sincere.
    Senator Franken. Because basically if you have--I mean, 
this location like in your illustration, you see that you are 
in the Hart Building.
    Mr. Soltani. Or near the entrance of the Hart Building.
    Senator Franken. Yes, yes. And so--well, let me ask Mr. 
Brookman the same question I asked Mr. Weinstein. My wireless 
company, companies like Apple and Google, and the mobile apps I 
have on my phone all can and do get my location or something 
very close to it. And my understanding, Mr. Brookman, is that 
in a variety of cases, under current law each of those entities 
may be free to disclose my location to almost anyone they want 
to without my knowing it and without my consent. Is that right? 
And if so, how exactly can they do this?
    Mr. Brookman. I think that's correct. As I mentioned 
before, the default law in this country for sharing of data is 
you can do whatever you want. The only thing you cannot do is 
what you have previously promised not to do with that data. So 
if someone like Apple or Google said, hey, if you give this 
location data to Google Maps, we promise not to share it with 
an advertising partner, under that scenario they would be 
prohibited under the FTC Act from sharing it.
    Otherwise, I think for most players in this space, I think 
it would be very hard to make a legal argument that they were 
required to have an affirmative requirement not to share data.
    Senator Franken. Thank you.
    Mr. Davidson and Mr. Tribble, let me ask you one last 
question because my time is running out. Your two companies run 
the biggest app markets in the world, and both of your 
companies say you care deeply about privacy. And yet neither of 
your stores requires that apps have a privacy policy. Would 
your companies be willing to commit to requiring apps in your 
stores to have a clear, understandable privacy policy? This 
would by no means fix everything, but it would be a simple 
first step and would show your commitment on this issue. Mr. 
Davidson.
    Mr. Davidson. Thanks. It is a great question. I would be 
happy to take it back. I think it is an extremely important 
issue that you raise about application privacy. At Google, we 
have tried to maximize the openness of our platform to allow 
lots of different small businesses to develop applications. We 
have relied on a permission-based model at Google so that 
before an application could get access to information, they 
have to ask permission from the user.
    You are asking about the next step, which is whether we put 
affirmative requirements on applications, and I would just say 
I will take that issue back to our leadership. I think it is a 
very good suggestion for us to think about.
    Senator Franken. Mr. Tribble.
    Mr. Tribble. Yes, I think that is a great question. What we 
do currently is we contractually require third-party app 
developers to provide clear and complete notice if they are 
going to do anything with the user's information or device 
information. So if you want to become an Apple developer and 
put an app in the app store, you sign an agreement with Apple 
that says you are going to do that.
    Now, it does not specifically require a privacy policy, but 
what I will say is that a privacy policy in this general area 
is probably not enough. I agree with the earlier panel that 
what we need to do, because people may not read a privacy 
policy, is put things in the user interface that make it clear 
to people what is happening with their information, and Apple 
thinks this way. For example, when an app is using your 
location data, we put a little purple icon right up next to the 
battery to let the user know that. Now, we saw that in the 
privacy policy too, and the app should say that too. But we 
also could put something in the user interface to make it even 
more clear to the user.
    We also have an arrow that shows if an app has used your 
location in the last 24 hours, so transparency here goes beyond 
just what is in the privacy policy. It is designed into the app 
and the system information, itself provides feedback to the 
user about what is happening with their information.
    Senator Franken. Thank you. Just a yes or no, Mr. Soltani. 
Isn't it true that there is no mechanism for iPhones to notify 
users that their apps can disclose their information to 
whomever they want?
    Mr. Soltani. Yes.
    Senator Franken. OK. Thank you.
    Mr. Soltani. It is true.
    Senator Franken. Thank you.
    Senator Coburn.
    Senator Coburn. Let me defer to Senator Blumenthal. I have 
a meeting that I have to take for about five minutes, and then 
I will be back in.
    Senator Franken. Senator Blumenthal.
    Senator Blumenthal. Thank you, Mr. Chairman. Thank you, 
Senator Coburn.
    I want to focus on really the very broad area or issue of 
trust that Mr. Davidson raised, which I think goes to the core 
of much of what you do with the consent and acquiescence of 
consumers and, most particularly, the practice and goal of 
building wireless network maps. Both Apple and Google are 
engaged in that business activity, are you not?
    Mr. Tribble. Yes.
    Mr. Davidson. Yes.
    Senator Blumenthal. And, in particular, Mr. Davidson, I 
want to ask some questions about the Google Wi-Spy experience, 
scandal, debacle. All three terms have been used to refer to 
it. In particular, as you well know--and now we all know--for 
three years Google intercepted and collected bits of user 
information payload data--e-mails, passwords, browsing history, 
and other personal information--while driving around taking 
pictures of people's homes on the streets in the Street View 
program. The company first denied that it was collecting this 
information, did it not?
    Mr. Davidson. It did. We did not believe that we were--we 
did not know that we were.
    Senator Blumenthal. And then it denied that it was 
collecting it intentionally. Is that true?
    Mr. Davidson. I think we still believe we were not 
collecting it intentionally.
    Senator Blumenthal. And, in fact, this personal data and 
the interception and downloading of this personal data is 
contemplated, in fact, by a patent application that has been 
submitted by Google to both the U.S. Patent Office and 
internationally, does it not?
    Mr. Davidson. I am not specifically familiar with the 
details of the patent application.
    Senator Blumenthal. I think you have been provided with a 
copy----
    Mr. Davidson. Is that what this is here?
    Senator Blumenthal. Maybe you could have a look at it. Do 
you recognize the document? Have you seen it before?
    Mr. Davidson. I have not seen this document before, but I 
am probably roughly--I have not seen this document before.
    Senator Blumenthal. Are you familiar with the goal that it 
describes of, in fact, pinpointing the location of wireless 
routers to construct a wireless network map by intercepting and 
downloading the payload data in precisely the way that Google 
denies having done?
    Mr. Davidson. No, I am not--I apologize. I am not familiar 
with that aspect of this or really anything relating that to 
this patent's content, to the content of----
    Senator Blumenthal. Are you aware that this process may 
have been used in the Street View program to collect private 
confidential information and use it to construct the wireless 
network route?
    Mr. Davidson. I would be very surprised. I think it--we 
have tried to be very clear about the fact that it was not our 
policy to collect this information; it was not the company's 
intent to collect the content or payload information. I think 
we have been very specific about the fact that we never used 
that information.
    As you indicated, people at the company were quite 
surprised and, honestly, embarrassed to find out that we had 
been collecting it. So we have said before, this was a mistake, 
that we did not intend to collect this information, and we have 
tried very hard to work with regulators to make sure we are now 
doing the responsible thing. We have not used it, and we are 
working with the regulators around the world to figure out what 
to do with it, and in many cases we have destroyed it.
    Senator Blumenthal. Why would the company then submit a 
patent application for the process--that very process that it 
denies having used?
    Mr. Davidson. I am sorry I cannot speak to the specifics of 
this patent. We were not aware that this was a topic for 
today's hearing. But I will say generally we submit patent 
applications for many, many different things. Often they are 
fairly speculative. We probably do, I do not know, hundreds of 
patent applications a year, certainly scores. And it would not 
be surprising at all that in this area that is so important we 
would be looking for innovative ways to provide location-based 
services. But it was certainly--as we have said publicly, it 
was a mistake, and we certainly never intended to collect 
payload information.
    Senator Blumenthal. Well, in fact, the payload information 
would be extremely valuable in constructing this wireless 
network map, would it not?
    Mr. Davidson. I am not sure that we would say that. I think 
that what is most important is basically having the 
identification of a hotspot and a location, which is what we 
were collecting, and that is what we have used to create this 
kind of data base, as others have. And it is not obvious that 
small snippets of a few seconds of whatever happens to be 
broadcast in the clear from somebody's home at any given 
precise second when you are passing by with a car would 
necessarily be that valuable. And I think we certainly never 
intended to collect it.
    Senator Blumenthal. Would it be valuable, in your opinion, 
Mr. Tribble, to have that kind of payload data in constructing 
a wireless network map?
    Mr. Tribble. I am actually not sure how valuable----
    Senator Franken. Turn your microphone on.
    Mr. Tribble. Yes, Senator, I am actually not sure how 
valuable it would be. We do not collect that or use that in our 
mechanisms for geolocating, and, in fact, I checked with the 
engineering group, and they said it would be--they are not sure 
how you would do that. But they probably have not seen the 
patent, so I cannot really, I guess, specifically answer your 
question.
    Senator Blumenthal. Let me ask Mr. Brookman and Mr. Soltani 
whether you have an opinion as to whether payload data would be 
useful in strengthening the location network or map.
    Mr. Brookman. I am not a technologist so I will mostly 
defer to Mr. Soltani. My instinct is that I do not think that 
it would be. The primarily interesting fact is that here is a 
wireless access point. They may need to sense that it is 
sending information out technologically, but I do not believe 
that the content of that communication would be valuable at 
all.
    Mr. Soltani. I would concur with Justin. I think the small 
differentiation is--what you are referring to is whether the 
header information, which is not necessarily--there is a 
question of whether that is payload data. So Google collects 
the information about the hot spot, which includes the header 
information about the MAC address or the identifier for that 
hotspot, and I think that is the question, whether that is 
payload data.
    I would feel like it is also not payload data, but that 
remains to be determined by others.
    Senator Blumenthal. Let me turn back then to Mr. Davidson. 
What are the plans that Google has to use or dispose of the 
information that has been downloaded and collected?
    Mr. Davidson. We are in active conversation with many 
regulators, including your former office in the State of 
Connecticut, but regulators around the world. Some of them have 
asked us to destroy the data, and we have done so. Some of them 
are continuing their investigations.
    Our intent is to answer all the questions of any regulator 
who has got an interest in this fully. We do not intend to ever 
use this data. We intend to dispose of it in whatever form 
regulators tell us we should.
    Senator Blumenthal. And would you agree that collection of 
this data violates privacy rights and that it may, in fact, be 
illegal?
    Mr. Davidson. I think our position was that it was not 
illegal, but it was not our intent, either, and it was not how 
we expect to operate our services.
    Senator Blumenthal. If it was not illegal, do you not agree 
it should be?
    Mr. Davidson. I think this raises a really complicated 
question about what happens to things that get broadcast in the 
clear and what the obligations are about people hearing them. 
And I think it is a complicated question. It is an important 
question. But I think we have to be careful about it. I think 
the law appropriately says--regulates--I believe it regulates 
the use of that information. And as I have said before, we have 
no intention to use it.
    Senator Blumenthal. I will have additional questions, Mr. 
Chairman. My time has expired and I appreciate your indulgence. 
In the meantime, I would like these patents to be made a part 
of the record.
    Senator Franken. Absolutely.
    [The patents appears as a submission for the record.]
    Senator Franken. The Ranking Member.
    Senator Coburn. Thank you, Mr. Chairman.
    This is for both Apple and Google. You both have 
requirements for the people that supply apps for your systems. 
How do you enforce the requirements that you place on them? 
Specifically, how do you know that they are keeping their word? 
How do you know they are not using data different than what 
they have agreed to? How do you know they are not tracking?
    Mr. Tribble. Yes, Senator, so Apple curates the apps that 
are in our store. The way people get apps on their phone is 
that they are in the Apple apps store.
    As I mentioned, we have requirements for the app 
developers. What we do is we examine apps, look at them. We do 
not look at their source code, but we run them, we try them 
out, we examine them before we even put them into the app 
store. If they do not meet our requirements, that----
    Senator Coburn. I understand that. But once they are in 
your app store----
    Mr. Tribble. Once they are in the app store, we do random 
audits on applications. Now, we have 350,000 apps. We do not 
audit every single one, just like the Federal Government does 
not audit every single tax return. But we do random audits and 
do things like examine the network traffic produced by that 
application to see if it is properly respecting the privacy of 
our customers.
    If we find an issue through that means or through public 
information, a blog, or a very active community of app users, 
we will investigate. And if we find a violation of our terms, 
including privacy terms or specific location handling terms, we 
will contact--we will have contacted them during the 
investigation and hopefully gotten them to fix it. But if they 
do not, we will notify them that their app will be removed from 
the store within 24 hours, and we will do that.
    Now, in fact, the overwhelmingly common case is that the 
app developers are highly incentivized to stay in the apps 
store. So during the investigation or if we warn them, 
typically they correct, and often that correction involves 
making sure they pop up a notice panel telling the customers 
what they are doing.
    Senator Coburn. Mr. Davidson.
    Mr. Davidson. So we have taken a slightly different 
approach at Google. We have strived to make sure that our 
platform is as open as possible, and we have chosen not to try 
to be a gatekeeper in terms of what applications people get 
access to. That is striking a balance, but we have tried to 
maximize openness, and we have taken a different approach to 
try to protect consumer privacy, which is to use the power of 
the device itself to make sure that people know what 
information is being shared. And so the device itself will tell 
you, when you want to install an application, what that 
application wants to have access to. And that we believe is a 
very powerful form of policing for users. But we do not then 
generally go back and try to make sure that every application 
does what it says it is going to do because we have, as I say, 
a large number, but we are also really trying to maximize the 
ability of small app developers to get online.
    Senator Coburn. Is that notification when you download that 
app in plain English where it is easily understood? Or is it a 
10-page deal that everybody scrolls down to and says, ``I 
accept'' ?
    Mr. Davidson. It is a terrific question. We have tried 
really hard to avoid that, so we do not show that ten-page 
thing that the lawyers write that says all the different things 
that may happen. It is plain language. It is rarely more than a 
screen. Sometimes you have to scroll down a little bit. And it 
says very specifically what pieces of information--not just 
location information, but all types of information that might 
be coming from the phone that that application has sought 
access to.
    And I will tell you personally I have seen applications 
that I have rejected, and I think hopefully a lot of people do 
this, when you say, well, why does my solitaire program need my 
contact data base? It does not and I should reject it.
    Senator Coburn. What is the motivation for the app 
producer--and, Mr. Zuck, you can comment on this, too--to have 
that information? Is it so they can re-use it and sell it?
    Mr. Davidson. I am sure that it is going to be a 
combination of things, and I am sure that in many cases they 
will be providing valuable services, so, you know, Foursquare 
or other location services that let you know if your friends 
are nearby. Twitter lets you look at tweets that are near your 
location. There are really valuable services out there that are 
going to be provided. Sometimes people might be using data to 
serve ads better or to build a data base of their own, and that 
is the kind of thing I think consumers need to decide whether 
they want to make that trade.
    Senator Coburn. Mr. Tribble, do you want to comment on 
that?
    Mr. Tribble. I think that there are a variety of reasons 
why third-party apps would want that kind of information and a 
variety of things that they would do with it. Again, what we 
require the apps to do is to tell the users before they do 
that. We let them have a way of choosing not to do it or to 
change their mind later. So it is an area where there is a lot 
of innovation. I am sure Mr. Zuck can tell you about that. And 
it is an important area in terms of privacy and rapidly 
evolving.
    Senator Coburn. Mr. Zuck.
    Mr. Zuck. Thank you, Senator. It is a very good question, 
and it is exciting here at the kids' table to be heard and not 
just seen, I suppose.
    Most of the privacy policies of these small businesses 
reflect the fact that most of these businesses are not 
collecting personal information, and those that are, very often 
their privacy policies extend from their other online presences 
or websites, et cetera.
    As to your question about the use and why they do it, most 
of the time it is some overt process where someone is actively 
checking in or doing something very specific where they know 
they are sharing information in order to get information. But 
the other use of the information is to allow for partnerships 
and revenue streams from ad networks. And so data is not stored 
by these small businesses in most cases, but actually 
transferred back to the likes of Google and Apple that are the 
ones that are actually accumulating the large data bases of 
data about these users.
    The one thing that is worth noting, though, is that this is 
another bite at the apple that these folks have with 
application developers and that there are terms of services for 
those ad networks as well. So that in sharing the information 
back to Apple or Google, there are restrictions on the kind of 
policies we have to have in place in order to share that 
information back with that ad network and to make use of that 
service.
    Senator Coburn. All right. Thank you very much. I will have 
additional questions for the record for Mr. Brookman and Mr. 
Soltani.
    Senator Coburn. Senator Whitehouse.
    Senator Whitehouse. Thank you very much.
    It strikes me that we are in a very new area in trying to 
think about what our take-off point should be. What existing 
models are a good analogy for where we are right now and where 
we should go is an interesting discussion to have, and I 
encourage each of you to take that as a question for the 
record, if you could for me, and get back to me in writing 
because that is a longer discussion than we have time for. But, 
you know, if you want to sell pharmaceuticals in this country, 
you can do so, but you have to disclose their side effects. If 
you want to operate on somebody in this country, you can do so, 
but you have to get their consent and list the things that 
could go wrong in the surgery. If you want to sell a consumer 
product in this country, you have to put appropriate warnings 
on, and if the product is dangerous, you have got to pull it 
back off the market. If you want to sell stock in this country, 
you have got to file a proper SEC filing so people know what 
the financial information behind the stock offering is and they 
can make an intelligent decision.
    In all of those different ways that we regulate conduct, we 
are trying to make, to your statement, Mr. Davidson, as open as 
possible a market, but not at the expense of people who are 
trying to take advantage of people.
    And so it worries me that the principle--we hear it from 
you in terms of ``as open as possible.'' We also hear it from 
the ISPs in terms of, ``Do not blame us for what comes across 
the pipes,'' even if it is crawling with malware and is really 
putting even potentially our National security at risk. ``We 
are just providing a service. We just want anything to go 
through.'' And that is not an argument that we allow to stand 
in pharmaceuticals, in consumer products, in surgery--really 
anywhere. We build an arena in which the market can work, but 
we make sure that the boundaries of the arena are the 
boundaries of safety. And I think we really need to be working 
on those boundaries, and I think that ``as open as possible'' 
is simply not an adequate standard to this task--as open as 
possible, yes, but within what controls. And I think that is 
the question that we have to be focusing on, and it is 
complicated by the fact that some of these things you want and 
you are choosing them; some of it rides along with that. I do 
not know how effective your program that allows you to check in 
and out, tell you what things it has access to, is in terms of 
the real-life consumer. What does a 14-year-old loading an app 
know about all these choices? How informed is that choice? So I 
am not sure that is a boundary that I am perfectly comfortable 
with.
    Mr. Tribble mentioned that you could change your mind later 
in the Apple system if you saw that something was going wrong. 
I am not sure, can you change your mind in yours? Or----
    Mr. Davidson. Absolutely. As I mentioned in my written and 
oral----
    Senator Whitehouse. How do you get prompted to----
    Mr. Davidson [continuing]. You can easily go back and 
change----
    Senator Whitehouse. How do you get prompted to once you 
have loaded the app?
    Mr. Davidson. Well, you can remove the application very, 
very easily. You can also change your settings in terms of, for 
example, the use of the location services that Google provides.
    Senator Whitehouse. But you have to be aware of it.
    Mr. Davidson. Absolutely. There is----
    Senator Whitehouse. So if you are not aware that somebody 
is selling your location information to somebody you are not 
interested in having it, you do not really get a second bite at 
that apple.
    Mr. Davidson. Well, and I think this is a tremendously 
important area, about the need to educate our consumers and 
users better because we believe you are right, that a lot of 
users do not understand all this. We have tried to make it very 
simple, and we have tried to strike the right balance. I do not 
think we--we do not say openness at all costs. What we have 
said is we are trying to maximize--I do not know if 
``maximize'' is the right word, but we are trying to increase 
openness. We tried to create a very open platform, and it is a 
different approach. It is not no holds barred. We take 
certain--we do have a content policy for our market. But I 
think the question is what is the appropriate way--who are the 
appropriate actors to go after? We do not go after trucking 
companies because they happen to carry faulty goods. We go 
after the manufacturers of those goods. And I would just say we 
are trying to strike the right balance, and we also need to 
really educate consumers. That is why a hearing like this is 
honestly so important because it does shed a lot of light, even 
as we try to give people information.
    Senator Whitehouse. You do go after the trucking company if 
the company knew what it was carrying.
    Mr. Davidson. And I think this is----
    Senator Whitehouse. And Google is in a better position to 
know what is being carried as a professional company that 
specializes and has vast resources than a 17-year-old who has 
been told by his friend that this is a cool app to load. So I 
would not be satisfied--I do not think that is a comfortable 
analogy either for you to rely on.
    The other thing, if somebody wants to take control of your 
computer and slave it to their botnet, they will try a lot of 
different ways to do it, and many of the ways in which they try 
this stuff will involve broadcast to thousands of people, and 
most people are careful enough to know better than to open the 
attachment or whatever. They are getting more sophisticated, 
and they are starting to add more personal data, so it is 
getting harder and harder to sort that out. But ordinarily you 
could have a success rate of only 1 in 1,000 and still be a 
pretty successful propagator of a botnet.
    And so it seems to me that there are some things for which 
even a very high failure rate is still not good. So even if 999 
of 1,000 of your customers said, ``Oh, I do not want them to do 
that,'' if somebody is putting these apps up not for the facial 
purpose, for the stated purpose, but because they have loaded a 
bunch of other stuff behind it that they want to use for an 
ulterior motive, what I called earlier a Trojan horse, you take 
it for one reason but that is not really why they are doing 
business with you. That is just their way to get in the door 
and into your computer and being able to take economic 
advantage of your information.
    It seems to me that there is some line that we want to draw 
that is an absolute line that says, even if you are--you know, 
you really should not be in a position where you are agreeing 
to this with as little information as you have, in the same way 
that you try to protect people from having their computers 
slaved to botnets by spam emails.
    So, again, I think we need to consider a little bit more 
sort of what our model is going to be here and then work off of 
that, and all I can say is that I have not yet heard a model 
here today that is convincing to me that it adequately protects 
both the Internet itself and the privacy interests. We have 
talked a lot about privacy, but, frankly, it is not just 
privacy that is at issue here. Once somebody is in your 
computer with an application, there are a lot of other ways 
they can cause mischief, and it could be all the way to 
outright malware rather than just some--it could be something 
that is ultimately illegal, not just something that is 
immediately unwelcome.
    So, anyway, I want to just thank Chairman Franken for 
having this hearing. I think it has been very interesting, very 
significant, and I think it is an issue where we have got a lot 
of work to do ahead of us, and I want to appreciate the 
participation of all of you. We all bring different 
perspectives to this. I do not think anybody's perspective is 
yet ideal. But together and working hard on this, I think that 
we can get something accomplished that will make the Internet 
safer and make people less vulnerable as consumers to abuse and 
make sure that it is clearer that you are getting what you pay 
for or what you load up when you choose to take on these 
applications.
    Much appreciation to the Chairman for his leadership on 
this.
    Senator Franken. Thank you, Senator Whitehouse.
    By the way, I apologize to the witnesses. I had to step out 
for a meeting on Minnesota flooding.
    Senator Schumer has stepped in, and I recognize you.
    Senator Schumer. Thank you. First let me thank you, Mr. 
Chairman, for having this very important hearing, and there are 
so many different types of issues and questions that have come 
up because we are in this brave new world where information is 
available much more freely and that creates new privacy 
concerns, and creating the balance is one of the most important 
things we can do at the beginning of this century. So I look 
forward to your leadership and the leadership of Senator Coburn 
as we try to balance the important benefits, and I am so glad 
you have stepped into this place.
    I always tell people that the Senate has so many different 
vacuums that, you know, somebody who is interested can sort of 
step into, and this is a classic example. So thanks for your 
leadership, Al.
    I am glad that the representatives--I have a particular 
area that I know some of you know I care about. There are a lot 
of these areas I care about, but I am going to talk on a couple 
today. Apple and Google have come here, and I thank you both 
for that. I want to ask about a slightly different aspect of 
balancing technology with public safety, and that is the 
smartphone applications that enable drunk driving.
    As you know, several weeks ago a number of my colleagues 
and I--Senators Udall, Lautenberg, Reed, and I--wrote letters 
to your companies calling your attention to the dangerous apps 
that were being sold in your app stores and asking you to take 
immediate--to immediately remove them. The apps we were talking 
about endangered public safety by allowing drunk drivers to 
avoid police checkpoints. I do not have to go into how bad 
drunk driving is in our country, and I just read those 
newspaper articles, particular at prom time and Christmastime, 
of parents just looking so forlorn because they have lost a kid 
to drunk driving.
    Anyway, the DUIs that were popping up in stores were 
terrifying because they undermined drunk-driving checkpoints. 
The apps, they have names like Buzz and Fuzz Alert, and they 
are intended to notify drivers in real time when they approach 
police drunk-driving checkpoints. There is only one purpose to 
these. We know what that is, and that is, to allow drivers to 
avoid the checkpoints and avoid detection. People often think 
twice about drunk driving, driving while drinking, because they 
know they could get stopped, with all the consequences, and 
these apps enable them not to.
    We brought these to the attention of RIM. They pulled the 
app down. I was disappointed that Google and Apple have not 
done the same, and I would like to ask you how you can justify 
to sell apps that put the public at serious risk. I know you 
agree with me that drunk driving is a terrible hazard, right? 
And I know each of your companies has different reasons for not 
removing these apps, so I would like to discuss them with you 
separately. First, Mr. Davidson, tell me your reasoning why 
Google has not removed this kind of application.
    Mr. Davidson. I will start by saying we do take this issue 
very seriously----
    Senator Schumer. I know. I do not doubt that.
    Mr. Davidson [continuing]. And we appreciate you raising 
it. As I actually just discussed with Senator Whitehouse, we 
have a policy on our application store, our application market 
and on our platform where we do try to maintain openness of 
applications and maximize it, and we do have a set of content 
policies regarding our Android marketplace. And although we 
evaluate each application separately, applications that share 
information about sobriety checkpoints are not a violation of 
our content policy.
    Senator Schumer. Let me ask you this: Would you allow an 
app that provided specific directions on how to cook 
methamphetamines? That does not explicitly violate the terms of 
your service explicitly but generates a public safety hazard.
    Mr. Davidson. I think it would be--it would be fairly fact 
specific. We do look at these things specifically. I think 
applications that are unlawful or that, you know, directly 
related to unlawful activity, I think we do take those down.
    Senator Schumer. So let me ask----
    Mr. Davidson. Malware we do take down. You are right. But 
we do have a fairly open policy about what we allow.
    Senator Schumer. Well, no one is disputing fairly open, and 
that is the motto of Google, and, you know, you are a company 
that has paid the price in a certain sense for those beliefs. 
So everyone respects the company. But my view is even under 
your present terms of prohibiting illegal behavior, this app 
would fit. By why wouldn't you then change the app to include 
at least this specifically so it does not--you know, I know if 
you had to draft generalized language, it might be trouble. But 
why wouldn't you do that?
    Mr. Davidson. Again, I think we have a set of content 
policies. We try to keep them broad, and I will just say you 
have raised what we think is an extremely important question. 
It is a question that we are actively discussing internally, 
and I will take this back and your concern back to our most 
senior leadership.
    Senator Schumer. So you will look at--if you do not believe 
under your current rules that this would be prohibited, you 
would look at specifically, at least narrowly trying to 
eliminate this app.
    Mr. Davidson. Yes.
    Senator Schumer. You agree it is a terrible thing; it is a 
bad thing.
    Mr. Davidson. We agree it is a bad thing. I agree it is a 
bad thing, Senator.
    Senator Schumer. And it probably causes death.
    Mr. Davidson. Senator, I think this is an extremely 
important issue.
    Senator Schumer. All right. Let us go to Mr. Tribble. Tell 
me why you have not. Different reasoning. That is why I am 
doing it separately.
    Mr. Tribble. Well, Senator, I share your abhorrence of 
drunk driving. As a physician who has worked in an emergency 
room, I have seen firsthand the tragedy that can come about due 
to drunk driving, so we are in complete and utter agreement on 
that. And, you know, Apple in this case is carefully examining 
this situation. One of the things we found is that some of 
these applications are actually publishing data on when and 
where the checkpoints are that are published by the police 
departments.
    Senator Schumer. No, not in the same time sequence.
    Mr. Tribble. In some cases the police department actually 
publishes when and where they are going to have a checkpoint. 
Now, not all of them do that, and there are variances to--there 
are theories on why they----
    Senator Schumer. How many police departments do that?
    Mr. Tribble. I have seen a map, for example, San Francisco, 
Ninth and Geary, we are going to be having a checkpoint 
tomorrow night. On the Web.
    Senator Schumer. Do they publish all of them?
    Mr. Tribble. I do not know. So we are looking into this. We 
think it is a very serious issue.
    Senator Schumer. It is sort of a weak read, I think.
    Mr. Tribble. Well----
    Senator Schumer. I would bet to you that I do not know of a 
police department that in real time would publish where all 
these checkpoints would be. It would make no sense. And they 
publish it on their Web site?
    Mr. Tribble. As you know, they often publish in general 
that they are doing it. It was surprising----
    Senator Schumer. But what does that----
    Mr. Tribble. That means that they believe that these 
checkpoints provide a deterrent effect and that wider 
publicity----
    Senator Schumer. But that is a different type of 
checkpoint.
    Mr. Tribble. I agree. I am just saying we are in the 
process of looking into it. We think it is very serious. We 
definitely have a policy that we will not allow--encourage 
illegal activity. And----
    Senator Schumer. Apple has pulled bad apps before.
    Mr. Tribble. Absolutely.
    Senator Schumer. OK. You pulled one even about tasteless 
jokes. Well, this is worse than that, wouldn't you say?
    Mr. Tribble. Well, I would say that in some cases it is 
difficult to decide what the intent of these apps are. But if 
they intend to encourage people to break the law, then our 
policy is to pull them off the store.
    Senator Schumer. Then I would suggest that you look at--
just keeping that policy as is, it is a little different 
situation than Mr. Davidson. You would find that the intent of 
these apps is to encourage people to break the law.
    Mr. Tribble. And I will take that back, and we will----
    Senator Schumer. And it is different. I know my time is up. 
I apologize. And I would encourage you to make a distinction 
between a police department that says, ``Well, we usually have 
a checkpoint at Ninth and Geary,'' and an app that just talks 
about where the new checkpoints are and in real time. And you 
say they publish it.
    Mr. Tribble. Yes.
    Senator Schumer. They publish it two days later.
    Mr. Tribble. No, I understand that distinction, and I agree 
that is different.
    Senator Schumer. So you, too, Apple will take a serious 
look at this.
    Mr. Tribble. Yes, we will.
    Senator Schumer. I would like if you folks, both of you, 
could get me an answer, say two weeks from now, as to what 
your--is that too soon?
    Mr. Davidson. We could certainly give you a progress 
report.
    Mr. Tribble. Yes.
    Senator Schumer. How about a month from now as to what your 
internal examination has come up with, OK?
    [The information referred to appears as a submission for 
the record.]
    Senator Schumer. I thank you and I thank my colleague for 
indulging me in an extra two minutes. Thanks.
    Senator Franken. I was actually saying that we were going 
to go to a second round, not that you were two minutes over. I 
would never do that to the distinguished Senator from New York.
    I am going to indulge my prerogative as the Chair and go to 
a second round.
    Mr. Tribble, when you download an app on Android or an 
Android machine, it tells you if that app will access your 
location, your calendar, your contact list, and you get a 
chance to opt out of those. But an iPhone only asks you if you 
want to share your location with an app, nothing else. Don't 
you think it would be helpful for Apple to inform consumers if 
an app will be able to get information from their calendars or 
address books? What more can Apple do to inform consumers of 
the information that an app can access, do you think?
    Mr. Tribble. Well, in the case of those things that--you 
know, the app, we encourage, as I mentioned, and even require 
the app provider themselves to give notice and get consent from 
the consumer before they do that. Different from Google in 
those cases, we do not provide or attempt to provide technical 
means in all cases to prevent the app from getting at any and 
all information. In fact, we think that would be very 
difficult.
    However, specifically in the case of location, we do make 
sure that every single time an application--or for the first 
time an application asks to get access to that user's location, 
it pops up that dialog box that says, ``This app would like to 
use your location, yes or no.'' So I would say two things 
there. One of our priorities in this case has been on the 
especially sensitive nature of location and to provide 
technical measures or attempt to on the phone to provide that 
notice every single time when the app first asks.
    In the case of other information which may also be personal 
information, but maybe not, you know, to the same extent as 
where am I right now, we require the app to give notice and to 
get consent from the user, but we do not have a technical means 
to require that. And if we--it is not that we would not want 
to. We think that is difficult, and it is especially difficult 
because when you start to do that for every little piece of 
information, the screen that the user is confronted with in 
terms of yes/no, yes/no, yes/no potentially becomes very long 
and complex.
    Senator Franken. Google has a screen that contains a number 
of those, and it seems to work for you guys, right?
    Mr. Davidson. It works for us guys, yes.
    Senator Franken. OK. Mr. Tribble, the Ranking Member asked 
you how your companies enforce your own rules for apps. When 
you were in my office yesterday--and thank you for coming--I 
actually asked you this question. How many apps have you 
removed from your App Store because they shared information 
with third parties without users' consent?
    Mr. Tribble. As I mentioned to Senator Coburn, of course, 
our first defense is to not put them there in the first place, 
but if we find an app, we investigate, we work with the 
developer to get them to give proper notice, and we tell them 
at some point, if we find them violating, ``you are going to be 
off in 24 hours.'' In fact, I think all of the applications to 
date or the application vendors to date have fixed their 
applications rather than get yanked from the apps store in 
those cases.
    Senator Franken. So the answer to my question is zero?
    Mr. Tribble. Is zero.
    Senator Franken. OK. Thank you.
    Mr. Soltani, let me ask you a different question. Of all 
the things that you have seen, what is the most serious privacy 
threat that mobile devices pose today?
    Mr. Soltani. Senator, thank you for your question. I think 
the biggest take-away from this is that consumers are 
repeatedly surprised by the information that apps and platforms 
are accessing. Consumers are entrusting their computers and 
phones and other devices with a great deal of personal 
information, and to the degree that these platforms are not 
taking adequate steps to make this clear to consumers that 
others in the pipeline have access to this information, I think 
that is a problem. We have talked about the apps where, you 
know, a certain app might need access to--I think the example 
was it needed access to your location information and you said 
no. I do not think consumers would know whether apps would need 
access to certain types of information or not or could make 
those definitions clearer.
    Kind of stemming from that, we see the--it sounds like the 
providers of these platforms are actually surprised as well 
that they are collecting information. In the case of Street 
View, they were surprised that they were collecting the WiFi 
information, and in the case of the recent Apple episode, they 
were surprised--even a year ago they responded to this issue--
that they were collecting information for a year.
    And so I think, you know, we need improved transparency on 
this stuff, and in order to do that, we need clear definitions 
of what things like ``opt in'' mean. For example, the check box 
being checked by default and you have to uncheck that, is that 
really kind of opt in or is that opt out? Clear definitions----
    Senator Franken. It sounds like opt out to me.
    Mr. Soltani. Right. Clear definitions of what location is, 
you know, if it gets you within 20 feet, is that your location? 
And then most importantly, clear definitions of what ``third 
parties'' and ``first parties'' mean in this context.
    Senator Franken. Well, could you describe the results of 
the Wall Street Journal's investigation into mobile apps? 
Specifically, can you describe the information that apps are 
getting from users and sharing with third parties? And can you 
tell us--you said they are surprised--if the average user has 
any idea that this is happening?
    Mr. Soltani. Right. So I do not think most consumers would 
know that apps would access things like your location 
information or information stored on your device.
    Senator Franken. So your address book or----
    Mr. Soltani. Your address book, your contacts list. And 
then there was a case where Facebook, you would install the 
Facebook app, and it would synchronize your entire address book 
up to Facebook server. I think people were kind of surprised by 
that functionality. I do not think people realized what is the 
data that is held on the phone versus the data that is 
transmitted to websites, and then, even more, transmitted to 
downstream ad companies and other entities that are not even 
the website that builds the app.
    I think ultimately this might be an issue with regard to 
kind of the incentives are mixed. So in this context, we have 
Apple and Google as platform providers, but they are also 
advertisers, and they also make apps. And so in the example 
earlier where it was the truck driving and making problematic 
products, I think in this case we have the same companies that 
are the truck and the product, and it is really weird to figure 
out what the incentives should be for them to kind of do the 
right thing and make intelligent defaults. I think we have seen 
the defaults fall in favor of what is in their best interest--
obviously so. They are companies, right? They are commercial 
entities.
    Senator Franken. Thank you, Mr. Soltani, and thank you all.
    Senator Blumenthal.
    Senator Blumenthal. Thank you, Mr. Chairman.
    I want to thank all of you again for being here and for 
your very, very useful contribution to this hearing.
    Just by way of brief footnote to your conversation, Mr. 
Tribble, Dr. Tribble, earlier with Senator Schumer, you may or 
may not be aware, but sometimes police departments actually 
publicize checkpoints so that drunk drivers will go to 
alternative routes where they do not publicize the checkpoints. 
So there may be more strategy than you may be aware in some of 
the law enforcement practices that are involved here. But I 
welcome both your and Mr. Davidson's willingness to come back 
to Senator Schumer with your response. I think that is very 
welcome and commendable.
    I also want to welcome and commend Google's response on the 
notice issue in case of breaches, which I think is a very 
important source of support for notice legislation, and would 
ask, Dr. Tribble, I do not think I saw in your testimony--I may 
have overlooked it--any reference to the requirement for notice 
in case of breaches of confidentiality. Would Apple likewise 
support that kind of legislation?
    Mr. Tribble. I am actually not the policy person at Apple, 
but what I will say is that, in general, we think it is 
extremely important that information kept on our servers stays 
secure, and we do a lot to make sure that that is the case. And 
we think that if--I personally think if customers are at risk 
from important information that is leaked from servers, I, for 
example, as a consumer would like to know.
    Fortunately, Apple has not--you know, what we are 
discussing is not that here, but if that were to happen, I 
think that would be something that consumers would want to know 
about.
    Senator Blumenthal. Well, would it be Apple's practice to 
notify consumers in case of a breach as soon as possible?
    Mr. Tribble. Yes, I think we are--I believe we are subject 
to at least various State laws along those lines, breach 
notification, and although it is not my area of the company, I 
certainly believe that--I know we would comply with that and 
notify in case of a breach.
    Senator Blumenthal. And, again, I will be submitting 
questions that I am hoping that all the witnesses will respond 
to, and we are late into this hearing, but I would be very 
interested in knowing, and would welcome your response here if 
you can do it briefly, what additional measures you would 
suggest. As you may have heard earlier, we asked the panel 
before yours about requiring security measures, privacy by 
design so to speak, as well as remedies such as credit freezes, 
credit monitoring, insurance, in case of breaches and to 
prevent such breaches and would welcome any comments from the 
panel--or not. Whichever you would prefer.
    Mr. Brookman. Fortunately, I actually testified on this 
issue last week, so I have done a little bit of thinking about 
it.
    From a consumer perspective, there is actually already a 
pretty strong legal regime in place to require reasonable 
security practices. The FTC has brought 30-some-odd cases where 
companies failed to adequately secure data. And for data breach 
notification, 46 or 47 States have versions in place. So the 
legal regime right now already has pretty strong protections in 
place. The things we would probably look for are, one, more 
authority to the FTC, maybe greater capacity to bring more 
cases. I think the 35 they brought are great, but obviously 
more would be better. And penalty authority especially as well. 
The FTC does not have the ability to get civil penalties for 
violations of the FTC Act. I think if there were a strong 
sword, a little stick, I think you would see better practices.
    Also, I think we would like to see other of the fair 
information practices put into law. So one idea that we keep 
bringing up is this idea of data minimization. If you have data 
sitting on your servers and you do not need it anymore, get rid 
of it. In both the Sony and the Epsilon case, data breach 
cases, it seemed they were holding old data they did not need 
anymore. Sony had a 2007 data base with credit card numbers 
that they were not even using. Epsilon was keeping email 
addresses of people who had previously opted out. I had 
personally got email from companies I had opted out from years 
ago saying, ``Oh, by the way, your data was breached here.''
    So I think putting into law protections for data 
minimization and stronger FTC authority would be valuable 
things here.
    Senator Blumenthal. Mr. Brookman, did Sony have in place 
adequate safeguards?
    Mr. Brookman. As I said, I am not a technologist. There 
have been a lot of press reports indicating that there are 
things they should have done better. Their servers were not 
patched to the latest security software. They were holding old 
data, and their password verification system probably should 
have been stronger.
    I am probably not the best person to testify to that. It is 
easy for me to sit back and say now that it seemed inadequate, 
but there are definitely strong security minds in this space 
who have criticized what they have done.
    Senator Blumenthal. Well, in fact, they acknowledged that 
much better, stronger safeguards should be in place going 
forward. Whether that is an implicit acknowledgment as to the 
inadequacy previously, we cannot ask them because they are not 
here today. But certainly they are going to upgrade or at least 
have promised to upgrade their safeguards.
    Mr. Brookman. Yes, they have said that they are going to 
put better protections in place, and so if there were maybe a 
greater consequences to data security breaches such as FTC 
penalty authority, then hopefully companies would think about 
it more in advance than trying to append security and privacy 
after the fact.
    Senator Blumenthal. I have a bunch of other questions which 
I will ask the witnesses and will not detain you to as now, but 
thank you very much, Mr. Chairman.
    Senator Franken. Thank you, Senator Blumenthal.
    The hearing record will be held open for a week. In 
closing, I want to thank my friend, the Ranking Member. I want 
to thank all of you who testified today. Thank you all.
    As I said at the beginning of this hearing, I think the 
people have a right to know who is getting their information 
and the right to decide how that information is shared and 
used. After having heard today's testimony, I still have 
serious doubts that those rights are being respected in law or 
in practice. We need to think seriously about how to address 
this problem, and we need to address this problem now. Mobile 
devices are only going to become more and more popular. They 
will soon be the predominant way that people access the 
Internet, so this is an urgent issue that we will be dealing 
with.
    We will hold the record, as I said, open for a week for 
submission of questions, and this hearing is now adjourned.
    [Whereupon, at 12:39 p.m., the Subcommittee was adjourned.]
    [Questions and answers and submissions for the record 
follow.]


                            A P P E N D I X

              Additional Material Submitted for the Record

                              Witness List

[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]

              Prepared Statement of Hon. Patrick J. Leahy

[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]

                    Prepared Statements of Witnesses

[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]

Questions for Witnesses from Hon. Al Franken, Hon. Richard Blumenthal, 
                          and Hon. Tom Coburn


[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]

                         Questions and Answers

[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]

                Miscellaneous Submissions for the Record


[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]

   Submissions for the Record Not Printed Due to Voluminous Nature, 
  Previously Printed by an Agency of the Federal Government, or Other 
Criteria Determined by the Committee, List of Material and Links Can Be 
                              Found Below:

    http://info.publicintelligence.net/GoogleWiFiSpy.pdf



