[Senate Hearing 112-851]
[From the U.S. Government Publishing Office]
S. Hrg. 112-851
WHAT FACIAL RECOGNITION TECHNOLOGY MEANS FOR PRIVACY AND CIVIL
LIBERTIES
=======================================================================
HEARING
before the
SUBCOMMITTEE ON PRIVACY
TECHNOLOGY AND THE LAW
of the
COMMITTEE ON THE JUDICIARY
UNITED STATES SENATE
ONE HUNDRED TWELFTH CONGRESS
SECOND SESSION
__________
JULY 18, 2012
__________
Serial No. J-112-87
__________
Printed for the use of the Committee on the Judiciary
U.S. GOVERNMENT PRINTING OFFICE
86-599 WASHINGTON : 2014
-----------------------------------------------------------------------
For sale by the Superintendent of Documents, U.S. Government Printing
Office Internet: bookstore.gpo.gov Phone: toll free (866) 512-1800; DC
area (202) 512-1800 Fax: (202) 512-2104 Mail: Stop IDCC, Washington, DC
20402-0001
COMMITTEE ON THE JUDICIARY
PATRICK J. LEAHY, Vermont, Chairman
HERB KOHL, Wisconsin CHUCK GRASSLEY, Iowa
DIANNE FEINSTEIN, California ORRIN G. HATCH, Utah
CHUCK SCHUMER, New York JON KYL, Arizona
DICK DURBIN, Illinois JEFF SESSIONS, Alabama
SHELDON WHITEHOUSE, Rhode Island LINDSEY GRAHAM, South Carolina
AMY KLOBUCHAR, Minnesota JOHN CORNYN, Texas
AL FRANKEN, Minnesota MICHAEL S. LEE, Utah
CHRISTOPHER A. COONS, Delaware TOM COBURN, Oklahoma
RICHARD BLUMENTHAL, Connecticut
Bruce A. Cohen, Chief Counsel and Staff Director
Kolan Davis, Republican Chief Counsel and Staff Director
------
Subcommittee on Privacy, Technology and the Law
AL FRANKEN, Minnesota, Chairman
CHUCK SCHUMER, New York TOM COBURN, Oklahoma
SHELDON WHITEHOUSE, Rhode Island ORRIN G. HATCH, Utah
RICHARD BLUMENTHAL, Connecticut LINDSEY GRAHAM, South Carolina
Alvaro Bedoya, Democratic Chief Counsel
Elizabeth Hays, Republican General Counsel
C O N T E N T S
----------
STATEMENTS OF COMMITTEE MEMBERS
Page
Franken, Hon. Al, a U.S. Senator from the State of Minnesota..... 1
prepared statement........................................... 123
Sessions, Hon. Jeff, a U.S. Senator from the State of Alabama.... 4
WITNESSES
Witness List..................................................... 37
Pender, Jerome, M., Deputy Assistant Director, Criminal Justice
Information Services Division, Federal Bureau of Investigation,
U.S. Department of Justice, Clarksburg, West Virginia.......... 6
prepared statement........................................... 39
Mithal, Maneesha, Associate Director, Division of Privacy and
Identity Protection, Federal Trade Commission, Washington, DC.. 8
prepared statement........................................... 44
Martin, Brian, Director of Biometric Research, MorphoTrust USA,
Jersey City, New Jersey........................................ 14
prepared statement........................................... 57
Acquisti, Alessandro, Associate Professor, Heinz College ad
CyLab, Carnegie Mellon University, Pittsburgh, Pennsylvania.... 16
prepared statement........................................... 63
Amerson, Larry, Sheriff, Calhoun County, Alabama, Anniston,
Alabama, on Behalf of the National Sheriffs' Association....... 18
prepared statement........................................... 75
Farahany, Nita A., Professor of Law, Duke Law School, and
Professor of Genome Sciences & Policy, Institute for Genome
Sciences & Policy, Duke University, Durham, North Carolina..... 19
prepared statement........................................... 81
Sherman, Rob, Manager of Privacy and Public Policy, Facebook,
Washington, DC................................................. 22
prepared statement........................................... 92
Lynch, Jennifer, Staff Attorney, Electronic Frontier Foundation,
San Francisco, California...................................... 23
prepared statement........................................... 100
QUESTIONS
Questions submitted by Hon. Al Franken for Jerome Pender,
Maneesha Mithal, Brian Martin, Alessandro Acquisti, Rob
Sherman, and Jennifer Lynch.................................... 127
QUESTIONS AND ANSWERS
Responses of Jerome Pender to questions submitted by Senator
Franken........................................................ 134
Responses of Maneesha Mithal to questions submitted by Senator
Franken........................................................ 137
Responses of Brian Martin to questions submitted by Senator
Franken........................................................ 139
Responses of Alessandro Acquisti to questions submitted by
Senator Franken................................................ 141
Responses of Rob Sherman to questions submitted by Senator
Franken........................................................ 145
Responses of Jennifer Lynch to questions submitted by Senator
Franken........................................................ 147
MISCELLANEOUS SUBMISSIONS FOR THE RECORD
Face Book, Facebook.com:
Approving and Removing Tag, instructions..................... 149
Detroit, Michigan, Code of Ordinance, City code.................. 150
Electronic Privacy Information Center (EPIC), Marc Rotenberg,
Executive Director, Ginger P. McCall, Director, Open Government
Program, and David Jacobs, Consumer Protection Fellow, July 18,
2012, joint letter............................................. 155
Federal Bureau of Investigation, Richard W. Vorder Bruegge,
Quantico, Virginia, report..................................... 156
Westlaw, Thomson Teuters:
Federal Anti-Protest law, Public Law--112-98, March 8, 2012.. 174
Hawaii Anti-Protest law, Title 38, Chapter 852, HRS 852-1... 176
Illinois Anti-Protest law, Chapter 740, Act 14............... 179
Maryland Anti-Protest law, Maryland Code, Criminal Law, 10-
201........................................................ 185
Michigan law allowing anti-protest ordinance................. 202
Security Industry Association (SIA), Don Erickson, Chief
Executive Officer, Alexandria, Virginia, January 31, 2012,
letter......................................................... 266
Tag Suggestions, instructions: Windows Photo Viewer.......... 268
Texas Anti-Protest law, V.T.C.A, Business & Commerce
503.001.................................................... 269
ADDITIONAL SUBMISSIONS FOR THE RECORD
Submissions for the record not printed due to voluminous nature,
previously printed by an agency of the Federal Government, or
other criteria determined by the Committee, list:.............. 273
EPIC Comments--January 31, 2012.: http://www.ftc.gov/os/comments/
facialrecognitiontechnology/00083-0982624.pdf.................. 273
National Institute of Justice (NIJ), William A. Ford, Director,
State of Research, Development and Evaluation.: T3https://
www.eff.org/sites/default/files/ford-State-of-Research-
Development-and-Evaluation-at-NIJ.pdf#page=17.................. 274
Farahany, Nita A., Testimony Attachment--Pennsylvania Law Review:
http://www.pennumbra.com/issues/pdfs/160-5/Farahany.pdf........ 273
WHAT FACIAL RECOGNITION TECHNOLOGY MEANS FOR PRIVACY AND CIVIL
LIBERTIES
----------
WEDNESDAY, JULY 18, 2012
U.S. Senate,
Subcommittee on Privacy, Technology, and the Law,
Committee on the Judiciary,
Washington, DC.
The Subcommittee met, pursuant to notice, at 2:36 p.m., in
Room SD-226, Dirksen Senate Office Building, Hon. Al Franken,
Chairman of the Subcommittee, presiding.
Present: Senators Franken, Whitehouse, and Blumenthal.
Also present. Senator Sessions.
OPENING STATEMENT OF HON. AL FRANKEN, A U.S. SENATOR FROM THE
STATE OF MINNESOTA
Chairman Franken. This hearing will be called to order.
Welcome to the fourth hearing of the Subcommittee on Privacy,
Technology, and the Law. Today's hearing will examine the use
of facial recognition technology by the Government and the
private sector and what that means for privacy and civil
liberties.
I want to be clear: There is nothing inherently right or
wrong with facial recognition technology. Just like any other
new and powerful technology, it is a tool that can be used for
great good. But if we do not stop and carefully consider the
way we use this technology, it could also be abused in ways
that could threaten basic aspects of our privacy and civil
liberties. I called this hearing so we can just start this
conversation.
I believe that we have a fundamental right to control our
private information, and biometric information is already among
the most sensitive of our private information, mainly because
it is both unique and permanent. You can change your password.
You can get a new credit card. But you cannot change your
fingerprint, and you cannot change your face--unless, I guess,
you go to a great deal of trouble.
Indeed, the dimensions of our faces are unique to each of
us--just like our fingerprints. And just like fingerprint
analysis, facial recognition technology allows others to
identify you with what is called a ``faceprint''--a unique file
describing your face.
But facial recognition creates acute privacy concerns that
fingerprints do not. Once someone has your fingerprint, they
can dust your house or your surroundings to figure out what you
have touched.
Once someone has your faceprint, they can get your name,
they can find your social networking account, and they can find
and track you in the street, in the stores that you visit, the
Government buildings you enter, and the photos your friends
post online. Your face is a conduit to an incredible amount of
information about you, and facial recognition technology can
allow others to access all of that information from a distance,
without your knowledge, and in about as much time as it takes
to snap a photo.
People think of facial recognition as something out of a
science fiction novel. In reality, facial recognition
technology is in broad use today. If you have a driver's
license, if you have a passport, if you are a member of a
social network, chances are good that you are part of a facial
recognition data base.
There are countless uses of this technology, and many of
them are innovative and quite useful. The State Department uses
facial recognition technology to identify and stop passport
fraud--preventing people from getting multiple passports under
different names. Using facial recognition technology, Sheriff
Larry Amerson of Calhoun County, Alabama, who is with us here
today, can make sure that a prisoner being released from the
Calhoun County jail is actually the same prisoner that is
supposed to be released. That is useful. Similarly, some of the
latest smartphones can be unlocked by the owner by just looking
at the phone and blinking.
But there are uses of this technology that should give us
pause.
In 2010, Facebook, the largest social network, began
signing up all of its then 800 million users in a program
called Tag Suggestions. Tag Suggestions made it easier to tag
close friends in photos, and that is a good thing.
But the feature did this by creating a unique faceprint for
every one of those friends. And in doing so, Facebook may have
created the world's largest privately held data base of
faceprints--without the explicit consent of its users. To date,
Tag Suggestions is an opt-out program. Unless you have taken
the time to turn it off, it may have already been used to
generate your faceprint.
Separately, last year, the FBI rolled out a Facial
Recognition Pilot program in Maryland, Michigan, and Hawaii
that will soon expand to three more States. This pilot lets
officers in the field take a photo of someone and compare it to
a Federal data base of criminal mug shots. The pilot can also
help ID a suspect in a photo from an actual crime. Already,
several other States are setting up their own facial
recognition systems independently of the FBI. These efforts
will catch criminals. In fact, they already have.
Now, many of you may be thinking that that is an excellent
thing, and I agree. But unless law enforcement facial
recognition programs are deployed in a very careful manner, I
fear that these gains could eventually come at a high cost to
our civil liberties.
I fear that the FBI pilot could be abused to not only
identify protesters at political events and rallies, but to
target them for selective jailing and prosecution, stifling
their First Amendment rights. Curiously enough, a lot of the
presentations on this technology by the Department of Justice
show it being used on people attending political events or
other public gatherings.
I also fear that without further protections, facial
recognition technology could be used on unsuspecting civilians
innocent of any crime, invading their privacy and exposing them
to potential false identifications.
Since 2010, the National Institute of Justice, which is a
part of DOJ, has spent $1.4 million to develop facial
recognition-enhanced binoculars that can be used to identify
people at a distance and in crowds. It seems easy to envision
facial recognition technology being used on innocent civilians
when all an officer has to do is look at them through his
binoculars or her binoculars.
But facial recognition technology has reached a point where
it is not limited to law enforcement and multi-billion-dollar
companies. It can also be used by private citizens. Last year,
Professor Alessandro Acquisti of Carnegie Mellon University,
who is testifying today, used a consumer-grade digital camera
and off-the-shelf facial recognition software to identify one
out of three students walking across a campus.
I called this hearing to raise awareness about the fact
that facial recognition already exists right here, today, and
we need to think about what that means for our society. I also
called this hearing to call attention to the fact that our
Federal privacy laws are almost totally unprepared to deal with
this technology.
Unlike what we have in place for wiretaps and other
surveillance devices, there is no law regulating law
enforcement use of facial recognition technology. And current
Fourth Amendment case law generally says that we have no
reasonable expectation of privacy in what we voluntarily expose
to the public; yet we can hardly leave our houses in the
morning without exposing our faces to the public. So law
enforcement does not need a warrant to use this technology on
someone. It might not even need to have a reasonable suspicion
that the subject has been involved in a crime.
The situation for the private sector is similar. Federal
law provides some protection against true bad actors that
promise one thing yet do another. But that is pretty much as
far as the law goes. If a store wants to take a photo of your
face when you walk in and generate a faceprint--without your
permission--they can do that. They might even be able to sell
it to third parties.
Thankfully, we have a little time to do better. While this
technology will in a matter of time be at a place where it can
be used quickly and reliably to identify a stranger, it is not
there quite just yet. And so I have called the FBI and Facebook
here today to challenge them to use their position as leaders
in their fields to set an example for others before this
technology is used pervasively.
The FBI already has some privacy safeguards in place. But I
still think that they could do more to prevent this technology
from being used to identify and target people engaging in
political protests or other free speech. I think the FBI could
do more to make sure that officers use this technology only
when they have good reason to think that someone is involved in
a crime. I also think that if the FBI did these things, law
enforcement agencies around the country would follow.
For their part, Facebook allows people to use Tag
Suggestions only on their close friends. But I think Facebook
could still do more to explain to its users how it uses facial
recognition and to give them better choices about whether or
not to participate in Tag Suggestions. I think that Facebook
could make clear to its users just how much data it has and how
it will and will not use its large and growing data base of
faceprints. And I think that if Facebook did these things, they
would establish a best practice against which other social
networks would be measured.
My understanding is that for the past few months, Facebook
Tag Suggestions has been temporarily disabled to allow for some
technical maintenance. It seems to me that Facebook has the
perfect opportunity to make changes to its facial recognition
program when it brings Tag Suggestions back online.
I am also calling the Federal Trade Commission to testify
because they are in the process of actually writing best
practices for the use of this technology in industry. I urge
the Commission to use this as an opportunity to guarantee
consumers the information and choices they need to make
informed decisions about their privacy.
In the end, though, I also think that Congress may need to
act, and it would not be the first time it did. In the era of
J. Edgar Hoover, wiretaps were used freely with little regard
to privacy. Under some Supreme Court precedents of that era, as
long as the wiretapping device did not actually penetrate the
person's home or property, it was deemed constitutionally
sound--even without a warrant. And so in 1968, Congress passed
the Wiretap Act. Thanks to that law, wiretaps are still used to
stop violent and serious crimes. But police need a warrant
before they get a wiretap. And you cannot wiretap someone just
because they are a few days late on their taxes. Wiretaps can
be used only for certain categories of serious crimes.
I think that we need to ask ourselves whether Congress is
in a similar position today as it was 50 or 60 years ago before
the passage of the Wiretap Act. I hope the witnesses today will
help us consider this and all of the different questions raised
by this technology.
I was going to turn it over to my friend and Ranking
Member, Senator Coburn, but I do not think he would have a lot
to say at this moment.
[Laughter.]
Chairman Franken. I am sure he will have some great
questions.
What I would like to do is introduce our first panel of
witnesses. But before I do, I would like to give my esteemed
colleague, Senator Sessions, the opportunity to make an
introduction of the sheriff, who is going to be on the second
panel from your own State.
STATEMENT OF HON. JEFF SESSIONS, A U.S. SENATOR FROM THE STATE
OF ALABAMA
Senator Sessions. That would be wonderful. Thank you, Mr.
Chairman. Those are remarks that we need to think about as we
go forward with new technologies, and it takes some effort to
get to the bottom of it.
I am honored to take a few moments to introduce my friend,
Sheriff Larry Amerson, who has served for 18 years as sheriff
in Calhoun County, Alabama, and Anniston. He is a graduate of
Jacksonville State University, one of my superb universities,
with a B.A. in law enforcement, finally becoming sheriff. He
served for 14 years as deputy sheriff in Calhoun County. He
currently serves as the 71st president of the National
Sheriffs' Association and is also the chairman of the National
Sheriffs' Institute Education and Training Committee and vice
chair of the Court Security Committee. He is a certified jail
manager and past member of the FBI Criminal Justice Information
System's Southern Working Group, and that Criminal Justice
Information System is a lot of what we will be talking about
today, how that system works.
Sheriff, it is great to see you. Thank you for coming, and
I am pleased to have this opportunity to introduce you.
Mr. Chairman, could I just say a couple of things?
Chairman Franken. Absolutely.
Senator Sessions. I would like to come back if you would
allow me, but I might not be able to.
Chairman Franken. I understand.
Senator Sessions. We need to look at facial recognition and
see how it works and where it can be beneficial consistent with
our constitutional rights and privileges that we value in our
country. But it is a matter that I have dealt with for a long
time, and there are a lot of people who would like to see a
major enhancement of the facial identification system used at
airports for security and that sort of thing. And there are
some fundamental weaknesses at this point with that as a
practical matter.
The fingerprint has been in use for 50 years, I guess.
Virtually every criminal in America has had his fingerprint
placed in records that can be ascertained by even a local
police officer at his police car. He can have people put their
hands on a machine, and it will read that to see if the ID he
presented may be false and he may be somebody else, maybe a
fugitive from justice. So the fingerprint system is really,
really proven. And you have the criminal histories that are
available to law officers when they produce that.
So if we start with the facial recognition--and maybe it is
time to start with some of that. But if we start with it, we do
not have many people in it. There are not that many people who
have been identified who have had their visage imprinted and
can be drawn. And terrorists around the world, presumably we do
not have their facial things, where we may have been collecting
their fingerprints for years.
Secretary Ridge, when he was Homeland Security Secretary,
tried to figure a way to deal with the situation at the
airports. A lot of people wanted to use facial recognition, Mr.
Chairman, because they thought it would be quicker, people
would just go right on through the system. But, you know, I
would ask a simple question: If there is no bank of visages,
what good is it? And why couldn't you use a fingerprint
situation where you put your fingerprint in, the computer reads
it, even if you check through and you go down and wait to get
on the plane, if a minute, five minutes, three minutes later,
it comes back this is a terrorist, you can go down and get the
man.
When he left, I would say I was kind of pleased. I had not
talked to him for some time about it. He said, ``Well, I have
one bit of advice for my successor: Emphasize the
fingerprint.'' So I felt like he had concluded that is a
suggestion.
So I do not know how far you can go with utilizing the face
system effectively. I was a Federal prosecutor for 15 years.
Knowing how the system works today, I know it would take many
years to get it to compete with the fingerprint system for
basic law enforcement work. But, Mr. Chairman, there could be
certain things, like in a jail. You suggested that. There are
other things that could work right now.
So thank you for giving me the opportunity to share those
thoughts. You have got a great panel of witnesses. I salute you
for investing the time and effort to wrestle with these
important issues.
Chairman Franken. Well, thank you for your very well-made
comments, and these are questions that we are starting to deal
with in today's hearing, so thank you.
Senator Sessions. If I come back, I would like to ask some
of those. If not, I will try to submit it for the record, if
you do not mind.
Chairman Franken. Absolutely.
Senator Sessions. Thank you.
Chairman Franken. Maybe we should call it, after listening
to you, ``visage recognition technology.''
[Laughter.]
Chairman Franken. Just to confuse people, I would like to
do that.
Chairman Franken. Now I would like to introduce our first
panel of witnesses.
Jerome Pender is the Deputy Assistant Director of the
Operations Branch at the FBI's Criminal Justice Information
Division. He manages information technology for many of the
FBI's biometric systems and helps oversee the deployment of a
pilot facial recognition program as part of the FBI's Next
Generation Identification Initiative. Prior to joining the FBI,
Mr. Pender served as the executive director of Information
Technology for UBS Warburg. He holds a master's degree in
computer science from Johns Hopkins and is a graduate of the
United States Air Force Academy. Thank you for being here.
Maneesha Mithal is the Associate Director of the Federal
Trade Commission's Division of Privacy and Identity Protection.
She oversees work on commercial privacy, data security, and
credit reporting, and works to ensure companies comply with the
FTC Act's unfair or deceptive practices provision. Before
joining the FTC, Ms. Mithal was an attorney at the Washington
office of Covington & Burling. She earned her undergraduate and
law degrees from Georgetown University.
Thank you again, both of you, for being here today. I
really hope that your presence here will mark the start of a
productive dialogue about this technology going forward. Your
complete written testimony will be made a part of the record.
You each have about 5 minutes for opening remarks that you
would like to make.
Mr. Pender, would you like to begin?
STATEMENT OF JEROME M. PENDER, DEPUTY ASSISTANT DIRECTOR,
CRIMINAL JUSTICE INFORMATION SERVICES DIVISION, FEDERAL BUREAU
OF INVESTIGATION, U.S. DEPARTMENT OF JUSTICE, CLARKSBURG, WEST
VIRGINIA
Mr. Pender. Certainly. Thank you. Mr. Chairman, I would
like to thank the Subcommittee for the opportunity to discuss
the FBI's Next Generation Identification Program, NGI. The FBI
is committed to ensuring appropriate privacy protections are in
place as we deploy NGI technologies, including facial
recognition, and that the capabilities are implemented and
operated with transparency and full disclosure.
The FBI began collecting criminal history on a national
level in 1924. From 1924 until 1999, fingerprints and
associated criminal history information, including mug shot
photographs, were received in the U.S. mail and processed
manually. In 1999, with the launching of the Integrated
Automated Fingerprint Identification System, fingerprints were
searched, processed, and stored using automation.
The NGI Program, which is on scope, on schedule, and on
cost, and 60 percent deployed, is enabling the FBI to meet its
criminal justice mission. It will use facial recognition to
automate for the first time the processing of mug shots.
NGI is being deployed in seven separate increments.
Increment four includes the facial recognition system. It was
deployed as a pilot in February 2012 and is scheduled for full
operational capability in the summer of 2014. The objective of
the pilot is to conduct image-based facial recognition searches
of the FBI's national repository and provide investigative
candidate lists to agencies submitting queries.
The goals of the pilot are to test the facial recognition
processes, resolve policy and processing issues, solidify
privacy protection procedures, and address user concerns.
The pilot provides a search of the national repository of
photos consisting of criminal mug shots, which were taken at
the time of a criminal booking. Only criminal mug shot photos
are used to populate the national repository. Query photos and
photos obtained from social networking sites, surveillance
cameras, and similar sources are not used to populate the
national repository. It contains approximately 12.8 million
photos.
The Facial Recognition Pilot permits authorized law
enforcement agencies to submit queries for a facial recognition
search of the national repository. It can be queried by
authorized criminal justice agencies for criminal justice
purposes.
Access is subject to all rules regarding access to FBI CJIS
systems information and subject to dissemination rules for
authorized criminal justice agencies. The investigative
response provided to a submitting agency will include the
number of candidates requested, in ranked order, along with a
caveat noting that the response should only be used as an
investigative lead.
In accordance with Section 208 of the E-Government Act of
2002, facial recognition was initially addressed by the FBI's
June 9, 2008, Interstate Photo System Privacy Impact
Assessment, or PIA. In coordination with the FBI's Office of
the General Counsel, the 2008 PIA is currently in the process
of being renewed by way of Privacy Threshold Analysis, with an
emphasis on facial recognition. An updated PIA is planned and
will address all evolutionary changes since the preparation of
the 2008 PIA.
Each participating pilot State or agency is required to
execute a Memorandum of Understanding, MOU, that details the
purpose, authority, scope, disclosure, and use of information,
and the security rules and procedures associated with piloting.
Pilot participants are advised that all information is treated
as ``law enforcement sensitive'' and protected from
unauthorized disclosure.
Information derived from the pilot search requests and
resulting responses are to be used only as an investigative
lead. Results are not to be considered as positive
identifications.
In February 2012, the State of Michigan successfully
completed an end-to-end Facial Recognition Pilot transaction
and is currently submitting facial recognition searches to
CJIS. MOUs have also been executed with Hawaii and Maryland;
South Carolina, Ohio, and New Mexico are engaged in the MOU
review process for Facial Recognition Pilot participation.
In summary, the FBI's Next Generation Identification
Program is on scope, on schedule, on cost, and 60 percent
deployed. The Facial Recognition Pilot which began operation in
February 2012 searches criminal mug shots and provides
investigative leads. The Facial Recognition Pilot is evaluating
and solidifying policies, procedures, and privacy protections.
Full operational capability for facial recognition is scheduled
for the summer of 2014.
Thank you.
[The prepared statement of Mr. Pender appears as a
submission for the record.]
Chairman Franken. Thank you, Mr. Pender.
Ms. Mithal.
STATEMENT OF MANEESHA MITHAL, ASSOCIATE DIRECTOR, DIVISION OF
PRIVACY AND IDENTITY PROTECTION, FEDERAL TRADE COMMISSION,
WASHINGTON, D.C.
Ms. Mithal. Thank you, Chairman Franken. I am Maneesha
Mithal with the Federal Trade Commission. I appreciate the
opportunity to present the Commission's testimony on the
commercial uses of facial recognition technology, the potential
benefits, and privacy implications.
Imagine a world where you are walking down the street and a
stranger takes a picture of you with their smartphone. The
stranger is then able to pull up not only your name but where
you live, how much you paid for your house, and who your close
friends are.
Imagine another scenario where you walk into a store and a
digital sign scans your face, links you with a loyalty card,
and greets you with a message: ``Jane Doe, I see you have
bought Slimfast before. Here is a coupon for $1 off your next
purchase.''
These scenarios are not far from becoming a reality. Some
consumers might think they are innovative and they want to
participate in them. Others may find them invasive. Today
facial recognition is being used commercially for a variety of
purposes, many of them beneficial to consumers. For example, as
you mentioned, companies are using the technology to allow
consumers to unlock their smartphones using their faces rather
than their passwords, to allow consumers to upload their faces
to a website to try on make up hair styles and eyeglasses, and
to help consumers manage and organize photos.
In December 2011, the Commission hosted a workshop to
examine these current and future uses of facial recognition, as
well as the privacy implications they raise. In my statement
today, I would like to discuss four themes that emerged from
the workshop and conclude by setting forth our next steps in
this area.
First, many workshop participants highlighted the recent
growth in the commercial use of facial recognition
technologies. Until recently, because of high costs and limited
accuracy, companies did not widely use these technologies.
However, several recent developments have brought steady
improvements. For example, better quality digital cameras and
lenses create higher-quality images from which biometric data
can be more easily extracted. Recent technological advances
have been accompanied by a rapid growth in the availability of
online photos. For example, approximately 2.5 billion photos
are uploaded to Facebook each month. As a result, companies do
not need to purchase proprietary sets of identified images,
thereby lowering costs and making facial recognition
technologies commercially viable for a broad range of entities.
Second, we learned about current applications of facial
recognition technologies. In one application, the technology
can simply be used for pure facial detection--that is, to
determine that a photo has a face in it. Current uses include
refining search engine results to include only those results
that contain a face, locating faces in images in order to blur
them, or ensuring that the frame for a video chat feed actually
includes a face.
In another application, the technology allows companies to
assess characteristics of facial images. For instance,
companies can identify moods or emotions from facial
expressions to determine a player's engagement with a video
game or a viewer's excitement during a movie.
Companies can also determine demographic characteristics of
a face such as age and gender to deliver targeted ads in real
time in retail spaces.
The use of facial recognition technology that potentially
raises the most privacy concerns is the use to identify
anonymous individuals in images. One of the most prevalent
current uses of this application is to enable semiautomated
photo tagging or photo organization on social networks and in
photo management applications.
Third, in addition to these current uses, panelists
discussed the ways in which facial recognition could be
implemented in the future. For example, will it become feasible
to use facial recognition to identify previously anonymous
individuals in public places or in previously unidentified
photos online? In a 2011 study, which we will be hearing about,
Carnegie Mellon researchers were able to identify individuals
in previously unidentified photos from a dating site by using
facial recognition technology to match them to their Facebook
profile photos.
Finally, panelists discussed the privacy concerns
associated with facial recognition. For example, a mobile app
that could, in real-time, identify previously anonymous
individuals on the street or in a bar and correlate a name with
a person's physical address could raise serious physical safety
concerns.
Following the workshop, Commission staff has been
developing a report that builds on the principles that the
Commission outlined in its March 2012 privacy report. Those
principles are: privacy by design, simplified choice, and
improved transparency. The report discusses the application of
these principles to the realm of facial recognition, and we
should be issuing a report in the coming months.
Thank you for the opportunity to provide the Commission's
views, and we look forward to working with Congress on this
important issue.
[The prepared statement of Ms. Mithal appears as a
submission for the record.]
Chairman Franken. Thank you, Ms. Mithal.
Mr. Pender, the FBI allows searches of its facial
recognition data base. They are done only for criminal justice
purposes, and that is a good thing. But the term ``criminal
justice purpose'' is kind of broad, so I am concerned that this
system allows law enforcement to identify and target people
marching in a rally or protesting in front of a courthouse
because in all three States where the pilot is operating, it is
technically a crime to block a sidewalk or obstruct the
entrance to a building.
Mr. Pender, has the FBI issued a rule prohibiting or
discouraging jurisdictions from using facial recognition
technology in a way that could stifle free speech? And if not,
will the FBI consider doing this?
Mr. Pender. Certainly as we are deploying the NGI system,
we are extremely concerned to make sure that we have
appropriate protections in it to ensure there is not any
invasion of privacy or those sorts of things.
The definition of ``criminal justice purpose'' is defined
in 28 CFR Section 20.3(b), and it has nine particular
activities that are part of the administration of criminal
justice. In the scenario that you mentioned about the
protesters and potentially blocking the sidewalk, I think you
are implying that an officer is taking a photo of someone for
blocking the sidewalk on the pretext of putting them into some
type of data base. So I can say a few things about that.
First of all, the only photos that will go into the data
base are the criminal mug shot photos, so the probe photos that
are being searched through the system do not ever go into the
data base.
Then as regards to whether or not the particular person
blocking the sidewalk could even be searched, the officer would
have to clearly articulate which of those administration of
criminal justice functions that they are trying to perform, and
the way you have let out the scenario there, you are implying
that they are not really interested in blocking the sidewalk.
They are using it as a pretext for something else, and that
would not be a valid use of the system under the current rules.
Again, we take this very seriously, so that is certainly
the reason that we are deploying the system slowly in a pilot
phase to work out any details, make sure that there is
appropriate training and guidance in place, and so that is an
important part of our process.
One of the things that the MOUs that we sign with the
agencies that are going to access the system require is an
audit process, so the local agencies are required to audit the
use of the system on an annual basis to detect any type of
misuse. And then, in addition to that, within our FBI CJIS
Division we have an audit unit that goes out and does triennial
audits of the same agencies, and that is done as a little bit
of a safety net, a double-check on the audits, as well as to be
sure that the audit processes are in place and being done
effectively.
In those audits, if any misuse is detected, there is a full
range of options that is defined in the sanctions process, and
that could range from administrative letters, that sort of
thing, to removal of access from the system, either on an
individual or an agency basis, if the controls are not
effective, up to and including criminal prosecution for misuse.
Chairman Franken. OK. How do you define ``misuse'' ? First
of all, have any audits been produced yet?
Mr. Pender. The audit process that I am talking about is
with regards to access to criminal history in general. It has
been longstanding for the last many decades. The photos are
part of that criminal history data base, so all of those same
standards apply.
At this point, we have not done any audits specific to the
use of facial recognition. That is what we are in the process
of developing through the pilot.
Chairman Franken. OK. So is there anything that explicitly
in your pilot discourages the use of this technology at a rally
or a political event?
Mr. Pender. I cannot think of something that says you
should not use this at a political event. I think it is defined
in the terms of the positive where it is allowed to be used,
and that would be outside of what is permitted. But certainly
we are--that is the purpose of doing the slow deployment, is to
identify if there are particular gray areas that need to be
trained----
Chairman Franken. Part of the reason I bring this up is
that the FBI's own presentations of this technology--I do not
know if we have a blow-up of this, but it shows it being used
to identify people at a political rally. That is what the FBI
did. So that is--you know, I mean, this is done by the Obama
administration. It is at an Obama rally. One of them is. And
one is at a Hillary rally, and, you know, they have made up.
[Laughter.]
Chairman Franken. She is a great Secretary of State. But
they might be sending the wrong message, don't you think?
Mr. Pender. I am not familiar with that particular
presentation. I am not familiar with the photos, but certainly
if there are photos of a political rally, what we are--the NGI
system that we are deploying and what we are doing, we
absolutely have no intention of going out. It absolutely will
be limited to the mug shot photos and the criminal history data
base.
Chairman Franken. OK. In a similar vein, will the FBI
consider telling States in its facial recognition program that
they should use the technology to identify someone only if they
have a probable cause that they have been involved in a
criminal activity?
Mr. Pender. The mug shot photos are part of the criminal
history data base, and so this is an issue that we have been
working with for many years on when is it appropriate to
distribute information out of the criminal history data base.
And so in April 2001, there were some questions about that, and
we sent out what we call a contributor letter that clarifies
when it is appropriate to use the system or not. And the
language in that particular letter says that the officer must
clearly articulate one of the administration of criminal
justice purposes that they are administering, and if they are
basing it on the detection or apprehension function, they have
to have an articulable suspicion or a reasonable basis for the
search.
So, again, that was in the context of criminal history, but
mug shots are part of that. And certainly as we are deploying
the system----
Chairman Franken. Well, I understand that the mug shots are
the data base from which they are looking. I am wondering who
they choose to search, I mean, who they choose to take a
picture of, say, to see if they match the data base. That is
what I am asking.
Mr. Pender. Right. The probe photos are photos that they
are searching against the data base. They have to be able to
have that articulable suspicion or reasonable basis for
performing the search. And certainly, again, that is the reason
for going slowly. We have a series of working groups that we
are working with, our State and local partners from the
Advisory Policy Board, as Senator Sessions was talking about,
that were working on it and making sure that the policies are
clear, that we have appropriate training programs in place as
well. Prior to accessing our NCIC system, for example, an
individual is required to have training and a certification
test that is repeated every two years to maintain the current
certification. And we require annual training on security
practices as well.
So if there are appropriate enhancements that we need to
make specific to facial recognition, we are very open to doing
that.
Chairman Franken. OK. Thank you.
Ms. Mithal, my understanding is that the Commission is in
the process of proposing best practices for the commercial use
of facial recognition. I want to urge you to make a very simple
rule one of your best practices; that is, if a company wants to
create a unique faceprint for someone to identify them, they
need to get their permission first. Will the Commission do
that?
Ms. Mithal. Thank you. As I mentioned, the Commission is
considering best practices, and I am certainly sure that that
is one of the issues that they are considering, and I will take
it back to them that you have requested us to consider this.
The other thing I would note is that in our March 2012
privacy report, we talked about the importance of providing
consumers with meaningful choice when their information is
collected. At a minimum, what we think that means is that a
disclosure has to be provided very clearly outside the privacy
policy so that consumers can make informed decisions about
their data.
Chairman Franken. That does not sound like a yes. I do not
think this is a heavy lift, frankly. While Federal law says
nothing about this, two States--Illinois and Texas--both
require a company to get a customer's consent before they
create a biometric for them. So, at least in theory, this is
already the standard that national companies have to meet, and
without objection, I would like to enter these laws into the
record.
[The information appears as a submission for the record.]
Chairman Franken. Could you pass this on to the Commission?
I will give it to you.
Ms. Mithal. We will take a look, and I will pass it on,
yes. Thank you.
Chairman Franken. Thank you. Thank you very much.
Ms. Mithal, when a social network or an app company is
creating a faceprint to identify someone in a photo, what is
the Commission's position on the kind of notice they need to
provide their users? Is the best practice to tell their users,
you know, ``We are going to create a unique faceprint for you''
? Or is it something less than that?
Ms. Mithal. Sir, again, this is exactly the type of issue
the Commission is currently considering, and I cannot get in
front of my Commission on this. They are really considering
these issues. But if you look at what the Commission has said
publicly in terms of our privacy report, we have called for
transparency. And what that means is clear, simple, concise
notices, not in legalese.
Chairman Franken. OK. Clear, simple, and precise.
Ms. Mithal. Concise.
Chairman Franken. Concise. Oh, I am sorry.
Ms. Mithal. Precise would be good, too.
Chairman Franken. Thank you for that validation.
[Laughter.]
Chairman Franken. OK. Well, I want to thank you both for
your testimony and call the second panel. Thank you, Ms. Mithal
and Mr. Pender.
Ms. Mithal. Thank you.
Mr. Pender. Thank you.
Chairman Franken. We have now our second panel, and let me
introduce them while they take their seats.
We have Mr. Brian Martin, who is director of Biometric
Research for MorphoTrust USA, a leading biometrics company that
supplies facial recognition technology to the Federal
Government and many State governments. Mr. Martin has over 15
years of experience in the biometrics and has helped develop
numerous biometric technologies involving iris, fingerprint,
and facial recognition. He earned his Ph.D. in physics from the
University of Pittsburgh. I called Mr. Martin to be our star
technical witness who can begin our second panel by explaining
how the technology actually works.
Alessandro Acquisti is an associate professor of
information technology and public policy at Carnegie Mellon
University where his research focuses on the economics of
privacy. Professor Acquisti is at the helm of not just one but
several pioneering studies evaluating the privacy implications
of facial recognition technology. He has received numerous
awards for his research and expertise on privacy issues.
Professor Acquisti earned a master's and Ph.D. in information
systems from UC-Berkeley and received a master's in economics
from Trinity College, Dublin, and from the London School of
Economics.
Sheriff Larry Amerson, whom Senator Sessions introduced
earlier, is the president of the National Sheriffs' Association
and is also serving in his 18th year as sheriff of Calhoun
County, Alabama, and that is in Anniston as the county seat?
Mr. Amerson. Yes, sir.
Chairman Franken. As part of his mission to modernize
police operations, Sheriff Amerson is overseeing the
implementation of iris and facial recognition in Calhoun County
jails and in the field. Sheriff Amerson has had a long,
successful career in law enforcement. Sheriff Amerson earned
his bachelor's degree in law enforcement from Jacksonville
State University.
Nita Farahany is an associate professor of law at the Duke
University School of Law and is a leading scholar on the
ethical, legal, and social implications of emerging
technologies. She was appointed in 2010 by President Obama to
serve on the Presidential Commission on the Study of Bioethical
Issues. Professor Farahany has written on the application of
the Fourth Amendment to emerging technology. She received her
bachelor's degree from Dartmouth College and a J.D. and Ph.D.
in philosophy from Duke University.
Rob Sherman is the manager of privacy and public policy at
Facebook. He manages policy matters involving privacy,
security, and online trust. Prior to joining Facebook, Mr.
Sherman was an attorney at Covington & Burling, where he
focused his practice on issues relating to privacy and online
security. Mr. Sherman received his law degree from the
University of Michigan and his undergraduate degree from the
University of Maryland.
Jennifer Lynch is a staff attorney at the Electronic
Frontier Foundation, where she focuses on Government
transparency and privacy issues. Ms. Lynch has written and
spoken on biometrics collection, including the Government's use
of facial recognition technology. Before joining EFF, she
served as a clinical teaching fellow with the Samuelson Law,
Technology, and Public Policy Clinic at the UC-Berkeley School
of Law and clerked for Judge A. Howard Matz in the Central
District of California. She received both her undergraduate and
law degrees from UC-Berkeley.
Thank you all for joining us, and your complete written
testimonies will be made part of the record. You each have
approximately five minutes for any opening remarks that you
would like to make. Mr. Martin, please start us off.
STATEMENT OF BRIAN MARTIN, PH.D., DIRECTOR OF BIOMETRIC
RESEARCH, MORPHOTRUST USA, JERSEY CITY, NEW JERSEY
Mr. Martin. Thank you. Good afternoon, Chairman Franken.
Thank you for asking MorphoTrust to testify on the capabilities
of face recognition.
As the director of Biometric Research for MorphoTrust, my
team is responsible for the biometric technologies used by the
U.S. Department of State, the Department of Defense, the FBI,
and numerous motor vehicle/driver's license systems. I am here
today to testify on the state-of-the-art of face recognition.
First, I would like to briefly explain how face recognition
works. Now, face recognition is not new. The idea has been
around for almost half a century. But only in the late 1990s
did these ideas become commercialized. The different approaches
are varied. They can be 2-D, a regular image; they can be 3-D
from a special 3-D scanner. Face recognition can look at the
shape of the face, or it can even look at microscopic features
like your pores and wrinkles on your skin.
In all cases, though, modern face recognition approaches
are vastly more complicated than commonly perceived, where
people say, oh, they are just measuring, you know, the distance
between the eyes and the nose or something.
While there are several different approaches to face
recognition, there are some general steps to face recognition.
The first is what is called face detection, and this is exactly
what your camera is doing when it tries to focus on the face.
It is just trying to see if there is a face in the image.
Another step is called feature registration and extraction,
and this is maybe the more interesting case because this is
where the individualized features of the face are extracted
from an image and stored into a binary format which you have
called a ``faceprint'' or ``facial template.''
Now, these faceprints are vendor-specific, meaning they are
not very useful outside of the face recognition system. They
contain no more information than what was in the original
image. They do not contain meta data or identity data about the
person. They are just a different representation of what was
already in the image. And they cannot be reverse engineered, so
you cannot regenerate the image from the faceprint.
After you have two or more faceprints, then you can perform
facial matching, and facial matching, in the state of the art,
can be as fast as tens of millions of matches per second on a
modern computer. Typically, the faster you match, the less
accurate the match is. This accuracy has been benchmarked by
the U.S. Government since the early 1990s, and in a recent
report from the National Institute of Standards and Technology
in 2010, they said that the best face recognition algorithms
are over 100 times better than they were a decade ago. So this
means essentially from their report that an algorithm can
determine if two faces belong to the same person 99.7 percent
of the time, while only making a mistake about one in 1,000
times. In fact, face recognition is as good is as good as a
human if the human is not a trained expert.
Now, these accuracy numbers are for a staged or controlled
setting. When you have variable lighting, when the person is
not looking directly at the camera, or when it is a low-
resolution image, then the accuracy does decrease, and that is
an active area of research.
Furthermore, when I quoted this 99 percent number, this is
for verification when you are trying to determine if you are
who you say you are, say, for instance, unlock your phone. Much
more demanding is the application of identification where you
are trying to determine an unknown identity from a gallery of
individuals. So this would be where you are trying to generate
an investigative lead from a mug shot data base.
Identification is more complicated because it is
essentially like performing many verifications. So if you had
to perform a million verifications, then you are going to have
a higher false positive rate because you have more chances to
make a mistake. And that is why with identification
applications, there is almost always a human in the loop, and
this is even the case when you have a photo-tagging feature and
you have to sit there and you actually have to tell that
algorithm, ``Did you make a mistake or not? '' ``Yes, this is
who the photo-tagging algorithm thinks it is.''
So to summarize, and maybe speculate on the future a little
bit, I do not think that the accuracy of face recognition for
good-quality images will continue to improve at the rate that
it has in the last 10 years. However, for the uncontrolled
cases, where you are not looking at the camera, I do think that
over the next couple decades, there will be a substantial
improvement in accuracy to help these forensic type of face
cases.
Thank you for the opportunity to address the Subcommittee.
I look forward to answering any of your questions.
[The prepared statement of Mr. Martin appears as a
submission for the record.]
Chairman Franken. Thank you, Mr. Martin.
Mr. Acquisti.
STATEMENT OF ALESSANDRO ACQUISTI, ASSOCIATE PROFESSOR, HEINZ
COLLEGE AND CYLAB, CARNEGIE MELLON UNIVERSITY, PITTSBURGH,
PENNSYLVANIA
Mr. Acquisti. Thank you, Chairman Franken. It is an honor
to appear before you today. I will discuss four findings from
research on privacy and face recognition.
The first finding is that while early computer algorithms
vastly underperform humans in detecting and recognizing faces,
modern ones have progressed to a point that they can outperform
humans in certain tasks and can be found in consumer
applications. Later on, billboards predicted the age of
pedestrians, cameras estimated generation of crowds in a bar,
online social networks identified people and tagged their names
in photos.
The second finding is that the convergence of face
recognition, online social networks, and data mining will make
it possible to identify people online and offline and infer
sensitive information about them, starting from anonymous
faces, and using only public data.
In one experiment we completed last year, we took anonymous
photos from a popular dating site where people used pseudonyms
to protect their privacy, compared them using face recognition
to public but identified photos from Facebook, and identified
about 10 percent of the anonymous members of the dating site.
In another experiment, we identified about one-third of the
participants, students on a college campus, simply taking
photos of them on a webcam and comparing these photos in real
time to images from Facebook.
In a final experiment, we predicted the interests and
Social Security numbers for some of the participants of the
second experiment, combining face recognition with the
algorithms we had developed in 2009 to predict SSNs from public
data. We also developed a phone application which completes the
process I just described on the mobile device in real time
showing on the device screen the predicted sensitive
information of the target subject overlaid on their face, and
this is a screen shot of the application there.
Social Security numbers are just an example of many
sensitive data it is possible to infer, starting from an
anonymous face and using public data. The results we obtained
are not yet scalable to the entire American population due to
computational costs, false positives, availability of facial
images. But each of these hurdles is being overcome by software
and hardware improvements. In fact, some entities already have
access to more powerful computational tools and larger and more
accurate repositories of data than we do.
In particular, online social networks are accumulating the
largest known data bases of facial images, often tagged or
linked to identified profiles, providing a public connection
between a person's facial biometrics and their real names.
The third finding is that the process through which face
recognition can undermine our notions of privacy and anonymity
has already started, and its consequences will be nuanced and
complex. Your phone, we will remind you of the name of someone
at a party. However, it will also tell a stalker in a bar where
you live. The hotel will greet you as you arrive in the lobby.
However, also such person may infer your credit score the
moment you enter the dealership and also predict in real time
based on your online posts a psychological profile for you,
and, therefore, nudge you to accept the steepest price for a
car. An agency will be able to find missing children in an
online data base; however, another agency could chill free
speech by identifying via remote, high-definition cameras all
the thousands of participants in a peaceful protest.
The fourth finding is that, depending on which goals
Congress intends to achieve in this area, different approaches
may be considered: price of technologies, more commercial
applications, legislation. However, if privacy and civil
liberties are the concern here, it is not a given, not
guaranteed that industry self-regulatory approaches will
suffice. I say this for two reasons. One reason is that facial
biometric data is particularly valuable. It provides a
permanent, ubiquitous, and invisible means for identification
and tracking online and offline.
First to control the base facial biometrics will be able to
provide valuable identity recognition services to others.
Hence, competition for control over the data will be fierce and
will likely come at the cost of individuals' privacy.
The second reason is that recent history in the markets for
personal data suggest that firms will engage in progressively
more invasive applications of face recognition over time.
Current users of face recognition are limited not just by
computational costs but by fear of consumer backlash. These
initial applications that we see, however, could be considered
as ``bridgeheads.'' In a way, they are designed to habituate us
into accepting progressively more expansive services. Consider
the frequency in which companies such as Facebook have engaged
in changes to settings and defaults associated with users'
privacy so as to nudge users into disclosing and sharing more.
Why? Because information is power. In the 21st century, the
wealth of data accumulated about individuals and the staggering
progress of behavioral research in using the data to influence
individual behavior make it so that control over personal
information implies power over the person. As control is
tilting from data subjects to data holders, it is the balance
of power within different entities which is at stake.
Thank you.
[The prepared statement of Mr. Acquisti appears as a
submission for the record.]
Chairman Franken. Thank you, Mr. Acquisti.
Sheriff Amerson, please.
STATEMENT OF LARRY AMERSON, SHERIFF, CALHOUN COUNTY, ALABAMA,
ANNISTON, ALABAMA, ON BEHALF OF THE NATIONAL SHERIFFS'
ASSOCIATION
Mr. Amerson. Mr. Chairman, thank you for inviting me today
to testify today on behalf of the National Sheriffs'
Association. Chartered in 1940, the National Sheriffs'
Association is a professional association dedicated to serving
the Office of Sheriff and its affiliates throughout law
enforcement with education, training, and information
resources. NSA represents thousands of sheriffs, their
deputies, and other law enforcement professionals, and
concerned citizens nationwide.
I applaud the Subcommittee for holding this important
hearing on the implications of facial recognition for privacy
and civil liberties. These are critical concerns that
rightfully need to be debated and the rights of innocent
citizens protected from unwarranted interference in their
privacy and everyday lives.
On the other hand, new technologies, especially facial
recognition, already implemented in law enforcement, national
defense, and the fight against terrorism, are a critical tool
in protecting the rights of citizens, in ensuring the accurate
identification of suspects, prisoners, and potential terrorists
while it is protecting the safety of our citizens and law
enforcement officers.
There is a critical balance between protecting the rights
of law-abiding citizens and providing law enforcement agencies
with the most advanced tools to combat crime, properly identify
suspects, catalogue those incarcerated in prisons and jails,
and defending America from acts of terrorism.
Most importantly, advances in facial recognition technology
over the last 10 years will result in the end of the total
reliance on fingerprinting, where it can take hours and even
days to identify a suspect, fugitive, or person being booked
into a jail, to the immediate identification of those known to
have criminal records or who are wanted by law enforcement. It
will surprise many in the room today to know that there is no
national data base of those incarcerated in America's jails at
any one time. The use of facial recognition to provide instant
identification of those incarcerated or under arrest will
eliminate many problems while protecting innocent civilians and
law enforcement officers.
For instance, utilizing facial recognition in law
enforcement would:
LInterconnect law enforcement and intel
organizations to instantly share vital information with
accurate identification results;
LEstablish a national data base of those
incarcerated, past and present, wanted fugitives, felons, and
persons of interest among all law enforcement agencies;
LAllow officers to quickly determine who they are
encountering and provide notification if a suspect is wanted or
a convicted felon;
LA simple, cost-effective, software-based solution
delivered in Windows-based computers with inexpensive, non-
proprietary, off-the-shelf cameras will provide a huge cost
savings;
LDemonstrate new capabilities in alias detection,
fugitive apprehension, and the speed of suspect recognition;
LEnsure correct identification of prisoners being
released and reduce costs associated with administrative
procedures;
LEstablish a complete national data base of
incarcerated persons for the first time in U.S. history; no
longer could wanted criminals escape detection and arrest due
to inefficient processes.
While fingerprints take hours and days for analysis, some
advanced facial recognition in use today by U.S. law
enforcement is as accurate as fingerprints, but results are
obtained in seconds, not hours, in identifying criminals and
perpetrators attempting to use false identities and aliases.
It is also important to point out that facial recognition
comes in two general forms, two-dimensional and three-
dimensional. Only All-aspect 3-D Facial systems can protect the
privacy of participants who agree to be enrolled, except for in
law enforcement or Homeland Security applications. All-aspect
3-D cannot search on 2-D facial photographs and cannot be
invasive of privacy by design. All-aspect 3-D facial
recognition systems remove skin color and facial hair and,
therefore, have no profiling capability.
Currently, the National Sheriffs' Association, the Bureau
of Prisons, and the United States Marshals Service are all in
support of utilizing this new three-dimensional, holographic
imaging technology to eliminate errors in identification;
detecting false identities; and immediately identifying
dangerous suspects, fugitives, or terrorists rather than
learning who they are after they are released on traffic
offenses or let go without suspicion because immediate
identification is not possible.
Accidental releases, sometimes of dangerous felons, could
also be eliminated. This technology has been in use for over
eight years in Georgia detention facilities with data bases of
approximately five million inmates without a single erroneous
release.
And just last year, a dangerous murderer was released from
the District of Columbia jail by switching a wrist band with
another inmate. This cannot happen with facial recognition.
In closing, the proper utilization of facial recognition
for intelligence or law enforcement uses can protect civil
liberties, save millions of dollars, and instantly identify
fugitives, felons, and dangerous suspects while saving lives.
Thank you, Mr. Chairman. I will be glad to answer any
questions you may have.
[The prepared statement of Mr. Amerson appears as a
submission for the record.]
Chairman Franken. Thank you, Sheriff.
Ms. Farahany.
STATEMENT OF NITA A. FARAHANY, PROFESSOR OF LAW, DUKE LAW
SCHOOL, AND PROFESSOR OF GENOME SCIENCES& POLICY, INSTITUTE FOR
GENOME SCIENCES & POLICY, DUKE UNIVERSITY, DURHAM, NORTH
CAROLINA
Ms. Farahany. Thank you. Chairman Franken and distinguished
Members of the Subcommittee, thank you for the opportunity to
express my views about facial recognition technology and its
implications for privacy and civil liberties.
My fellow witnesses today have canvassed the science behind
facial recognition technology and the myriad of privacy
concerns about its use. Rather than repeat what has already
been said, I will focus my comments on why I believe that law
enforcement use of these technologies is not, in itself, a
Fourth Amendment search, let alone an unreasonable one.
Although the Supreme Court has not yet addressed this issue, as
Senator Franken acknowledged earlier, the doctrine in analogous
cases supports this view.
A novel feature of facial recognition technology is that
the first step of the investigative process--scanning a face of
interest--can be done from a distance and without the awareness
of the individual being scanned. No physical contact,
proximity, or detention of an individual is necessary for law
enforcement to obtain a faceprint.
A faceprint is a form of identifying information that is
the bread and butter of law enforcement: information about the
physical likeness and other descriptive features of a suspect,
which is routine practice for investigators to collect. Except
in extraordinary circumstances, individuals have received only
minimal constitutional protection against law enforcement
collection of their personally identifying information.
The Fourth Amendment guarantees the right of the people to
be secure in their person, houses, papers, and effects against
unreasonable searches and seizures. A Fourth Amendment search
only occurs when the Government intrudes upon a legally
cognizable interest of an individual. This technology may be
used in different ways which may require different Fourth
Amendment analyses. It may be used from afar without a
subject's awareness or during a brief investigative stop based
on reasonable suspicion. Under either approach, I believe that
the facial scanning itself is neither a search nor an
unreasonable one.
If the police use facial recognition from afar without an
individual's awareness, then no Fourth Amendment search has
occurred. Neither his person nor his effects has been
disturbed, and he lacks any legal source to support a
reasonable expectation of hiding his facial features from
Government view. He has chosen to present his face to the
world, and he must expect that the world, including the police,
may be watching.
Cameras and machines may now be doing the scanning, but for
constitutional purposes, this is no different from a police
officer scanning faces in public places. This has never been
thought to be a Fourth Amendment search. But even if the use of
this technology did constitute a search, it would likely be a
constitutionally reasonable one, consistent with the Fourth
Amendment.
Since the Court primarily uses property rights to inform
Fourth Amendment privacy interests, it measures the
reasonableness of a search based on the physical intrusiveness
of the search rather than the personal indignity that one may
have endured by having their personal information revealed.
Mere observation without any physical intrusion is not
tantamount to a search, and certainly not to an unreasonable
one.
The police might instead choose to use facial scanning
technology during a brief investigative stop, which requires a
slightly different constitutional analysis. Beginning with
Terry v. Ohio, the Court has held that if a police officer has
a reasonable suspicion that somebody has committed, is
committing, or is about to commit a crime, the police may
detain the individual without a warrant. A facial recognition
scan to achieve the same is not constitutionally
distinguishable. Such stops are Fourth Amendment searches, and
a person is seized while they are detained. But using facial
scanning during the stop is unlikely to change the Fourth
Amendment reasonableness. The individual privacy interest that
the Court recognizes during stop-and-frisk detentions is the
personal security of that individual and the interest against
interference with his free movement, not the secrecy of his
personal identity. In other words, the Court has not included
secrecy of personally identifying information as a relevant
privacy concern to determine the reasonableness of a stop.
The second step of the process, which is probing a data
base for an identity match, is now a commonplace practice by
law enforcement in other contexts. They regularly check local
and national data bases to find the identity of individuals by
using their license plates, Social Security numbers,
fingerprints, or DNA, and all of this is nothing more than an
automated version of what police have done for centuries:
compare information acquired in the world with information held
at police headquarters looking for a match.
Ultimately, the privacy concern advanced in most debates
regarding facial recognition technology is whether an
individual has a right to secrecy of their personal
information. The Court has never recognized a Fourth Amendment
privacy interest in the mere secrecy of identifying
information. This is likely because intrusions upon possession
and privacy are the core individual interests protected by the
Fourth Amendment. And so from the beginning, the Court has
turned to property law to inform Fourth Amendment interests.
Indeed, when the Court first encountered the modern
investigative technique of wiretapping, which, like facial
recognition, enables investigators to obtain evidence without
physical interference, the Court found no search had occurred.
Now, to be sure, the Court has subsequently extended the
Fourth Amendment beyond property. The Court has held that the
Fourth Amendment applies to tangible and intangible interests
such as private conversations. But even with this expanded view
of individual interests, an individual who is facially scanned
in public cannot reasonably claim that the police have searched
or seized something that he has sought to seclude from public
view. Instead, he must argue that he has a reasonable
expectation of privacy in his personal identity associated with
his facial features. Under current doctrine, courts would
properly reject such a claim.
Most recently, in the United States v. Jones, the Court
revisited this analysis. But what remains after Jones is an
incomplete picture of which individual interest beyond real
property interest, if any, the Fourth Amendment protects. The
Jones majority emphasized that trespassed upon property and the
Katz expectation-of-privacy framework co-exist under Fourth
Amendment jurisprudence. But under either analysis, without
trespass upon real property or upon information that a person
has sought to hide, there is no legitimate source of law upon
which a reasonable expectation of privacy could be founded.
Again, I thank you for the opportunity to appear before you
today, and I look forward to your questions.
[The prepared statement of Ms. Farahany appears as a
submission for the record.]
Chairman Franken. Thank you, Doctor.
Mr. Sherman.
STATEMENT OF ROB SHERMAN, MANAGER OF PRIVACY AND PUBLIC POLICY,
FACEBOOK, WASHINGTON, D.C.
Mr. Sherman. Chairman Franken, Members of the Subcommittee,
my name is Robert Sherman. I am the manager of privacy and
public policy at Facebook.
Facebook is committed to building innovative tools that
enhance people's online experiences while giving them control
over their personal information. We appreciate the opportunity
to share our views on what the use of facial recognition
technology means for our users.
Today I will describe how we use facial recognition
technology as a part of our photo-sharing product, the
important controls that we offer, and how Facebook safeguards
the data that we use.
At the outset, I want to provide some background on why we
offer photo-sharing features on Facebook. We learned early on
how important photo sharing was to our users when we realized
that people were frequently changing their profile photos to
show friends recent snapshots. In response, we built tools that
allowed people to upload and share photos, and we continue to
build on those tools today.
One component of our photo sharing on Facebook is tagging,
which is the 21st century version of handwriting captions on
the backs of photos to label important events like birthdays or
reunions and the people who participated. Tags promote
transparency and control on Facebook because Facebook lets a
person know when she is tagged. This allows the person included
in the photo to interact with the user who uploaded it or to
take action if she does not like the photo, for example,
removing the tag or requesting that the photo be deleted.
Our Tag Suggestion tool uses facial recognition technology
to automate the process of identifying and, if the user
chooses, tagging her friends in the photo she uploads. Tag
Suggestions work by identified similarities among photos in
which a person has been tagged. We use this information to
create a template that allows us to offer recommendations about
whom a user should tag when she uploads a photo. The user can
then accept or reject that recommendation.
Use of our photo-sharing tools continues to grow. In fact,
as you noted, Mr. Chairman, a few months ago we took our Tag
Suggestion feature down to improve its efficiency, and we plan
to restore it soon.
Individual control is the hallmark of Facebook's Tag
Suggestion feature. It includes four important protections.
First, we are transparent about the use of the technology.
Across our site, we describe Tag Suggestions and the controls
that we offer. This included providing information in our data
use policy, on our Help Center, on our Privacy Settings page,
and on our Facebook blog.
Secondly, Tag Suggestions only use data people have
voluntarily provided to Facebook and derives information from
that data to automate the process of future tagging. We do not
collect any new information as a part of this process.
Third, Facebook's technology only uses a person's friends
and does not enable people to identify random strangers.
Fourth, through an easy-to-use privacy setting, Facebook
enables people to prevent the user of their images and tag
suggestions. If a user makes that selection, Facebook will not
include her name when suggesting tags for uploaded photos. And
we will delete the template in which we stored the user's
facial recognition data if one was previously created.
In addition to these controls, we protect facial
recognition data from unauthorized disclosure to third parties,
including to law enforcement. Two aspects of our technology
significantly limit its use to third parties. First, our
templates are encrypted, and they work only with our
proprietary software, so they would be useless to a third
party. Second, our software is designed to search only a
limited set of potential matches, namely, an individual user's
friends, and is not used to identify strangers.
Last, we share our users' private information with law
enforcement only in very limited circumstances and consistent
with our terms of service and applicable law. A dedicated team
of professionals scrutinizes each request for legal sufficiency
and compliance with Facebook's internal requirements. We are
one of the handful of major Internet companies that promotes
transparency in this process by publishing our law enforcement
guidelines on our website.
I hope that my testimony has helped the Members of this
Subcommittee understand how Facebook uses facial recognition
technology and, more importantly, the privacy and security
protections that define our implementation. We look forward to
continuing our discussion with Members of Congress about the
important issues raised in today's hearing.
Thank you again for the opportunity to testify, and I look
forward to answering any questions that you have.
[The prepared statement of Mr. Sherman appears as a
submission for the record.]
Chairman Franken. Well, thank you, Mr. Sherman.
Ms. Lynch.
STATEMENT OF JENNIFER LYNCH, STAFF ATTORNEY, ELECTRONIC
FRONTIER FOUNDATION, SAN FRANCISCO, CALIFORNIA
Ms. Lynch. Mr. Chairman, thank you very much for the
invitation to testify on the important topic of facial
recognition today. My name is Jennifer Lynch, and I am an
attorney with the Electronic Frontier Foundation in San
Francisco. We are a nonprofit, and for over 20 years, we have
been focused on protecting privacy and defending civil
liberties in new technology.
Today, and in my written testimony, I would like to address
the implications of government and private sector use of facial
recognition on privacy and civil liberties and on the laws that
do or do not apply.
The collection of biometrics, including facial recognition,
may seem like science fiction or something out of a movie like
``Minority Report,'' but it is already a well-established part
of our lives in the United States. The FBI and the DHS have the
largest biometrics data bases in the world, with over 100
million records each, and DHS alone collects 300,000
fingerprints every day. Both of these and other agencies in the
Federal Government are working quickly to add extensive facial
recognition capabilities to these data bases.
The scope of Government-driven biometrics data collection
is well matched by private sector collection. Facebook, for
example, uses facial recognition by default to scan all images
uploaded to its site, and its 900 million members upload
300,000 photos every day. Face.com, which is the company that
developed Facebook's facial recognition system and was recently
acquired by Facebook, stated in March that it had indexed 31
billion face images. Other companies, from Google and Apple to
smartphone app developers, also provide facial recognition
services to their customers, and biometrics are used by private
companies to track employee time, to prevent unauthorized
access to computers or facilities or even the gym. And private
companies, like Morpho, represented on the panel here today,
and other companies, are building out large facial recognition
systems for governments and agencies around the world.
For example, Morpho has developed a facial recognition
technology at 41 of the 50 DMVs in the United States and for
the FBI. And companies like this often retain access to the
data that is collected.
So facial recognition is here to stay, and yet at the same
time many Americans do not even realize that they are already
in a facial recognition data base.
Facial recognition technology, like other biometrics
programs that collect, store, share, and combine sensitive and
unique data poses critical threats to privacy and to civil
liberties. Biometrics in general are immutable, readily
accessible, individuating, and can be highly prejudicial. And
facial recognition takes the risks inherent in other biometrics
to a new level. Americans cannot take precautions to prevent
the collection of their image. We walk around in public. Our
image is always exposed to the public. Facial recognition
allows for covert, remote, and mass capture and identification
of images, and the photos that may end up in a data base
include not just a person's face but also what she is wearing,
what she might be carrying, and who she is associated with.
This creates threats to free expression and to freedom of
association that are not evident in other biometrics.
Americans should also be concerned about the extensive
sharing of biometric data that is already occurring at the
government- and private-sector level. Data accumulation and
sharing can be good for identifying people, for verifying
identities, and for solving crimes. But it can also create
social stigma when people end up in criminal data bases and
their image is searched constantly. And it can perpetuate
racial and ethnic profiling and inaccuracies throughout the
system. It can also allow for Government tracking and
surveillance on a level that has not before been possible.
Americans cannot participate in society today without
exposing their faces to public view. And, similarly, connecting
with friends, family, and the broader world through social
media has quickly become a daily--and many would say
necessary--experience for Americans of all ages. Though face
recognition implicates important First and Fourth Amendment
values, it is unclear whether the Constitution would protect
against the challenges it presents. Without legal protections
in place, it could be relatively easy for the government or
private companies to amass a data base of images on all
Americans. This presents opportunities for Congress to develop
legislation to protect Americans. The Constitution creates a
baseline, but Congress can and has legislated significant
additional privacy protections. As I discuss in more detail in
my written testimony, Congress could use statutes like the
Wiretap Act or the Video Privacy Protection Act as models for
this legislation.
Given that facial recognition and the accompanying privacy
concerns are not going away, it is imperative that Congress and
the rest of the United States act now to limit unnecessary
biometrics collection, to instill proper protections on data
collection, transfer, and search, to ensure accountability, to
mandate independent oversight, to require appropriate legal
process before government collection, and define clear rules
for data sharing at all levels. All of these are necessary to
preserve the democratic and constitutional values that are
bedrock to American society.
Thank you once again for the invitation to testify today. I
look forward to your questions.
[The prepared statement of Ms. Lynch appears as a
submission for the record.]
Chairman Franken. Thank you all for your testimony.
Just for the sake of the record, I want to clarify that
Facebook users upload 300 million photos to the site a day, not
300,000. I will add a document to the record to that effect. I
would not want to underestimate the power of Facebook.
[The information appears as a submission for the record.]
Chairman Franken. Professor Acquisti, one of the things I
think is so special about your work is that it really shows us
how a face can be a real conduit between your online world and
your offline world in a way that other biometrics are not. Can
you tell us why facial recognition technology is so sensitive
and how it compares to taking someone's fingerprint and
analyzing that?
Mr. Acquisti. Senator, I believe facial biometrics are a
more powerful and sensitive biometrics than fingerprints. Not
only they are permanent, starting with childhood your face
changes, but computers are learning to be able to predict these
changes, and your face can be changed, as you mentioned
earlier, only at very great cost. Also, this biometric can be
captured remotely. In fact, we have a gigapixels camera, very
remotely shot can be sufficient to make a good, effective
faceprint of someone's face. Remote capturing means that this
is happening without the person's consent or even knowledge.
Also, the technology to capture facial images and do
matching is becoming ubiquitous. Your phone probably can do it,
my phone, iPad, and so forth.
Also, unlike fingerprints, which are not usually publicly
available online, facial data is, as our experiment showed and
studies by others have shown, plenty available online.
And, finally, as you mentioned, a face is truly the conduit
between your different personas, who you are on the street, in
real life, and who you are online, who you are online may be on
a dating site, and who you are on a social network. And the
face, therefore, allows these different sides of your life that
you wanted to keep, perhaps, compartmentalized to be connected.
Plus there is also the issue of the sensitive inferences one
can make starting from a face, which is perhaps another story,
but it is related to this topic as well.
Chairman Franken. Thank you.
Mr. Sherman, you have heard from almost everyone else at
this hearing that facial recognition technology is extremely
powerful and extremely sensitive. Why doesn't Facebook turn its
facial recognition feature off by default and give its users
the choice to turn it on?
Mr. Sherman. Well, Senator, I think you are right to say
that, like all of the other information that we store about our
users, it is important that we take appropriate steps to
protect information. We take that responsibility very
seriously. And in terms of implementing choice throughout our
site, and we do that in a lot of ways, we use a number of
different mechanisms to do it.
As you point out, with regard to the tag suggestion feature
specifically, it is turned on by default, and we give people
the opportunity to go in and disable it if they do not want to
use it.
The reason for that in part is we think that is the
appropriate choice because Facebook itself is an opt-in
experience. People choose to be on Facebook because they want
to share with each other. Beyond that, tag suggestions are only
used in the context of an opt-in friend relationship on
Facebook, which means that you would not be suggested to
somebody as a potential tag for a photo unless both parties to
the relationship had already decided to communicate with one
another on Facebook, had already seen each other's photos. So
we are actually not exposing any additional information to
anybody as a part of this process.
And so given those things and the fact that we do a lot to
be transparent and to let people know about the feature, we
think that it is the right choice to let people who are
uncomfortable with it decide to opt out.
Chairman Franken. I understand what you are saying. We are
just going to have to disagree on this a little bit. I just
think that this information is so sensitive that it is the kind
of thing that users should have to consciously opt themselves
into. I will note that Facebook's competitor Google leaves
their facial recognition feature off by default on its social
network and then lets users opt into it. But I am worried about
how Facebook handles the choices that it does give its users
about this technology.
Mr. Sherman, on page six of your written testimony, you
write that, ``Through an easy-to-use privacy setting, people
can choose whether we will use our facial recognition
technology to suggest that their friends tag them in photos.''
This is the screen that Facebook users get when they go to
their privacy settings to find out about tag suggestions.
Nowhere on this screen or on the screen that you get when you
click ``Learn More'' do you see the words ``facial
recognition'' or anything that describes facial recognition.
Those words are elsewhere in your Help Center, but right now
you have to go through six different screens to get there. I am
not sure that is easy to use.
How can users make an informed decision about facial
recognition in their privacy settings if you do not actually
tell them in their privacy settings that you are using facial
recognition?
Mr. Sherman. Well, the screen shot that you have displayed
does not use the words ``facial recognition.'' I believe that
the ``Learn More'' link at the bottom leads to the page in our
Help Center. We have a series of frequently asked questions
that we provide to users that explains in detail how----
Chairman Franken. This is the page that it links to.
[Laughter.]
Chairman Franken. And nowhere does it talk about a facial
recognition page, right?
Mr. Sherman. I have not done that, so I do not know that--
--
Chairman Franken. You have not done that?
Mr. Sherman. I have done that. I did not create the visual,
so I do not know that, but I can tell you that----
Chairman Franken. What haven't you done?
Mr. Sherman. I am sorry. I just have not seen the visual. I
think the page that you are looking at is one of the pages in
our Help Center that provides information about how tagging
works on Facebook. The Help Center content that you are talking
about, which I think is available from that page, does describe
facial recognition, uses the words ``facial recognition''
specifically, and provides some detail about the way in which
the templates that we use, the files that include the facial
recognition data are stored.
Chairman Franken. It is my understanding, am I right, that
that is six clicks away?
Mr. Sherman. I am not sure about the number. I do not think
that is right, but I am not sure.
Chairman Franken. OK. You are head of this at Facebook?
Mr. Sherman. I am one of many people who work on privacy at
Facebook.
Chairman Franken. What is your title?
Mr. Sherman. I am the manager of privacy and public policy.
Chairman Franken. Thank you, Mr. Sherman.
Mr. Sherman. Thank you.
Chairman Franken. Ms. Lynch, you are a privacy and civil
liberties lawyer. It is your job to interpret the law in a way
that protects privacy and civil liberties. Can you summarize
for us in a few sentences what concrete legal protections there
are with respect to the use of facial recognition technology by
the government and by the private sector?
Ms. Lynch. Well, I think at the Federal level it is pretty
clear that there are no specific laws that regulate facial
recognition or that regulate the collection of images to be put
into a facial recognition data base, whether from the
government or the private sector.
That said, the Constitution creates a baseline. I think we
have seen in the U.S. v. Jones case that was decided in January
that the Supreme Court and several other courts are concerned
about collection of information on us when we are in public.
And, also, the FTC, of course, has some ability to regulate
companies that are engaged in deceptive or unfair trade
practices. And then there are two State laws, which you
mentioned earlier, in Illinois and Texas, that would govern the
collection of biometrics on citizens within those States.
Chairman Franken. Thank you.
Right now, I know Senator Blumenthal has been here for a
while. Since I am chairing this, I am going to be here. I want
to be conscious of your time, so why don't I turn the
questioning over to you, Senator?
Senator Blumenthal. Thank you, Mr. Chairman.
Mr. Sherman, let me first thank Facebook for being so
cooperative in the Password Protection Act that I proposed,
with the support of a number of other Members of the Judiciary
Committee, that prohibits employers from compelling passwords
and other such information that provides access to private
personal accounts to being divulged in the course of
employment, whether it is applications for employment or
prospective employment or existing employment.
Why does Facebook not require or not permit the kind of
opt-in procedure that Senator Franken mentioned?
Mr. Sherman. Well, we do not provide--we have implemented
tag suggestions in a way that does not require people to opt in
for a number of reasons, including the fact that, as I
mentioned, Facebook is an opt-in service and the fact that we
provide tag suggestions only in the context of existing
friendships.
I think we also work very hard to be transparent with
people about how the feature works. We provide information
about the tool on a lot of different places on the site. And we
also think that there are benefits both in terms of social
engagement and also in terms of privacy associated with photo
tagging. And we think that making it easier for people to tag
people on Facebook, again, people that they already know and
already are in relationships with, promotes those benefits. It
gives people the ability to know that they are in photos that
have been posted on Facebook and to exercise control over them
if they want to do so.
Senator Blumenthal. Does Facebook share facial recognition
data with any third parties?
Mr. Sherman. We do not.
Senator Blumenthal. Is there anything in your guidelines or
company practices that precludes it?
Mr. Sherman. As I mentioned, we publish on our website our
law enforcement guidelines, which I think may be the
circumstance that you are talking about, and with regard to
that information, first, we--as far as I know, we have never
received a request from law enforcement from the information
that you are talking about. I think that reflects the fact that
the templates that we have would not be useful outside of our
service. They just cannot be used by law enforcement. I think
there are other technologies that law enforcement might use.
And I think beyond that there is a very rigorous standard that
we describe in our policies under which we would provide any
non-public personal information to law enforcement.
Senator Blumenthal. And what about going beyond law
enforcement? Is there anything in your guidelines or practices
that precludes sharing with non-law enforcement?
Mr. Sherman. I do not know whether we have said
specifically with regard to facial recognition information, but
we have a data use policy which we publish on our website which
provides significant detail about the restrictions, and the
general standard is that we do not disclose personal
information to third parties without our users' consent.
Senator Blumenthal. Does Facebook allow third-party apps to
collect facial recognition data from users?
Mr. Sherman. Just to make sure I understand your question,
Senator, the facial recognition data that is in our data bases,
the templates?
Senator Blumenthal. Correct.
Mr. Sherman. No, we do not provide those to any apps.
Senator Blumenthal. And just assume that someone signs up
for Facebook--you mentioned that it is, obviously, voluntary--
and he or she does not want to have facial data stored,
collected, used by Facebook. What are the options available to
that person?
Mr. Sherman. So if a person signs up for Facebook and does
not want facial recognition data to be collected or used about
that person, the person can go to their Privacy Center, click
on Tagging, and then the option to turn off the tag suggestion
feature is there. If they do that, two things will happen: one,
we will not suggest them to any of their friends when their
friends upload photos; and, two, if a facial recognition
template was created, it will be deleted. In the circumstance
that I think you are describing, we probably would not have a
facial recognition template in the first instance.
If a user wanted to allow the use of the feature but to
exercise other kinds of control, we offer that as well. For
example, the user can be notified when he or she is tagged, can
remove the tag from the photo. If he or she does that, then
that removes that from the template that we use to power our
tag suggestions feature.
And, finally, the user can choose to exercise control
before any photo in which he or she is tagged shows up on her
timeline.
Senator Blumenthal. Now that Facebook is considering
allowing children under 13 to sign up for Facebook accounts,
which obviously implicates a number of privacy concerns of a
different nature and magnitude, does Facebook have any new
policies or plans to develop new policies and what will those
policies be regarding facial recognition technology on pictures
of children who use Facebook?
Mr. Sherman. Well, Senator, as you know, our current policy
is that children under 13 are not allowed on Facebook, and we
have a number of technical and procedural measures that we put
in place to try to prevent children under 13 from gaining
access to our service in violation of that policy.
There have been some studies that have come out recently
that have suggested that children, despite our efforts, are
gaining access to Facebook, and in many cases with the
assistance of their parents. And so one of the things that has
been suggested is that we provide tools for parents to manage
their children's access of Facebook if they do get on.
We are in the process of thinking about those. Those are
really important issues, and protecting children and all of our
uses is a high priority at Facebook. And we are thinking
through the right way to manage those questions. So we have not
made any final decision about what we would do, if anything,
about changing our under-13 policy.
What I can tell you is we do implement the tag suggestion
feature in a slightly different way for children who are over--
for teenagers, excuse me, who are over 13 but under 17. In
those cases, the tag suggestion feature is off by default, and
the teenagers can turn it on if they want to do so, but it is
not on by default.
Senator Blumenthal. Wouldn't it make sense to simply
preclude those images for children under 13 to be in any way
collected or stored?
Mr. Sherman. Well, I mean, I think certainly there are
difficult questions, and the one that you raise is one of a
large number of questions that we would have to confront if we
decided to allow children under 13. It is something certainly
that we would consider actively, but until we make a decision
about changing our policy, I think it is premature to say
exactly how we would implement it.
Senator Blumenthal. Well, I am going to ask that Facebook
commit to not collecting or storing those facial recognition
data for anyone under 13 if you decide to go ahead. I think it
is a matter of public policy and public safety that Facebook
adopt that kind of policy if you decide to go ahead.
Mr. Sherman. OK, thank you. We absolutely appreciate the
feedback, and if we go in that direction, that is something we
will certainly consider.
Senator Blumenthal. Thank you.
Thank you, Mr. Chairman.
Chairman Franken. Thank you, Senator Blumenthal.
I just want to also correct the record that MorphoTrust has
32 driver's license contracts that include facial recognition,
not 40.
Professor Acquisti, a month or two ago, a company called
Face.com released an iPhone app that allowed you to point your
iPhone at someone and have a little box pop up above that
person's face on your screen that told you their name. The app
was only supposed to work on your friends, but soon after the
release of this app, a well-respected security researcher who
has testified before this Subcommittee, Ashkan Soltani,
revealed that the app could easily be hacked in a way that
would appear to allow it to identify strangers.
Facebook has since purchased Face.com and shut down this
app. But were you familiar with this app and the vulnerability
that it created or had? What did it tell you about the state of
privacy when it comes to facial recognition technology? Is this
something we should be thinking about?
Mr. Acquisti. Senator, yes, I have been following the news
and the research about Klik, this app. I will make a few
points.
One is that this app shows that the studies we presented
last year are not just theoretical experiments. They happen in
reality. The reality of face mobile, real-time face recognition
is coming much faster than what some people may have believed.
A second point is that the vulnerability Ashkan Soltani
found shows that there are inherent risks in this technology in
that they cluster and aggregate very sensitive information
which becomes a desirable target for hackers and third parties.
Soltani was able, through the vulnerability he discovered, to
get access to non-public photos of individuals as well as to
private data of other users, which means that conceivably he
could have used these additional photos for face recognition
not just of his own friends but friends of friends and many
other people in the network.
Which leads me to the third point. Currently, the
limitations in this app come mostly from two directions. One is
computational cost. In experiments we did, we were working on
data bases of hundreds of thousands of images; therefore, we
could do a match in real time. If we had tried to do it against
300 million Americans or, in fact, 90 billion photos, it would
take hours and hours and hours. However, this limit is
transient; it is not systemic in the sense that cloud computing
clusters are getting faster and faster. Therefore, we cannot
guarantee that what is not possible to do now, extrapolating
our results to nationwide to the entire population, will not be
possible five years out.
The second limitation is, like I mentioned in my testimony,
there is a sort of a self-restraint in the providers of the
services which can be found in statements such as, ``Don't
worry. This only works with your friends. Only your friends
will be able to tag you.'' Well, this is now. There is no
guarantee that a few years from now it will be friends of
friends or some years later it will be anyone in the network.
In fact, the history of social media and online social networks
in general shows that there is this progressive nudging of
users toward more and more disclosure. So this is to me one of
the concerns we have in this area.
Chairman Franken. Well, then, I will turn to Mr. Martin. I
am going to try to get everybody in here. We are really talking
about how fast this technology is improving, and that is sort
of what I was just asking Mr. Acquisti. What are we
approaching? What kind of world are we approaching in terms of
how quickly and reliably this technology can identify unknown
individuals walking down a city street? I know we are not quite
there yet, but tell me how fast this technology is improving
and how far we are from that world.
Mr. Martin. There is not a black-and-white answer to this.
So certainly, today, if you have a small data base of
individuals, a few thousand or even tens of thousands, and you
had a controlled situation where somebody was walking through a
metal detector but still they did not know the camera was on
them, then you could reliably do identification on that small
data base, say if you had a watchlist of criminals or
terrorists or something.
In the case where you now expand the data base to the size
of multiple millions and you are just shooting a camera outside
the window down the street, you cannot reliably do that for a
large data base. What you could do is, for instance, have some
humans that look at the results, and if you only were looking
for a few people, not millions of people, then you could shoot
something out the window and probably try to find a suspect.
But certainly the technology is not there to do that on a large
scale with 300 million people or a billion people. And even if
you have more processors and it is faster, I do not think you
are going to be there in the next several years.
Chairman Franken. What about the scenario of going into--a
guy goes into a bar, takes a picture of a woman, wants to stalk
her, can find out where she lives?
Mr. Martin. Some of the arguments here was that that is a
concern that you can do something like that, and I think the
only way it would be viable today is that you would need some
additional information. Like you would have to know that she is
a friend of somebody on Facebook and you are a friend with that
person and you have access to see who their friends are. Then
potentially you could look at images off of the Internet and
link up that extra metadata that is on her profile with that
picture and find out that information.
But even just from the science side of it, taking a picture
in the bar where it is dark and the person is not looking at
your camera unless you ask them for a good picture, it is
technically very hard even to do the face recognition matching,
despite the other part that you need to have all this linking
information to get it to work. So it is not easy.
Chairman Franken. Sometimes you would say, ``Hey''----
Mr. Martin. ``Can I get a picture of you? '' Right.
Chairman Franken. A flash, and there it is.
Mr. Martin. Right. So if you did that, though, then the
question is: What is the data base that you are going to search
against?
Chairman Franken. I just want to ask this with Mr. Acquisti
and Mr. Sherman. Mr. Acquisti said that the social networks--
the privacy policy has sort of loosened in a way. What did you
mean by that in terms of--let us just get a little dialogue
maybe between the two of you just on this. Has Facebook done
that? Have they loosened their privacy policies? You are
nodding, Ms. Farahany, so--I just go to whoever is nodding.
That is my role as Chairman.
[Laughter.]
Chairman Franken. If you want to get called on, nod.
Ms. Farahany. I am happy to nod and be called on. I think
Facebook and other social media sites are changing our
expectations of privacy. So I think part of the reason why the
Fourth Amendment analysis is useful here is that it is tied to
what does society expect to be able to keep private. And in
today's world, we are moving toward much greater transparency.
As I have been listening to the conversation, it does not seem
like it is facial recognition itself that anybody is afraid of.
It is linking it to other information that people are
frightened by. And I think that is right, which is, there is
nothing inherently frightening about having your face seen. We
have it seen in public all the time. We do not try to hide it
from view. It is the aggregation of data that frightens people.
And so what is it, if anything, we should be doing about
aggregation of data? Well, Congress has already taken a number
of initiatives to keep some types of personal information
private, like your health information, financial transactions,
your genetic information for certain types of uses through the
Genetic Information Nondiscrimination Act. But we do not stop
the flow of information. We say there are certain applications
of the information which are limited or impermissible. And I
think there is nothing about for me personally--and this may be
because, you know, I am a user of Facebook and somebody who is
comfortable with greater transparency. There is nothing
frightening to me about somebody having a photograph taken of
me or even going into every store or every place on the street
and having a photograph taken of me. It is the ability to make
a complete dossier about me and know a lot of other
information.
And so if there is something about the use and application
that we are frightened about, I think that is an appropriate
place for Congress to focus very targeted interest, but it may
not be facial recognition technology it should be focusing on
then. It is the act of data aggregation itself and who can
aggregate data, for what purpose, and to whom they can package
and sell it.
Chairman Franken. OK. Now, you are nodding, so that means
you are going to be called on.
Mr. Acquisti. I was nodding, Senator. In my written
testimony, I made a short list of examples where Facebook
indeed changed something--settings, defaults--to unilaterally
create more disclosure or more sharing. The examples include
Facebook News in 2006, Tagging in 2009; changes in privacy
settings in early 2010; changing of the cache time limits in
2010--that refers to how long third-party developers can keep
your data; the introduction of Facebook Places in 2010, which
allows others to tag you when you go in a certain location; the
switch to the ``Timeline'' in early 2012, initially voluntary,
then compulsory; more recent the switching of users to using
Facebook emails rather than other parties' emails. So there is
an extensive list of examples showing this trend.
Chairman Franken. How do you respond to that?
Mr. Sherman. Well, I think the examples that Professor
Acquisti is offering are examples of ways in which we have
changed our service, and I think you would want Facebook to
innovate, you would want Facebook to continue to offer new and
better products to our users, and that is something that we try
to do every day. Anytime we make any change to our service,
including the changes that Professor Acquisti referred to, we
have a robust privacy process that includes professionals from
all across our organization who review those changes to make
sure that they are consistent with the commitments that we have
made to our users and that they will help us maintain the trust
of our users, because, after all, if people do not trust us,
then they will not use our service, and that is something we
very much want people to do. And I think if we did make a
change of any sort--and I think in the instances that he has
described--we let our users know about that and give them the
ability to make choices about them.
Chairman Franken. OK. And did it involve information
retrospectively? In other words, did it involve loosening the
privacy on information they had already put in there that they
did not know would--I am saying this out of ignorance here. I
am just asking.
Mr. Sherman. There may be instances where we would change a
default, so for new people who come onto the site, things might
work in a slightly different way, and we would be very clear
with them about how that works. But we have committed to the
FTC that when we have information that we already have that is
covered by a privacy setting, we will not disclose it in a way
that materially exceeds the privacy setting after that has been
done.
Chairman Franken. OK. Thank you.
I want to go to Ms. Lynch in kind of a final question, but
I have not talked to the sheriff yet, and I want to thank you
for being with us. I know that right now Calhoun County is
about to roll out a facial recognition system for the field. If
your deputy pulls someone over and that person refuses to
identify him- or herself, this system will allow you to see if
they are a wanted criminal or someone with an arrest record.
Now, I know that the data base of photos you are using for
this field system is still going to be a data base of mug shots
from arrests.
Mr. Amerson. Right.
Chairman Franken. It is not going to be the data base from
the Department of Motor Vehicles. Can you tell us why you
decided to stick with the criminal data base and not use a
bigger data base like the DMV's?
Mr. Amerson. I think the key is for us to focus on the
people that are of interest to us. Ordinary, honest people
going about their daily business are not of interest to us. Our
interests are people who are committing crimes, people who are
wanted for questioning about crimes. It would have to be a very
certain degree of information allowed--available for us to do
that. But, again, the key to us is locating wanted criminals so
that we can locate and arrest them and take them off the
street.
Chairman Franken. Thank you.
Ms. Lynch, if Congress were to pass a law governing law
enforcement use of facial recognition technology, what are the
two or three protections you think need to be included?
Ms. Lynch. Well, I think first we have to look at how law
enforcement is getting the data. So law enforcement is
currently getting data in general in two different ways. One is
directly, so let us say they are bringing a suspected criminal
into a police department and fingerprint them, or they are
collecting an image on the street. And then the second way that
law enforcement gets data is from a private company or a third
party--bank records or data from Facebook, submitting a warrant
to Facebook. And I think in both of those situations, we would
like to see a warrant based on probable cause to get access to
the data.
Facial recognition data and faceprints and photographs are
pretty sensitive data, and everyone though we do share our
faces with the public and we share our images with third
parties, there has been a lot of significant research done to
show that people still have an expectation of privacy in this
information. Even though we are sharing it with our friends and
our family and our networks, we are not necessarily expecting
that that data should be shared with the Government. And I
think based on that, we do have a reasonable expectation of
privacy in the data that would warrant a warrant standard. So
that is the first thing.
I think the second thing that I would like to see is that
there would be some data minimization requirements put in
place. This could be minimization of how much data the
gvernment collects, so instead of getting 10 pictures of a
person or crowd photos of a person--that include a person, it
is limited to mug shots like the sheriff said. So that is one
way of minimizing the data collection. Another is if the
government is collecting crowd photo data for an individual
investigation, that that crowd photo be deleted once the
investigation is concluded, or that other faces in the crowd be
scrubbed so that they are not identifiable. So that is the
second.
And then I think the third thing that I would like to see
is that data that is gathered for one purpose is not combined
with data gathered for another. So, for example, right now the
FBI has two separate parts to its fingerprint data base. It has
the records collected for civil purposes, like employment. If
you are Federal employee, if you are a lawyer in California, if
you are applying for a job to work with children, your
fingerprints are collected and put in the FBI's civil
fingerprint data base. And that is kept separate from the
criminal data base where all of the fingerprints of anybody
arrested in the United States go. And, currently, although
those are kept separate, the FBI is planning to incorporate a
master name system that would allow searching of both data
bases at the same time, and I think this raises a lot of
implications for privacy and civil liberties that we have not
discussed. And even though we are talking about fingerprints
here, when the FBI includes facial recognition into its data
base--and it is supposed to do that by 2014--they will be
searching facial recognition-ready photographs as well.
Chairman Franken. Thank you.
I have a note here that Professor Farahany has a plane to
catch. Is that correct?
Ms. Farahany. My flight is at seven.
Chairman Franken. I am sorry?
Ms. Farahany. I said my flight is at seven.
Chairman Franken. Let us see. It is rush hour. Is it
National or Dulles? Dulles.
[Laughter.]
Chairman Franken. Are you checking any bags?
[Laughter.]
Chairman Franken. OK. Well, I will ask my last question,
and then you can get out of here.
Mr. Sherman, once you generate a faceprint for somebody,
even though you might not do it now, you can use it down the
road in countless ways. You could. I would like for you to tell
us on the record how Facebook will and will not use its
faceprints going forward. We did have the matter of some
changes in policy. For example, can you assure us that Facebook
will share or sell users' faceprints along with the software
needed to use them to third parties--will not do that? Can you
assure us that they will not do that?
Mr. Sherman. Well, Senator Franken, I think it is difficult
to know in the future what Facebook will look like five or 10
years down the road, and so it is hard to respond to that
hypothetical. What I can tell you is that we have a robust
process, as I have described, to vet any changes that we would
make along those lines. We also have relationships with the
Federal Trade Commission, the Irish Data Protection
Commissioner which regulates our Irish affiliate, and consumer
groups like the Electronic Frontier Foundation. We talk with
them regularly about changes that we are making or are planning
to make. I think if we would make a change that would be
concerning, those are certainly groups that would express
concern, and we obviously would be transparent with any change
with our users.
Chairman Franken. Well, I think that is a fair answer. Your
company has every right not to lock itself into future business
decisions and to keep your options open. But perhaps that is
why Congress should be looking at this and considering whether
we need to put in place protections so that users' faceprints
are never shared or sold without their explicit permission, for
example.
Well, I want to thank you all for joining us. Ms. Farahany,
you--you are all permitted to bolt.
[Laughter.]
Chairman Franken. But I want to thank you and, again, your
complete written testimonies will be made part of the record.
In closing, I want to thank Ranking Member Coburn, and I
want to thank each of the witnesses who appeared with us today.
I will add a statement from EPIC to the record.
[The statement appears as a submission for the record.]
Chairman Franken. We are adjourned. Thank you. Thank you
all.
[Whereupon, at 4:35 p.m., the Subcommittee was adjourned.]
[Questions and answers and submissions for the record
follow.]
A P P E N D I X
Additional Material Submitted for the Record
Witness List
[GRAPHIC] [TIFF OMITTED] 86599.001
[GRAPHIC] [TIFF OMITTED] 86599.002
Prepared Statements of Witnesses
[GRAPHIC] [TIFF OMITTED] 86599.003
[GRAPHIC] [TIFF OMITTED] 86599.004
[GRAPHIC] [TIFF OMITTED] 86599.005
[GRAPHIC] [TIFF OMITTED] 86599.006
[GRAPHIC] [TIFF OMITTED] 86599.007
[GRAPHIC] [TIFF OMITTED] 86599.008
[GRAPHIC] [TIFF OMITTED] 86599.009
[GRAPHIC] [TIFF OMITTED] 86599.010
[GRAPHIC] [TIFF OMITTED] 86599.011
[GRAPHIC] [TIFF OMITTED] 86599.012
[GRAPHIC] [TIFF OMITTED] 86599.013
[GRAPHIC] [TIFF OMITTED] 86599.014
[GRAPHIC] [TIFF OMITTED] 86599.015
[GRAPHIC] [TIFF OMITTED] 86599.016
[GRAPHIC] [TIFF OMITTED] 86599.017
[GRAPHIC] [TIFF OMITTED] 86599.018
[GRAPHIC] [TIFF OMITTED] 86599.019
[GRAPHIC] [TIFF OMITTED] 86599.020
[GRAPHIC] [TIFF OMITTED] 86599.021
[GRAPHIC] [TIFF OMITTED] 86599.022
[GRAPHIC] [TIFF OMITTED] 86599.023
[GRAPHIC] [TIFF OMITTED] 86599.024
[GRAPHIC] [TIFF OMITTED] 86599.025
[GRAPHIC] [TIFF OMITTED] 86599.026
[GRAPHIC] [TIFF OMITTED] 86599.027
[GRAPHIC] [TIFF OMITTED] 86599.028
[GRAPHIC] [TIFF OMITTED] 86599.029
[GRAPHIC] [TIFF OMITTED] 86599.030
[GRAPHIC] [TIFF OMITTED] 86599.031
[GRAPHIC] [TIFF OMITTED] 86599.032
[GRAPHIC] [TIFF OMITTED] 86599.033
[GRAPHIC] [TIFF OMITTED] 86599.034
[GRAPHIC] [TIFF OMITTED] 86599.035
[GRAPHIC] [TIFF OMITTED] 86599.036
[GRAPHIC] [TIFF OMITTED] 86599.037
[GRAPHIC] [TIFF OMITTED] 86599.038
[GRAPHIC] [TIFF OMITTED] 86599.039
[GRAPHIC] [TIFF OMITTED] 86599.040
[GRAPHIC] [TIFF OMITTED] 86599.041
[GRAPHIC] [TIFF OMITTED] 86599.042
[GRAPHIC] [TIFF OMITTED] 86599.043
[GRAPHIC] [TIFF OMITTED] 86599.044
[GRAPHIC] [TIFF OMITTED] 86599.045
[GRAPHIC] [TIFF OMITTED] 86599.046
[GRAPHIC] [TIFF OMITTED] 86599.047
[GRAPHIC] [TIFF OMITTED] 86599.048
[GRAPHIC] [TIFF OMITTED] 86599.049
[GRAPHIC] [TIFF OMITTED] 86599.050
[GRAPHIC] [TIFF OMITTED] 86599.051
[GRAPHIC] [TIFF OMITTED] 86599.052
[GRAPHIC] [TIFF OMITTED] 86599.053
[GRAPHIC] [TIFF OMITTED] 86599.054
[GRAPHIC] [TIFF OMITTED] 86599.055
[GRAPHIC] [TIFF OMITTED] 86599.056
[GRAPHIC] [TIFF OMITTED] 86599.057
[GRAPHIC] [TIFF OMITTED] 86599.058
[GRAPHIC] [TIFF OMITTED] 86599.059
[GRAPHIC] [TIFF OMITTED] 86599.060
[GRAPHIC] [TIFF OMITTED] 86599.061
[GRAPHIC] [TIFF OMITTED] 86599.062
[GRAPHIC] [TIFF OMITTED] 86599.063
[GRAPHIC] [TIFF OMITTED] 86599.064
[GRAPHIC] [TIFF OMITTED] 86599.065
[GRAPHIC] [TIFF OMITTED] 86599.066
[GRAPHIC] [TIFF OMITTED] 86599.067
[GRAPHIC] [TIFF OMITTED] 86599.068
[GRAPHIC] [TIFF OMITTED] 86599.069
[GRAPHIC] [TIFF OMITTED] 86599.070
[GRAPHIC] [TIFF OMITTED] 86599.071
[GRAPHIC] [TIFF OMITTED] 86599.072
[GRAPHIC] [TIFF OMITTED] 86599.073
[GRAPHIC] [TIFF OMITTED] 86599.074
[GRAPHIC] [TIFF OMITTED] 86599.075
[GRAPHIC] [TIFF OMITTED] 86599.076
[GRAPHIC] [TIFF OMITTED] 86599.077
[GRAPHIC] [TIFF OMITTED] 86599.078
[GRAPHIC] [TIFF OMITTED] 86599.079
[GRAPHIC] [TIFF OMITTED] 86599.080
[GRAPHIC] [TIFF OMITTED] 86599.081
[GRAPHIC] [TIFF OMITTED] 86599.082
[GRAPHIC] [TIFF OMITTED] 86599.083
[GRAPHIC] [TIFF OMITTED] 86599.084
[GRAPHIC] [TIFF OMITTED] 86599.085
[GRAPHIC] [TIFF OMITTED] 86599.086
Subcommittee Chairman Prepared Statement
[GRAPHIC] [TIFF OMITTED] 86599.087
[GRAPHIC] [TIFF OMITTED] 86599.088
[GRAPHIC] [TIFF OMITTED] 86599.089
[GRAPHIC] [TIFF OMITTED] 86166.090
Questions for Witnesses from Senator Al Franken
[GRAPHIC] [TIFF OMITTED] 86599.091
[GRAPHIC] [TIFF OMITTED] 86599.092
[GRAPHIC] [TIFF OMITTED] 86599.093
[GRAPHIC] [TIFF OMITTED] 86599.094
[GRAPHIC] [TIFF OMITTED] 86599.095
[GRAPHIC] [TIFF OMITTED] 86599.096
Questions and Answers
[GRAPHIC] [TIFF OMITTED] 86599.097
[GRAPHIC] [TIFF OMITTED] 86599.098
[GRAPHIC] [TIFF OMITTED] 86599.099
[GRAPHIC] [TIFF OMITTED] 86599.100
[GRAPHIC] [TIFF OMITTED] 86599.101
[GRAPHIC] [TIFF OMITTED] 86599.102
[GRAPHIC] [TIFF OMITTED] 86599.103
[GRAPHIC] [TIFF OMITTED] 86599.104
[GRAPHIC] [TIFF OMITTED] 86599.105
[GRAPHIC] [TIFF OMITTED] 86599.106
[GRAPHIC] [TIFF OMITTED] 86599.107
[GRAPHIC] [TIFF OMITTED] 86599.108
[GRAPHIC] [TIFF OMITTED] 86599.109
[GRAPHIC] [TIFF OMITTED] 86599.110
[GRAPHIC] [TIFF OMITTED] 86599.111
[GRAPHIC] [TIFF OMITTED] 86599.112
Miscellaneous Submissions for the Record
[GRAPHIC] [TIFF OMITTED] 86599.113
[GRAPHIC] [TIFF OMITTED] 86599.114
[GRAPHIC] [TIFF OMITTED] 86599.115
[GRAPHIC] [TIFF OMITTED] 86599.116
[GRAPHIC] [TIFF OMITTED] 86599.117
[GRAPHIC] [TIFF OMITTED] 86599.118
[GRAPHIC] [TIFF OMITTED] 86599.119
[GRAPHIC] [TIFF OMITTED] 86599.120
[GRAPHIC] [TIFF OMITTED] 86599.121
[GRAPHIC] [TIFF OMITTED] 86599.122
[GRAPHIC] [TIFF OMITTED] 86599.123
[GRAPHIC] [TIFF OMITTED] 86599.124
[GRAPHIC] [TIFF OMITTED] 86599.125
[GRAPHIC] [TIFF OMITTED] 86599.126
[GRAPHIC] [TIFF OMITTED] 86599.127
[GRAPHIC] [TIFF OMITTED] 86599.128
[GRAPHIC] [TIFF OMITTED] 86599.129
[GRAPHIC] [TIFF OMITTED] 86599.130
[GRAPHIC] [TIFF OMITTED] 86599.131
[GRAPHIC] [TIFF OMITTED] 86599.132
[GRAPHIC] [TIFF OMITTED] 86599.133
[GRAPHIC] [TIFF OMITTED] 86599.134
[GRAPHIC] [TIFF OMITTED] 86599.135
[GRAPHIC] [TIFF OMITTED] 86599.136
[GRAPHIC] [TIFF OMITTED] 86599.137
[GRAPHIC] [TIFF OMITTED] 86599.138
[GRAPHIC] [TIFF OMITTED] 86599.139
[GRAPHIC] [TIFF OMITTED] 86599.140
[GRAPHIC] [TIFF OMITTED] 86599.141
[GRAPHIC] [TIFF OMITTED] 86599.142
[GRAPHIC] [TIFF OMITTED] 86599.143
[GRAPHIC] [TIFF OMITTED] 86599.144
[GRAPHIC] [TIFF OMITTED] 86599.145
[GRAPHIC] [TIFF OMITTED] 86599.146
[GRAPHIC] [TIFF OMITTED] 86599.147
[GRAPHIC] [TIFF OMITTED] 86599.148
[GRAPHIC] [TIFF OMITTED] 86599.149
[GRAPHIC] [TIFF OMITTED] 86599.150
[GRAPHIC] [TIFF OMITTED] 86599.151
[GRAPHIC] [TIFF OMITTED] 86599.152
[GRAPHIC] [TIFF OMITTED] 86599.153
[GRAPHIC] [TIFF OMITTED] 86599.154
[GRAPHIC] [TIFF OMITTED] 86599.155
[GRAPHIC] [TIFF OMITTED] 86599.156
[GRAPHIC] [TIFF OMITTED] 86599.157
[GRAPHIC] [TIFF OMITTED] 86599.158
[GRAPHIC] [TIFF OMITTED] 86599.159
[GRAPHIC] [TIFF OMITTED] 86599.160
[GRAPHIC] [TIFF OMITTED] 86599.161
[GRAPHIC] [TIFF OMITTED] 86599.162
[GRAPHIC] [TIFF OMITTED] 86599.163
[GRAPHIC] [TIFF OMITTED] 86599.164
[GRAPHIC] [TIFF OMITTED] 86599.165
[GRAPHIC] [TIFF OMITTED] 86599.166
[GRAPHIC] [TIFF OMITTED] 86599.167
[GRAPHIC] [TIFF OMITTED] 86599.168
[GRAPHIC] [TIFF OMITTED] 86599.169
[GRAPHIC] [TIFF OMITTED] 86599.170
[GRAPHIC] [TIFF OMITTED] 86599.171
[GRAPHIC] [TIFF OMITTED] 86599.172
[GRAPHIC] [TIFF OMITTED] 86599.173
[GRAPHIC] [TIFF OMITTED] 86599.174
[GRAPHIC] [TIFF OMITTED] 86599.175
[GRAPHIC] [TIFF OMITTED] 86599.176
[GRAPHIC] [TIFF OMITTED] 86599.177
[GRAPHIC] [TIFF OMITTED] 86599.178
[GRAPHIC] [TIFF OMITTED] 86599.179
[GRAPHIC] [TIFF OMITTED] 86599.180
[GRAPHIC] [TIFF OMITTED] 86599.181
[GRAPHIC] [TIFF OMITTED] 86599.182
[GRAPHIC] [TIFF OMITTED] 86599.183
[GRAPHIC] [TIFF OMITTED] 86599.184
[GRAPHIC] [TIFF OMITTED] 86599.185
[GRAPHIC] [TIFF OMITTED] 86599.186
[GRAPHIC] [TIFF OMITTED] 86599.187
[GRAPHIC] [TIFF OMITTED] 86599.188
[GRAPHIC] [TIFF OMITTED] 86599.189
[GRAPHIC] [TIFF OMITTED] 86599.190
[GRAPHIC] [TIFF OMITTED] 86599.191
[GRAPHIC] [TIFF OMITTED] 86599.192
[GRAPHIC] [TIFF OMITTED] 86599.193
[GRAPHIC] [TIFF OMITTED] 86599.194
[GRAPHIC] [TIFF OMITTED] 86599.195
[GRAPHIC] [TIFF OMITTED] 86599.196
[GRAPHIC] [TIFF OMITTED] 86599.197
[GRAPHIC] [TIFF OMITTED] 86599.198
[GRAPHIC] [TIFF OMITTED] 86599.199
[GRAPHIC] [TIFF OMITTED] 86599.200
[GRAPHIC] [TIFF OMITTED] 86599.201
[GRAPHIC] [TIFF OMITTED] 86599.202
[GRAPHIC] [TIFF OMITTED] 86599.203
[GRAPHIC] [TIFF OMITTED] 86599.204
[GRAPHIC] [TIFF OMITTED] 86599.205
[GRAPHIC] [TIFF OMITTED] 86599.206
[GRAPHIC] [TIFF OMITTED] 86599.207
[GRAPHIC] [TIFF OMITTED] 86599.208
[GRAPHIC] [TIFF OMITTED] 86599.209
[GRAPHIC] [TIFF OMITTED] 86599.210
[GRAPHIC] [TIFF OMITTED] 86599.211
[GRAPHIC] [TIFF OMITTED] 86599.212
[GRAPHIC] [TIFF OMITTED] 86599.213
[GRAPHIC] [TIFF OMITTED] 86599.214
[GRAPHIC] [TIFF OMITTED] 86599.215
[GRAPHIC] [TIFF OMITTED] 86599.216
[GRAPHIC] [TIFF OMITTED] 86599.217
[GRAPHIC] [TIFF OMITTED] 86599.218
[GRAPHIC] [TIFF OMITTED] 86599.219
[GRAPHIC] [TIFF OMITTED] 86599.220
[GRAPHIC] [TIFF OMITTED] 86599.221
[GRAPHIC] [TIFF OMITTED] 86599.222
[GRAPHIC] [TIFF OMITTED] 86599.223
[GRAPHIC] [TIFF OMITTED] 86599.224
[GRAPHIC] [TIFF OMITTED] 86599.225
[GRAPHIC] [TIFF OMITTED] 86599.226
[GRAPHIC] [TIFF OMITTED] 86599.227
[GRAPHIC] [TIFF OMITTED] 86599.228
[GRAPHIC] [TIFF OMITTED] 86599.229
[GRAPHIC] [TIFF OMITTED] 86599.230
[GRAPHIC] [TIFF OMITTED] 86599.231
[GRAPHIC] [TIFF OMITTED] 86599.232
[GRAPHIC] [TIFF OMITTED] 86599.233
[GRAPHIC] [TIFF OMITTED] 86599.234
[GRAPHIC] [TIFF OMITTED] 86599.235
[GRAPHIC] [TIFF OMITTED] 86599.236
Submissions for the Record Not Printed Due to Voluminous Nature,
Previously Printed by an Agency of the Federal Government, or Other
Criteria Determined by the Committee, List of Material and Links Can Be
Found Below:
EPIC Comments--January 31, 2012.:
http://www.ftc.gov/os/comments/
facialrecognitiontechnology/00083-0982624.pdf
National Institute of Justice (NIJ), William A. Ford,
Director, State of Research, Development and Evaluation.:
https://www.eff.org/sites/default/files/ford-State-of-
Research-Development-and-Evaluation-at-NIJ.pdf#page=17
Farahany, Nita A., Testimony Attachment--Pennsylvania Law
Review:
http://www.pennumbra.com/issues/pdfs/160-5/Farahany.pdf
[GRAPHIC] [TIFF OMITTED] 86599.237