Fig. 1: Sample advertisement with in-ad privacy notice With the launch of our Ads Preferences Manager (www.google.com/ads/ preferences), Google became the first major industry player to empower users to review and edit the interest categories we use to target ads. The Ads Preferences Manager enables a user to see the interest categories Google associates with the cookie stored on her browser, to add interest categories that are relevant to her, and to delete any interest categories that do not apply or that she does not wish to be associated with. I should also clarify that Google does not serve interest-based ads based on sensitive interest categories such as health status or categories relating to children under 13. The Ads Preferences Manager also permits users to opt out of interest-based ads altogether. Google implements this opt-out preference by setting an opt-out cookie that has the text ``OPTOUT'' where a unique cookie ID would otherwise be set. We have also developed tools to make our opt-out cookie permanent, even when users clear other cookies from their browser (see www.google.com/ads/preferences/plugin). We are encouraged that others are using the open-source code for this plugin, released by Google, to create their own persistent opt-out tools.
Fig. 2: Ads Preferences Manager As an engineer, I like to evaluate things by looking at the data. In this case, we have begun to receive information about how users are interacting with the Ads Preferences Manager. While our data are preliminary, we have discovered that, for every user that has opted out, about four change their interest categories and remain opted in and about ten view their settings but do nothing. We take from this that online users appreciate transparency and control, and become more comfortable with data collection and use when we offer it on their terms and in full view. Control Through Data Portability Providing our users with control over their personal information must also mean giving them the ability to easily take their data with them if they decide to leave. Starting with our Gmail service and now covering more than 25 Google products where users create and store personal information, a cadre of Google engineers--self-named the ``Data Liberation Front''--has built tools to allow our users to ``liberate'' their data if they choose to switch providers or to stop using one of our services. The critical insight of these engineers was to recognize that users should never feel stuck using a service because they are unable to easily retrieve the content they created and transfer it to another service or provider at no additional cost. Every user of Gmail, Picasa, Reader, YouTube, Calendar, Apps for Business, Docs, iGoogle, Maps, and many other products already have access to data portability tools, and the team continues to work on additional products. Detailed information for users is available at www.dataliberation.org.
Fig. 3: Data Liberation Front Data portability has benefits for our users and for Google. First, it keeps our product teams on their toes--they know just how easy it is for their users to move to a competitor's product, and understand that their success depends upon continuing to be responsive to privacy and product concerns and acting quickly to address them. Second, allowing our users the freedom to leave honors our commitment to put users in control. In considering the testimony today and as the Committee develops its approach to consumer privacy, I urge you to consider the role that data portability can play in ensuring that consumer-facing businesses remain accountable for their privacy choices. Regulators should encourage this kind of ``user empowerment by design'' as an effective means of ensuring respect for user privacy without chilling innovation. One-stop Shop for Transparency and Control: the Google Dashboard Google developed the Google Dashboard (www.google.com/dashboard) to provide users with a one-stop, easy-to-use control panel to manage the use and storage of personal information associated with their Google accounts and products--from Gmail to Picasa to Search. With the Dashboard, a user can see and edit the personally identifiable data stored with her individual Google account. A user also can change her password or password recovery options using Dashboard, and click to manage various products' settings, contacts stored with the account, or documents created or stored through Google Docs. Dashboard also lets a user manage chat data, by choosing whether or not to save it in her Google account.
Fig. 4: Google Dashboard Industry-leading Security: Encrypted Search and Gmail Along with transparency and user control, good security is vital in maintaining user trust. Google faces complex security challenges while providing services to millions of people every day, and we have world- class engineers working at Google to help secure information. In fact, my own research background is in security. In a 1999 paper, ``Why Johnny Can't Encrypt,'' I argued that security tools must be simple and usable to be effective. Unfortunately, it is sometimes the case that security technology is so complicated that it isn't usable, and thus ineffective. I have continued that theme at Google, working to build user-friendly, simple security features into our products. For example, Google recently became the first (and still only) major webmail provider to offer session-wide secure socket layer (SSL) encryption by default. Usually recognized by a web address starting with ``https'' or by a ``lock'' icon, SSL encryption is regularly used for online banking or transactions. As our Gmail lead engineer wrote: In 2008, we rolled out the option to always use https-- encrypting your mail as it travels between your web browser and our servers. Using https helps protect data from being snooped by third parties. . . . We initially left the choice of using it up to you because there's a downside: https can make your mail slower since encrypted data doesn't travel across the web as quickly as unencrypted data. Over the last few months, we've been researching the security/latency tradeoff and decided that turning https on for everyone was the right thing to do. We hope other companies will soon join our lead. We also hope to see our competitors adopt another security tool we offer our users: encryption for search queries. Users can simply type in ``encrypted.google.com'' and encrypt their search queries and results. As we said in our blog post about encrypted search, ``an encrypted connection is created between your browser and Google. This secured channel helps protect your search terms and your search results pages from being intercepted by a third party on your network.'' And in March Google launched a system to notify users about suspicious activities associated with their accounts. By automatically matching a user's IP address to broad geographical locations, Google can help detect anomalous behavior, such as a log-in appearing to come from one continent only a few hours after the same account holder logged in from a different continent. Thus, someone whose Gmail account may have been compromised will be notified and given the opportunity to change her password, protecting her own account and her Gmail contacts.
Fig. 5: Recent Account Activity Warning Similarly, we built Google Chrome with security in mind from the beginning, including features such as:
Safe Browsing, which warns a user before he visits a site that it is suspected of phishing or containing malware; Sandboxing, which works automatically to help prevent web browser processes from harming one another or a user's computer, and Automatic updates that deliver security patches to users as quickly as possible. Google also conducts extensive security research and provides free security resources to the broader Internet community. We make security tools available for free to webmasters to help them operate more secure sites, as well as to application developers to help them build more secure applications. For example, we recently released a tool called ``skipfish'' under an open source license to help identify web application vulnerabilities through fully automated, active security reconnaissance. The Challenges of Designing for Privacy and Security In addition to discussing Google's efforts to offer transparency, user control, and security, I want to also discuss just two of the many challenges I and others in similar roles face as we try to build privacy and security into innovative products. The first relates to data collection and use. The second involves how to best communicate to individuals how to manage their privacy. Every day we receive information from our users' interaction with our products and services. That information may be in the form of an e- mail that we process, store, and protect in our Gmail product--or it could be generated by the interaction between a user's computer and our servers, such as a search query and the IP address associated with a specific computer or network of computers. We are asked often why we retain this query and IP address data-- which can be very sensitive even if it does not personally identify individuals. We certainly treat this data with strong security, and seek to build in transparency and user controls where appropriate-- including tools like our Ads Preferences Manager. We also voluntarily anonymize IP addresses after 9 months. But this data is actually tremendously helpful to us in improving our products and protecting our networks from hackers, spammers, and fraudsters. For example, bad actors continually seek to manipulate our search ranking, launch denial-of-service attacks, and scam our users via e-mail spam or malware. We use our log files to track, block, and keep ahead of the bad guys. We also use information like IP addresses and search queries to develop products like Flu Trends (www.google.com/flutrends). A team of our engineers found that examining certain search terms on an aggregate basis can provide a good indicator of flu activity. Of course, not every person who searches for ``flu'' is actually sick, but a pattern emerges when many flu-related search queries are added together. By counting how often we see these search queries, we can estimate how much flu is circulating in different countries and regions around the world. Our results have been published in the journal Nature. For epidemiologists, this is an exciting development, because early detection of a disease outbreak can reduce the number of people affected. If a new strain of influenza virus emerges under certain conditions, a pandemic could ensue with the potential to cause millions of deaths. Our up-to-date influenza estimates may enable public health officials and health professionals to better respond to seasonal epidemics and pandemics.
Even-handed application. A pro-innovation privacy framework must apply even-handedly to all personal data regardless of source or means of collection. Thus, offline data collection and processing should, where reasonable, involve similar data protection obligations. Recognition of benefits and costs. As with any regulatory policy, it is appropriate to examine the benefits and costs of legislating in this area, including explicit attention to actual harm and compliance costs. Security requirements and breach notification. We pride ourselves at Google for industry-leading security features, including the use of encryption for our search and Gmail services I discussed. A thorough privacy framework should promote uniform, reasonable security principles, including data breach notification procedures. Clear process for compelled access. The U.S. law governing government access to stored communications is outdated and out of step with what is reasonably expected by those who use cloud computing services. The problems in the law threaten the growth, adoption, and innovation of cloud technologies without a corresponding benefit. As part of the Digital Due Process coalition, we are working to address this issue. The Committee can play an important role in encouraging clear rules for compelled access to user data. Consistency across jurisdictions. Generally, Internet users neither expect nor want different baseline privacy rules based on the local jurisdiction in which they or the provider reside. Moreover, in many instances, strict compliance with differing state or national privacy protocols would actually diminish consumer privacy, since it would require Internet companies to know where consumers are located at any given time. Any new privacy law must also offer baseline protections on which providers can innovate. A pro-innovation privacy framework offers providers the flexibility to both develop self-regulatory structures and individually innovate in privacy practices and tools. The advertising industry and online publisher efforts to develop self- regulatory rules for interest-based advertising, for example, are a strong example of the need for and utility of industry-driven efforts. As I have discussed, Google has been a leader in developing innovative privacy tools. Continued innovation in the privacy space is vital for users. Unfortunately, compliance-based or overly complex rules can lock in a specific privacy model that may quickly become obsolete or insufficient due to the speed with which Internet services evolve. A principles- based model encourages innovation and competition in privacy tools. A baseline framework needs to encourage the development of innovative tools like the ones I've described. We believe that stable, baseline principles set by law can permit flexible, adaptive structures to develop on top--much like the stable protocols and standards at the physical and network layers of the Internet allow flexible and innovative development at the content and application layers. With comprehensive, baseline privacy legislation establishing ground rules for all entities, self-regulatory standards and best practices of responsible industry actors will evolve over time. On top of that structure, individual companies will be free (and encouraged) to create innovative privacy tools and policies rather than stick with potentially outdated compliance structures. Conclusion Chairman Rockefeller, Ranking Member Hutchison, and members of the Committee, thank you for inviting me to testify today. We at Google appreciate the opportunity to discuss online privacy and how our company has helped lead in the effort to protect our users by providing them with transparency, user control, and security. I look forward to answering any questions you might have about our efforts, and Google looks forward to working with members of the Committee and others in the development of better privacy protections. Thank you. The Chairman. Thank you, Dr. Whitten. Now Mr. Jim Harper, Director of Information Policy Studies at The Cato Institute. STATEMENT OF JIM HARPER, DIRECTOR OF INFORMATION POLICY STUDIES, THE CATO INSTITUTE Mr. Harper. Thank you, Mr. Chairman. Good afternoon. Thanks for inviting me to testify today. And I definitely appreciate that you're educating the Committee and the public about consumer online privacy. My 21-page single-spaced written testimony---- [Laughter.] Mr. Harper.--is only a brief glance at the many issues that are involved in privacy regulation and fair information practices. I suspect that the much more useful 1-page executive summary is what'll benefit you and your staff in your early examination of the issue. What it says is that privacy is a complicated human interest. When people talk about privacy, they may mean desire for fair treatment, they may mean security from identity fraud and other crimes, they may mean distaste for being marketed to as objects of crass commercialism, and they may mean something more like liberty or autonomy. I think the strongest sense of the word ``privacy'' refers to control of personal information. That is, having the ability to selectively reveal things about yourself so that you can craft the image you portray to all the different communities that you interact with in your life. As we've seen in discussion here today, the online environment is new and different. Many people literally don't know how to control information about themselves. Other technologists with me on the panel today are doing good work, I think, to try to rectify that, but it won't be easy. I may play ``skunk at the garden party'' when I say that I have doubts about the capacity of fair information practices and regulatory solutions to solve these problems and deliver privacy. Fair information practices have a long history, nearly 40 years, and there are many good practices, described by fair information practices, that many companies should probably do. But, just like there are many different senses of privacy, there are many different data practices that matter in different degrees at different times. So, blanket use of fair information practices is probably inappropriate and unhelpful. In my written testimony I focused heavily on notice and the failure of notice, really, over the last decade, to deliver privacy like many thought it would, 10 years ago. I think the short-notice project is wonderful and fine, but I don't hold out much hope that it will lead to an upwelling of privacy awareness, like I think we all would like to have. I also emphasize how changing business models and changing Internet protocols make it difficult to regulate, prospectively, in ways that'll work. Regulations may prevent new protocols--even worse--and new ways of interacting online from coming into existence. This would be a pity, because it would deny all of us the next generation of Internet-enabled innovations. It would also be a pity if privacy regulation were to lock in competitive advantages for the companies that are leading the pack today. For all the good they do consumers, the companies represented by my copanelists at the table, I think, should always be met by searing competition. And companies can use the legislative and regulatory process to lock out competition, foreclose new business models as privacy- problematic. Before I conclude, I want to change hats, really briefly, and talk about an issue that I know is on the mind of many people, and that's targeted advertising. Targeted advertising is sensitive, I think, because it represents a loss of control over personal information, like we've talked about. It also objectifies consumers, as such, rather than treating them as human beings who laugh and cry and aspire and get frustrated and fall in love. I think I understand that concern, but it doesn't motivate me as a privacy advocate. But, what I want to talk about is my experience as the operator of a small website. As I noted in my written testimony, I run a website called washingtonwatch.com. It had about 1.6 million visitors last year, which is pretty good. One bill has 150,000 comments, I'll tell you, so I'm quite aware of the passions that unemployment compensation generates. I run the site in my spare time, and I've built it with my own funds, over several years. I'm fond of joking that it's the reason why I don't have a boat in my driveway. In fact, it might be the reason why I don't have a driveway. I run Google ads to help defray the costs. AdSense is a pretty good product, though I am looking around. Amazon has a pretty cool thing going right now, called Omakase. Here's the thing. I have tons of features that I want to add to washingtonwatch.com, and I decide to add new features when I feel like I have the money to do it. OK? I pay my Web developers about twice what I make per hour to work on the site. Of course, my sob story doesn't matter, but I probably stand in the shoes of many small Website operators and bloggers who choose whether they're going to add more content and more features based on whether they can afford it. Targeted advertising is a way for sites, small and large, to support themselves better so that they can do more cool stuff for American citizens and consumers. Targeted ads, I think it's clear from economic study, are more valuable than contextual ads, more valuable that noncontextual, just blanket advertising. My point is only this: Curtailing targeted advertising in the name of privacy involves tradeoffs with other important consumer issues. And these things are all important to discuss. Thanks, again, so much for inviting me to testify today. Happy to answer your questions. [The prepared statement of Mr. Harper follows:] Prepared Statement of Jim Harper, Director of Information Policy Studies, The Cato Institute Executive Summary Privacy is a complicated human interest. People use the word ``privacy'' to refer to many different things, but its strongest sense is control of personal information, which exists when people have legal power to control information and when they exercise that control consistent with their interests and values. Direct privacy legislation or regulation is unlikely to improve on the status quo. Over decades, a batch of policies referred to as ``fair information practices'' have failed to take hold because of their complexity and internal inconsistencies. Even modest regulation like mandated privacy notices have not produced meaningful improvements in privacy. Consumers generally do not read privacy policies and they either do not consider privacy much of the time, or they value other things more than privacy when they interact online. The online medium will take other forms with changing times, and regulations aimed at an Internet dominated by the World Wide Web will not work with future uses of the Internet. Privacy regulations that work ``too well'' may make consumers worse off overall, not only by limiting their access to content, but by giving supernormal profits to today's leading Internet companies and by discouraging consumer- friendly innovations. The ``online'' and ``offline'' worlds are collapsing rapidly together, and consumers do not have separate privacy interests for one and the other. Likewise, people do not have privacy interests in their roles as consumers that are separate from their interests as citizens. If the Federal Government is going to work on privacy protection, it should start by getting its own privacy house in order. Chairman Rockefeller, Ranking Member Hutchison, and members of the Committee, thank you for inviting me to address your hearing on ``Consumer Online Privacy.'' My name is Jim Harper, and I am Director of Information Policy Studies at the Cato Institute. In that role, I study and write about the difficult problems of adapting law and policy to the challenges of the information age. Cato is a market liberal, or libertarian, think- tank, and I pay special attention to preserving and restoring our Nation's founding traditions of individual liberty, limited government, free markets, peace, and the rule of law. My primary focus is on privacy and civil liberties, and I serve as an advisor to the Department of Homeland Security as a member of its Data Integrity and Privacy Advisory Committee. I am not a technologist, but a lawyer familiar with technology issues. As a former committee counsel in both the House and Senate, I understand lawmaking and regulatory processes related to technology and privacy. I have maintained a website called Privacilla.org since 2000,\1\ cataloguing many dimensions of the privacy issue, and I also maintain an online Federal legislative resource called WashingtonWatch.com,\2\ which has had over 1.6 million visitors in the last year. --------------------------------------------------------------------------- \1\ http://www.privacilla.org \2\ http://www.washingtonwatch.com Disclosure: WashingtonWatch.com defrays some costs of its otherwise money-losing operation by running Google AdSense ads. --------------------------------------------------------------------------- What is Privacy? Your hearing to explore consumer online privacy is welcome. There are many dimensions to privacy, and it is wise to examine all of them, making yourselves aware of the plethora of issues and considerations before turning to legislation or regulation. People use the word ``privacy'' to describe many concerns in the modern world, including fairness, personal security, seclusion, and autonomy or liberty. Given all those salutary meanings, everyone wants ``privacy,'' of course. Few concepts have been discussed so much without ever being solidly defined. But confusion about the meaning of the word makes legislation or regulation aimed at privacy difficult. ``Privacy'' sometimes refers to the interest violated when a person's sense of seclusion or repose is upended. Telephone calls during the dinner hour,\3\ for example, spam e-mails,\4\ and-- historically--the quartering of troops in private homes \5\ undermine privacy and the vaunted ``right to be let alone.'' \6\ --------------------------------------------------------------------------- \3\ See Federal Trade Commission, ``Unwanted Telephone Marketing Calls'' web page http://www.fcc.gov/cgb/consumerfacts/tcpa.html. \4\ The CAN-SPAM Act of 2003 (15 U.S.C. 7701, et seq., Public Law No. 108-187) was intended to remedy the problem of spam, but it remains a huge amount of the SMTP traffic on the Internet. See Jim Harper, ``CAN-SPAM Didn't--Not By a Long Shot,'' [email protected] (Nov. 6, 2006) http://www.cato-atliberty.org/2006/11/06/can-spam-didnt-not-by-a-long- shot/. \5\ See U.S. Const. amend. III (barring quartering of troops in peacetime). \6\ Olmstead v. United States, 277 U.S. 438 (1928) (Brandeis, J, dissenting). Unfortunately, the Olmstead case was not about ``seclusion'' but control of information traveling by wire. --------------------------------------------------------------------------- For some, it is marketing that offends privacy--or at least targeted marketing based on demographic or specific information about consumers. Many people feel something intrinsic to individual personality is under attack when people are categorized, labeled, filed, and objectified for commerce based on data about them. This is particularly true when incomplete data fails to paint an accurate picture. The worst denial of personality occurs in the marketing area when data and logic get it wrong, serving inappropriate marketing communications to hapless consumers. A couple who recently lost their baby receives a promotion for diapers or children's toys, for example. Or mail for a deceased parent continues coming long after his or her passing. In the informal sector, communities sometimes attack individuals because of the inaccurate picture gossip paints on the powerful medium of the Internet.\7\ --------------------------------------------------------------------------- \7\ In his book, The Future of Reputation: Gossip, Rumor, and Privacy on the Internet, George Washington University Law School professor Daniel Solove details the story of ``Dog Poop Girl,'' for example, who was selected for worldwide ridicule when a photo of her failing to clean up after her pooch was uploaded and disseminated over the Internet. Daniel Solove, the Future of Reputation: Gossip, Rumor, and Privacy on the Internet (New Haven: Yale University Press, 2007) pp. 1-4. --------------------------------------------------------------------------- The ``privacy'' damage is tangible when credit bureaus and other reputation providers paint an incomplete or wrong picture. Employers and credit issuers harm individual consumers when they deny people work or credit based on bad data or bad decision rules.\8\ --------------------------------------------------------------------------- \8\ Congress passed the Fair Credit Reporting Act (codified at 15 U.S.C. 1681 et seq.) in 1970 intending to produce fairness in the credit reporting world, which is still an area of difficulty for consumers. --------------------------------------------------------------------------- Other kinds of ``privacy'' violations occur when criminals acquire personal information and use it for their malign purposes. The scourge of identity theft is a well known ``privacy'' problem. Drivers Privacy Protection Acts \9\ passed in many state legislatures and in the U.S. Congress after actress Rebecca Schaeffer was murdered in 1989. Her stalker got her residence information from the California Department of Motor Vehicles. In a similar notable incident a decade later, Vermont murderer Liam Youens used a data broker to gather information as part of an Internet-advertised obsession with the young woman he killed.\10\ --------------------------------------------------------------------------- \9\ The Federal Drivers Privacy Protection Act, Public Law No. 103- 322, amended by Public Law 106-69, prohibits the release or use by any State DMV (or officer, employee, or contractor thereof) of personal information about an individual obtained by the department in connection with a motor vehicle record. It sets penalties for violations and makes violators liable on a civil action to the individual to whom the released information pertains. \10\ See Remsburg v. Docusearch, Inc. (N.H. 2003) http:// www.courts.state.nh.us/supreme/opinions/2003/remsb017.htm. --------------------------------------------------------------------------- ``Privacy'' is also under fire when information demands stand between people and their freedom to do as they please. Why on earth should a person share a phone number with a technology retailer when he or she buys batteries? The U.S. Department of Homeland Security has worked assiduously in what is now called the ``Secure Flight'' program to condition air travel on the provision of accurate identity information to the government, raising the privacy costs of otherwise free movement. Laws banning or limiting medical procedures dealing with reproduction offend ``privacy'' in another sense of the word.\11\ There are a lot of privacy problems out there, and many of them blend together. --------------------------------------------------------------------------- \11\ See Griswold v. Connecticut, 381 U.S. 479 (1965); Roe v. Wade, 410 U.S. 113 (1973). --------------------------------------------------------------------------- Privacy as Control of Personal Information The strongest and most relevant sense of the word ``privacy,'' which I will focus on here, though, is its ``control'' sense--privacy as control over personal information. Privacy in this sense is threatened by the Internet, which is an unusual new medium for many people over the age of eighteen. In his seminal 1967 book Privacy and Freedom, Alan Westin characterized privacy as ``the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others.'' \12\ A more precise, legalistic definition of privacy in the control sense is: the subjective condition people experience when they have power to control information about themselves and when they have exercised that power consistent with their interests and values.\13\ The ``control'' sense of privacy alone has many nuances, and I will parse them here briefly. --------------------------------------------------------------------------- \12\ Alan F. Westin, Privacy and Freedom, p. 7 (New York: Atheneum 1967). \13\ See generally, Jim Harper, ``Understanding Privacy--and the Real Threats to It,'' Cato Policy Analysis No. 520 (Aug. 4, 2004) http://www.cato.org/pub_display.php?pub_id=1652. --------------------------------------------------------------------------- Importantly, privacy is a subjective condition. It is individual and personal. One person cannot decide for another what his or her sense of privacy is or should be. To illustrate this, one has only to make a few comparisons: Some Americans are very reluctant to share their political beliefs, refusing to divulge any of their leanings or the votes they have cast. They keep their politics private. Their neighbors may post yard signs, wear brightly colored pins, and go door-to-door to show affiliation with a political party or candidate. The latter have a sense of privacy that does not require withholding information about their politics. Health information is often deemed intensely private. Many people closely guard it, sharing it only with doctors, close relatives, and loved ones. Others consent to have their conditions, surgeries, and treatments broadcast on national television and the Internet to help others in the same situation. More commonly, they relish the attention, flowers, and cards they receive when an illness or injury is publicized. Privacy varies in thousands of ways from individual to individual and from circumstance to circumstance. An important conclusion flows from the observation that privacy is subjective: government regulation in the name of privacy can be based only on guesses about what ``privacy'' should look like. Such rules can only ape the privacy-protecting decisions that millions of consumers make in billions of daily actions, inactions, transactions, and refusals. Americans make their highly individual privacy judgments based on culture, upbringing, experience, and the individualized costs and benefits of interacting and sharing information. The best way to protect true privacy is to leave decisions about how personal information is used to the people affected. Regulatory mandates that take decision-making power away from people will prevent them striking the balances that make them the best off they can be. Sometimes it is entirely rational and sensible to share information.
At its heart, privacy is a product of autonomy and personal responsibility. Only empowered, knowledgeable citizens can formulate and protect true privacy for themselves, just as they individually pursue other subjective conditions, like happiness, piety, or success. The Role of Law The legal environment determines whether people have the power to control information about themselves. Law has dual, conflicting effects on privacy: Much law protects the privacy-enhancing decisions people make. Other laws undermine individuals' power to control information. Various laws foster privacy by enforcing individuals' privacy- protecting decisions. Contract law, for example, allows consumers to enter into enforceable agreements that restrict the sharing of information involved in, or derived from, transactions. Thanks to contract, one person may buy foot powder from another and elicit as part of the deal an enforceable promise never to tell another soul about the purchase. In addition to explicit terms, privacy- protecting confidentiality has long been an implied term in many contracts for professional and fiduciary services, like law, medicine, and financial services. Alas, legislation and regulation of recent vintage have undermined those protections.\14\ --------------------------------------------------------------------------- \14\ The Gramm-Leach-Bliley Act and Federal regulations under the Health Insurance Portability and Accountability Act institutionalized sharing of personal information with government authorities and various ``approved'' institutions. See 15 U.S.C. 6802(e)(5)&(8); various subsections of 45 C.F.R. 164.512. --------------------------------------------------------------------------- Many laws protect privacy in other areas. Real property law and the law of trespass mean that people have legal backing when they retreat into their homes, close their doors, and pull their curtains to prevent others from seeing what goes on within. The law of battery means that people may put on clothes and have all the assurance law can give that others will not remove their clothing and reveal the appearance of their bodies without permission. Whereas most laws protect privacy indirectly, a body of U.S. state law protects privacy directly. The privacy torts provide baseline protection for privacy by giving a cause of action to anyone whose privacy is invaded in any of four ways.\15\ --------------------------------------------------------------------------- \15\ Privacilla.org, ``The Privacy Torts: How U.S. State Law Quietly Leads the Way in Privacy Protection,'' (July 2002) http:// www.privacilla.org/releases/Torts_Report.html. --------------------------------------------------------------------------- The four privacy causes of action, available in nearly every state, are:
Intrusion upon seclusion or solitude, or into private affairs; Public disclosure of embarrassing private facts; Publicity that places a person in a false light in the public eye; and Appropriation of one's name or likeness. While those torts do not mesh cleanly with privacy as defined here, they are established, baseline, privacy-protecting law. Law is essential for protecting privacy, but much legislation plays a significant role in undermining privacy. Dozens of regulatory, tax, and entitlement programs deprive citizens of the ability to shield information from others. You need only look at the Internal Revenue Service's Form 1040 and related tax forms to see that. Consumer Knowledge and Choice I wrote above about the role of personal responsibility in privacy protection. Perhaps the most important, but elusive, part of privacy protection is consumers' exercise of power over information about themselves consistent with their interests and values. This requires consumers and citizens to be aware of the effects their behavior will have on exposure of information about them. Technology and the world of commerce are rapidly changing, and personal information is both ubiquitous and mercurial. Unfortunately, there is no horn that sounds when consumers are sufficiently aware, or when their preferences are being honored. But study of other, more familiar, circumstances reveals how individuals have traditionally protected privacy.
Consider privacy protection in the physical world. For millennia, humans have accommodated themselves to the fact that personal information travels through space and air. Without understanding how photons work, people know that hiding the appearance of their bodies requires them to put on clothes. Without understanding sound waves, people know that keeping what they say from others requires them to lower their voices. From birth, humans train to protect privacy in the ``natural'' environment. Over millions of years, humans, animals, and even plants have developed elaborate rules and rituals of information sharing and information hiding based on the media of light and sound. Tinkering with these rules and rituals today would be absurd. Imagine, for instance, a privacy law that made it illegal to observe and talk about a person who appeared naked in public without giving the nudist a privacy notice and the opportunity to object. People who lacked the responsibility to put on clothes might be able to sue people careless enough to look at them and recount what they saw. A rule like that would be ridiculous. The correct approach is for consumers to be educated about what they reveal when they interact online and in business so that they know to wear the electronic and commercial equivalents of clothing. Of all the online privacy concerns, perhaps the most fretting has been done about ``behavioral advertising''--sometimes referred to as ``psychographic profiling'' to get us really worked up. What is truly shocking about this problem, though, is that the remedy for most of it is so utterly simple: exercising control over the cookies in one's browser. Cookies are small text files that a website will ask to place in the memory of computers that visit it. Many cookies have distinct strings of characters in them that allow the website to ``recognize'' the computer when it visits the site again. When a single domain places content across the web as a ``third party''--something many ad networks do--it can recognize the same computer many places and gain a sense of the interests of the user. The solution is cookie control: In the major browsers (Firefox and Internet Explorer), one must simply go to the ``Tools'' pull-down menu, select ``Options,'' then click on the ``Privacy'' tab to customize one's cookie settings. In Firefox, one can decline to accept all third- party cookies (shown inset), neutering the cookie-based data collection done by ad networks. In Internet Explorer, one can block all cookies, block all third-party cookies, or even choose to be prompted each time a cookie is offered.\16\ --------------------------------------------------------------------------- \16\ These methods do not take care of an emerging tracker known as ``Flash cookies'' which must be disabled another way, but consumers aware of their ability and responsibility to control cookies can easily meet the growth of Flash cookies. See ``Flash Player Help'' web page, Global Privacy Settings panel, http://www.macromedia.com/support/ documentation/en/flashplayer/help/settings_manager02.html.
Again, consumers educated about what they reveal when they interact online can make decisions about how to behave that will protect privacy much better--in all online contexts--than consumers unaware of how the world around them works. Can Direct Regulation Protect Privacy Better? Above, I wrote about how law protects people's privacy-protecting decisions. This unfortunately leaves them with the responsibility of making those decisions. Naturally, most privacy advocates--myself included--believe that people do not do enough to protect their privacy. Consciously or not, people seem to prioritize the short-term benefits of sharing personal information over the long-term costs to their privacy. This poses the question: Can direct regulation protect consumers privacy better than they can protect themselves? There is a decades-long history behind principles aimed at protect privacy and related interests, principles that are often put forward as a framework for legislative or regulatory directives. In the early 1970s, a group called ``The Secretary's Advisory Committee on Automated Personal Data Systems'' within the Department of Health, Education, and Welfare did an important study of record-keeping practices in the computer age. The intellectual content of its report, commonly known as the ``HEW Report,'' \17\ formed much of the basis of the Privacy Act of 1974. The report dealt extensively with the use of the Social Security Number as the issues stood at that time. --------------------------------------------------------------------------- \17\ ``Records, Computers and the Rights of Citizens: Report of the Secretary's Advisory Committee on Automated Personal Data Systems,'' Department of Health, Education, and Welfare [now Department of Health and Human Services] (July, 1973) http://www.aspe.dhhs.gov/datacncl/ 1973privacy/tocprefacemembers.htm. --------------------------------------------------------------------------- The HEW report advocated the following ``fair information practices'':