[Federal Register Volume 86, Number 227 (Tuesday, November 30, 2021)]
[Notices]
[Pages 67925-67927]
From the Federal Register Online via the Government Publishing Office [www.gpo.gov]
[FR Doc No: 2021-25999]


-----------------------------------------------------------------------

DEPARTMENT OF COMMERCE

National Telecommunications and Information Administration


Privacy, Equity, and Civil Rights Listening Sessions

AGENCY: National Telecommunications and Information Administration, 
U.S. Department of Commerce.

ACTION: Notice of open meeting.

-----------------------------------------------------------------------

SUMMARY: The National Telecommunications and Information Administration 
(NTIA) will convene three virtual Listening Sessions about issues and 
potential solutions at the intersection of privacy, equity, and civil 
rights. The sessions will help to provide the data for a report on the 
ways in which commercial data flows of personal information can lead to 
disparate impact and outcomes for marginalized or disadvantaged 
communities.

DATES: The meetings will be held on December 14, 15, and 16, 2021, from 
1:00 p.m. to 3:30 p.m., Eastern Standard Time.

ADDRESSES: The meetings will be held virtually, with online slide share 
and dial-in information to be posted at https://www.ntia.gov/.

FOR FURTHER INFORMATION CONTACT: Travis Hall, National 
Telecommunications and Information Administration, U.S. Department of 
Commerce, 1401 Constitution Avenue NW, Room 4725, Washington, DC 20230; 
telephone: (202) 482-3522; email: [email protected]. Please direct media 
inquiries to NTIA's Office of Public Affairs: (202) 482-7002; email: 
[email protected].

SUPPLEMENTARY INFORMATION: 
    Background and Authority: The National Telecommunications and 
Information Administration (NTIA) is the President's principal advisor 
on telecommunications and information policy issues.\1\ In this role, 
NTIA studies and develops policy advice about the impact of technology 
and the internet on privacy. This includes examining the extent to 
which technology implementations, business models, and related data 
processing are adequately addressed by the U.S.'s current privacy 
protection framework.\2\ Importantly, NTIA has long acknowledged that 
privacy is a matter of contextual data flow and use rather than simply 
being a question of publicity.\3\ Increasingly,

[[Page 67926]]

scholarship has shown that marginalized or underserved communities are 
especially in need of robust privacy protections.\4\ These studies have 
shown that not only are these communities often materially 
disadvantaged with regards to the marginal effort required to 
adequately manage privacy controls, they are often at increased risk of 
suffering harm from losses of privacy or misuse of collected data.
---------------------------------------------------------------------------

    \1\ See 47 U.S.C. 902(b)(2)(D), (H).
    \2\ NTIA Blog, ``NTIA Releases Comments on a Proposed Approach 
to Protecting Consumer Privacy'' (Nov. 13, 2018), https://www.ntia.doc.gov/press-release/2018/ntia-releases-comments-proposed-approach-protecting-consumer-privacy (commenters generally 
emphasized the need for changes to the U.S. privacy framework); see 
also, GAO, Consumer Privacy: Changes to Legal Framework Needed To 
Address Gaps (June 2019), https://www.gao.gov/products/gao-19-621t 
(same); Congressional Research Service, Data Protection Law: An 
Overview (March 25, 2019), https://fas.org/sgp/crs/misc/R45631.pdf 
(``Recent high-profile data breaches and other concerns about how 
third parties protect the privacy of individuals in the digital age 
have raised national concerns over legal protections of Americans' 
electronic data.''); Thorin Klosowski, The State of Consumer Privacy 
Laws In The US (And Why It Matters), Wirecutter (Sept. 6, 2021), 
https://www.nytimes.com/wirecutter/blog/state-of-privacy-laws-in-us/ 
(describing consumer privacy laws in the United States and providing 
legal experts' characterizations of their inadequacy); Press 
Release, ``Wicker, Blackburn Introduce Federal Privacy Legislation'' 
(July 28, 2021), https://www.commerce.senate.gov/2021/7/wicker-blackburn-introduce-federal-data-privacy-legislation (``the need for 
federal privacy legislation is imperative''); Business Roundtable 
Letter to Senate Commerce Committee Urging Passage of a Federal 
Consumer Data Privacy Law (Oct. 4, 2021), https://www.businessroundtable.org/business-roundtable-letter-to-senate-commerce-committee-urging-passage-of-a-federal-consumer-data-privacy-law.
    \3\ See Internet Policy Task Force, Commercial Data Privacy and 
Innovation in the Internet Economy: A Dynamic Policy 18 (Dec. 16, 
2010), https://www.ntia.doc.gov/files/ntia/publications/iptf_privacy_greenpaper_12162010.pdf; White House, Consumer Data 
Privacy in a Networked World: A Framework for Protecting Privacy and 
Promoting Innovation in the Global Digital Economy, (Feb. 23, 2012), 
16; see also: Helen Nissenbaum, Privacy in Context, (Nov. 2009). 
NTIA considers problematic uses and problematic collection to both 
fall under the umbrella of a ``privacy harm,'' an idea that is well-
established in the literature. (https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3782222, 21-22 (``Privacy harms are highly 
contextual, with the harm depending upon how the data is used, what 
data is involved, and also how the data might be combined with other 
data'')).
    \4\ Danielle Keats-Citron, Cyber Civil Rights, 89 Boston U. L. 
Rev. 61 (2008); Khiara Bridges, The Poverty of Privacy Rights, 
Stanford University Press (2017); Mary Madden, Michele Gilman, Karen 
Levy & Alice Marwick, Privacy, Poverty, and Big Data: A Matrix Of 
Vulnerabilities For Poor Americans, 95 Wash. U. L. Rev. 53 (2017); 
Alvaro Bedoya, Privacy As Civil Right, 50 New Mexico L. Rev. 3 
(2020); Scott Skinner-Thompson, Privacy At The Margins, Cambridge 
University Press (2020); Sara Sternberg Greene, Stealing (Identity) 
From The Poor (2021), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3781921; Michele Gilman, Feminism, Privacy, 
And Law In Cyberspace, Oxford Handbook of Feminism and Law in the 
U.S. (2021 Forthcoming), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3779323.
---------------------------------------------------------------------------

    The Administration has highlighted that there is a national 
imperative to promote equity and increase support for communities and 
individuals that have been ``historically underserved, marginalized, 
and adversely affected by persistent poverty and inequality.'' \5\ As 
stated in the Executive Order on Advancing Racial Equity and Support 
for Underserved Communities Through the Federal Government: 
``[e]ntrenched disparities in our laws and public policies, and in our 
public and private institutions, have often denied [. . .] equal 
opportunity to individuals and communities.'' \6\ These entrenched 
disparities persist in the digital economy, and the collection, 
processing, sharing, and use of data can directly affect--both 
positively and negatively--structural inequities present in our 
society.
---------------------------------------------------------------------------

    \5\ Exec. Order No. 13,985, 86 FR 7009 (Jan. 20, 2021), https://www.federalregister.gov/documents/2021/01/25/2021-01753/advancing-racial-equity-and-support-for-underserved-communities-through-the-federal-government.
    \6\ Id.
---------------------------------------------------------------------------

    The following examples underscore how commercial collection and use 
of personal information, even for legitimate purposes, often results in 
disparate outcomes for marginalized and underserved communities:
     Digital advertising systems have been shown to often 
reproduce historical patterns of discrimination by enabling 
discriminatory targeting by advertisers.\7\ Even when targeting 
criteria does not include protected traits, targeted advertising can be 
used to perpetuate discrimination using proxy indicators of race, 
gender, disability, and other characteristics.\8\
---------------------------------------------------------------------------

    \7\ Muhammad Ali et al., Discrimination Through Optimization: 
How Facebook's Ad Delivery Can Lead to Skewed Outcomes, Computers 
and Society (April 19, 2019), https://arxiv.org/abs/1904.02095.
    \8\ Id.
---------------------------------------------------------------------------

     Data brokers, health insurance companies, and their 
subsidiaries are using information such as neighborhood safety, 
bankruptcies, gun ownership, inferred hobbies, and other information to 
determine coverage for people they deem more likely to require more 
expensive care.\9\ These assessments can rely on unreliable and 
discriminatory heuristics or proxies for characteristics such as race, 
socioeconomic status, or disability--or as one salesman joked, ``God 
forbid you live on the wrong street these days,'' he said. ``You're 
going to get lumped in with a lot of bad things.''\10\
---------------------------------------------------------------------------

    \9\ Marshall Allen, Health Insurers Are Vacuuming Up Details 
About You--And It Could Raise Your Rates, Pro Publica (July 17, 
2018), https://www.propublica.org/article/health-insurers-are-vacuuming-up-details-about-you-and-it-could-raise-your-rates; Sarah 
Jeong, Insurers Want To Know How Many Steps You Took Today, The New 
York Times (April 10, 2019), https://www.nytimes.com/2019/04/10/opinion/insurance-ai.html (``But when it comes to insurance in 
particular, there are unanswered questions about the kind of biases 
that are acceptable. Discrimination based on genetics has already 
been deemed repugnant, even if it's perfectly rational. Poverty 
might be a rational indicator of risk, but should society allow 
companies to penalize the poor?'').
    \10\ Marshall Allen, Health Insurers Are Vacuuming Up Details 
About You--And It Could Raise Your Rates, Pro Publica (July 17, 
2018), https://www.propublica.org/article/health-insurers-are-vacuuming-up-details-about-you-and-it-could-raise-your-rates; see 
also, Rachel Goodman, Big Data Could Set Insurance Premiums. 
Minorities Could Pay the Price, ACLU (July 19, 2018), https://www.aclu.org/blog/racial-justice/race-and-economic-justice/big-data-could-set-insurance-premiums-minorities-could (``Existing health 
disparities mean that data will consistently show members of certain 
groups to be more likely to need more health care. What will happen, 
then, if this data starts being used against those groups? We know, 
for example, that Black women are much more likely to experience 
serious complications from pregnancy than white women. So, health 
insurers might conclude that a woman who is Black and recently 
married is likely to cost them more money than a white woman in the 
same position''). Starre Vartan, Racial Bias Found in a Major Health 
Care Risk Algorithm, Scientific American (Oct. 24, 2019), https://www.scientificamerican.com/article/racial-bias-found-in-a-major-health-care-risk-algorithm/ (``A study published Thursday in Science 
has found that a health care risk-prediction algorithm, a major 
example of tools used on more than 200 million people in the U.S., 
demonstrated racial bias--because it relied on a faulty metric 
[previous patients' health care spending as a proxy for medical 
needs].'').
---------------------------------------------------------------------------

     Software implemented by a university to predict whether 
students will struggle academically used race as a strong predictor for 
poor performance.\11\ Black students were flagged ``high risk'' for 
dropping out of certain subjects, such as science and math, at elevated 
rates, a designation that researchers warned could improperly lead to 
advisors encourage students to change to ``easier'' majors.\12\
---------------------------------------------------------------------------

    \11\ Todd Feathers, Major Universities Are Using Race as a 
``High Impact Predictor'' of Student Success, The Markup (March 2, 
2021), https://themarkup.org/news/2021/03/02/major-universities-are-using-race-as-a-high-impact-predictor-of-student-success.
    \12\ Feathers, supra note 8.
---------------------------------------------------------------------------

    In light of these and many more examples, it is critical for 
policymakers to understand how information policy can reduce data-
driven discrimination and disparate treatment. In service of these 
objectives, NTIA announces through this Notice three virtual Listening 
Sessions, which aim to advance the policy conversation on how to 
alleviate the disproportionate privacy harms suffered by marginalized 
or underserved communities. NTIA's upcoming Listening Sessions are 
intended as an opportunity to build the factual record for further 
policy development in this area. The information gathered from these 
Listening Sessions will inform a subsequent Request for Comment, and 
together these efforts will provide the basis for NTIA to draft a 
report. Possible topics include, but are not limited to:
     The role and adequacy of current civil rights laws, 
related protections, and enforcement thereof in mitigating privacy 
harms against marginalized communities.
     The interplay between current civil rights laws and 
related protections with current privacy laws and proposed reforms.
     Data brokers and secondary markets for data.
     Exploitation of data or commercially available software 
for stalking or harassment based on protected class status.
     Workplace tracking and surveillance that may be 
discriminatory.
     Hiring, credit, lending, and housing algorithms and 
advertisements.
     Intersectional privacy needs of groups such as trans 
individuals, the unhoused, or people with disabilities.
    The format of the Listening Sessions will include a mix of keynote 
speeches, moderated panel discussions, and open forums for members of 
the public to share their perspective. The first Listening session will 
be held on December 14, 2021, on the intersection of civil rights law 
and privacy. The second Listening session will be held on December 15, 
2021, and will be on the

[[Page 67927]]

way in which the collection, use, and processing of personal and 
personally sensitive data affects structural inequities. The final 
Listening session will focus on solutions to the gaps and problems 
identified in the first two sessions, and will be held on December 16, 
2021.
    NTIA intends to publish a Notice and Request for Comments in the 
Federal Register that will be informed by the input received during the 
Listening Sessions. Members of the public unable to participate in the 
Listening Sessions are encouraged to respond to the forthcoming Request 
for Comments.
    Time and Date: NTIA will convene three virtual Listening Sessions 
on December 14, 15, and 16, 2021, from 1:00 p.m. to 3:30 p.m., Eastern 
Standard Time. The exact time of the meeting is subject to change. 
Please refer to NTIA's website, https://www.ntia.gov, for the most 
current information.
    Place: The meeting will be held virtually, with online slide share 
and dial-in information to be posted at https://www.ntia.gov. Please 
refer to NTIA's website, https://www.ntia.gov, for the most current 
information.
    Other Information: The meeting is open to the public and the press 
on a first-come, first-served basis. The virtual meetings are 
accessible to people with disabilities. Individuals requiring 
accommodations such as real-time captioning, sign language 
interpretation or other ancillary aids should notify Travis Hall at 
(202) 482-3522 or [email protected] at least seven (7) business days prior 
to the meeting. Access details for the meeting are subject to change. 
Please refer to NTIA's website, https://www.ntia.gov/, for the most 
current information.

    Dated: November 23, 2021.
Kathy D. Smith,
Chief Counsel, National Telecommunications and Information 
Administration.
[FR Doc. 2021-25999 Filed 11-29-21; 8:45 am]
BILLING CODE 3510-60-P