<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="billres.xsl"?>
<!DOCTYPE bill PUBLIC "-//US Congress//DTDs/bill.dtd//EN" "bill.dtd">
<bill bill-stage="Introduced-in-Senate" dms-id="A1" public-private="public" slc-id="S1-MUR22061-W95-T6-P6L"><metadata xmlns:dc="http://purl.org/dc/elements/1.1/">
<dublinCore>
<dc:title>117 S3608 IS: Nudging Users to Drive Good Experiences on Social Media Act</dc:title>
<dc:publisher>U.S. Senate</dc:publisher>
<dc:date>2022-02-09</dc:date>
<dc:format>text/xml</dc:format>
<dc:language>EN</dc:language>
<dc:rights>Pursuant to Title 17 Section 105 of the United States Code, this file is not subject to copyright protection and is in the public domain.</dc:rights>
</dublinCore>
</metadata>
<form>
<distribution-code display="yes">II</distribution-code><congress>117th CONGRESS</congress><session>2d Session</session><legis-num>S. 3608</legis-num><current-chamber>IN THE SENATE OF THE UNITED STATES</current-chamber><action><action-date date="20220209">February 9, 2022</action-date><action-desc><sponsor name-id="S311">Ms. Klobuchar</sponsor> (for herself and <cosponsor name-id="S410">Ms. Lummis</cosponsor>) introduced the following bill; which was read twice and referred to the <committee-name committee-id="SSCM00">Committee on Commerce, Science, and Transportation</committee-name></action-desc></action><legis-type>A BILL</legis-type><official-title>To require the Federal Trade Commission to identify content-agnostic platform interventions to reduce the harm of algorithmic amplification and social media addiction on covered platforms, and for other purposes.</official-title></form><legis-body><section id="S1" section-type="section-one"><enum>1.</enum><header>Short title</header><text display-inline="no-display-inline">This Act may be cited as the <quote><short-title>Nudging Users to Drive Good Experiences on Social Media Act</short-title></quote> or the <quote><short-title>Social Media NUDGE Act</short-title></quote>.</text></section><section id="idA164397993C843C284BDAC0010730FC6"><enum>2.</enum><header>Findings</header><text display-inline="no-display-inline">Congress finds the following:</text><paragraph id="idece01fedbb54480d9687dfa04840789b"><enum>(1)</enum><text>Social media platforms can have significant impacts on their users, both positive and negative. However, social media usage can be associated with detrimental outcomes, including on a user's mental and physical health. Design decisions made by social media platforms, such as decisions affecting the content a user might see on a social media platform, may drive or exacerbate these negative or detrimental outcomes.</text></paragraph><paragraph id="id5B6F3EFDF28F47218B7157F5DDDA37F1"><enum>(2)</enum><text>Viral harmful content often spreads on social media platforms. Social media platforms do not consistently enforce their terms of service and content policies, leading to supposedly prohibited content often being shown to users and amplified by such platforms. </text></paragraph><paragraph id="ida663517314954211877a2ac3049f7b46"><enum>(3)</enum><text>Social media platforms often rely heavily on automated measures for content detection and moderation. These social media platforms may rely on such automated measures due to the large quantity of user-generated content on their platforms. However, evidence suggests that even state-of-the-art automated content moderation systems currently do not fully address the harmful content on social media platforms.</text></paragraph><paragraph id="id10E7C11D90B94E17A110EFC798595219"><enum>(4)</enum><text>Significant research has found that content-agnostic interventions, if made by social media platforms, may help significantly mitigate these issues. These interventions could be readily implemented by social media platforms to provide safer user experiences. Such interventions include the following:</text><subparagraph id="id7EF1DF2430014AA4A5E4F296360F380E"><enum>(A)</enum><text>Nudges to users and increased platform viewing options, such as screen time alerts and grayscale phone settings, which may reduce addictive platform usage patterns and improve user experiences online.</text></subparagraph><subparagraph id="id027C015B308942C09CD9619D2EB6AE3E"><enum>(B)</enum><text>Labels and alerts that require a user to read or review user-generated content before sharing such content.</text></subparagraph><subparagraph id="idD644A5B644874D388BCD46B3AC816332"><enum>(C)</enum><text>Prompts to users, which may help users identify manipulative and microtargeted advertisements.</text></subparagraph><subparagraph id="id74B4BCE7038B4DF9BB3D04250689FD8F"><enum>(D)</enum><text>Other research-supported content-agnostic interventions.</text></subparagraph></paragraph><paragraph id="idA9A633244FEC4CC482C6D1BF4ECE0C28"><enum>(5)</enum><text>Evidence suggests that increased adoption of content-agnostic interventions would lead to improved outcomes of social media usage. However, social media platforms may be hesitant to independently implement content-agnostic interventions that will reduce negative outcomes associated with social media use.</text></paragraph></section><section id="idF0A91DDD88D84D61B7E1DB00887A5029"><enum>3.</enum><header>Study on content-agnostic interventions</header><subsection id="id7125A98259934E5F9770877CAF91F908"><enum>(a)</enum><header>Study To identify content-Agnostic interventions</header><text>The Director of the National Science Foundation (in this section referred to as the <term>Director</term>) shall enter into an agreement with the National Academies of Sciences, Engineering, and Medicine (in this section referred to as the <term>Academies</term>) to conduct ongoing studies to identify content-agnostic interventions that covered platforms could implement to reduce the harms of algorithmic amplification and social media addiction on covered platforms. The initial study shall—</text><paragraph id="idB9E276962CE14D0F93C2BE5227531F13"><enum>(1)</enum><text>identify ways to define and measure the negative mental or physical health impacts related to social media, including harms related to algorithmic amplification and social media addiction, through a review of—</text><subparagraph id="id3194AB64C8654F42BC1DA60DCB7D7644"><enum>(A)</enum><text>a wide variety of studies, literature, reports, and other relevant materials created by academic institutions, civil society groups, and other appropriate sources; and</text></subparagraph><subparagraph id="id5AE06FE6A2B34E798ADC6A8918944340" commented="no" display-inline="no-display-inline"><enum>(B)</enum><text>relevant internal research conducted by a covered platform or third-party research in the possession of a covered platform that is voluntarily submitted to the Academies by the covered platform (through a process, established by the Academies, with appropriate privacy safeguards); </text></subparagraph></paragraph><paragraph id="idD9F87BC6AE834FE9ADCA888D4717E824"><enum>(2)</enum><text>identify research-based content-agnostic interventions, such as reasonable limits on account creation and content sharing, to combat problematic smartphone use and other negative mental or physical health impacts related to social media, including through a review of the materials described in subparagraphs (A) and (B) of paragraph (1);</text></paragraph><paragraph id="idFEE31B2CC0704DACB325F7579420F7CA"><enum>(3)</enum><text>provide recommendations on how covered platforms may be separated into groups of similar platforms for the purpose of implementing content-agnostic interventions, taking into consideration factors including any similarity among the covered platforms with respect to—</text><subparagraph id="id694D1357A89040169423C242A52FF87B"><enum>(A)</enum><text>the number of monthly active users of the covered platform and the growth rate of such number;</text></subparagraph><subparagraph id="idC1B9BD44A121434A855A94AF86D800ED"><enum>(B)</enum><text>how user-generated content is created, shared, amplified, and interacted with on the covered platform;</text></subparagraph><subparagraph id="id588A13ECF5E147B18EEAA74DA720C309"><enum>(C)</enum><text>how the covered platform generates revenue; and</text></subparagraph><subparagraph id="id5F9FF5467F4B423FA515A4DC4ADDC5A3"><enum>(D)</enum><text>other relevant factors for providing recommendations on how covered platforms may be separated into groups of similar platforms;</text></subparagraph></paragraph><paragraph id="id2E603EE2C4674F22A07A70607DA63275"><enum>(4)</enum><text>for each group of covered platforms recommended under paragraph (3), provide recommendations on which of the content-agnostic interventions identified in paragraph (2) are generally applicable to the covered platforms in such group; </text></paragraph><paragraph id="idb9487908afad443c96a4a8ef95a5fd59"><enum>(5)</enum><text>for each group of covered platforms recommended under paragraph (3), provide recommendations on how the covered platforms in such group could generally implement each of the content-agnostic interventions identified for such group under paragraph (4) in a way that does not alter the core functionality of the covered platforms, considering—</text><subparagraph id="idb34f0399a90d4a648b133031d6e0980d"><enum>(A)</enum><text>whether the content-agnostic intervention should be offered as an optional setting or feature that users of a covered platform could manually turn on or off with appropriate default settings to reduce the harms of algorithmic amplification and social media addiction on the covered platform without altering the core functionality of the covered platform; and</text></subparagraph><subparagraph id="id3003a938242b4c25ac15f7ce9f0ee925"><enum>(B)</enum><text>other means by which the content-agnostic intervention may be implemented and any associated impact on the experiences of users of the covered platform and the core functionality of the covered platform;</text></subparagraph></paragraph><paragraph id="id3AA8EF457D18433BAC46C4A24D542A37"><enum>(6)</enum><text>for each group of covered platforms recommended under paragraph (3), define metrics generally applicable to the covered platforms in such group to measure and publicly report in a privacy-preserving manner the impact of any content-agnostic intervention adopted by the covered platform; and</text></paragraph><paragraph id="idE777A2F647A045E2A683936202570A94"><enum>(7)</enum><text>identify data and research questions necessary to further understand the negative mental or physical health impacts related to social media, including harms related to algorithmic amplification and social media addiction.</text></paragraph></subsection><subsection id="idDB9C04A0755C4D3C9DFFA3FC2485E68D"><enum>(b)</enum><header>Requirement To submit additional research</header><text>If a covered platform voluntarily submits internal research to the Academies under subsection (a)(1)(B), the covered platform shall, upon the request of the Academies and not later than 60 days after receiving such a request, submit to the Academies any other research in the platform's possession that is closely related to such voluntarily submitted research.</text></subsection><subsection id="idF7DCC7BFB8B8415889C26FC8AC79818F"><enum>(c)</enum><header>Reports</header><paragraph id="id4B8939CA706746A0B6509A2CE366A339"><enum>(1)</enum><header>Initial study report</header><text>Not later than 1 year after the date of enactment of this Act, the Academies shall submit to the Director, Congress, and the Commission a report containing the results of the initial study conducted under subsection (a), including recommendations for how the Commission should establish rules for covered platforms related to content-agnostic interventions as described in paragraphs (1) through (5) of subsection (a).</text></paragraph><paragraph id="id447A9AD72C6F47C085EE1525F367249B"><enum>(2)</enum><header>Updates</header><text>Not later than 2 years after the effective date of the regulations promulgated under section 4, and every 2 years thereafter during the 10-year period beginning on such date, the Academies shall submit to the Director, Congress, and the Commission a report containing the results of the ongoing studies conducted under subsection (a). Each such report shall— </text><subparagraph id="id3A3816EF6A4D462EA5280815A735AAE0"><enum>(A)</enum><text>include analysis and updates to earlier studies conducted, and recommendations made, under such subsection;</text></subparagraph><subparagraph id="id69f89b0108c14295b93673035898ddca"><enum>(B)</enum><text>be based on— </text><clause id="id80736C381B12418486DE443FD1976197"><enum>(i)</enum><text>new academic research, reports, and other relevant materials related to the subject of previous studies, including additional research identifying new content-agnostic interventions; and</text></clause><clause id="id050a3485b8f54fd5997a13cc62075c31"><enum>(ii)</enum><text>new academic research, reports, and other relevant materials about harms occurring on covered platforms that are not being addressed by the content-agnostic interventions being implemented by covered platforms as a result of the regulations promulgated under section 4;</text></clause></subparagraph><subparagraph id="id0d70cff903cf4334aad0f8f8dc9624ba"><enum>(C)</enum><text>include information about the implementation of the content-agnostic interventions by covered platforms and the impact of the implementation of the content-agnostic interventions; and</text></subparagraph><subparagraph id="id59f8ddf6c6b24ae7a87c80584a95886c"><enum>(D)</enum><text>include an analysis of any entities that have newly met the criteria to be considered a covered platform under this Act since the last report submitted under this subsection.</text></subparagraph></paragraph></subsection></section><section id="id336DF641807842E3ABDC3AF7587078BD"><enum>4.</enum><header>Implementation of content-agnostic interventions</header><subsection id="id5C71C13993C44E9083D5BC85F0772805"><enum>(a)</enum><header>Determination of applicable content-Agnostic interventions</header><paragraph id="id67ED1EF935894B64910BE202A2EAD7E5"><enum>(1)</enum><header>In general</header><text>Not later than 60 days after the receipt of the initial study report under section 3(c)(1), the Commission shall initiate a rulemaking proceeding for the purpose of promulgating regulations in accordance with section 553 of title 5, United States Code—</text><subparagraph id="idEB585FC05AB240E685A416DB5AEA4790"><enum>(A)</enum><text>to determine how covered platforms should be grouped together;</text></subparagraph><subparagraph id="idE10C3EA905A9473997C25916269ACF75"><enum>(B)</enum><text>to determine which content-agnostic interventions identified in such report shall be applicable to each group of covered platforms identified in the report; and</text></subparagraph><subparagraph id="id1DBD2372373945279300A3C07F40CC0B"><enum>(C)</enum><text>to require each covered platform to implement and measure the impact of such content-agnostic interventions in accordance with subsection (b).</text></subparagraph></paragraph><paragraph id="id7F0C2FD908C145A5AE99C8FA3E3B8750"><enum>(2)</enum><header>Considerations</header><text>In the rulemaking proceeding described in paragraph (1), the Commission— </text><subparagraph id="id3EDB205E55DA4D77878ED73A046F5879"><enum>(A)</enum><text>shall consider the report under section 3(c)(1) and its recommendations; and</text></subparagraph><subparagraph id="id4539C31EABC94E198137344CF2BC3D05"><enum>(B)</enum><text>shall not promulgate regulations requiring any covered platform to implement a content-agnostic intervention that is not discussed in such report.</text></subparagraph></paragraph><paragraph id="id3634849DC7E5451181EC688A3BB2F4EB" commented="no"><enum>(3)</enum><header>Notification to covered platforms</header><text>The Commission shall, not later than 30 days after the promulgation of the regulations under this subsection, provide notice to each covered platform of the content-agnostic interventions that are applicable to the platform pursuant to the regulations promulgated under this subsection.</text></paragraph></subsection><subsection id="id60585CEC235142348CC0A721F2FA7BD5"><enum>(b)</enum><header>Implementation of content-Agnostic interventions</header><paragraph id="id0307A7227A6646E0BF809E2BB6FEF38A"><enum>(1)</enum><header>In general</header><subparagraph id="idF4ED6DD3C55347C196CEF76E024D3002"><enum>(A)</enum><header>Implementation plan</header><clause id="id73DD39AA1FD44CD3897DEE05A51D6B06"><enum>(i)</enum><header>In general</header><text>Not later than 60 days after the date on which a covered platform receives the notice from the Commission required under subsection (a)(3), the covered platform shall submit to the Commission a plan to implement each content-agnostic intervention applicable to the covered platform (as determined by the Commission) in an appropriately prompt manner. If the covered platform reasonably believes that any aspect of an applicable intervention is not technically feasible for the covered platform to implement, would substantially change the core functionality of the covered platform, or would pose a material privacy or security risk to consumer data stored, held, used, processed, or otherwise possessed by such covered platform, the covered platform shall include in its plan evidence supporting these beliefs in accordance with paragraph (2). </text></clause><clause id="id49ACCEE023CE4574BE782DBFA76ED060"><enum>(ii)</enum><header>Commission determination</header><text>Not later than 30 days after receiving a covered platform’s plan under clause (i), the Commission shall determine whether such plan includes details related to the appropriately prompt implementation of each content-agnostic intervention applicable to the covered platform, except for any aspect of an intervention for which the Commission determines the covered platform is exempt under paragraph (2).</text></clause><clause id="id5FE875D51F36495B8E9C81EECC376811"><enum>(iii)</enum><header>Appeal or revised plan</header><subclause id="idC7FE231ADD4740E1B3E7A235CED0D8A3"><enum>(I)</enum><header>In general</header><text>Subject to subclause (II), if the Commission determines under clause (ii) that a covered platform's plan does not satisfy the requirements of this subsection, not later than 90 days after the issuance of such determination, the covered platform shall—</text><item id="id2494AACCAAF243B79036365191A96DD1"><enum>(aa)</enum><text>appeal the determination by the Commission to the United States Court of Appeals for the Federal Circuit; or</text></item><item id="id2C81E4CA326E491A834B4C14A1C8D574"><enum>(bb)</enum><text>submit to the Commission a revised plan for a Commission determination pursuant to clause (ii).</text></item></subclause><subclause id="id9862476C860A443793E7FF033E47D044"><enum>(II)</enum><header>Limitation</header><text>If a covered platform submits 3 revised plans to the Commission for a determination pursuant to clause (ii) and the Commission determines that none of the revised plans satisfy the requirements of this subsection, the Commission may find that the platform is not acting in good faith in developing an implementation plan and may require the platform to implement, pursuant to a plan developed for the platform by the Commission, each content-agnostic intervention applicable to the platform (as determined by the Commission) in an appropriately prompt manner.</text></subclause></clause></subparagraph><subparagraph id="id18A25303A95341CA9690728B98561DCB"><enum>(B)</enum><header>Statement of compliance</header><text>Not less frequently than annually, each covered platform shall make publicly available on their website and submit to the Commission, in a machine-readable format and in a privacy-preserving manner, the details of—</text><clause id="id5AA25EEFB77F43D6919E495DC20252DD"><enum>(i)</enum><text>the covered platform's compliance with the required implementation of content-agnostic interventions; and</text></clause><clause id="idED3E86C27B5C4FCABA43285EA323ABDF"><enum>(ii)</enum><text>the impact (using the metrics defined by the Director of the National Science Foundation and the National Academies of Sciences, Engineering, and Medicine pursuant to section 3(a)(6)) of such content-agnostic interventions on reducing the harms of algorithmic amplification and social media addiction on covered platforms.</text></clause></subparagraph></paragraph><paragraph id="id635B9C0D4020434D84EBA919F60C1779"><enum>(2)</enum><header>Feasibility, functionality, privacy, and security exemptions</header><subparagraph id="idEF9DFA0B98384D808C54B3FE84DCC6D4"><enum>(A)</enum><header>Statement of inapplicability</header><text>Not later than 60 days after the date on which a covered platform receives the notice from the Commission required under subsection (a)(3), a covered platform seeking an exemption from any aspect of such rule may submit to the Commission—</text><clause id="id799D015D7DB64DC191E25475BE8B7F91"><enum>(i)</enum><text>a statement identifying any specific aspect of a content-agnostic intervention applicable to such covered platform (as determined by the Commission under subsection (a)) that the covered platform reasonably believes—</text><subclause id="id88F32702F35B48F2B2E0E18D2BF381C1"><enum>(I)</enum><text>is not technically feasible for the covered platform to implement;</text></subclause><subclause id="id26FC68A93C18432497BA463B70A07693"><enum>(II)</enum><text>will substantially change the core functionality of the covered platform; or</text></subclause><subclause id="idD04C609919D4469C95FF865C7D6DB2AF"><enum>(III)</enum><text>will create a material and imminent privacy or security risk to the consumer data stored, held, used, processed, or otherwise possessed by such covered platform; and</text></subclause></clause><clause id="idFDF5CC0C60DF4BAD827A04C14898278C"><enum>(ii)</enum><text>specific evidence supporting such belief, including any relevant information regarding the core functionality of the covered platform.</text></clause></subparagraph><subparagraph id="id04850FD8AC694316B48D9D0A3CEA99F8"><enum>(B)</enum><header>Determination by the Commission</header><text>Not later than 30 days after receiving a covered platform’s statement under subparagraph (A), the Commission shall determine whether the covered platform shall be exempt from any aspect of a content-agnostic intervention discussed in the covered platform’s statement.</text></subparagraph><subparagraph id="id3727E612E424441399857D751CAA0DA9"><enum>(C)</enum><header>Appeal or revised plan</header><text>Not later than 90 days after a determination issued under subparagraph (B), a covered platform may—</text><clause id="id536DE7D31AE54F86942E8EF65B941930"><enum>(i)</enum><text>appeal the determination by the Commission to the United States Court of Appeals for the Federal Circuit; or</text></clause><clause id="id1E7AC9D3BFA14F8BA361A3E47B516C37"><enum>(ii)</enum><text>submit to the Commission a revised plan, including details related to the prompt implementation of any content-agnostic intervention for which the covered platform requested an exemption that the Commission subsequently denied, for a Commission determination pursuant to paragraph (1)(A)(ii).</text></clause></subparagraph></paragraph></subsection></section><section id="id254A8199C7AB4779A00488CDBE300DC8"><enum>5.</enum><header>Transparency report</header><text display-inline="no-display-inline">Not later than 180 days after the date of enactment of this Act, and semiannually thereafter, each covered platform shall publish a publicly available, machine-readable report about the content moderation efforts of the covered platform with respect to each language spoken by not less than 100,000 monthly active users of the covered platform in the United States. Such report shall include the following:</text><paragraph id="id1A6CE952B48B421C9533B9213B257FB2"><enum>(1)</enum><header>Content moderators</header><text>The total number of individuals employed or contracted by the covered platform during the reporting period to engage in content moderation for each language, broken down by the number of individuals retained as full-time employees, part-time employees, and contractors of the covered platform and reported in a privacy-preserving manner.</text></paragraph><paragraph id="ide5bd9b7e4440453bbc19f4556ebaebb5"><enum>(2)</enum><header>Random sample of viewed content</header><text>Information related to a random sample of publicly visible content accounting for 1,000 views each month. Each month, covered platforms shall calculate the total number of views for each piece of publicly visible content posted during the month and sample randomly from the content in a manner such that the probability of a piece of content being sampled is proportionate to the total number of views of that piece of content during the month. Covered platforms shall report the following information about each piece of sampled content (with appropriate redactions to exclude the disclosure of illegal content):</text><subparagraph id="id03eed31b1f5b4493925416558e11b28e"><enum>(A)</enum><text>The text, images, audio, video, or other creative data associated with each such piece of content.</text></subparagraph><subparagraph id="iddf56d00b557a4161a8a2b62130fde53e"><enum>(B)</enum><text>The details of the account or accounts that originally posted the content.</text></subparagraph><subparagraph id="id4d82280bc5d6444ea039223f01be92e6"><enum>(C)</enum><text>The total number of views of each such piece of content during the month. </text></subparagraph></paragraph><paragraph id="id7AA391D58A60414A8661A439B5A7AC48"><enum>(3)</enum><header>High reach content</header><text>Content moderation metrics broken down by language to assess the prevalence of harmful content on the covered platform, including, for each language, the 1,000 most viewed pieces of publicly visible content each month, including the following (with appropriate redactions to exclude the disclosure of illegal content):</text><subparagraph id="idA93B5EC277CD461FBBCC2778B31B1B9A"><enum>(A)</enum><text>The text, images, audio, video, or other creative data associated with each such piece of content.</text></subparagraph><subparagraph id="id42063C05B570433E852D81A9095AC264"><enum>(B)</enum><text>The details of—</text><clause id="idAFEE82672FAA4AFA9562CBCFCF007BAC"><enum>(i)</enum><text>the account that originally posted the content; and</text></clause><clause id="id318A2F5618104777A042836ADC130606"><enum>(ii)</enum><text>any account whose sharing or reposting of the content accounted for more than 5 percent of the views of the content.</text></clause></subparagraph></paragraph><paragraph id="idAF6C6EA437E148D4A08D4B8107B9B722"><enum>(4)</enum><header>Removed and moderated content</header><subparagraph id="id50B99CFCB1ED4467A22CF0208A0C7D57"><enum>(A)</enum><header>In general</header><text>Aggregate metrics for user-generated content that is affected by any automated or manual moderation system or decision, including, as calculated on a monthly basis and reported in a privacy-preserving manner, the number of pieces of user-generated content and the number of views of such content that were—</text><clause id="id94fac63772f14094a5011f3aba5eaea9"><enum>(i)</enum><text>reported to the covered platform by a user of the covered platform;</text></clause><clause id="id89172bc21aa94a80a9e565eeecc2a9b8"><enum>(ii)</enum><text>flagged by the covered platform by an automated content detection system;</text></clause><clause id="id418ed1c95547410687278097a93ed987"><enum>(iii)</enum><text>removed from the covered platform and not restored;</text></clause><clause id="id683de466d3c448398f41bfc202f761b1"><enum>(iv)</enum><text>removed from the covered platform and later restored; or</text></clause><clause id="id2979ab01ab224d39accc59d57a20806b"><enum>(v)</enum><text>labeled, edited, or otherwise moderated by the covered platform following a user report or flagging by an automated content detection system.</text></clause></subparagraph><subparagraph id="idf663eb47aca44eda87edeff703541ace"><enum>(B)</enum><header>Requirements for metrics</header><text>The metrics reported under subparagraph (A) shall be broken down by—</text><clause id="idce205a1eed1f43c5833169ca85063f05"><enum>(i)</enum><text>the language of the user-generated content;</text></clause><clause id="id54e2313565b34428b67a3cbbaec35b22"><enum>(ii)</enum><text>the topic of the user-generated content, such as bullying, hate speech, drugs and firearms, violence and incitement, or any other category determined by the covered platform to categorize such content; and</text></clause><clause id="id1CE285813FD149ABAE56732BA25B068E"><enum>(iii)</enum><text>if the covered platform has a process for publicly verifying that an account on the platform belongs to a prominent user or public figure, whether the creator of the content is a politician or journalist with a verified account.</text></clause></subparagraph></paragraph></section><section id="id97B9FD9695B84B7496CD8285BD5A60B7"><enum>6.</enum><header>Enforcement</header><subsection id="id28acd2c0c6144223880ff054416ae6f5"><enum>(a)</enum><header>Unfair or deceptive acts or practices</header><text>A violation of section 3(b), 4, or 5 or a regulation promulgated under section 4 shall be treated as a violation of a rule defining an unfair or deceptive act or practice prescribed under section 18(a)(1)(B) of the Federal Trade Commission Act (<external-xref legal-doc="usc" parsable-cite="usc/15/57a">15 U.S.C. 57a(a)(1)(B)</external-xref>).</text></subsection><subsection id="id182937c051a345a9ba434509c0bf4c47"><enum>(b)</enum><header>Powers of the Commission</header><paragraph id="ide32811f640664bd49e0d201378d8520d"><enum>(1)</enum><header>In general</header><text>The Commission shall enforce this Act in the same manner, by the same means, and with the same jurisdiction, powers, and duties as though all applicable terms and provisions of the Federal Trade Commission Act (<external-xref legal-doc="usc" parsable-cite="usc/15/41">15 U.S.C. 41 et seq.</external-xref>) were incorporated into and made a part of this Act.</text></paragraph><paragraph id="id72473A3A302543A0AF89326482BF3C1A"><enum>(2)</enum><header>Privileges and immunities</header><text>Any person who violates section 4 or 5 or a regulation promulgated under section 4 shall be entitled to the privileges and immunities provided in the Federal Trade Commission Act (<external-xref legal-doc="usc" parsable-cite="usc/15/41">15 U.S.C. 41 et seq.</external-xref>).</text></paragraph><paragraph id="idbda9fc8b4d8e4628881af4f174f339f3"><enum>(3)</enum><header>Enforcement guidelines and updates</header><text>Not later than 1 year after the date of enactment of this Act, the Commission shall issue guidelines that outline any policies and practices of the Commission related to the enforcement of this Act in order to promote transparency and deter violations of this Act. The Commission shall update the guidelines as needed to reflect current policies, practices, and changes in technology, but not less frequently than once every 4 years.</text></paragraph><paragraph id="id502d9670a9e448819ffa409fb9572b35"><enum>(4)</enum><header>Authority preserved</header><text>Nothing in this Act shall be construed to limit the authority of the Commission under any other provision of law.</text></paragraph></subsection></section><section id="id2ED4A11ED23B41A9A1706279D690D01E"><enum>7.</enum><header>Definitions</header><text display-inline="no-display-inline">In this Act:</text><paragraph id="idB513D85411104A57836BA50FB68AB6E1"><enum>(1)</enum><header>Algorithmic amplification</header><text>The term <term>algorithmic amplification</term> means the promotion, demotion, recommendation, prioritization, or de-prioritization of user-generated content on a covered platform to other users of the covered platform through a means other than presentation of content in a reverse-chronological or chronological order.</text></paragraph><paragraph id="id5171552F2E074990835B7275B80FEC0A"><enum>(2)</enum><header>Commission</header><text>The term <term>Commission</term> means the Federal Trade Commission.</text></paragraph><paragraph id="id857BAE317CBC4216B1FBBD8DBABA042C"><enum>(3)</enum><header>Content moderation</header><text>The term <term>content moderation</term> means the intentional removal, labeling, or altering of user-generated content on a covered platform by the covered platform or an automated or human system controlled by the covered platform, including decreasing the algorithmic ranking of user-generated content, removing user-generated content from algorithmic recommendations, or any other action taken in accordance with the covered platform’s terms of service, community guidelines, or similar materials governing the content allowed on the covered platform.</text></paragraph><paragraph id="id9429FDC233994144ADF6DC94B621870E"><enum>(4)</enum><header>Content-agnostic intervention</header><text>The term <term>content-agnostic intervention</term> means an action that can be taken by a covered platform to alter a user's experience on the covered platform or the user interface of the covered platform that does not—</text><subparagraph id="id516F1BD1B0484B33B5FF9D23085D393D"><enum>(A)</enum><text>rely on the substance of user-generated content on the covered platform; or</text></subparagraph><subparagraph id="id3C57C55D0BBB4B22BD00F30B9BD45EC5"><enum>(B)</enum><text>alter the core functionality of the covered platform.</text></subparagraph></paragraph><paragraph id="id2DB970344A5D438697284B9ED3B0B011"><enum>(5)</enum><header>Covered platform</header><text>The term <term>covered platform</term> means any public-facing website, desktop application, or mobile application that—</text><subparagraph id="id78A115E0D2514C98AE1FECA2D6F77A5E"><enum>(A)</enum><text>is operated for commercial purposes;</text></subparagraph><subparagraph id="idF21272B809114319871EAEEA324B8A12"><enum>(B)</enum><text>provides a forum for user-generated content;</text></subparagraph><subparagraph id="idACB79EA6088644FB9D00C9CF9760D290"><enum>(C)</enum><text>is constructed such that the core functionality of the website or application is to facilitate interaction between users and user-generated content; and</text></subparagraph><subparagraph id="id51224B49AEAC4068B49750755C77BE6E"><enum>(D)</enum><text>has more than 20,000,000 monthly active users in the United States for a majority of the months in the previous 12-month period. </text></subparagraph></paragraph><paragraph id="ideca06034763c446297f5fe0f5a783fdb"><enum>(6)</enum><header>Privacy-preserving manner</header><text>The term <term>privacy-preserving manner</term> means, with respect to a report made by a covered platform, that the information contained in the report is presented in a manner in which it is not reasonably capable of being used, either on its own or in combination with other readily accessible information, to uniquely identify an individual.</text></paragraph><paragraph id="id0900e69c928a4c80945839c6cfa4e0a6"><enum>(7)</enum><header>User</header><text>The term <term>user</term> means a person that uses a covered platform, regardless of whether that person has an account or is otherwise registered with the platform.</text></paragraph><paragraph id="id573C94FD82D345E8A628E0BF244CC72D"><enum>(8)</enum><header>User-generated content</header><text>The term <term>user-generated content</term> means any content, including text, images, audio, video, or other creative data that is substantially created, developed, or published on a covered platform by any user of such covered platform.</text></paragraph></section></legis-body></bill> 

