<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="billres.xsl"?>
<!DOCTYPE bill PUBLIC "-//US Congress//DTDs/bill.dtd//EN" "bill.dtd">
<bill bill-stage="Introduced-in-Senate" dms-id="A1" public-private="public" slc-id="S1-MIR22B15-VJD-8N-HR8"><metadata xmlns:dc="http://purl.org/dc/elements/1.1/">
<dublinCore>
<dc:title>106 S5259 IS: Shielding Children's Retinas from Egregious Exposure on the Net Act</dc:title>
<dc:publisher>U.S. Senate</dc:publisher>
<dc:date>2022-12-14</dc:date>
<dc:format>text/xml</dc:format>
<dc:language>EN</dc:language>
<dc:rights>Pursuant to Title 17 Section 105 of the United States Code, this file is not subject to copyright protection and is in the public domain.</dc:rights>
</dublinCore>
</metadata>
<form>
<distribution-code display="yes">II</distribution-code><congress>117th CONGRESS</congress><session>2d Session</session><legis-num>S. 5259</legis-num><current-chamber>IN THE SENATE OF THE UNITED STATES</current-chamber><action><action-date date="20221214">December 14, 2022</action-date><action-desc><sponsor name-id="S346">Mr. Lee</sponsor> introduced the following bill; which was read twice and referred to the <committee-name committee-id="SSCM00">Committee on Commerce, Science, and Transportation</committee-name></action-desc></action><legis-type>A BILL</legis-type><official-title>To require certain interactive computer services to adopt and operate technology verification measures to ensure that users of the platform are not minors, and for other purposes.</official-title></form><legis-body display-enacting-clause="yes-display-enacting-clause"><section section-type="section-one" id="S1"><enum>1.</enum><header>Short title</header><text display-inline="no-display-inline">This Act may be cited as the <quote><short-title>Shielding Children's Retinas from Egregious Exposure on the Net Act</short-title></quote> or the <quote><short-title>SCREEN Act</short-title></quote>.</text></section><section commented="no" id="id85EF0D61523F4377B158491592282DBC"><enum>2.</enum><header>Findings; sense of Congress</header><subsection commented="no" id="idE9EBD1CD55D640ABB626A6213D6B2936"><enum>(a)</enum><header>Findings</header><text>Congress finds the following:</text><paragraph id="id8bb40cf9ddd54945bfed5884d680ac8c"><enum>(1)</enum><text>Over the 3 decades preceding the date of enactment of this Act, Congress has passed several bills to protect minors from access to online pornographic content, including title V of the Telecommunications Act of 1996 (<external-xref legal-doc="public-law" parsable-cite="pl/104/104">Public Law 104–104</external-xref>) (commonly known as the <quote>Communications Decency Act</quote>), section 231 of the Communications Act of 1934 (<external-xref legal-doc="usc" parsable-cite="usc/47/231">47 U.S.C. 231</external-xref>) (commonly known as the <quote>Child Online Protection Act</quote>), and the Children’s Internet Protection Act (title XVII of division B of <external-xref legal-doc="public-law" parsable-cite="pl/106/554">Public Law 106–554</external-xref>). </text></paragraph><paragraph id="id51c6b3f655bd45c1af63bfa6202f55bf"><enum>(2)</enum><text>With the exception of the Children's Internet Protection Act (title XVII of division B of <external-xref legal-doc="public-law" parsable-cite="pl/106/554">Public Law 106–554</external-xref>), the Supreme Court of the United States has struck down the previous efforts of Congress to shield children from pornographic content, finding that such legislation constituted a <quote>compelling government interest</quote> but that it was not the least restrictive means to achieve such interest. In Ashcroft v. ACLU, 542 U.S. 656 (2004), the Court even suggested at the time that <quote>blocking and filtering software</quote> could conceivably be a <quote>primary alternative</quote> to the requirements passed by Congress.</text></paragraph><paragraph id="id4cec2b34ff7f4ed6a406695debc8466d"><enum>(3)</enum><text>In the nearly 2 decades since the Supreme Court of the United States suggested the use of <quote>blocking and filtering software</quote>, such technology has proven to be ineffective in protecting minors from accessing online pornographic content. The Kaiser Family Foundation has found that filters do not work on 1 in 10 pornography sites accessed intentionally and 1 in 3 pornography sites that are accessed unintentionally. Further, it has been proven that children are able to bypass <quote>blocking and filtering</quote> software by employing strategic searches or measures to bypass the software completely.</text></paragraph><paragraph id="id5a771360dca746c89ce6be3fbc0e0a73"><enum>(4)</enum><text>Additionally, Pew Research has revealed studies showing that only 39-percent of parents use blocking or filtering software for their minor’s online activities, meaning that 61-percent of children only have restrictions on their internet access when they are at school or at a library.</text></paragraph><paragraph id="idbc89c483fd2944d795b2c837a3088ecd"><enum>(5)</enum><text>17 States have now recognized pornography as a public health hazard that leads to a broad range of individual harms, societal harms, and public health impacts.</text></paragraph><paragraph id="ided95fd046571415fac237a813c69fa55"><enum>(6)</enum><text>It is estimated that 80-percent of minors between the ages of 12 to 17 have been exposed to pornography, with 54-percent of teenagers seeking it out. The internet is the most common source for minors to access pornography with pornographic websites receiving more web traffic in the United States than Twitter, Netflix, Pinterest, and LinkedIn combined.</text></paragraph><paragraph id="id8bcf923ecdd34456ac17cdd023017c9d"><enum>(7)</enum><text>Exposure to online pornography has created unique psychological effects for minors, including anxiety, addiction, low self-esteem, body image disorders, an increase in problematic sexual activity at younger ages, and an increased desire among minors to engage in risky sexual behavior.</text></paragraph><paragraph id="id4876dc633de94c3daccf5d93dc7722fb"><enum>(8)</enum><text>The Supreme Court of the United States has recognized on multiple occasions that Congress has a <quote>compelling government interest</quote> to protect the physical and psychological well-being of minors, which includes shielding them from <quote>indecent</quote> content that may not necessarily be considered <quote>obscene</quote> by adult standards.</text></paragraph><paragraph id="id3d19cbb5a52a4a95bfb1eeca3a8def20"><enum>(9)</enum><text>Because <quote>blocking and filtering software</quote> has not produced the results envisioned nearly 2 decades ago, it is necessary for Congress to pursue alternative policies to enable the protection of the physical and psychological well-being of minors.</text></paragraph><paragraph id="id0a45323e2c9e46eaab505a731659b84e"><enum>(10)</enum><text>The evolution of our technology has now enabled the use of age verification technology that is cost efficient, not unduly burdensome, and can be operated narrowly in a manner that ensures only adults have access to a website’s online pornographic content.</text></paragraph></subsection><subsection id="id16b107e8744649419f3d7a7cd726b058"><enum>(b)</enum><header>Sense of Congress</header><text>It is the sense of Congress that—</text><paragraph id="ide58c2e40bdab440bad5465b2cbb68fb6"><enum>(1)</enum><text>shielding minors from access to online pornographic content is a compelling government interest that protects the physical and psychological well-being of minors; and</text></paragraph><paragraph id="id535fdcd9158d469787c494abbcf458b3"><enum>(2)</enum><text>requiring interactive computer services that are in the business of creating, hosting, or making available pornographic content to enact technological measures that shield minors from accessing pornographic content on their platforms is the least restrictive means for Congress to achieve its compelling government interest.</text></paragraph></subsection></section><section id="idBB5CF2CB6E934FD7B6B70736AE86ADB3"><enum>3.</enum><header>Definitions</header><text display-inline="no-display-inline">In this Act:</text><paragraph id="id7139AA8B5290401C9E955F181E66416A"><enum>(1)</enum><header>Child pornography; minor</header><text>The terms <term>child pornography</term> and <term>minor</term> have the meanings given those terms in section 2256 of title 18, United States Code.</text></paragraph><paragraph id="idA3E8DC17ACBD45BE901A3B84AE8A6C4A"><enum>(2)</enum><header>Commission</header><text>The term <term>Commission</term> means the Federal Communications Commission.</text></paragraph><paragraph id="id92D854BB068D440EB8585C10F36CB179"><enum>(3)</enum><header>Covered platform</header><text>The term <term>covered platform</term>—</text><subparagraph id="id50FDF1DBF3C34F88B76A5933118DC28B"><enum>(A)</enum><text>means an entity—</text><clause id="id0CC7D9468C574BFEBDC6AD84B3777B10"><enum>(i)</enum><text>that is an interactive computer service;</text></clause><clause id="idB8D56A2232EE4F49A1D67086721D1334"><enum>(ii)</enum><text>that—</text><subclause id="id1C6269F204D848B6984FB81C204F26AB"><enum>(I)</enum><text>is engaged in interstate or foreign commerce; or</text></subclause><subclause id="id0BF588A0657040EE97C33E29DD580858"><enum>(II)</enum><text>purposefully avails itself of the United States market or a portion thereof; and</text></subclause></clause><clause id="idD7E8683B6057490890B24B6CDD374E38"><enum>(iii)</enum><text>for which it is in the regular course of the trade or business of the entity to create, host, or make available content that meets the definition of harmful to minors under paragraph (4) and that is provided by the entity, a user, or other information content provider, with the objective of earning a profit; and </text></clause></subparagraph><subparagraph id="idA491E8326E9C4559A63A8B244A7F77AE"><enum>(B)</enum><text>includes an entity described in subparagraph (A) regardless of whether—</text><clause id="id211B9AA0431E498B81B4D7FF8C44FCD6"><enum>(i)</enum><text>the entity earns a profit on the activities described in subparagraph (A)(iii); or</text></clause><clause id="id0766022D6DAA4028A67249B2ED440E5D"><enum>(ii)</enum><text>creating, hosting, or making available content that meets the definition of harmful to minors under paragraph (4) is the sole source of income or principal business of the entity. </text></clause></subparagraph></paragraph><paragraph id="id436411DC622946338B7967907E702B78"><enum>(4)</enum><header>Harmful to minors</header><text>The term <term>harmful to minors</term>, with respect to a picture, image, graphic image file, film, videotape, or other visual depiction, means that the picture, image, graphic image file, film, videotape, or other depiction—</text><subparagraph id="idB39F762C3FA042B8920508BD970D0D97"><enum>(A)</enum><clause commented="no" display-inline="yes-display-inline" id="id4EADB83620264C5A990DC072649A5C91"><enum>(i)</enum><text>taken as a whole and with respect to minors, appeals to the prurient interest in nudity, sex, or excretion;</text></clause><clause indent="up1" id="idFBE42A17DBD24A7A93AED41C6F9FF3CA"><enum>(ii)</enum><text>depicts, describes, or represents, in a patently offensive way with respect to what is suitable for minors, an actual or simulated sexual act or sexual contact, actual or simulated normal or perverted sexual acts, or lewd exhibition of the genitals; and</text></clause><clause indent="up1" id="id6B6C0815A127461C8174C5A1870BC43D"><enum>(iii)</enum><text>taken as a whole, lacks serious, literary, artistic, political, or scientific value as to minors;</text></clause></subparagraph><subparagraph display-inline="no-display-inline" commented="no" id="idD3EC701A3A8C467F8FB316C010F71C8A"><enum>(B)</enum><text>is obscene; or</text></subparagraph><subparagraph display-inline="no-display-inline" commented="no" id="id8EF13EC2E9AB445DA281271713659DB8"><enum>(C)</enum><text>is child pornography.</text></subparagraph></paragraph><paragraph commented="no" display-inline="no-display-inline" id="id057F97632954443FA3C9C8E4E05C89A9"><enum>(5)</enum><header>Information content provider; interactive computer service</header><text>The terms <term>information content provider</term> and <term>interactive computer service</term> have the meanings given those terms in section 230(f) of the Communications Act of 1934 (<external-xref legal-doc="usc" parsable-cite="usc/47/230">47 U.S.C. 230(f)</external-xref>).</text></paragraph><paragraph commented="no" display-inline="no-display-inline" id="idFBE5A0B30FC443F29DB877745397A97B"><enum>(6)</enum><header>Sexual act; sexual contact</header><text>The terms <term>sexual act</term> and <term>sexual contact</term> have the meanings given those terms in section 2246 of title 18, United States Code.</text></paragraph><paragraph commented="no" display-inline="no-display-inline" id="id8966BC323E8347C5AB1D323C38D82AAD"><enum>(7)</enum><header>Technology verification measure</header><text>The term <term>technology verification measure</term> means technology that—</text><subparagraph commented="no" display-inline="no-display-inline" id="idDA78FE22E1534FFB93D58FA704729D8E"><enum>(A)</enum><text>employs a system or process to determine whether it is more likely than not that a user of a covered platform is a minor; and</text></subparagraph><subparagraph commented="no" display-inline="no-display-inline" id="idE3269A41E12846EFBA199BA03A31E1E4"><enum>(B)</enum><text>prevents access by minors to any content on a covered platform.</text></subparagraph></paragraph></section><section id="id96ff307be20b441cb755d13ac8642880"><enum>4.</enum><header>Technology verification measures</header><subsection id="id895ECAE1266F4F2D9D9295FBB8DAE6C3"><enum>(a)</enum><header>Rule making</header><text>The Commission shall—</text><paragraph id="id437C3EB4D0DB4D9896BABE2BF4C4BEEF"><enum>(1)</enum><text>not later than 30 days after the date of enactment of this Act, issue a notice of proposed rule making to require covered platforms to adopt and operate technology verification measures on the platform to ensure that—</text><subparagraph id="idA83FEBC6C5EA4D1DBDDBFAD35E016871"><enum>(A)</enum><text>users of the covered platform are not minors; and</text></subparagraph><subparagraph id="id29D0D854B7A24A029A488C20A124DC92"><enum>(B)</enum><text>minors are prevented from accessing any content on the covered platform that is harmful to minors; and</text></subparagraph></paragraph><paragraph id="idCA9A79BADCD24D0B94017A5F11AC8C27"><enum>(2)</enum><text>not later than 1 year after issuing the notice of proposed rule making under paragraph (1), issue the final rule.</text></paragraph></subsection><subsection id="idA1D81D906F2A4159AF7946B1757ED12D"><enum>(b)</enum><header>Requirements</header><text>The rule described in subsection (a) shall—</text><paragraph id="id8329845456f54b4f9eab42a90c3e0afe"><enum>(1)</enum><text>set the applicable verification standards and metrics to which a covered platform using a technology verification measure is required to adhere when determining whether it is more likely than not that a user of the covered platform is not a minor;</text></paragraph><paragraph id="id574E19672B684E59AD5CEBCB953B3FC0"><enum>(2)</enum><text>require covered platforms to—</text><subparagraph id="id5106DED3951F43D79DF78A343DF643B1"><enum>(A)</enum><text>adopt technology verification measures that adhere to the standards and metrics set by the Commission under paragraph (1); and</text></subparagraph><subparagraph id="idF0990591A0DD4D2E8C2ED863773EB619"><enum>(B)</enum><text>make publicly available the verification process that the covered platform is employing to comply with the requirements under this Act;</text></subparagraph></paragraph><paragraph id="idA01F8546F8CB4258947E7D1B0F943AA8"><enum>(3)</enum><text>provide that requiring a user to confirm that the user is not a minor shall not be sufficient to satisfy the requirements under subparagraphs (A) and (B) of subsection (a)(1);</text></paragraph><paragraph id="idF08EDE61D5F54F9BA41F7A512942D2E0"><enum>(4)</enum><text>subject the Internet Protocol (IP) addresses of all users, including known virtual proxy network IP addresses, of a covered platform to the requirements described in subparagraphs (A) and (B) of subsection (a)(1) unless the covered platform (or third-party described in paragraph (6)), according to standards set by the Commission, determines based on available technology a user is not located within the United States;</text></paragraph><paragraph id="id5b3e4d7ceef643eda4c5c1c84be4b2b2"><enum>(5)</enum><text>permit covered platforms to choose the technology verification measure that ensures the verification of users in accordance with the standards and metrics set by the Commission under paragraph (1), provided that the technology verification measure employed by the covered platform prohibits a minor from accessing the platform or any information on the platform that is obscene, child pornography, or harmful to minors; </text></paragraph><paragraph id="idDF6FA26A1C304C82AD5647D1822922B0"><enum>(6)</enum><text>permit covered platforms to contract with a third-party to employ a technology verification measures, and provide that use of such a third-party shall not relieve the covered platform of the requirements under subparagraphs (A) and (B) of subsection (a)(1) or the enforcement actions described in section 6; </text></paragraph><paragraph id="id424F6A291B3E4BBD8845AB57360F82D7"><enum>(7)</enum><text>require the Commission to establish a process for each covered platform to submit only such documents or other materials as are necessary for the Commission to ensure full compliance with the requirements of the rule; and</text></paragraph><paragraph id="id6397509d0ad0424fb9c8430f7d6f36dc"><enum>(8)</enum><text>require the Commission to—</text><subparagraph id="id2A27DF54950D4B5E91A06EC46002E7C1"><enum>(A)</enum><text>conduct regular audits of covered platforms to ensure compliance with the requirements under this subsection; and</text></subparagraph><subparagraph id="id9C9DC92966E143AFA68C6922CAD2626F"><enum>(B)</enum><text>make public the terms and processes for the audits conducted under subparagraph (A), including the processes for any third-party conducting an audit on behalf of the Commission.</text></subparagraph></paragraph></subsection><subsection id="id74258253bb8542b08728742c89783366"><enum>(c)</enum><header>Compliance</header><paragraph id="id21D454B0B5A74A0BB00839A07B1CCEED"><enum>(1)</enum><header>Deadline</header><text>Not later than 180 days after the date on which the final rule is issued under subsection (a)(2), each covered platform shall comply with the requirements under the final rule.</text></paragraph><paragraph id="idfa216eb9971c4811b4f57af06a4181de"><enum>(2)</enum><header>Appropriate documents, materials, and measures</header><text>The Commission shall prescribe the appropriate documents, materials, or other measures required to ensure full compliance with the requirements of the final rule issued under subsection (a)(2). </text></paragraph></subsection><subsection id="id7E360ED29B384FED8F1998D22E12CD61"><enum>(d)</enum><header>Contracting with third parties</header><text>The Commission may create a process to contract with independent third-party auditors to conduct regular audits on behalf of the Commission under subsection (b)(8). </text></subsection><subsection id="id977F662912E1461A9F275CF8852CA425"><enum>(e)</enum><header>Rule of construction</header><text>Nothing in this section shall be construed to require a covered platform to submit any information that identifies, is linked to, or is reasonably linkable to a user of the covered platform or a device that is linked or reasonable linkable to a user of the covered platform.</text></subsection></section><section id="id488004C533D14DAD860F2998D4BC145A"><enum>5.</enum><header>Consultation requirements</header><text display-inline="no-display-inline">In issuing the rule required under section 4, the Commission shall consult with the following individuals, including with respect to the applicable standards and metrics for making a determination on whether it is more likely than not that a user of a covered platform is not a minor:</text><paragraph id="id7f9a29f6d93c4feb895cbb99ff0e5b9f"><enum>(1)</enum><text>Individuals with experience in computer science and software engineering.</text></paragraph><paragraph id="ida1106d83665c40f79cbe3ce40bd15563"><enum>(2)</enum><text>Individuals with experience in—</text><subparagraph id="id908E9D7BD08F41A6A4246D03933E7A6F"><enum>(A)</enum><text>advocating for online child safety; or</text></subparagraph><subparagraph id="idEEB4C375B3E04574A627DB8BFDBBA54D"><enum>(B)</enum><text>providing services to minors who have been victimized by online child exploitation.</text></subparagraph></paragraph><paragraph id="ida23c28e8f6224996b000fb390430fb20"><enum>(3)</enum><text>Individuals with experience in consumer protection and online privacy.</text></paragraph><paragraph id="id867e4d88e5bf473bb7ddb08358a72c14"><enum>(4)</enum><text>Individuals who supply technology verification measure products or have expertise in technology verification measure solutions.</text></paragraph><paragraph id="id398d4beae4c54dadbd811345f99da4f3"><enum>(5)</enum><text>Individuals with experience in data security and cryptography.</text></paragraph></section><section id="id082c3410fcf14131a249ed882af2467e"><enum>6.</enum><header>Civil penalty for violations</header><subsection id="id88750b62e09041e69206f44e2e7bf713"><enum>(a)</enum><header>Notification</header><text>If the Commission has a sound basis to conclude that a covered platform has violated the final rule issued under section 4, the Commission shall notify the covered platform with a brief description of the specific violation with recommended measures to correct the violation.</text></subsection><subsection id="ida2b0b41eda104602ae27d1c6d238759a"><enum>(b)</enum><header>Penalty</header><paragraph id="id0EC6A20F86284388B573ADB960C22781"><enum>(1)</enum><header>In general</header><text>A covered platform that has not provided evidence of compliance or corrected a violation that has been noticed by the Commission under subsection (a) within 72 hours of the receipt of such notice shall be subject to a civil penalty in an amount that is not more than $25,000 per violation. </text></paragraph><paragraph id="id1D5248C83BEE4DFABFAFA15F18D9C99F"><enum>(2)</enum><header>Separate violations</header><text>For the purposes of paragraph (1), each day of violation of the final rule issued under section 4 shall constitute a separate violation. </text></paragraph><paragraph id="idBAE95070F2454F49A0FF9C08515E98C3"><enum>(3)</enum><header>Appeal</header><text>A covered platform may appeal any civil penalty issued by the Commission under this subsection in an appropriate district court of the United States.</text></paragraph><paragraph id="id302882911083473B8A32A7394D7A8211"><enum>(4)</enum><header>Use of amounts</header><text>Any amounts collected under this subsection shall be used by the Commission to carry out enforcement of the final rule issued under section 4.</text></paragraph></subsection><subsection id="id5944560cc8cb4a87b7dad7cf437a72a6"><enum>(c)</enum><header>Enforcement</header><text>The Commission may—</text><paragraph id="idC419DF7046144B369739A0A232DBA37A"><enum>(1)</enum><text>file a claim in an appropriate district court of the United States to enforce this section;</text></paragraph><paragraph id="id0027D99A759748B690FB9F935339EAC3"><enum>(2)</enum><text>seek a temporary or permanent injunction from an appropriate district court of the United States on such terms as the court deems reasonable to prevent or restrain a violation of the final rule issued under section 4; </text></paragraph><paragraph id="idd0a9e125ae804ddb9ae967ed09a3a792"><enum>(3)</enum><text>after 30 days of non-compliance with section 4 and a demonstration of a lack of good faith on the part of the covered platform to comply with section 4, seek a permanent or temporary injunction to restrict the operation of the covered platform within the United States; and</text></paragraph><paragraph id="id7e26f28fe60e4a818cde3c97580127b1"><enum>(4)</enum><text>after 30 days of non-compliance with section 4 and a demonstration of a lack of good faith on the part of the covered platform to comply with section 4, seek an order to allow the Commission to prohibit a covered platform from engaging in any online economic transactions within the United States. </text></paragraph></subsection><subsection id="id97193C47C43648C0AA74498B0E1CF2E9"><enum>(d)</enum><header>Duration</header><text>The terms of an injunction or an order issued under paragraph (2), (3), or (4) of subsection (c) with respect to a covered platform shall only be valid until such time as the covered platform demonstrates to the Commission full compliance with the requirements of the final rule issued under section 4. </text></subsection></section><section id="id517E35E9BFCB4155836DFED12B2E5727"><enum>7.</enum><header>GAO report</header><text display-inline="no-display-inline">Not later than 2 years after the date on which covered platforms are required to comply with the final rule issued under section 4(a)(2), the Comptroller General of the United States shall submit to Congress a report that includes—</text><paragraph id="idcab77d3b179d4969b6cd823ff7801cbc"><enum>(1)</enum><text>an analysis of the effectiveness of the technology verification measures required under the final rule;</text></paragraph><paragraph id="id606B55FAB07F450E88228C47E8398F45"><enum>(2)</enum><text>recommendations to the Commission for improvements to the final rule; and</text></paragraph><paragraph id="idC44847BAB05E4F1C8D4CC51B86FED843"><enum>(3)</enum><text>recommendations to Congress on future legislative improvements. </text></paragraph></section><section id="id8d8e68b5613d45ffbe9dd6cbe98a2fb0"><enum>8.</enum><header>Severability clause</header><text display-inline="no-display-inline">If any provision of this Act, or the application of such a provision to any person or circumstance, is held to be unconstitutional, the remaining provisions of this Act, and the application of such provisions to any other person or circumstance, shall not be affected thereby. </text></section></legis-body></bill> 

