[Congressional Record Volume 170, Number 143 (Monday, September 16, 2024)]
[Senate]
[Pages S6056-S6058]
From the Congressional Record Online through the Government Publishing Office [www.gpo.gov]

  SA 3277. Mr. SCHUMER (for himself, Mr. Rounds, and Mr. Heinrich) 
submitted an amendment intended to be proposed by him to the bill S. 
4638, to authorize appropriations for fiscal year 2025 for military 
activities of the Department of Defense, for military construction, and 
for defense activities of

[[Page S6057]]

the Department of Energy, to prescribe military personnel strengths for 
such fiscal year, and for other purposes; which was ordered to lie on 
the table; as follows:

       At the appropriate place in title XVI, insert the 
     following:

     SEC. ___. PHYSICAL AND CYBERSECURITY PROCUREMENT REQUIREMENTS 
                   FOR ARTIFICIAL INTELLIGENCE SYSTEMS.

       (a) Definitions.--In this section:
       (1) Artificial intelligence.--The term ``artificial 
     intelligence'' has the meaning given such term in section 
     5002 of the National Artificial Intelligence Initiative Act 
     of 2020 (15 U.S.C. 9401).
       (2) Covered artificial intelligence technology.--The term 
     ``covered artificial intelligence technology'' means an 
     artificial intelligence system procured by the Department of 
     Defense and all components of the development and deployment 
     lifecycle of that artificial intelligence system, including 
     source code, numerical parameters (such as model weights) of 
     the trained artificial intelligence system, details of any 
     methods and algorithms used to develop that system, data used 
     in the development of the system, and software used for 
     evaluating the trustworthiness of the artificial intelligence 
     system during development or deployment.
       (3) Covered entity.--The term ``covered entity'' means an 
     entity that enters into a Department of Defense contract that 
     engages in the development, deployment, storage, or 
     transportation of a covered artificial intelligence 
     technology.
       (b) Security Framework.--
       (1) In general.--The Secretary of Defense shall develop a 
     framework describing best practices for artificial 
     intelligence cybersecurity and physical security to mitigate 
     risks to the Department of Defense from the use of covered 
     artificial intelligence technologies.
       (2) Coverage of relevant aspects of security.--The 
     framework developed under paragraph (1) shall cover all 
     relevant aspects of the security of artificial intelligence 
     systems, including the following:
       (A) Workforce risks, such as insider threat risks.
       (B) Supply chain risks, such as data poisoning risks.
       (C) Risks relating to adversarial tampering with artificial 
     intelligence systems.
       (D) Risks relating to unintended exposure or theft of 
     artificial intelligence systems.
       (3) Risk-based framework.--The framework developed under 
     paragraph (1) shall be risk-based, with higher security 
     levels corresponding proportionally to the national security 
     or foreign policy risks posed by the covered artificial 
     intelligence technology being stolen or tampered with.
       (4) Use of existing frameworks.--To the maximum extent 
     feasible, the framework developed under paragraph (1) shall--
       (A) draw on existing cybersecurity references, such as the 
     NIST Special Publication 800 series; and
       (B) be implemented as an extension or augmentation of 
     existing cybersecurity frameworks developed by the Department 
     of Defense, such as the Cybersecurity Maturity Model 
     Certification framework.
       (5) Addressing extreme security risks.--
       (A) Highly capable cyber threat actors.--The framework 
     developed under paragraph (1) shall take into account that 
     the most highly capable artificial intelligence systems may 
     be of great interest to the most highly capable cyber threat 
     actors, such as intelligence and defense agencies of peer and 
     near-peer nations.
       (B) Security levels.--The Secretary of Defense shall ensure 
     that cybersecurity frameworks provided for contractors 
     contain security levels designed to mitigate risks posed by 
     cyber threat actors described in subparagraph (A), with the 
     highest levels being similar in scope to the level of 
     protection offered by national security systems.
       (C) General design with specific components.--To the extent 
     feasible, any additional security levels developed under 
     subparagraph (B) shall be designed generally for all software 
     systems, but may contain components designed specifically for 
     highly capable artificial intelligence systems.
       (c) Security Requirements.--
       (1) In general.--The Secretary may amend the Defense 
     Federal Acquisition Regulation Supplement, or take other 
     similar action, to require covered entities to implement the 
     best practices described in the framework developed under 
     subsection (c).
       (2) Risk-based rules.--Requirements implemented in rules 
     developed under paragraph (1) shall be as narrowly tailored 
     as practicable to the specific covered artificial 
     intelligence technologies developed, deployed, stored, or 
     transported by a covered entity, and shall be calibrated 
     accordingly to the different tasks involved in development, 
     deployment, storage, or transportation of components of those 
     covered artificial intelligence technologies.
       (3) Cost-benefit consideration.--
       (A) In general.--In implementing paragraph (1), the 
     Secretary shall--
       (i) consider the costs and benefits to the Department and 
     to United States national security and technological 
     leadership, of imposing security requirements on covered 
     entities; and
       (ii) to the extent feasible, design requirements in a way 
     that minimizes costs and maximizes benefits.
       (B) Weighing costs of slowing down development.--In 
     carrying out subparagraph (A), the Secretary shall, in 
     particular, weigh the costs of slowing down artificial 
     intelligence development and deployment against the benefits 
     of mitigating national security risks and potential security 
     risks to the Department of Defense from using commercial 
     software.
       (d) Reporting Requirements.--Not later than 180 days after 
     the date of the enactment of this Act, the Secretary shall 
     submit to the congressional defense committees an update on 
     the status of implementation of the requirements of this 
     section.

     SEC. ___. PUBLIC-PRIVATE CYBERSECURITY PARTNERSHIP FOR HIGHLY 
                   CAPABLE ARTIFICIAL INTELLIGENCE SYSTEMS.

       (a) Establishment Required.--Not later than 180 days after 
     the date of the enactment of this Act, the Assistant 
     Secretary of Defense for Cyber Policy shall establish a 
     public-private partnership body to address cybersecurity 
     threats to highly capable artificial intelligence systems.
       (b) Forum for Engagement.--The partnership body established 
     under subsection (a) shall serve as a forum for engagement 
     between the Department of Defense and commercial industry 
     partners to align and enhance cybersecurity frameworks and 
     practices applicable to both national security systems and 
     artificial intelligence systems at risk from sophisticated 
     state actors.
       (c) Purpose.--The public-private partnership body developed 
     under subsection (a) shall--
       (1) convene regular engagements to discuss cybersecurity 
     threats specific to highly capable artificial intelligence 
     systems, with a focus on both current and emerging threats 
     posed by state-sponsored cyber actors;
       (2) facilitate the development, sharing, and alignment of 
     best practices and robust cybersecurity frameworks between 
     the Department and commercial industry to protect artificial 
     intelligence systems;
       (3) promote collaborative threat intelligence sharing 
     between the Department and commercial entities, with 
     particular attention to vulnerabilities in artificial 
     intelligence systems used in critical infrastructure, defense 
     operations, and sensitive national security functions; and
       (4) develop recommendations for cybersecurity policy 
     enhancements aimed at safeguarding artificial intelligence 
     technologies from state-sponsored cyber attacks and report 
     findings and policy recommendations to Congress on an annual 
     basis.
       (d) Participants.--The public-private partnership body 
     developed under subsection (a) shall include representatives 
     from--
       (1) the Department of Defense, including--
       (A) the Office of the Assistant Secretary of Defense for 
     Cyber Policy;
       (B) the Under Secretary of Defense for Intelligence and 
     Security;
       (C) the Chief Information Officer of the Department of 
     Defense;
       (D) the Chief Digital and Artificial Intelligence Officer 
     of the Department of Defense;
       (E) the Defense Advanced Research Projects Agency;
       (F) the National Security Agency;
       (G) United States Cyber Command; and
       (H) such other Department of Defense agencies with 
     responsibilities for cybersecurity or artificial intelligence 
     systems as the Assistant Secretary considers relevant;
       (2) commercial industry companies with expertise in highly 
     capable artificial intelligence systems or cybersecurity 
     practices, including--
       (A) cloud computing and artificial intelligence service 
     providers;
       (B) cybersecurity companies;
       (C) artificial intelligence research and development 
     companies;
       (D) telecommunications companies; and
       (E) such other industry leaders as the Assistant Secretary 
     identifies as relevant and appropriate; and
       (3) federally funded research and development centers, 
     national laboratories, and academic institutions with 
     demonstrated expertise in artificial intelligence or 
     cybersecurity.
       (e) Meetings.--The engagements described under subsection 
     (c)(1) shall include convenings not less frequently than 
     semiannually--
       (1) to identify key threats to artificial intelligence 
     systems in both the Department and commercial sectors, with 
     an emphasis on threats posed by sophisticated state actors;
       (2) to align the most robust cybersecurity frameworks 
     applicable to national security systems and those artificial 
     intelligence systems used in commercial sectors that are 
     deemed critical to national security; and
       (3) to assess the cybersecurity readiness of artificial 
     intelligence systems and artificial intelligence developers 
     and providers and make recommendations to improve protective 
     measures against cyber threats to artificial intelligence 
     systems and artificial intelligence developers and providers.
       (f) Reporting Requirements.--Not later than one year after 
     the establishment of the public-private partnership body 
     under subsection (a), and not less frequently than once each 
     year thereafter, the Assistant Secretary shall submit to the 
     congressional defense committees a report summarizing--
       (1) the key findings from the meetings held under 
     subsection (e), including identified cybersecurity 
     vulnerabilities in artificial intelligence systems;
       (2) recommendations for enhancing cybersecurity policy and 
     practices to protect artificial intelligence systems across 
     both the Department and commercial sectors; and

[[Page S6058]]

       (3) an analysis of the progress made in aligning Department 
     and commercial cybersecurity frameworks to address state-
     sponsored cyber threats.
                                 ______