[Congressional Bills 119th Congress]
[From the U.S. Government Publishing Office]
[S. 3952 Introduced in Senate (IS)]

<DOC>






119th CONGRESS
  2d Session
                                S. 3952

To establish artificial intelligence standards, metrics, and evaluation 
 tools, to support artificial intelligence research, development, and 
 capacity building activities, to promote innovation in the artificial 
 intelligence industry by ensuring companies of all sizes can succeed 
                  and thrive, and for other purposes.


_______________________________________________________________________


                   IN THE SENATE OF THE UNITED STATES

                           February 26, 2026

     Mr. Young (for himself, Ms. Cantwell, Mrs. Blackburn, and Mr. 
 Hickenlooper) introduced the following bill; which was read twice and 
   referred to the Committee on Commerce, Science, and Transportation

_______________________________________________________________________

                                 A BILL


 
To establish artificial intelligence standards, metrics, and evaluation 
 tools, to support artificial intelligence research, development, and 
 capacity building activities, to promote innovation in the artificial 
 intelligence industry by ensuring companies of all sizes can succeed 
                  and thrive, and for other purposes.

    Be it enacted by the Senate and House of Representatives of the 
United States of America in Congress assembled,

SECTION 1. SHORT TITLE; TABLE OF CONTENTS.

    (a) Short Title.--This Act may be cited as the ``Future of 
Artificial Intelligence Innovation Act of 2026''.
    (b) Table of Contents.--The table of contents for this Act is as 
follows:

Sec. 1. Short title; table of contents.
Sec. 2. Sense of Congress.
    TITLE I--VOLUNTARY ARTIFICIAL INTELLIGENCE STANDARDS, METRICS, 
       EVALUATION TOOLS, TESTBEDS, AND INTERNATIONAL COOPERATION

Sec. 100. Definitions.
Subtitle A--Center for Artificial Intelligence Standards and Innovation 
                              and Testbeds

Sec. 101. Center for Artificial Intelligence Standards and Innovation.
Sec. 102. Interagency coordination and program to facilitate artificial 
                            intelligence testbeds.
Sec. 103. National Institute of Standards and Technology and Department 
                            of Energy testbed to identify, test, and 
                            synthesize new materials.
Sec. 104. Coordination, reimbursement, and savings provisions.
Sec. 105. Progress report.
                 Subtitle B--International Cooperation

Sec. 111. International coalitions on innovation, development, and 
                            alignment of standards with respect to 
                            artificial intelligence.
       Subtitle C--Identifying Regulatory Barriers to Innovation

Sec. 121. Comptroller General of the United States identification of 
                            risks and obstacles relating to artificial 
                            intelligence and Federal agencies.
   TITLE II--ARTIFICIAL INTELLIGENCE RESEARCH, DEVELOPMENT, CAPACITY 
                          BUILDING ACTIVITIES

Sec. 201. Public data for artificial intelligence systems.
Sec. 202. Federal grand challenges in artificial intelligence.
             TITLE III--RESEARCH SECURITY AND OTHER MATTERS

Sec. 301. Research security.
Sec. 302. Expansion of authority to hire critical technical experts.
Sec. 303. Certifications and audits of temporary fellows.

SEC. 2. SENSE OF CONGRESS.

    It is the sense of Congress that policies affecting artificial 
intelligence should maximize the potential, development, and use of 
artificial intelligence to benefit all private and public stakeholders.

    TITLE I--VOLUNTARY ARTIFICIAL INTELLIGENCE STANDARDS, METRICS, 
       EVALUATION TOOLS, TESTBEDS, AND INTERNATIONAL COOPERATION

SEC. 100. DEFINITIONS.

    In this title:
            (1) Artificial intelligence.--The term ``artificial 
        intelligence'' has the meaning given such term in section 5002 
        of the National Artificial Intelligence Initiative Act of 2020 
        (15 U.S.C. 9401).
            (2) Artificial intelligence model.--The term ``artificial 
        intelligence model'' means a component of an artificial 
        intelligence system that is--
                    (A) derived using mathematical, computational, 
                statistical, or machine-learning techniques; and
                    (B) used as part of an artificial intelligence 
                system to produce outputs from a given set of inputs.
            (3) Artificial intelligence system.--The term ``artificial 
        intelligence system'' means an engineered or machine-based 
        system that--
                    (A) can, for a given set of objectives, generate 
                outputs such as predictions, recommendations, or 
                decisions influencing real or virtual environments; and
                    (B) is designed to operate with varying levels of 
                autonomy.
            (4) Critical infrastructure.--The term ``critical 
        infrastructure'' has the meaning given such term in section 
        1016(e) of the Uniting and Strengthening America by Providing 
        Appropriate Tools Required to Intercept and Obstruct Terrorism 
        (USA PATRIOT ACT) Act of 2001 (42 U.S.C. 5195c(e)).
            (5) Federal laboratory.--The term ``Federal laboratory'' 
        has the meaning given such term in section 4 of the Stevenson-
        Wydler Technology Innovation Act of 1980 (15 U.S.C. 3703).
            (6) Foundation model.--The term ``foundation model'' means 
        an artificial intelligence model trained on broad data at scale 
        and is adaptable to a wide range of downstream tasks.
            (7) National laboratory.--The term ``National Laboratory'' 
        has the meaning given such term in section 2 of the Energy 
        Policy Act of 2005 (42 U.S.C. 15801).
            (8) Testbed.--The term ``testbed'' means a facility or 
        mechanism, virtual or otherwise, equipped for conducting 
        rigorous, transparent, and replicable testing of tools and 
        technologies, including artificial intelligence systems, to 
        help evaluate the functionality, trustworthiness, usability, 
        and performance of those tools or technologies.

Subtitle A--Center for Artificial Intelligence Standards and Innovation 
                              and Testbeds

SEC. 101. CENTER FOR ARTIFICIAL INTELLIGENCE STANDARDS AND INNOVATION.

    The National Institute of Standards and Technology Act (15 U.S.C. 
271 et seq.) is amended by inserting after section 22A (15 U.S.C. 278h-
1) the following:

``SEC. 22B. CENTER FOR ARTIFICIAL INTELLIGENCE STANDARDS AND 
              INNOVATION.

    ``(a) Definitions.--In this section:
            ``(1) Agency.--The term `agency' has the meaning given the 
        term `Executive agency' in section 105 of title 5, United 
        States Code.
            ``(2) Artificial intelligence.--The term `artificial 
        intelligence' has the meaning given such term in section 5002 
        of the National Artificial Intelligence Initiative Act of 2020 
        (15 U.S.C. 9401).
            ``(3) Artificial intelligence blue-teaming.--The term 
        `artificial intelligence blue-teaming' means an effort to 
        conduct operational vulnerability evaluations and provide 
        mitigation techniques to entities who have a need for an 
        independent technical review of the security posture of an 
        artificial intelligence system.
            ``(4) Artificial intelligence red-teaming.--The term 
        `artificial intelligence red-teaming' means structured 
        adversarial testing efforts of an artificial intelligence 
        system.
            ``(5) Federal laboratory.--The term `Federal laboratory' 
        has the meaning given such term in section 4 of the Stevenson-
        Wydler Technology Innovation Act of 1980 (15 U.S.C. 3703).
            ``(6) Foundation model.--The term `foundation model' means 
        an artificial intelligence model trained on broad data at scale 
        and is adaptable to a wide range of downstream tasks.
            ``(7) Synthetic content.--The term `synthetic content' 
        means information, such as images, videos, audio clips, and 
        text, that has been significantly modified or generated by 
        algorithms, including by an artificial intelligence system.
            ``(8) Testbed.--The term `testbed' means a facility or 
        mechanism, virtual or otherwise, equipped for conducting 
        rigorous, transparent, and replicable testing of tools and 
        technologies, including artificial intelligence systems, to 
        help evaluate the functionality, trustworthiness, usability, 
        and performance of those tools or technologies.
            ``(9) Watermarking.--The term `watermarking' means the act 
        of embedding provenance and authenticity information that is 
        intended to be difficult to remove, into outputs generated by 
        artificial intelligence systems or in original content, 
        including outputs such as text, images, audio, videos, software 
        code, or any other digital content or data, for the purposes of 
        verifying and maintaining the authenticity, integrity, and 
        reliability of the output or the identity or characteristics of 
        its provenance, modifications, or conveyance.
    ``(b) Establishment of Center for Artificial Intelligence Standards 
and Innovation.--
            ``(1) In general.--Not later than 90 days after the date of 
        the enactment of the Future of Artificial Intelligence 
        Innovation Act of 2026, the Director shall establish a center 
        on artificial intelligence within the Institute.
            ``(2) Designation.--The center established pursuant to 
        paragraph (1) shall be known as the `Center for Artificial 
        Intelligence Standards and Innovation' (in this section the 
        `Center').
            ``(3) Mission.--The mission of the Center is to assist the 
        private sector and agencies in developing voluntary best 
        practices for the robust assessment of artificial intelligence 
        systems, which may be contributed to or inform the work on such 
        practices in standards development organizations.
    ``(c) Functions.--
            ``(1) In general.--The functions of the Center, which the 
        Center shall carry out in coordination with the laboratories of 
        the Institute, include the following:
                    ``(A) Using publicly available or voluntarily 
                provided information, assessing artificial intelligence 
                systems and developing guidelines and best practices to 
                measure and improve the secure development, deployment, 
                and use of artificial intelligence technology.
                    ``(B) Supporting artificial intelligence red-
                teaming, sharing best practices, and coordinating on 
                building testbeds and test environments with allies and 
                international partners of the United States.
                    ``(C) Developing and publishing physical and 
                cybersecurity tools, methodologies, best practices, 
                voluntary guidelines, and other supporting information 
                to assist persons who maintain systems used to create 
                or train artificial intelligence models with 
                discovering and mitigating vulnerabilities and attacks, 
                including manipulation through data poisoning, 
                including those that may be exploited by foreign 
                adversaries.
                    ``(D) Establishing artificial intelligence blue-
                teaming capabilities to support mitigation approaches 
                and partnering with industry to address the reliability 
                of artificial intelligence systems.
                    ``(E) Developing tools, methodologies, best 
                practices, and voluntary guidelines for detecting 
                synthetic content, authenticating content and tracking 
                of the provenance of content, labeling original and 
                synthetic content, such as by watermarking, and 
                evaluating software and systems relating to detection 
                and labeling of synthetic content.
                    ``(F) Coordinating or developing metrics and 
                methodologies for testing artificial intelligence 
                systems, including the following:
                            ``(i) Cataloging existing artificial 
                        intelligence metrics and evaluation 
                        methodologies used in industry and academia.
                            ``(ii) Testing the efficacy of existing 
                        metrics and evaluations.
                            ``(iii) Documenting tools that assess 
                        reliability, accuracy, and robustness.
                    ``(G) Coordinating with counterpart international 
                institutions, partners, and allies to support global 
                interoperability in the development of research and 
                testing of standards relating to artificial 
                intelligence.
                    ``(H) Producing resources for Federal agencies to 
                conduct their own evaluations of artificial 
                intelligence systems to best fulfill their missions.
                    ``(I) Convening meetings on a semiannual basis with 
                Federal agencies and the private sector--
                            ``(i) to share information and best 
                        practices on building artificial intelligence 
                        evaluations; and
                            ``(ii) to accelerate the development and 
                        adoption of national standards for artificial 
                        intelligence systems in sectors including, 
                        biotechnology, agriculture, and health care.
                    ``(J) Examining safeguards and best practices to 
                secure artificial intelligence systems from cyber 
                attacks.
                    ``(K) Examining safeguards and best practices to 
                protect against unintended use of artificial 
                intelligence for the purpose of developing chemical, 
                biological, radiological, nuclear, and energy-security 
                threats or hazards.
                    ``(L) Providing, in consultation with the Secretary 
                of Homeland Security and the Director of the 
                Cybersecurity and Information Security Agency, a 
                toolkit for best practices in anticipating, responding 
                to, and recovering from cybersecurity incidents 
                involving artificial intelligence systems. Such toolkit 
                may include guidance on remediating and responding to 
                known artificial intelligence-specific vulnerabilities.
                    ``(M) Developing, and curating, in consultation 
                with the Secretary of Labor, a list of high-priority 
                occupations for training for the advancement and 
                deployment of artificial intelligence.
                    ``(N) Developing best practices on minimum data 
                quality standards for the use of biological, material 
                science, chemical, physical, and other scientific areas 
                in artificial intelligence model training.
                    ``(O) Examining, in consultation with the heads of 
                other relevant Federal agencies, the vulnerabilities in 
                the supply chain of hardware, including semiconductors 
                and microelectronics, that are critical to enabling the 
                development and deployment of artificial intelligence.
                    ``(P) Examining ways in which artificial 
                intelligence may be used by the Federal Government in 
                combating fraud and other unfair or deceptive 
                practices.
                    ``(Q) Identify proven, scalable, and interoperable 
                techniques and metrics to promote the development of 
                artificial intelligence.
    ``(d) Center for Artificial Intelligence Standards and Innovation 
Consortium.--
            ``(1) Establishment.--
                    ``(A) In general.--Not later than 180 days after 
                the date of the enactment of the Future of Artificial 
                Intelligence Innovation Act of 2026, the Director shall 
                establish a consortium of stakeholders from academic or 
                research communities, Federal laboratories, private 
                industry, including companies of all sizes with 
                different roles in the use of artificial intelligence 
                systems, including developers, deployers, evaluators, 
                users, and civil society with expertise in matters 
                relating to artificial intelligence to support the 
                Center in carrying out the functions set forth under 
                subsection (c).
                    ``(B) Designation.--The consortium established 
                pursuant to subparagraph (A) shall be known as the 
                `Center for Artificial Intelligence Standards and 
                Innovation Consortium'.
            ``(2) Consultation.--The Director shall consult with the 
        consortium established under this subsection not less 
        frequently than quarterly.
            ``(3) Annual reports to congress.--Not later than 1 year 
        after the date of the enactment of the Future of Artificial 
        Intelligence Innovation Act of 2026 and not less frequently 
        than once each year thereafter, the Director shall submit to 
        the Committee on Commerce, Science, and Transportation of the 
        Senate and the Committee on Science, Space, and Technology of 
        the House of Representatives a report summarizing the 
        contributions of the members of the consortium established 
        under this subsection in support the efforts of the Center.
    ``(e) Voluntary Artificial Intelligence Testing Standards.--In 
carrying out the functions under subsection (c), the Director shall 
support and contribute to the development of voluntary, consensus-based 
technical standards for testing artificial intelligence system 
components, including by addressing, as the Director considers 
appropriate, the following:
            ``(1) Physical infrastructure for training or developing 
        artificial intelligence models and systems, including cloud 
        infrastructure.
            ``(2) Physical infrastructure for operating artificial 
        intelligence systems, including cloud infrastructure.
            ``(3) Data for training artificial intelligence models.
            ``(4) Data for evaluating the functionality and 
        trustworthiness of trained artificial intelligence models and 
        systems.
            ``(5) Trained or partially trained artificial intelligence 
        models and any resulting software systems or products.
            ``(6) Human-in-the-loop testing of artificial intelligence 
        models and systems.
    ``(f) Matters Relating to Disclosure and Access.--
            ``(1) FOIA exemption.--Any confidential content, as deemed 
        confidential by the contributing private sector person, shall 
        be exempt from public disclosure under section 552(b)(3) of 
        title 5, United States Code.
            ``(2) Limitation on access to content.--Access to a 
        contributing private sector person's voluntarily provided 
        confidential content, as deemed confidential by the 
        contributing private sector person shall be limited to the 
        private sector person and the Center.
            ``(3) Aggregated information.--The Director may make 
        aggregated, deidentified information available to contributing 
        companies, the public, and other agencies, as the Director 
        considers appropriate, in support of the purposes of this 
        section.
    ``(g) Rule of Construction.--Nothing in this section shall be 
construed to provide the Director any enforcement authority that was 
not in effect on the day before the date of the enactment of the Future 
of Artificial Intelligence Innovation Act of 2026.
    ``(h) Prohibition on Access to Resources for Entities Under Control 
of Certain Foreign Governments.--
            ``(1) Definitions.--In this subsection:
                    ``(A) Covered nation.--The term `covered nation' 
                has the meaning given that term in section 4872 of 
                title 10, United States Code.
                    ``(B) Ownership, control, or influence of the 
                government of a covered nation.--The term `ownership, 
                control, or influence of the government of a covered 
                nation', with respect to an entity, means the 
                government of a covered nation--
                            ``(i) has the power to direct or decide 
                        matters affecting the entity's management or 
                        operations in a manner that could--
                                    ``(I) result in unauthorized access 
                                to classified information; or
                                    ``(II) adversely affect performance 
                                of a contract or agreement requiring 
                                access to classified information; and
                            ``(ii) exercises that power--
                                    ``(I) directly or indirectly;
                                    ``(II) through ownership of the 
                                entity's securities, by contractual 
                                arrangements, or other similar means;
                                    ``(III) by the ability to control 
                                or influence the election or 
                                appointment of one or more members to 
                                the entity's governing board (such as 
                                the board of directors, board of 
                                managers, or board of trustees) or its 
                                equivalent; or
                                    ``(IV) prospectively (such as by 
                                not currently exercising the power, but 
                                could).
            ``(2) In general.--An entity under the ownership, control, 
        or influence of the government of a covered nation may not 
        access any of the resources of the Center.
            ``(3) Criteria for identification.--The Director, working 
        with the heads of the relevant Federal agencies, shall 
        establish criteria to determine if any entity that seeks to 
        utilize the resources of the Center is under the ownership, 
        control, or influence of the government of a covered nation.''.

SEC. 102. INTERAGENCY COORDINATION AND PROGRAM TO FACILITATE ARTIFICIAL 
              INTELLIGENCE TESTBEDS.

    (a) Definitions.--In this section:
            (1) Appropriate committees of congress.--The term 
        ``appropriate committees of Congress'' means--
                    (A) the Committee on Commerce, Science, and 
                Transportation and the Committee on Energy and Natural 
                Resources of the Senate; and
                    (B) the Committee on Science, Space, and Technology 
                of the House of Representatives.
            (2) Director.--The term ``Director'' means the Director of 
        the National Science Foundation.
            (3) Institute.--The term ``Institute'' means the National 
        Institute of Standards and Technology.
            (4) Secretary.--The term ``Secretary'' means the Secretary 
        of Energy.
            (5) Under secretary.--The term ``Under Secretary'' means 
        the Under Secretary of Commerce for Standards and Technology.
    (b) Program Required.--Not later than 1 year after the date of the 
enactment of this Act, the Under Secretary and the Secretary, in 
coordination with the Director, shall jointly establish a testbed 
program to encourage collaboration and support partnerships between the 
National Laboratories, Federal laboratories, the National Institute of 
Standards and Technology, the National Artificial Intelligence Research 
Resource pilot program established by the Director, or any successor 
program, and public and private sector entities, including companies of 
all sizes, to conduct tests, evaluations, and security or vulnerability 
risk assessments, and to support research and development, of 
artificial intelligence systems, including measurement methodologies 
developed by the Institute, in order to develop standards and encourage 
development of a third-party ecosystem.
    (c) Activities.--In carrying out the program required by subsection 
(b), the Under Secretary and the Secretary--
            (1) may use the advanced computing resources, testbeds, and 
        expertise of the National Laboratories, Federal laboratories, 
        the Institute, the National Science Foundation, and private 
        sector entities to run tests and evaluations on the 
        capabilities and limitations of artificial intelligence 
        systems;
            (2) shall use existing solutions to the maximum extent 
        practicable;
            (3) shall develop automated and reproducible tests and 
        evaluations for artificial intelligence systems to the extent 
        that is practicable;
            (4) shall assess the computational resources necessary to 
        run tests and evaluations of artificial intelligence systems;
            (5) shall research methods to effectively minimize the 
        computational resources needed to run tests, evaluations, and 
        security assessments of artificial intelligence systems;
            (6) shall where practicable, develop tests and evaluations 
        for artificial intelligence systems that are designed for high-
        , medium-, and low-computational intensity;
            (7) shall prioritize assessments by identifying security 
        vulnerabilities of artificial intelligence systems, including 
        the establishment of and utilization of existing classified 
        testbeds, at the National Laboratories if necessary, including 
        with respect to--
                    (A) autonomous offensive cyber capabilities;
                    (B) cybersecurity vulnerabilities in the artificial 
                intelligence software ecosystem and beyond;
                    (C) chemical, biological, radiological, nuclear, 
                critical infrastructure, and energy-security threats or 
                hazards; and
                    (D) such other capabilities as the Under Secretary 
                or the Secretary determines necessary; and
            (8) shall organize a hackathon to test artificial 
        intelligence systems security risks and vulnerabilities.
    (d) Consideration Given.--In carrying out the activities required 
by subsection (c), the Under Secretary and the Secretary shall take 
under consideration the applicability of any tests, evaluations, and 
risk assessments to artificial intelligence systems trained using 
primarily biological sequence data that could be used to enhance an 
artificial intelligence system's ability to contribute to the creation 
of a pandemic or biological weapon, including those systems used for 
gene synthesis.
    (e) Metrics.--The Under Secretary and the Secretary shall jointly 
develop metrics to assess--
            (1) the effectiveness of the program in encouraging 
        collaboration and supporting partnerships as described in 
        subsection (b); and
            (2) the impact of the program on public and private sector 
        integration and use of artificial intelligence systems.
    (f) Use of Existing Program.--In carrying out the program required 
by subsection (b), the Under Secretary, the Secretary, and the Director 
may use a program that was in effect on the day before the date of the 
enactment of this Act.
    (g) Evaluation and Findings.--Not later than 3 years after the 
start of the program required by subsection (b), the Under Secretary 
and the Secretary shall jointly--
            (1) evaluate the success of the program in encouraging 
        collaboration and supporting partnerships as described in 
        subsection (b), using the metrics developed pursuant to 
        subsection (e);
            (2) evaluate the success of the program in encouraging 
        public and private sector integration and use of artificial 
        intelligence systems by using the metrics developed pursuant to 
        subsection (e); and
            (3) submit to the appropriate committees of Congress the 
        evaluation supported pursuant to paragraph (1) and the findings 
        of the Under Secretary, the Secretary, and the Director with 
        respect to the testbed program.
    (h) Consultation.--In carrying out subsection (b), the Under 
Secretary and the Secretary shall consult, as the Under Secretary and 
the Secretary consider appropriate, with the following:
            (1) Industry, including private artificial intelligence 
        laboratories, companies of all sizes, and representatives from 
        the United States financial sector.
            (2) Academia and institutions of higher education.
            (3) Civil society.
    (i) Establishment of Voluntary Foundation Models Test Program.--In 
carrying out the program under subsection (b), the Under Secretary and 
the Secretary shall, jointly carry out a test program to provide 
vendors of foundation models, as well as vendors of artificial 
intelligence virtual agents and robots that incorporate foundation 
models, the opportunity to voluntarily test foundation models across a 
range of modalities, such as models that ingest and output text, 
images, audio, video, software code, and mixed modalities.
    (j) Matters Relating to Disclosure and Access.--
            (1) Limitation on access to content.--Access to a 
        contributing private sector person's voluntarily provided 
        confidential content, as deemed confidential by the 
        contributing private sector person, shall be limited to the 
        contributing private sector person and the Institute.
            (2) Aggregated information.--The Under Secretary and the 
        Secretary may make aggregated, deidentified information 
        available to contributing companies, the public, and other 
        agencies, as the Under Secretary considers appropriate, in 
        support of the purposes of this section.
            (3) FOIA exemption.--Any confidential content, as deemed 
        confidential by the contributing private sector person, shall 
        be exempt from public disclosure under section 552(b)(3) of 
        title 5, United States Code.
    (k) Rule of Construction.--Nothing in this section shall be 
construed to require a person to disclose any information, including 
information--
            (1) relating to a trade secret or other protected 
        intellectual property right;
            (2) that is confidential business information; or
            (3) that is privileged.
    (l) Sunset.--The programs required by subsections (b) and (i) and 
the requirements of this section shall terminate on the date that is 7 
years after the date of the enactment of this Act.

SEC. 103. NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY AND DEPARTMENT 
              OF ENERGY TESTBED TO IDENTIFY, TEST, AND SYNTHESIZE NEW 
              MATERIALS.

    (a) In General.--The Secretary of Commerce, acting through the 
Under Secretary of Commerce for Standards and Technology, and the 
Secretary of Energy may use the program established under section 
102(b) to advance materials science and energy storage and optimization 
and to support advanced manufacturing for the benefit of the United 
States economy through the use of artificial intelligence, autonomous 
laboratories, and artificial intelligence integrated with emerging 
technologies, such as quantum hybrid computing and robotics.
    (b) Support for Accelerated Technologies.--The Secretary of 
Commerce and the Secretary of Energy shall ensure that technologies 
accelerated under subsection (a) are supported by advanced algorithms 
and models, uncertainty quantification, and software and workforce 
development tools to produce benchmark data, model comparison tools, 
and best practices guides.
    (c) Public-Private Partnerships.--In carrying out subsection (a), 
the Secretary of Commerce and the Secretary of Energy shall, in 
consultation with industry, civil society, and academia, enter into 
such public-private partnerships as the Secretaries jointly determine 
appropriate.
    (d) Resources.--In carrying out this section, the Secretaries may--
            (1) use science and technology resources from the 
        Manufacturing USA Program, the Hollings Manufacturing Extension 
        Partnership, the National Laboratories, Federal laboratories, 
        and the private sector; and
            (2) the program established under section 102(b).

SEC. 104. COORDINATION, REIMBURSEMENT, AND SAVINGS PROVISIONS.

    (a) Coordination and Duplication.--The Secretary of Commerce shall 
take such actions as may be necessary to ensure no duplication of 
activities carried out under this subtitle with the activities of--
            (1) research entities of the Department of Energy, 
        including--
                    (A) the National Laboratories; and
                    (B) the Advanced Scientific Computing Research 
                program; and
            (2) relevant industries.
    (b) National Laboratory Resources.--Any advanced computing 
resources, testbeds, expertise, or other resources of the Department of 
Energy or the National Laboratories that are provided to the National 
Science Foundation, the National Institute of Standards and Technology, 
or any other applicable entities under this subtitle shall be 
provided--
            (1) on a reimbursable basis; and
            (2) pursuant to a reimbursable agreement.
    (c) Waiver.--The Secretary may waive the requirements set forth in 
subsection (b) if the Secretary determines the waiver is necessary or 
appropriate to carry out the missions of the Department of Commerce.
    (d) Savings Provision.--Nothing in this subtitle shall be 
construed--
            (1) to modify any requirement or authority provided under 
        section 5501 of the National Artificial Intelligence Initiative 
        Act of 2020 (15 U.S.C. 9461); or
            (2) to allow the Secretary of Commerce (including the Under 
        Secretary of Commerce for Standards and Technology or the 
        Director of the Center for Artificial Intelligence Standards 
        and Innovation) or the Director of the National Science 
        Foundation to use monetary resources of the Department of 
        Energy or any National Laboratory.

SEC. 105. PROGRESS REPORT.

    (a) In General.--Not later than 1 year after the date of the 
enactment of this Act, the Under Secretary of Commerce for Standards 
and Technology shall, in coordination with the Secretary of Commerce 
and the Secretary of Energy, submit to Congress a report on the 
implementation of sections 102 and 103.
    (b) Contents.--The report submitted pursuant to subsection (a) 
shall include the following:
            (1) A description of the reimbursable agreements, 
        statements of work, and associated project schedules and 
        deliverables for the testbed program established pursuant to 
        section 102(b) and section 103(a).
            (2) Details on the total amount of reimbursable agreements 
        entered into pursuant to section 104(b).
            (3) Such additional information as the Under Secretary 
        determines appropriate.

                 Subtitle B--International Cooperation

SEC. 111. INTERNATIONAL COALITIONS ON INNOVATION, DEVELOPMENT, AND 
              ALIGNMENT OF STANDARDS WITH RESPECT TO ARTIFICIAL 
              INTELLIGENCE.

    (a) In General.--The Under Secretary of Commerce for Standards and 
Technology (in this section referred to as the ``Under Secretary'') and 
the Secretary of Energy (in this section referred to as the 
``Secretary'') shall jointly lead information exchange and coordination 
among Federal agencies and communication from Federal agencies to the 
private sector of the United States and like-minded governments of 
foreign countries to ensure effective Federal engagement in the 
development and use of international technical standards for artificial 
intelligence.
    (b) Requirements.--To support private sector-led engagement and 
ensure effective Federal engagement in the development and use of 
international technical standards for artificial intelligence, the 
Under Secretary shall seek to form alliances or coalitions with like-
minded governments of foreign countries--
            (1) to support the private sector-led development and 
        adoption of standards or alignment with respect to artificial 
        intelligence;
            (2) to encourage technical standards developed in the 
        United States to be adopted by international standards 
        organizations and to advocate for international approaches to 
        governance of artificial intelligence that promote innovation 
        and counter influence from foreign adversaries;
            (3) to facilitate international collaboration on 
        innovation, science, and advancement in artificial intelligence 
        research and development, including data sharing, expertise, 
        and resources;
            (4) to develop the government-to-government infrastructure 
        to support the activities described in paragraphs (1) through 
        (3), using existing bilateral and multilateral agreements to 
        the extent practicable;
            (5) to work with like-minded governments on identifying 
        best practices to maintain cybersecurity of artificial 
        intelligence models; and
            (6) to work in coordination with the Secretary of State, 
        the National Security Council, and the National Science 
        Foundation to develop, implement, and share information on 
        complementary technology protection measures, including in 
        basic research and higher education, to mitigate risks of 
        exploitation by foreign adversaries.
    (c) Criteria for Participation.--In forming an alliance or 
coalition of like-minded governments of foreign countries under 
subsection (b), the Secretary of Commerce, the Secretary of Energy, the 
Secretary of State, and the Director, in consultation with the heads of 
relevant agencies, shall jointly establish technology trust criteria--
            (1) to ensure all partner countries have a high level of 
        scientific and technological advancement; and
            (2) to support the principles for international standards 
        development as detailed in the Committee Decision on World 
        Trade Organization Agreement on Technical Barriers to Trade 
        (Annex 2 of Part 1 of G/TBT/1), on international standards, 
        such as transparency, openness, and consensus-based decision 
        making.
    (d) Consultation on Innovation and Advancements in Artificial 
Intelligence.--In forming an alliance or coalition under subsection 
(b), the Director, the Secretary of Commerce, and the Secretary of 
State shall consult with the Secretary of Energy and the Director of 
the National Science Foundation on approaches to innovation and 
advancements in artificial intelligence.
    (e) Security and Protection of Intellectual Property.--The 
Director, the Secretary of Commerce, the Secretary of Energy, and the 
Secretary of State shall jointly ensure that an alliance or coalition 
formed under subsection (b) is only undertaken with countries that--
            (1) have in place sufficient intellectual property 
        protections, safety standards, and risk management approaches 
        relevant to innovation and artificial intelligence; and
            (2) develop and coordinate research security measures, 
        export controls, and intellectual property protections relevant 
        to innovation, development, and standard-setting relating to 
        artificial intelligence.
    (f) Limitation on Eligibility of the People's Republic of China.--
            (1) In general.--The People's Republic of China is not 
        eligible to participate in an alliance or coalition of like-
        minded governments of foreign countries under subsection (b) 
        until the United States Trade Representative determines in a 
        report to Congress required by section 421 of the U.S.-China 
        Relations Act of 2000 (22 U.S.C. 6951) that the People's 
        Republic of China has come into compliance with the commitments 
        it made in connection with its accession to the World Trade 
        Organization.
            (2) Report required.--Upon the submission of a report 
        described in paragraph (1), the officials specified in 
        paragraph (3) shall jointly submit to Congress a report that 
        includes the following:
                    (A) A detailed justification for why government-to-
                government information exchange and coordination with 
                the Government of the People's Republic of China is in 
                the national security interests of the United States.
                    (B) An assessment of the risks and potential 
                effects of such coordination, including any potential 
                for the transfer under an alliance or coalition 
                described in paragraph (1) of technology or 
                intellectual property capable of harming the national 
                security interests of the United States.
                    (C) A detailed justification for how the officials 
                specified in paragraph (3) intend to address human 
                rights concerns in any scientific and technology 
                collaboration proposed to be conducted by such an 
                alliance or coalition.
                    (D) An assessment of the extent to which those 
                officials will be able to continuously monitor the 
                commitments made by the People's Republic of China in 
                participating in such an alliance or coalition.
                    (E) Such other information relating to such an 
                alliance or coalition as those officials consider 
                appropriate.
            (3) Officials specified.--The officials specified in this 
        paragraph are the following:
                    (A) The Director.
                    (B) The Secretary of Commerce.
                    (C) The Secretary of Energy.
                    (D) The Secretary of State.
    (g) Rule of Construction.--Nothing in this section shall be 
construed--
            (1) to prohibit a person (as defined in section 551 of 
        title 5, United States Code) from participating in an 
        international standards body; or
            (2) to constrain separate engagement with emerging 
        economies on artificial intelligence.

       Subtitle C--Identifying Regulatory Barriers to Innovation

SEC. 121. COMPTROLLER GENERAL OF THE UNITED STATES IDENTIFICATION OF 
              RISKS AND OBSTACLES RELATING TO ARTIFICIAL INTELLIGENCE 
              AND FEDERAL AGENCIES.

    (a) Report Required.--Not later than 1 year after the date of the 
enactment of this Act, the Comptroller General of the United States 
shall submit to Congress a report on regulatory impediments to 
innovation in artificial intelligence systems.
    (b) Contents.--The report submitted pursuant to subsection (a) 
shall include the following:
            (1) Significant examples of Federal statutes and 
        regulations that directly affect the innovation of artificial 
        intelligence systems, including the ability of companies of all 
        sizes to compete in artificial intelligence, which should also 
        account for the effect of voluntary standards and best 
        practices developed with contributions from the Federal 
        Government.
            (2) An evaluation of the progress in government adoption of 
        artificial intelligence and use of artificial intelligence to 
        improve the quality of government services.
            (3) An evaluation of, and examples of, where artificial 
        intelligence assists Federal agencies deliver services to the 
        public, including towards combating fraud, and ways to increase 
        opportunities for increased use of such artificial intelligence 
        systems by the Federal Government.
            (4) Examples of Federal laws and regulations relating to 
        infrastructure and energy that unduly burden artificial 
        intelligence systems.
            (5) Based on the findings of the Comptroller General with 
        respect to paragraphs (1) through (5), such recommendations as 
        the Comptroller General may have for legislative or 
        administrative action to increase the rate of innovation in 
        artificial intelligence systems.

   TITLE II--ARTIFICIAL INTELLIGENCE RESEARCH, DEVELOPMENT, CAPACITY 
                          BUILDING ACTIVITIES

SEC. 201. PUBLIC DATA FOR ARTIFICIAL INTELLIGENCE SYSTEMS.

    (a) In General.--Title LI of the National Artificial Intelligence 
Initiative Act of 2020 (15 U.S.C. 9411 et seq.) is amended by adding at 
the end the following new section:

``SEC. 5103A. PUBLIC DATA FOR ARTIFICIAL INTELLIGENCE SYSTEMS.

    ``(a) List of Priorities.--
            ``(1) In general.--To expedite the development of 
        artificial intelligence systems in the United States, the 
        Director of the Office of Science and Technology Policy (in 
        this section referred to as the `Director') shall, acting 
        through the National Science and Technology Council and the 
        Interagency Committee and in consultation with the Advisory 
        Committee on Data for Evidence Building established under 
        section 315 of title 5, United States Code, develop a list of 
        priorities for Federal investment in creating or improving 
        curated, publicly available Federal Government data for 
        training and evaluating artificial intelligence systems and 
        identify an appropriate location to host curated datasets.
            ``(2) Requirements.--
                    ``(A) In general.--The list developed pursuant to 
                paragraph (1) shall--
                            ``(i) prioritize data that will advance 
                        novel artificial intelligence systems in the 
                        public interest;
                            ``(ii) prioritize datasets that are the 
                        result of scientific research that was funded 
                        by the Federal Government; and
                            ``(iii) prioritize datasets unlikely to 
                        independently receive sufficient private sector 
                        support to enable their creation, absent 
                        Federal funding.
                    ``(B) Datasets identified.--In carrying out 
                subparagraph (A)(ii), the Director shall identify 20 
                datasets to be prioritized.
            ``(3) Considerations.--In developing the list under 
        paragraph (1), the Director shall consider the following:
                    ``(A) Applicability to the initial list of 
                societal, national, and geostrategic challenges set 
                forth by subsection (b) of section 10387 of the 
                Research and Development, Competition, and Innovation 
                Act (42 U.S.C. 19107), or any successor list.
                    ``(B) Applicability to the initial list of key 
                technology focus areas set forth by subsection (c) of 
                such section, or any successor list.
                    ``(C) Applicability to other major United States 
                economic sectors, such as agriculture, health care, 
                transportation, manufacturing, biotechnology, 
                communications, weather services, and positive utility 
                to small- and medium-sized United States businesses.
                    ``(D) Opportunities to improve datasets in effect 
                before the date of the enactment of the Future of 
                Artificial Intelligence Innovation Act of 2026.
                    ``(E) Inclusion of data representative of the 
                entire population of the United States.
                    ``(F) Potential national security threats to 
                releasing datasets, consistent with the United States 
                Government approach to data flows.
                    ``(G) Requirements of laws in effect.
                    ``(H) Applicability to the priorities listed in the 
                National Artificial Intelligence Research and 
                Development Strategic Plan of the National Science and 
                Technology Council, dated October 2016, and subsequent 
                updates, and the priorities listed in Winning the Race, 
                America's AI Action Plan, dated July 2025.
                    ``(I) Ability to use data already made available to 
                the National Artificial Intelligence Research Resource 
                Pilot program or any successor program.
                    ``(J) Coordination with other Federal open data 
                efforts, as applicable.
                    ``(K) Requirements for researchers funded by the 
                Federal Government to disclose nonproprietary, 
                nonsensitive datasets that are used by artificial 
                intelligence models during the course of research and 
                development.
                    ``(L) Opportunities for the National Science 
                Foundation to maintain integrated, interoperable, and 
                multimodal datasets readily providing access to 
                scientific and engineering demonstration projects.
            ``(4) Public input.--Before finalizing the list required by 
        paragraph (1), the Director shall implement public comment 
        procedures for receiving input and comment from private 
        industry, academia, civil society, and other relevant 
        stakeholders.
    ``(b) Interagency Committee.--In carrying out this section, the 
Interagency Committee--
            ``(1) may establish or leverage existing initiatives, 
        including through public-private partnerships, for the creation 
        or improvement of curated datasets identified in the list 
        developed pursuant to subsection (a)(1), including methods for 
        addressing data scarcity;
            ``(2) may apply the priorities set forth in the list 
        developed pursuant to subsection (a)(1) to the enactment of 
        Federal public access and open government data policies;
            ``(3) shall ensure consistency with Federal provisions of 
        law relating to privacy, including the technology and privacy 
        standards applied to the National Secure Data Service under 
        section 10375(f) of the Research and Development, Competition, 
        and Innovation Act (42 U.S.C. 19085(f)); and
            ``(4) shall ensure that no data sharing is permitted with 
        any country that the Secretary of Commerce, in consultation 
        with the Secretary of Defense, the Secretary of State, the 
        Secretary of Energy, and the Director of National Intelligence, 
        determines to be engaged in conduct that is detrimental to the 
        national security or foreign policy of the United States.
    ``(c) Availability of Datasets.--Datasets that are created or 
improved pursuant to this section--
            ``(1) shall, in the case of a dataset created or improved 
        by a Federal agency, be made available to the comprehensive 
        data inventory developed and maintained by the Federal agency 
        pursuant to section 3511(a) of title 44, United States Code, in 
        accordance with all applicable regulations; and
            ``(2) may be made available to the National Artificial 
        Intelligence Research Resource pilot program established by the 
        Director of the National Science Foundation, and the applicable 
        programs established by the Department of Energy, in accordance 
        with Executive Order 14110 (88 Fed. Reg. 75191; relating to 
        safe, secure, and trustworthy development and use of artificial 
        intelligence), or any successor program.
    ``(d) Report.--Not later than 1 year after the date of the 
enactment of the Future of Artificial Intelligence Innovation Act of 
2026, the Director shall, acting through the National Science and 
Technology Council and the Interagency Committee, submit to the 
Committee on Commerce, Science, and Transportation of the Senate and 
the Committee on Science, Space, and Technology of the House of 
Representatives a report that includes--
            ``(1) best practices in developing publicly curated 
        artificial intelligence datasets;
            ``(2) lessons learned and challenges encountered in 
        developing the curated artificial intelligence datasets;
            ``(3) principles used for artificial intelligence-ready 
        data;
            ``(4) recommendations relating to artificial intelligence-
        ready data standards and potential processes for development of 
        such standards;
            ``(5) recommendations for maintaining and expanding the 
        availability of high-quality data sets;
            ``(6) recommendations for methods to increase incentives 
        for researchers support by the Federal Government to release 
        high-quality publicly available datasets, that protects against 
        risks to disclosure of personally identifiable information and 
        national and economic security risks; and
            ``(7) recommendations for establishing secure compute 
        environments at the National Science Foundation to enable 
        secure artificial intelligence use cases for controlled access 
        to restricted Federal data.
    ``(e) Rules of Construction.--
            ``(1) In general.--Nothing in this section shall be 
        construed to require the Federal Government or other 
        contributors to disclose any information--
                    ``(A) relating to a trade secret or other protected 
                intellectual property right;
                    ``(B) that is confidential business information; or
                    ``(C) that is privileged.
            ``(2) Disclosure to public datasets.--Except as 
        specifically provided for in this section, nothing in this 
        section shall be construed to prohibit the head of a Federal 
        agency from withholding information from a public dataset.''.
    (b) Clerical Amendments.--The table of contents at the beginning of 
section 2 of the William M. (Mac) Thornberry National Defense 
Authorization Act for Fiscal Year 2021 and the table of contents at the 
beginning of title LI of such Act are both amended by inserting after 
the items relating to section 5103 the following new item:

``5103A. Public data for artificial intelligence systems.''.

SEC. 202. FEDERAL GRAND CHALLENGES IN ARTIFICIAL INTELLIGENCE.

    (a) In General.--Title LI of the National Artificial Intelligence 
Initiative Act of 2020 (15 U.S.C. 9411 et seq.), as amended by section 
201, is further amended by adding at the end the following new section:

``SEC. 5107. FEDERAL GRAND CHALLENGES IN ARTIFICIAL INTELLIGENCE.

    ``(a) Establishment of Program.--
            ``(1) In general.--Not later than 1 year after the date of 
        the enactment of the Future of Artificial Intelligence 
        Innovation Act of 2026, the Director of the Office of Science 
        and Technology Policy (acting through the National Science and 
        Technology Council) and the Interagency Committee may establish 
        a program to award prizes, using the authorities and processes 
        established under section 24 of the Stevenson-Wydler Technology 
        Innovation Act of 1980 (15 U.S.C. 3719), to eligible 
        participants as determined by the co-chairs of the Interagency 
        Committee pursuant to subsection (e).
            ``(2) Purposes.--The purposes of the program required by 
        paragraph (1) are as follows:
                    ``(A) To expedite the development of artificial 
                intelligence systems in the United States.
                    ``(B) To stimulate artificial intelligence 
                research, development, and commercialization that 
                solves or advances specific, well-defined, and 
                measurable challenges in 1 or more of the categories 
                established pursuant to subsection (b).
    ``(b) Federal Grand Challenges in Artificial Intelligence.--
            ``(1) List of priorities.--The Director of the Office of 
        Science and Technology Policy (acting through the National 
        Science and Technology Council) and the Interagency Committee 
        and in consultation with industry, civil society, and academia, 
        shall identify, and annually review and update as the Director 
        considers appropriate, a list of priorities for Federal grand 
        challenges in artificial intelligence pursuant to the purposes 
        set forth under subsection (a)(2).
            ``(2) Initial list.--
                    ``(A) Contents.--The list established pursuant to 
                paragraph (1) may include the following priorities:
                            ``(i) To overcome challenges with 
                        engineering of and applied research on 
                        microelectronics, including through integration 
                        of artificial intelligence with emerging 
                        technologies, such as neuromorphic and quantum 
                        computing, or with respect to the physical 
                        limits on transistors, advanced interconnects, 
                        and memory elements.
                            ``(ii) To promote transformational or long-
                        term advancements in computing and artificial 
                        intelligence technologies through--
                                    ``(I) next-generation algorithm 
                                design;
                                    ``(II) next-generation compute 
                                capability;
                                    ``(III) generative and adaptive 
                                artificial intelligence for design 
                                applications;
                                    ``(IV) photonics-based 
                                microprocessors and optical 
                                communication networks, including 
                                electrophotonics;
                                    ``(V) the chemistry and physics of 
                                new materials;
                                    ``(VI) biotechnology, such as 
                                modeling a single cell;
                                    ``(VII) energy use or energy 
                                efficiency;
                                    ``(VIII) techniques to establish 
                                cryptographically secure content 
                                provenance information; or
                                    ``(IX) safety and controls for 
                                artificial intelligence applications.
                            ``(iii) To promote explainability and 
                        mechanistic interpretability of artificial 
                        intelligence systems.
                            ``(iv) To advance fundamental understanding 
                        of artificial intelligence, including through 
                        breakthroughs in theoretical, computational, 
                        and experimental methods that discover new and 
                        transformative paradigms that explain the 
                        advanced capabilities of artificial 
                        intelligence in domains such as the following:
                                    ``(I) Interpretability.
                                    ``(II) Control.
                                    ``(III) Steerability.
                                    ``(IV) Robustness against foreign 
                                adversaries.
                            ``(v) To develop artificial intelligence 
                        solutions, including through integration among 
                        emerging technologies such as neuromorphic and 
                        quantum computing to overcome barriers relating 
                        to innovations in advanced manufacturing in the 
                        United States, including areas such as--
                                    ``(I) materials, nanomaterials, and 
                                composites;
                                    ``(II) rapid, complex design;
                                    ``(III) sustainability and 
                                environmental impact of manufacturing 
                                operations;
                                    ``(IV) predictive maintenance of 
                                machinery;
                                    ``(V) improved part quality;
                                    ``(VI) process inspections;
                                    ``(VII) worker safety; and
                                    ``(VIII) robotics.
                            ``(vi) To develop artificial intelligence 
                        solutions in sectors of the economy, such as 
                        expanding the use of artificial intelligence in 
                        maritime vessels, including in navigation and 
                        in the design of propulsion systems and fuels.
                            ``(vii) To develop artificial intelligence 
                        solutions to improve border security, including 
                        solutions relevant to the detection of 
                        fentanyl, illicit contraband, and other illegal 
                        activities.
                            ``(viii) To develop artificial intelligence 
                        for science applications.
                            ``(ix) To develop cybersecurity for 
                        artificial intelligence-related intellectual 
                        property, such as artificial intelligence 
                        systems and artificial intelligence algorithms, 
                        including robustness, resilience, and security 
                        from foreign adversaries.
                            ``(x) To develop artificial intelligence 
                        solutions to modernize code and software 
                        systems that are deployed in government 
                        agencies and critical infrastructure and are at 
                        risk of maintenance difficulties due to code 
                        obsolescence or challenges finding expertise in 
                        outdated code bases.
                            ``(xi) To develop solutions to reduce the 
                        energy consumption in developing, deploying, 
                        and maintain data-efficient and high-
                        performance artificial intelligence models.
                            ``(xii) To develop methods to prevent 
                        misuse of artificial intelligence systems for 
                        malicious purposes.
                            ``(xiii) To find applications of artificial 
                        intelligence in wireless communications 
                        systems, including cellular networks and 
                        cybersecurity efforts.
                            ``(xiv) To advance the capabilities of 
                        artificial intelligence, robotics, and 
                        automation for physical laboratory 
                        infrastructure and cloud laboratories.
            ``(3) Consultation on identification and selection of grand 
        challenges.--The Director of the Office of Science and 
        Technology Policy, the Director of the National Institute of 
        Standards and Technology, the Director of the Defense Advanced 
        Research Projects Agency, such agency heads as the Director of 
        the Office of Science and Technology Policy considers relevant, 
        and the National Artificial Intelligence Advisory Committee 
        shall each identify and select artificial intelligence research 
        and development grand challenges in which eligible participants 
        will compete to solve or advance for prize awards under 
        subsection (a).
            ``(4) Public input on identification.--The Director of the 
        Office of Science and Technology Policy shall also seek public 
        input on the identification of artificial intelligence research 
        and development grand challenges under subsection (a).
            ``(5) Problem statements; success metrics.--For each 
        priority for a Federal grand challenge identified under 
        paragraph (1) and the grand challenges identified and selected 
        under paragraph (3), the Director of the Office of Science and 
        Technology Policy shall--
                    ``(A) establish a specific and well-defined grand 
                challenge problem statement and ensure that such 
                problem statement is published on a website linking out 
                to relevant prize competition listings on the website 
                Challenge.gov, or successor website, that is managed by 
                the General Services Administration; and
                    ``(B) establish and publish on the website 
                Challenge.gov, or successor website, clear targets, 
                success metrics, and validation protocols for the prize 
                competitions designed to address each grand challenge, 
                in order to provide specific benchmarks that will be 
                used to evaluate submissions to the prize competition.
    ``(c) Federal Investment Initiatives Authorized.--Subject to the 
availability of amounts appropriated for this purpose, the Secretary of 
Commerce, the Secretary of Transportation, the Director of the National 
Science Foundation may, consistent with the missions or 
responsibilities of each Federal agency, establish 1 or more prize 
competitions under section 24 of the Stevenson-Wydler Technology 
Innovation Act of 1980 (15 U.S.C. 3719), challenge-based acquisitions, 
or other research and development investments that each agency head 
deems appropriate consistent with the list of priorities established 
pursuant to subsection (b)(1).
    ``(d) Requirements.--
            ``(1) In general.--The Director of the Office of Science 
        and Technology Policy shall develop requirements for--
                    ``(A) the process for prize competitions under 
                subsections (a) and (c), including eligibility criteria 
                for participants, consistent with the requirements 
                under paragraph (2); and
                    ``(B) testing, judging, and verification procedures 
                for submissions to receive a prize award under 
                subsection (c).
            ``(2) Eligibility requirement and judging.--
                    ``(A) Eligibility.--In accordance with the 
                requirement described in section 24(g)(3) of the 
                Stevenson-Wydler Technology Innovation Act of 1980 (15 
                U.S.C. 3719(g)(3)), a recipient of a prize award under 
                subsection (c)--
                            ``(i) that is a private entity shall be 
                        incorporated in and maintain a primary place of 
                        business in the United States; and
                            ``(ii) who is an individual, whether 
                        participating singly or in a group, shall be a 
                        citizen or permanent resident of the United 
                        States.
                    ``(B) Judges.--In accordance with section 24(k) of 
                the Stevenson-Wydler Technology Innovation Act of 1980 
                (15 U.S.C. 3719(k)), a judge of a prize competition 
                under subsection (c) may be an individual from the 
                private sector.
            ``(3) Agency leadership.--Each agency head carrying out an 
        investment initiative under subsection (c) shall ensure that--
                    ``(A) for each prize competition or investment 
                initiative carried out by the agency head under such 
                subsection, there is--
                            ``(i) a positive impact on the economic 
                        competitiveness of the United States;
                            ``(ii) a benefit to United States industry;
                            ``(iii) to the extent possible, leveraging 
                        of the resources and expertise of industry and 
                        philanthropic partners in shaping the 
                        investments; and
                            ``(iv) in a case involving development and 
                        manufacturing, use of advanced manufacturing in 
                        the United States; and
                    ``(B) all research conducted for purposes of the 
                investment initiative is conducted in the United 
                States.
    ``(e) Reports.--
            ``(1) Notification of winning submission.--Not later than 
        60 days after the date on which a prize is awarded under 
        subsection (c), the agency head awarding the prize shall submit 
        to the Committee on Commerce, Science, and Transportation of 
        the Senate, the Committee on Science, Space, and Technology of 
        the House of Representatives, and such other committees of 
        Congress as the agency head considers relevant a report that 
        describes the winning submission to the prize competition and 
        its benefits to the United States.
            ``(2) Biennial report.--
                    ``(A) In general.--Not later than 2 years after the 
                date of the enactment of the Future of Artificial 
                Intelligence Innovation Act of 2026, and biennially 
                thereafter, the heads of agencies described in 
                subsection (c) shall submit to the Committee on 
                Commerce, Science, and Transportation of the Senate, 
                the Committee on Science, Space, and Technology of the 
                House of Representatives, and such other committees of 
                Congress as the agency heads consider relevant a report 
                that includes--
                            ``(i) a description of the activities 
                        carried out by the agency heads under this 
                        section;
                            ``(ii) a description of the active 
                        competitions and the results of completed 
                        competitions under subsection (c); and
                            ``(iii) efforts to provide information to 
                        the public on active competitions under 
                        subsection (c) to encourage participation.
                    ``(B) Public accessibility.--The agency heads 
                described in subsection (c) shall make the biennial 
                report required under subparagraph (A) publicly 
                accessible, including by posting the biennial report on 
                a website in an easily accessible location, such as the 
                GovInfo website of the Government Publishing Office.
    ``(f) Accessibility.--In carrying out any competition under 
subsection (c), the head of an agency shall post the active prize 
competitions and available prize awards under subsection (b) to 
Challenge.gov, or successor website, after the grand challenges are 
selected and the prize competitions are designed pursuant to 
subsections (c) and (e) to ensure the prize competitions are widely 
accessible to eligible participants.
    ``(g) Sunset.--This section shall terminate on the date that is 5 
years after the date of the enactment the Future of Artificial 
Intelligence Innovation Act of 2026.''.
    (b) Comptroller General of the United States Studies and Reports.--
            (1) Initial study.--
                    (A) In general.--Not later than 1 year after the 
                date of enactment of this Act, the Comptroller General 
                of the United States shall conduct a study of Federal 
                prize competitions, which shall include an assessment 
                of the efficacy and impact of prize competitions 
                generally.
                    (B) Elements.--The study conducted under 
                subparagraph (A) shall include, to the extent 
                practicable, the following:
                            (i) A survey of all existing, current and 
                        ongoing Federal prize competitions carried out 
                        under authorities enacted before the date of 
                        the enactment of this Act.
                            (ii) An assessment of those existing, 
                        current, and ongoing Federal prize competitions 
                        that includes addressing--
                                    (I) whether and what technology or 
                                innovation would have been developed in 
                                the absence of the prize competitions;
                                    (II) whether the prize competitions 
                                shortened the timeframe for the 
                                development of the technology or 
                                innovation;
                                    (III) whether the prize competition 
                                was cost effective;
                                    (IV) what, if any, other benefits 
                                were gained from conducting the prize 
                                competitions;
                                    (V) whether the use of a more 
                                traditional policy tool such as a grant 
                                or contract have resulted in the 
                                development of a similar technology or 
                                innovation;
                                    (VI) whether prize competitions 
                                might be designed differently in a way 
                                that would result in a more effective 
                                or revolutionary technology being 
                                developed;
                                    (VII) what are appropriate metrics 
                                that could be used for determining the 
                                success of a prize competition, and 
                                whether those metrics differ when 
                                evaluating near-term and long-term 
                                impacts of prize competitions; and
                                    (VIII) suggested best practices of 
                                prize competitions.
                    (C) Congressional briefing.--Not later than 540 
                days after the date of the enactment of this Act, the 
                Comptroller General shall provide the Committee on 
                Science, Space, and Technology and the Committee on 
                Energy and Natural Resources of the Senate and the 
                Committee on Energy and Commerce of the House of 
                Representatives a briefing on the findings of the 
                Comptroller General with respect to the study conducted 
                under subparagraph (A).
                    (D) Report.--Not later than 540 days after the date 
                of the enactment of this Act, the Comptroller General 
                shall submit to the congressional committees specified 
                in subparagraph (C) a report on the findings and 
                recommendations of Comptroller General from the study 
                conducted under subparagraph (A).
            (2) Interim study.--
                    (A) In general.--The Comptroller General of the 
                United States shall conduct a study of the Federal 
                prize challenges implemented under section 5108 of the 
                of the National Artificial Intelligence Initiative Act 
                of 2020, as added by subsection (a), which shall 
                include an assessment of the efficacy and effect of 
                such prize competitions.
                    (B) Elements.--The study conducted under 
                subparagraph (A) shall include, to the extent 
                practicable, the following:
                            (i) A survey of all Federal prize 
                        competitions implemented under section 5108 of 
                        the of the National Artificial Intelligence 
                        Initiative Act of 2020, as added by subsection 
                        (a).
                            (ii) An assessment of the Federal prize 
                        competitions implemented such section, which 
                        shall include addressing the same 
                        considerations as set forth under paragraph 
                        (1)(B)(ii).
                            (iii) An assessment of the efficacy, 
                        impact, and cost-effectiveness of prize 
                        competitions implemented under section 5108 of 
                        the of the National Artificial Intelligence 
                        Initiative Act of 2020, as added by subsection 
                        (a), compared to other Federal prize 
                        competitions.
                    (C) Congressional briefing.--Not later than 1 year 
                after completing the study required by subparagraph 
                (A), the Comptroller General shall provide the 
                Committee on Science, Space, and Technology and the 
                Committee on Energy and Natural Resources of the Senate 
                and the Committee on Energy and Commerce of the House 
                of Representatives a briefing on the findings of the 
                Comptroller General with respect to the study conducted 
                under subparagraph (A).
                    (D) Report.--Not later than 180 days after the date 
                of the enactment of this Act, the Comptroller General 
                shall submit to the congressional committees specified 
                in subparagraph (C) a report on the findings and 
                recommendations of the Comptroller General with respect 
                to the study conducted under subparagraph (A).
    (c) Clerical Amendments.--The table of contents at the beginning of 
section 2 of the William M. (Mac) Thornberry National Defense 
Authorization Act for Fiscal Year 2021 and the table of contents at the 
beginning of title LI of such Act, as amended by section 201, are both 
amended by inserting after the items relating to section 5107 the 
following new item:

``5107. Federal grand challenges in artificial intelligence.''.

             TITLE III--RESEARCH SECURITY AND OTHER MATTERS

SEC. 301. RESEARCH SECURITY.

    The activities authorized under this Act shall be carried out in 
accordance with the provision of subtitle D of title VI of the Research 
and Development, Competition, and Innovation Act (42 U.S.C. 19231 et 
seq.; enacted as part of division B of Public Law 117-167) and section 
223 of the William M. (Mac) Thornberry National Defense Authorization 
Act for Fiscal Year 2021 (42 U.S.C. 6605).

SEC. 302. EXPANSION OF AUTHORITY TO HIRE CRITICAL TECHNICAL EXPERTS.

    (a) In General.--Subsection (b) of section 6 of the National 
Institute of Standards and Technology Act (15 U.S.C. 275) is amended, 
in the second sentence, by striking ``15'' and inserting ``30''.
    (b) Modification of Sunset.--Subsection (c) of such section is 
amended by striking ``under section (b) shall expire on the date that 
is 5 years after the date of the enactment of this section'' and 
inserting ``under subsection (b) shall expire on December 30, 2035''.

SEC. 303. CERTIFICATIONS AND AUDITS OF TEMPORARY FELLOWS.

    (a) Definitions.--In this section:
            (1) Agency.--The term ``agency'' has the meaning given such 
        term in section 3502 of title 44, United States Code.
            (2) Committees of jurisdiction.--The term ``committees of 
        jurisdiction'' means--
                    (A) the Committee on Commerce, Science, and 
                Transportation and the Committee on Energy and Natural 
                Resources of the Senate; and
                    (B) the Committee on Energy and Commerce and the 
                Committee on Science, Space, and Technology of the 
                House of Representatives.
            (3) Critical and emerging technologies.--The term 
        ``critical and emerging technologies'' means a subset of 
        artificial intelligence and other critical and emerging 
        technologies included in the list of such technologies 
        identified and maintained by the National Science and 
        Technology Council of the Office of Science and Technology 
        Policy.
            (4) Inherently governmental function.--The term 
        ``inherently governmental function'' has the meaning given such 
        term in section 5 of the Federal Activities Inventory Reform 
        Act of 1998 (Public Law 105-270; 31 U.S.C. 501 note) and 
        includes the meaning given such term in subpart 7.5 of part 7 
        of the Federal Acquisition Regulation, or successor regulation.
            (5) Temporary fellow.--The term ``temporary fellow'', with 
        respect to an agency, means a fellow, contractor, consultant, 
        or any other person performing work for the agency who is not a 
        Federal Government employee.
    (b) Certification.--
            (1) In general.--Prior to performing any work for an agency 
        under this Act relating to artificial intelligence and other 
        critical and emerging technologies, a temporary fellow and the 
        head of the agency shall sign a certification that the 
        temporary fellow will not perform any inherently governmental 
        functions.
            (2) Submittal.--Not later than 30 days after the date on 
        which the head of an agency signs a certification under 
        paragraph (1), the head of the agency shall submit a copy of 
        the certification to the Director of the Office of Management 
        and Budget and the chairpersons and ranking members of the 
        committees of jurisdiction.
    (c) Audit.--
            (1) In general.--For each agency using a temporary fellow 
        to carry out this Act, the inspector general of the agency 
        shall perform an annual audit of the use of temporary fellows 
        by the agency, which includes--
                    (A) the number of temporary fellows used by the 
                agency;
                    (B) the entities paying any temporary fellow for 
                their work for the agency;
                    (C) the work temporary fellows are performing for 
                the agency;
                    (D) the authorities under which the agency hired 
                the temporary fellows; and
                    (E) whether the temporary fellows and the agency 
                are complying with the requirements of section (b).
            (2) Submittal to congress.--Not later than 30 days after 
        the date on which the inspector general of an agency completes 
        an audit under paragraph (1), the head of the agency shall 
        submit to the chairpersons and ranking members of the 
        committees of jurisdiction and the Director of the Office of 
        Management and Budget a report containing the findings of 
        inspector general with respect to the audit.
                                 <all>