[House Hearing, 110 Congress]
[From the U.S. Government Publishing Office]


 
  NUCLEAR SMUGGLING DETECTION: RECENT TESTS OF ADVANCED SPECTROSCOPIC 
                            PORTAL MONITORS

=======================================================================

                                HEARING

                               before the

                        SUBCOMMITTEE ON EMERGING
                        THREATS, CYBERSECURITY,
                       AND SCIENCE AND TECHNOLOGY

                                 of the

                     COMMITTEE ON HOMELAND SECURITY
                        HOUSE OF REPRESENTATIVES

                       ONE HUNDRED TENTH CONGRESS

                             SECOND SESSION

                               __________

                             MARCH 5, 2008

                               __________

                           Serial No. 110-99

                               __________

       Printed for the use of the Committee on Homeland Security
                                     

[GRAPHIC] [TIFF OMITTED] TONGRESS.#13


                                     

  Available via the World Wide Web: http://www.gpoaccess.gov/congress/
                               index.html

                               __________

                     U.S. GOVERNMENT PRINTING OFFICE
43-964 PDF                 WASHINGTON DC:  2008
---------------------------------------------------------------------
For Sale by the Superintendent of Documents, U.S. Government Printing Office
Internet: bookstore.gpo.gov  Phone: toll free (866) 512-1800; (202) 512ï¿½091800  
Fax: (202) 512ï¿½092104 Mail: Stop IDCC, Washington, DC 20402ï¿½0900012008


                     COMMITTEE ON HOMELAND SECURITY

               Bennie G. Thompson, Mississippi, Chairman

Loretta Sanchez, California          Peter T. King, New York
Edward J. Markey, Massachusetts      Lamar Smith, Texas
Norman D. Dicks, Washington          Christopher Shays, Connecticut
Jane Harman, California              Mark E. Souder, Indiana
Peter A. DeFazio, Oregon             Tom Davis, Virginia
Nita M. Lowey, New York              Daniel E. Lungren, California
Eleanor Holmes Norton, District of   Mike Rogers, Alabama
Columbia                             David G. Reichert, Washington
Zoe Lofgren, California              Michael T. McCaul, Texas
Sheila Jackson Lee, Texas            Charles W. Dent, Pennsylvania
Donna M. Christensen, U.S. Virgin    Ginny Brown-Waite, Florida
Islands                              Gus M. Bilirakis, Florida
Bob Etheridge, North Carolina        David Davis, Tennessee
James R. Langevin, Rhode Island      Paul C. Broun, Georgia
Henry Cuellar, Texas
Christopher P. Carney, Pennsylvania
Yvette D. Clarke, New York
Al Green, Texas
Ed Perlmutter, Colorado
Bill Pascrell, Jr., New Jersey

       Jessica Herrera-Flanigan, Staff Director & General Counsel

                     Rosaline Cohen, Chief Counsel

                     Michael Twinchek, Chief Clerk

                Robert O'Connor, Minority Staff Director

                                 ______

   SUBCOMMITTEE ON EMERGING THREATS, CYBERSECURITY, AND SCIENCE AND 
                               TECHNOLOGY

               James R. Langevin, Rhode Island, Chairman

Zoe Lofgren, California              Michael T. McCaul, Texas
Donna M. Christensen, U.S. Virgin    Daniel E. Lungren, California
Islands                              Ginny Brown-Waite, Florida
Bob Etheridge, North Carolina        Paul C. Broun, Georgia
Al Green, Texas                      Peter T. King, New York (Ex 
Bill Pascrell, Jr., New Jersey       Officio)
Bennie G. Thompson, Mississippi (Ex 
Officio)

                   Jacob Olcott, Director and Counsel

       Dr. Chris Beck, Senior Advisor for Science and Technology

                       Carla Zamudio-Dolan, Clerk

           Kevin Gronberg, Minority Professional Staff Member

                                  (II)


                            C O N T E N T S

                              ----------                              
                                                                   Page

                               Statements

The Honorable James R. Langevin, a Representative in Congress 
  From the State of Rhode Island, and Chairman, Subcommittee on 
  Emerging Threats, Cybersecurity, and Science and Technology....     1
The Honorable Michael T. McCaul, a Representative in Congress 
  From the State of Texas, and Ranking Member, Subcommittee on 
  Emerging Threats, Cybersecurity, and Science and Technology....     3

                               Witnesses

Ms. Elaine C. Duke, Deputy Under Secretary for Management, 
  Department of Homeland Security:
  Oral Statement.................................................     4
  Prepared Statement.............................................     6
Mr. Vayl S. Oxford, Director, Domestic Nuclear Detection Office, 
  Department of Homeland Security:
  Oral Statement.................................................     8
  Prepared Statement.............................................    10
Mr. George E. Thompson, Deputy Director, Programs, Homeland 
  Security Institute:
  Oral Statement.................................................    18
  Prepared Statement.............................................    19


  NUCLEAR SMUGGLING DETECTION: RECENT TESTS OF ADVANCED SPECTROSCOPIC 
                            PORTAL MONITORS

                              ----------                              


                        Wednesday, March 5, 2008

             U.S. House of Representatives,
                    Committee on Homeland Security,
      Subcommittee on Emerging Threats, Cybersecurity, and 
                                    Science and Technology,
                                                    Washington, DC.
    The subcommittee met, pursuant to call, at 2:10 p.m., in 
Room 311, Longworth House Office Building, Hon. James R. 
Langevin [chairman of the subcommittee] presiding.
    Present: Representatives Langevin, Christensen, Green, 
Pascrell, and McCaul.
    Mr. Langevin. The subcommittee will come to order.
    The subcommittee is meeting today to receive testimony on 
two recent reports: The Independent Review Panel Report and 
DNDO's Phase III Test Report which details a test of advanced 
spectroscopic portal monitors.
    Before I go into the hearing itself and start with my 
opening statement, we are expecting votes around 2:15, 
unfortunately. It is my intention to try to get as far through 
the opening statements as possible; then we will recess for 
what is I understand one vote. Then we will come back and 
continue with statements if necessary, and then go into 
questions.
    Good afternoon. I want to thank the witnesses for being 
here at this very important hearing.
    Today we are discussing a very important project for the 
Domestic Nuclear Detection Office, and that is the advanced 
spectroscopic portal monitor program.
    This subcommittee held its first hearing on this topic 1 
year ago. We will continue to provide robust oversight on this 
project until we are assured that we are deploying the best 
technology possible to detect radiological and nuclear 
materials coming across our borders.
    Given that we are holding a public hearing on this topic, I 
think it is fair to say that we are not quite there yet, though 
I do commend DNDO for improving our screening capabilities, 
along with both our southern and northern borders as well as 
our seaports.
    We are currently scanning 100 percent of all incoming cargo 
on the southern border, 98 percent at the Nation's seaports, 98 
percent at the northern border. Mr. Oxford assures me that we 
will be at 100 percent on the northern border by next year. I 
think these are important points of progress, and I want to 
acknowledge that we are making progress in these areas.
    While I applaud DNDO for its aggressive pursuit of new 
detection technologies, I still remain deeply concerned that 
the advanced spectroscopic portal monitors have not been 
properly tested and evaluated. In one of the reports that we 
will discuss today, the independent review team reiterates a 
recommendation made by GAO over a year ago: Deployment and 
testing should not be done by the same organization. This lack 
of rigorous and independent testing program can easily lead to 
the development and even the deployment of ineffective 
equipment.
    The two reports under consideration today both raise as 
many questions and concerns as they answer. It is my hope that 
the advanced spectroscopic portal monitors will ultimately 
function as intended, but at this point I expect Secretary 
Chertoff will require many more unequivocal test results before 
certifying to Congress that the ASP represents a significant 
increase in operational effectiveness over the currently 
deployed systems.
    Perhaps the field test underway by CBP will add to this 
needed level of clarity and if not, all the steps, I know, 
will.
    Not all the news presented in these reports is bad. There 
are several results that point to progress of the ASP program. 
However, both reports are extremely nuanced and do not seem to 
give clear, strong indications of whether this project is 
achieving its stated goals. In fact, there are several 
statements included in these reports that, if taken at face 
value and on their own, would cause most people to say that ASP 
has failed or, at the very least, is in serious jeopardy of 
doing so.
    For example, Phase III report states that when it comes to 
identifying mass sources, the polyvinyl toluene performed--PVT 
systems perform better than ASP. However, upon a more extensive 
review of the entire report, it becomes evident that this 
statement does not mean exactly what it seems to say, and I 
will explore this issue further in my questions.
    I don't want to see this program fail; none of us do. Let 
me say this again: I don't want to see the program fail; it is 
far too important. The advantages of an effective spectroscopic 
portal program along the current binary alarm/no alarm option 
under the current PVT monitors would represent a major increase 
in Homeland Security, but only if they work as intended.
    Current tests will still have not unequivocally 
demonstrated in an operational setting that the ASP represents 
a significant improvement over current technologies.
    This is an important point. While it may be easy to only 
focus on the simplistic big-picture items, this is a subtle and 
nuanced issue that we cannot afford to overlook. I hope that 
this hearing will allow us to delve into the details of these 
two reports and to make clear exactly what was said and what 
was not and where we go from here.
    I should note that the GAO has been a great help and a 
trusted source on this issue for Congress for various reasons. 
No GAO representatives were able to attend today's hearing for 
legitimate reasons, but we will continue to rely on their 
counsel in the future and implore those witnesses here today to 
cooperate with GAO and to fulfill their role as Congress' 
trusted auditor.
    With that, I just want to again thank the witnesses for 
being here today and look forward to your testimony.
    The Chair now recognizes the Ranking Member of the 
subcommittee, the gentleman from Texas, Mr. McCaul, for an 
opening statement.
    Mr. McCaul. I thank the Chairman. I agree with your 
assessment that we do not want to see this fail. The American 
people can't afford to see it fail. It is too important.
    I want to begin by thanking the witnesses for being here. 
Mr. Oxford, you are certainly no stranger to this committee. At 
this time last year you were here to describe DNDO's deployment 
strategy for radiation portal monitors. As we saw in your 
written testimony, you have assisted Customs and Border 
Protection, I should say, in deploying monitors that are now 
scanning incoming cargo for radiological and nuclear materials 
at a volume of 91 percent on our northern border and 98 percent 
at our Nation's seaports. Coming from a border State myself, I 
was very pleased to see the 100 percent screening on our 
southern, southwest border.
    I commend you and your staff for your hard work and service 
to the Nation. I am sure the Department will agree that while 
progress has been made in scanning incoming cargo, the current 
generation of radiation portal monitors is far from perfect. 
Frequent nuisance alarms due to radioactive, but legitimate 
material such as cat litter or medical therapeutics require 
manpower-intensive secondary screening by Customs officers. It 
is very time-consuming but not as efficient as new technology 
hopefully will provide.
    To improve the efficiency of scanning people and cargo at 
our ports of entry, DNDO has led a development program to 
provide the next generation of radiation portal monitors, 
detectors that can discriminate between threat materials that 
can be used in a nuclear weapon or dirty bomb and other 
materials that pose no threat to the Nation at all. The 
advanced spectroscopic portal monitor also known as ASP could 
provide this capability.
    I commend the Chairman on holding this hearing today as 
part of our continued oversight of the ASP program, a program 
that resides solely within the jurisdiction of this committee.
    It has been said that we have to get it right all the time, 
while the terrorists only have to get it right once. However, 
if we can establish a system of layered defenses, that would 
turn that argument on its head; now the terrorist has to get it 
right at each layer in the system, and we only have to get it 
right once.
    I see the ASP program as a critical layer in our system of 
defense against the transport of radiological and nuclear 
threat material. The delays in the certification and deployment 
of the ASP system, a technology that could accurately identify 
threat materials without impeding commerce, are of grave 
concern to this committee.
    That is why today Ranking Member King of the full committee 
and I have introduced legislation that will assist the 
Department in certifying this technology by clarifying 
congressional intent on the requirements for certification and 
the metrics to be considered in evaluating ASP's system 
performance. This legislation should help the Department keep 
their intended goal of making a decision on ASP certification 
by the end of this fiscal year.
    I hope we can achieve that goal within the end of this 
fiscal year, and I thank the witnesses for being here.
    Thank you, Mr. Chairman.
    Mr. Langevin. I thank the Ranking Member. Given the fact 
that the vote has just been called, we will recess right now 
for about 10 minutes for what I believe is just one vote and 
then return for the witnesses' statements.
    The committee stands in recess.
    [Recess.]
    Mr. Langevin. The committee will come to order. The Ranking 
Member has been detained, but he has given his indication to go 
ahead without him, and he will return momentarily.
    I want to welcome, again, our first panel of witnesses. Our 
first witness is Mr. Vayl Oxford, Director of the Domestic 
Nuclear Detection Office at the Department of Homeland 
Security. He has appeared before this panel many, many times to 
discuss this and other topics. We welcome him back here today.
    The next witness is Dr. George Thompson, who is the Deputy 
Director of Programs at the Homeland Security Institute. After 
several other individuals stepped down from the position of 
Chair of the Independent Review Panel, Dr. Thompson took up the 
position, which allowed the independent review team to complete 
its work. Welcome.
    Our third witness is Ms. Elaine Duke, the Deputy Under 
Secretary for Management at the Department of Homeland 
Security. Her office originally called for an independent 
review panel to be done, and her office will be a key part of 
incorporating the findings of the panel's report into its 
recommendations to the Secretary on whether to ultimately 
certify the ASP program for full procurement.
    So I want to thank the witnesses for being here. Without 
objection, the witnesses' full statements will be inserted into 
the record.
    I now ask each witness to summarize his or her statement 
for 5 minutes, beginning with Secretary Duke.

    STATEMENT OF ELAINE C. DUKE, DEPUTY UNDER SECRETARY FOR 
          MANAGEMENT, DEPARTMENT OF HOMELAND SECURITY

    Ms. Duke. Good afternoon. Thank you, Mr. Chairman, Ranking 
Member McCaul, and members of the committee. It is a pleasure 
to appear before you today to talk about the advanced 
spectroscopic portal system. Today is my first time before you 
as the Deputy Under Secretary for Management. I have been in 
this position for about 5 months but have spent most of my 25 
years of public service in the procurement profession, most 
recently as the Department's chief procurement officer.
    The Deputy Under Secretary of Management position was 
created as a part of the Department's transition planning 
efforts to ensure operational continuity during the change of 
administration in January 2009. My position currently holds the 
authority of the Under Secretary for Management, as the current 
Under Secretary, Mr. Paul Schneider, is serving as the Acting 
Deputy Secretary.
    To start I would like to convey my top priorities which are 
essential elements to achieving the DHS mission and practicing 
sound stewardship of the taxpayers' money.
    First is preparing the Department for the first 
administration transition; second, improving acquisition and 
procurement; and third, strengthening the requirements process 
and integrating it into the planning, programming, budgeting 
and execution process in the Department.
    My goal as the Deputy Under Secretary for Management as it 
relates to transition is to focus on three areas: internal 
processes, knowledge management and relationship building.
    In addition to transition planning and focusing on 
transforming the procurement office into a full-fledged 
acquisition office, often procurement and acquisition are 
incorrectly used interchangeably. Procurement is just one 
element of the acquisition management. Today we are talking 
about another one, test and evaluation.
    Acquisition includes the full operation life cycle 
requirements process using sound business strategies, financial 
management and managing program risks. We are making progress 
toward this goal. In August 2007 we established the Acquisition 
Program Management Division to provide oversight and support 
the acquisition programs.
    Today we have performed assessments of 37 of our largest 
level one programs and have provided advice and guidance, 
particularly in the area of cost/benefit analysis. To that end, 
I am here today to discuss one of our major acquisition 
programs, the ASP program.
    The acquisition of ASP systems is of national importance 
and vital priority for the Department to continue toward this 
mission of protecting the country from dangerous goods. The 
acquisition develops the next-generation radiation portal 
monitor and has the ability to rapidly identify the presence 
and type of radioactive materials present in the cargo entering 
the United States. It will allow us to distinguish from 
harmless sources, thereby decreasing the rate of false alarms 
resulting in unhindered flow of commerce.
    Before DHS uses its appropriated funds to deploy this new 
technology, Congress has directed Secretary Chertoff to certify 
that a significant increase in operational effectiveness will 
be achieved. In July 2007, the Secretary announced his intent 
to perform an independent review of the ASP test procedures, 
test results and associated technology assessments.
    My role as the Deputy Under Secretary in this process is to 
understand the outcome of this independent review and be in a 
position to advise the Secretary in making his certification 
decision. In my opinion, as DHS considers the best way forward, 
this independent review provides valuable assistance to the 
Secretary and me in moving toward conclusion of this program.
    The independent review is not an unusual exercise as it is 
in line with reviews we have conducted in other major programs. 
It is a standard practice we are developing, modeled after the 
Department of Defense, and will help us in our decisionmaking 
process and our acquisition programs.
    There are several things we have already learned from the 
independent review. First, to define up-front increased 
operational effectiveness so we can appropriately test ASP's 
capabilities, which is critical to our ability to certify the 
system.
    Second, to develop a new test and evaluation master plan 
that will clearly demonstrate the test results and whether ASP 
does indeed provide the increased operational effectiveness.
    Third, to include all the ownership costs in updating the 
cost/benefit analysis for this new phase of the program.
    I have asked Mr. John Higbee to lead this effort from an 
acquisition perspective. He is the director of our Acquisition 
Program Manager Provision; and Mr. George Ryan, from our Under 
Secretary of Science of Technology, who is our director of Test 
and Evaluation Standards. They provide valuable insight as we 
move forward in the next test phase of this program.
    Mr. Chairman, I thank you for the opportunity to testify 
before you and am happy to answer any of yours and the 
committee's questions. Thank you.
    Mr. Langevin. Thank you, Ms. Duke, I appreciate your 
testimony.
    [The statement of Ms. Duke follows:]

                  Prepared Statement of Elaine C. Duke
                             March 5, 2008

    Thank you Mr. Chairman, Representative McCaul and Members of the 
committee. It is a pleasure to appear before you today to talk about 
the Advanced Spectroscopic Portal (ASP).
    This is my first time before you as the Deputy Under Secretary for 
Management (DUSM). I have been in this position for over 5 months but 
have spent most of my 25 years of public service in the procurement 
profession, most recently as the Department's Chief Procurement 
Officer.
    The Deputy Under Secretary for Management position was created as 
part of the Department's 2009 Administration Transition Planning 
efforts. By having a senior career civil servant in this capacity, 
rather than a political appointee, the Department can ensure 
operational continuity during the change in administration. As the 
current Under Secretary for Management, Mr. Paul Schneider is serving 
as the Acting Deputy Secretary, my position holds the authorities of 
the Under Secretary for Management.
    At present, the most significant management challenge the 
Department has is continuing the effort that was mandated at the 
Department's creation: merging 22 agencies with approximately 208,000 
people and turning it into the most effective force to protect our 
country. This effort requires effective and efficient use of financial 
and human resources; enabling technology, strong processes and superb 
management. It is toward this effort that I devote my time, energy, and 
contributions.
    As the Deputy Under Secretary for Management, it is my duty to lead 
the Management Directorate's efforts in the development of the 
Department, with a focused, well-thought strategy.
    The major elements of our strategy are:
   Improving acquisition and procurement throughout the 
        Department.
   Strengthening the requirements and investment review 
        processes.
   Acquiring and maintaining human capital.
   Seeking efficiencies across the enterprise in operations and 
        the use of resources.
   Making the key management systems, such as financial and 
        information technology, world class.
   Acquire funding for DHS' consolidation at St. Elizabeths 
        West Campus and the efficient realignment of all Department of 
        Homeland Security (DHS) off-campus locations
    To start, I would like to convey my top priorities, which are 
essential elements to achieving the DHS mission and practicing sound 
stewardship of taxpayers' money:
   First: Preparing for the Department's first ever 
        administration transition;
   Second: Improving acquisition and procurement;
   Third: Strengthening the requirements process and 
        integrating it into the Planning, Programming, Budgeting, and 
        Execution (PPBE).
    My goal as the DUSM as it relates to Transition is to focus on 
three areas: Internal Processes, Knowledge Management, and Relationship 
Building. The Internal Processes initiative will review our Directives, 
strengthen records management and our processes for incoming and 
exiting employees. The Knowledge Management initiative will produce 
briefing materials, but more importantly, convey to career executives 
and incoming appointees the requisite knowledge to keep the Department 
running. The Relationship Building initiative will facilitate direct 
interactions among Federal, State, local and tribal officials with 
homeland security responsibility.
    In addition to transition planning, my focus is to transform the 
Office of Chief Procurement Officer (CPO) into an Acquisition Office. 
Often, Procurement and Acquisition are incorrectly used 
interchangeably. Procurement, however, is only one element of 
acquisition management. Acquisition also includes understanding 
operational and life-cycle requirements, such as formulating concepts 
of operations, developing sound business strategies, exercising prudent 
financial management, assessing tradeoffs, and managing program risks. 
Best practice acquisition management is executed by teams of 
professionals who understand and are able to manage the entire life-
cycle of a major program effort. We are making progress toward this 
goal.
    The Acquisition Program Management Division (APMD) of CPO began 
operations in August 2007. The division was established to provide 
oversight and support for acquisition programs. To date APMD has 
performed Quick Look assessments of 37 level 1 programs and has 
overseen Deep Dive reviews of the SBInet and ASP programs. APMD has 
provided advice and guidance to a number of programs, particularly in 
the area of cost benefit analysis. Currently the APMD team is focused 
on an aggressive Investment & Acquisition process re-engineering 
effort. The effort includes replacing Directive 1400, establishing 
revised investment and acquisition decision procedures, as well as 
processes for, acquisition program baselining, periodic reporting, 
acquisition of services, and other initiatives.
    DHS' $17 billion procurement spend plan provides for the 
development, fielding and support of significant homeland security 
capabilities. For example, U.S. Coast Guard contracts are providing 
aircraft and ships from the Integrated Deepwater System (IDS) and 
search and rescue capability from the Rescue 21 program. Transportation 
Security Administration (TSA) contracts are providing additional 
capabilities via the Electronic Baggage Screening Program (EBSP). 
Consistent with the SBI Strategy, U.S. Customs and Border Protection 
(CBP) is developing and fielding the capabilities at and between our 
Nation's ports of entry to gain effective control of our borders. The 
Domestic Nuclear Detection Office is developing and testing a new type 
of radiation portal monitor known as the Advanced Spectroscopic Portal 
(ASP) to improve the Nation's defense against the threat of nuclear 
smuggling.
    I am here today to discuss the Advanced Spectroscopic Portal (ASP). 
The acquisition of ASP systems is of national importance and vital 
priority for the Department to continue toward its mission of 
protecting the country from dangerous goods. This acquisition develops 
the next generation Radiation Portal Monitor (RPM) and has the ability 
to not only detect the presence of radiation in cargo entering the 
United States, but also to rapidly identify the type of radioactive 
material(s) present.
    In 2005, the Domestic Nuclear Detection Office (DNDO) took on the 
responsibility to develop a second generation RPM prototype, now known 
as ASP. The intent of this initiative was to decrease the rate of false 
alarms and close the gaps in coverage. Or in other words, increase 
operational effectiveness and rapidly detect the presence and the type 
of radioactive material present. This system would allow us to 
distinguish harmless sources, such as kitty litter from those that 
might pose a threat. As a result of these improved detection 
capabilities, the flow of commerce would proceed unhindered.
    Before Congress would appropriate the funds to deploy this new 
technology, it included restrictive language in the Homeland Security 
Appropriations Act for Fiscal Year 2007 requiring the Secretary to 
certify that a significant increase in operational effectiveness will 
be achieved before funds will be appropriated. In July 2007, Secretary 
Chertoff announced his intent to ``assemble a highly experienced team'' 
to perform ``an independent review of the [ASP] test procedures, test 
results, [and] associated technology assessments''.
    My role as the DUSM in this process is to understand the outcome of 
this independent review and be in a position to advise the Secretary in 
determining whether he should certify that there will be a significant 
increase in operational effectiveness with the procurement of ASP 
systems. In my opinion, this independent review provides valuable 
assistance to the Secretary, to the Department Acquisition Executive, 
Chair of the DHS Investment Review Board and me, as DHS considers the 
best way forward.
    This is not an unusual exercise as it is in line with the reviews 
we conduct for our all our major programs within the Department. 
Furthermore, this is a standard practice of other Departments within 
the U.S. Government, such as Defense, to improve their decisionmaking 
processes regarding major programs.
    The Department appreciates the need for rigorous review to ensure 
that the Department acquires the crucial capability to preventing the 
smuggling of nuclear materials across our borders. It is entirely 
appropriate for DHS to leverage the resources of the executive branch 
to gather information to make an informed decision on a critical 
program. We consider the independent review of this system to be 
complementary to GAO's investigation of ASP. As an agent of Congress, 
GAO provides information to Congress in support of its oversight 
function. We intend to review and consider these reports from both 
sources in determining a way forward.
    There are several things we have already learned from the 
Independent Review. First, ASP operational testing is critical to our 
ability to certify the system. I have asked Mr. John Higbee, Director, 
Acquisition Program Management to oversee ASP operational testing, 
working with independent Operational Test and Evaluation experts both 
internal and external to the Department. By conducting ASP operational 
testing, we will improve our ability to make an informed decision on 
this program. Testing of this type will also allow us to continue to 
exercise more oversight over the Department's acquisition programs as 
well as strengthen our requirements and investment review process.
    Second: We are mindful of the need for the ASP program to 
demonstrate ``increased operational effectiveness'', and the interest 
Congress has in this criterion. We are working with DNDO and CBP to 
ensure that the ASP testing program is structured to conclusively 
determine this critical point.
    Third: We learned the ``cost'' portion of the Cost Benefit Analysis 
(CBA) should include all ownership costs, including maintenance and 
support; and the ``benefits'' portion of the CBA should be consistent 
with the logic used to define ``operational effectiveness.'' Therefore, 
we will carefully review the CBA to ensure that the alternatives are 
well-defined and that the assumptions, data inputs, and calculations 
are sound.
    Finally: After reviewing and addressing the Independent Review 
Team's, and GAO's findings; and after considering the test results and 
all information provided within this process, the Secretary will be 
prepared to make a decision on whether he should certify that there 
will be a significant increase in operational effectiveness with the 
procurement of ASP systems.
    Mr. Chairman, thank you for the opportunity to testify before the 
committee on this very important topic. I would be glad to answer any 
questions you or the Members of the committee may have for me.

    Mr. Langevin. The Chair now recognizes Mr. Oxford for 5 
minutes. Welcome.

    STATEMENT OF VAYL S. OXFORD, DIRECTOR, DOMESTIC NUCLEAR 
       DETECTION OFFICE, DEPARTMENT OF HOMELAND SECURITY

    Mr. Oxford. Good afternoon, Mr. Chairman, Ranking Member 
McCaul and other Members of the subcommittee. I would like to 
thank the subcommittee for the opportunity to discuss recent 
progress in ASP development and the recently released Phase III 
test report and the final report of the ASP independent review 
team.
    Before addressing these reports, I would like to update the 
committee on progress since I last appeared before you. Since 
that time as has been previously mentioned, we have reached 
several milestones.
    First of all, in December we met the congressionally 
mandated goal of scanning 98 percent of the cargo coming in 
through our seaports. This is significant improvement over 
where we were 3 years ago when we were only scanning 22 percent 
of that cargo. When we couple that with 100 percent of the 
cargo we are scanning across the southern border, we are now 
scanning 96 percent of all cargo entering the United States on 
a daily basis. This is real and measurable progress. In working 
with CBP, we have a plan to complete the northern border in 
2009.
    Also in December we completed delivery of radiation 
detection equipment for all U.S. Coast Guard boarding teams. To 
address other threat pathways, we also delivered additional 
hand-held detection equipment to Customs and Border Protection; 
and as of December, Customs and Border Protection is now 
scanning all international general aviation airplanes arriving 
in the United States.
    Finally, DNDO has developed plans and is in the early 
stages of implementing a program of interest to this 
subcommittee to enhance physical security of high-risk 
radioactive sources in U.S. medical facilities.
    With respect to the IRT report I would like to highlight 
several key points:
    First it is important to recognize that DNDO is developing 
and evaluating detection systems in response to established 
threat guidance originally established by the Department of 
Energy for Customs and Border Protection. This approach 
establishes requirements to detect and identify quantities of 
plutonium or uranium that are actually a fraction of the likely 
amount or the amount likely required by a terrorist to 
construct nuclear weapons.
    Second and most importantly, the IRT reports that it could 
not find any signs of biased testing or manipulation of data. 
We thank the IRT for its efforts and intend to adopt several 
aspects of the recommendations. However, we find some 
shortfalls in the report.
    DNDO and CBP believe that ASP's systems provide significant 
advantages in both primary and secondary scanning roles, and 
there is a possibility that the greatest benefit will actually 
be realized in primary applications.
    The IRT report focused on secondary applications and as a 
result, did not include data showing that current rpms may miss 
some critical threats resulting in threats never being referred 
to secondary applications.
    By not conducting a full system, the system comparison, the 
ASP benefit of reducing the number of secondary referrals, was 
also ignored. Based on current data, ASP systems have the 
potential to reduce secondary referral rates by a factor of 5 
to 10 resulting in up to 150,000 less secondary inspections a 
year.
    For secondary applications, ASP systems provide a more 
consistent scan, as acknowledged by the IRT, and avoid the 
localization issues associated with using a hand-held detector 
in secondary scanning applications. As a result of being able 
to quickly scan the entire container and using time slicing to 
sample regions of interest and reduce background, ASP systems 
will be significantly more efficient and effective than the 
current hand-held device.
    Regarding the Phase III report, Phase III testing served 
many purposes, which provided an initial assessment of the 
range of ASP capabilities. As such, the tests subjected ASP 
systems to a variety of sources, masking cases and shielding 
configurations, and included scenarios more challenging than 
threat guidance that I previously mentioned.
    For example, we tested five shielding thicknesses that 
varied in difficulty, and above and below threat guidance 
level. We tested six types of masking materials of a 
representative cargo entering the United States, as well as 
challenging combinations involving industrial and medical 
sources that may potentially mask the presence of a threat.
    The bottom line is that we have explored ASP performance 
around the threat guidance to determine the extent of current 
ASP performance and enable algorithm improvement. We are using 
this data in the current system upgrades.
    Going forward, DNDO and CBP are planning additional tests 
to verify these capabilities leading to secretarial 
certification later this fiscal year.
    In the interest of time, Mr. Chairman, I will use a chart 
showing the upcoming test series that are available to you on 
the screens during the question-and-answer period if that is 
okay.
    With that, I will conclude my testimony and be glad to 
answer any questions.
    Mr. Langevin. Thank you, Director Oxford.
    [The statement of Mr. Oxford follows:]

                  Prepared Statement of Vayl S. Oxford
                             March 5, 2008

                              INTRODUCTION

    Good morning, Chairman Langevin, Ranking Member McCaul, and 
distinguished members of the subcommittee. As Director of the Domestic 
Nuclear Detection Office (DNDO), my office is responsible for 
developing new technologies and also ensuring that we deploy detection 
systems properly across the domestic nuclear detection architecture. I 
would like to thank the committee for the opportunity to discuss recent 
progress in the development of the next generation of radiation portal 
monitors (RPMs), or Advanced Spectroscopic Portal (ASP) systems. My 
testimony today will focus principally on the recently released Phase 
III Test Report, the Final Report of the ASP Independent Review Team 
(ASP-IRT), and the steps we will take to make a certification and 
production recommendation to the Secretary.

                            HISTORIC CONTEXT

    Countering the threat of nuclear terrorism is one of the top 
priorities for the Department of Homeland Security (DHS). DNDO is the 
lead agency responsible for the development, acquisition, and 
deployment of radiation detection equipment to support this mission 
within the Department. The ASP program is one of the programs DNDO has 
begun to improve radiation detection tools for operators, in this case 
Customs and Border Protection (CBP) Officers.
    Considerable progress has already been made using currently 
available technology. At ports of entry (POEs), RPMs are typically 
installed in a primary scanning location to detect the presence of 
radiation in cargo and vehicles. CBP operates additional RPMs and 
handheld radioisotopic identification devices (RIIDs) in secondary 
scanning locations to further investigate alarms originating in primary 
and identify the specific source of the radiation detected. As of 
February 8, 2008, 100 percent of all incoming cargo on the southern 
border is being scanned for the presence of radiological or nuclear 
material, as well as 98 percent at the Nation's seaports, and 91 
percent on the northern border. However, much work remains to close 
enduring gaps at many small border crossings along the northern border, 
as well as at small seaports. In addition, limits in the capabilities 
of current systems continue to present technical and operational 
challenges to those using the equipment.
    Unlike current systems which detect and identify radiation sources 
in a two-step process, ASP technology uses the radiation spectrum from 
the inspected material to make a single detection and identification 
decision. DNDO has maintained that this ability to differentiate 
between threat material and naturally occurring radioactive material 
(NORM) will reduce the number of alarms due to non-threat sources, 
reduce the number of containers and vehicles sent to secondary 
inspection, and dramatically improve the probability of correctly 
identifying and interdicting smuggled nuclear material during secondary 
inspections.
    In 2006, the Congress requested that DNDO complete a cost-benefit 
analysis of ASP systems, which DNDO subsequently issued in June 2006. 
In a later report (GAO 07-133R), the Government Accountability Office 
(GAO) raised concerns about performance and cost assumptions included 
in the DNDO cost-benefit analysis, and the Congress included further 
restrictions in the fiscal year 2007 Appropriations Act (Pub. L. 109-
295), directing that, ``none of the funds appropriated under this 
heading shall be obligated for full scale procurement of [ASP] monitors 
until the Secretary of Homeland Security has certified . . . that a 
significant increase in operational effectiveness will be achieved.''
    In order to provide the Secretary with all necessary information 
prior to a certification decision, DNDO launched a substantial test 
campaign from February 2007 through September 2007. This included three 
separate test series conducted at the Nevada Test Site (NTS), including 
the Phase III testing captured in the report that we are discussing 
today, as well as contractor verification testing, stream of commerce 
testing at the New York Container Terminal (NYCT), integration testing 
at the Pacific Northwest National Laboratory (PNNL), and field 
validation at eight operational sites. In addition, in late July 2007, 
Secretary Chertoff notified the Congress of his intent to ``assemble a 
highly experienced team'' to perform ``an independent review of the 
[ASP] test procedures, test results, [and] associated technology 
assessments.'' This group, known as the ASP-IRT, delivered a report to 
Elaine Duke, the DHS Deputy Under Secretary for Management, on February 
20, 2008. The ASP-IRT Report is the second document under discussion 
today.

                  PHASE III TEST REPORT--INTRODUCTION

    The Phase III Test Campaign, conducted at NTS in March 2007, was 
part of a larger series of tests conducted throughout 2007, designed to 
evaluate ASP performance. Specifically, Phase III testing was intended 
to collect data from challenging detection or identification cases, 
beyond those included in Phase I testing at NTS earlier in the year. In 
addition, Phase III testing was to support the development of concepts 
of operations, and provide an additional data collection opportunity 
for continued vendor development of improved detection and 
identification algorithms. Phase III testing was conducted in 
accordance with test plans developed by DNDO, in partnership with the 
National Institute of Standards and Technology (NIST), CBP and, to a 
limited extent, the Department of Energy (DOE) and its labs. The test 
plan included the incorporation of a variety of source and shielding 
configurations, and, in particular, configurations that were often more 
difficult than ``guidance'' detection goals. This point is particularly 
important when analyzing the results of the test.

             PHASE III TEST REPORT--OBJECTIVES AND RESULTS

    The Phase III Test Campaign was designed to evaluate several 
aspects of ASP system performance, with five primary objectives. Before 
discussing the objectives and results, it is necessary to provide 
several caveats that relate to the way the tests were designed, and how 
the specific objectives of the test affected the interpretation of the 
results obtained. First, it is important to reiterate that results 
indicating that ASP systems did not detect or identify some specific 
cases do not indicate that ASP systems did not work as designed. ASP 
systems are designed to operate to certain design thresholds. In some 
instances, Phase III test sources intentionally exceeded those 
thresholds to evaluate how far ASP performance continues. For instance, 
a number of test sources were selected that were shielded beyond 
amounts identified in government requirements, for the express purpose 
of understanding where ASP capabilities begin to ``fall off.'' The fact 
that ASP systems functioned beyond specified requirements should be 
considered a positive sign, rather than a sign of inherent flaws.
    Second, Phase III tests were intended to help DNDO and CBP better 
understand the full range of ASP system performance, and results will 
continue to guide further development efforts. Since testing in early 
2007, DNDO has provided results as feedback to the ASP vendors, and 
they have incorporated this data into subsequent design improvements. 
These improvements will be evaluated through additional test campaigns 
scheduled for this year.
    Finally, detailed Phase III results are classified at the SECRET 
level. Because individual results reveal vulnerabilities to both 
systems that are and will be deployed, performance of systems against 
specific sources cannot be discussed in an open setting. The results 
that follow have been intentionally generalized to avoid discussion of 
specific performance capabilities for systems. DNDO has previously 
provided classified test results to the committee staff. DNDO would be 
happy to provide the same information to committee Members in an 
appropriate environment.
    Detection sensitivity for plutonium surrogate.--The detection 
sensitivity of ASP systems was measured against a plutonium 
``surrogate,'' or a source that was designed to mimic the detectable 
signature of plutonium. This objective was specifically focused on 
ensuring that ASP systems met the CBP requirement that they be at least 
as sensitive as current-generation polyvinyl toluene (PVT)-based 
systems. ASP detection sensitivities were measured against PVT-based 
systems set at existing operational thresholds.
    For this representative source, ASP systems were more sensitive 
than PVT-based systems when operating at existing operational 
thresholds. As such, testing met the objective of assessing that ASP 
systems did not degrade detection performance, as compared to current 
systems.
    Relative performance as a function of source categories.--Phase III 
testing sought to compare relative performance of various ASP systems 
as a function of source categories. Source categories included bare, 
shielded, and ``masked'' special nuclear materials, bare, shielded and 
``masked'' industrial sources, and medical isotopes. In particular, 
this objective sought to identify any significant variation in 
performance between each ASP vendor design. Additionally, detection 
performance was also compared to PVT-based systems, and identification 
performance was compared to current-generation sodium iodide (NaI)-
based handheld detectors that are currently used to conduct secondary 
inspections.
    Due to the number of source categories evaluated as part of this 
objective, results are more complicated than those associated with 
other objectives. While on average no ASP system significantly 
outperformed the others with regard to detecting sources passing 
through portals at either 5 or 2 miles per hour, there were differences 
in system performance when evaluated against each source.
    With regard to comparisons between PVT-based and ASP systems, ASP 
systems outperformed PVT-based systems in detecting bare special 
nuclear materials, and both types of systems performed similarly 
against shielded special nuclear materials. ``Masked'' special nuclear 
materials resulted in higher alarm rates for PVT-based systems than ASP 
systems. Similarly, PVT-based systems demonstrated higher alarm rates 
for medical isotopes and industrial sources, though this was due to ASP 
decision software that categorized smaller sources as ``non-threats.'' 
Based on requests from CBP, revisions have since been made to ASP 
algorithms so that industrial sources will be referred for secondary 
scanning. Finally, and significantly, ASP systems outperformed handheld 
RIIDs in identifying all source categories, with the exception of bare 
industrial sources. Due to the extremely high signal strengths 
associated with industrial sources, performance between ASP systems and 
RIIDs was comparable in that instance.
    Effects of shielding on system performance.--This test campaign 
sought to provide preliminary measurements of the effects of shielding 
materials on system performance. In particular, tests evaluated the 
difference in ASP system response when different types of sources, 
including special nuclear materials, were placed inside varying amounts 
of shielding. For the purposes of this objective, system ``response'' 
included both detection and identification performance.
    As expected, all systems experienced difficulty in detecting and 
identifying certain heavily shielded materials, which results in signal 
strengths significantly below current ``guidance'' levels and 
requirements. This is consistent with performance of all passive 
detection systems. However, ASP systems were able to identify sources 
when placed inside almost all but the thickest shielding configuration 
tested.
    Relative performance for combined sources.--Phase III testing 
evaluated the relative performance of ASP systems against ``combined 
sources,'' where more than one emitting isotope was present. This 
portion of Phase III testing was designed to provide additional data 
collection opportunities for ASP vendors, in support of algorithm 
improvements. This testing was not designed to provide conclusive data 
as to the performance of ASP systems against ``masked'' sources.
    Phase III testing highlighted several areas where further study and 
algorithm development are required to reduce vulnerabilities. This data 
was provided to the ASP vendors, and software improvements are being 
incorporated into ASP revisions. In addition, the use of high-purity 
germanium-based systems, when operated in a ``wait-in mode,'' showed 
slightly better performance than other systems. However, these initial 
results are an indicator of potential capabilities, rather than proof 
of superior performance.
    Secondary screening for concepts of operations development.--
Additional evaluations were completed to assess varying concepts of 
operations for secondary scanning. ASP system performance was evaluated 
as a function of varying speeds and ``dwell times,'' or the amount of 
time that a source was present within the portal. Specifically, 
measurements were conducted as sources moved at several pre-set speeds 
through the portals, as well as instances where sources were stopped 
within the portal for a defined amount of time.
    The evaluations of concepts of operations demonstrated that 
scanning at 2 miles per hour, the current concept of operations for 
secondary scanning, could be sufficient for many source configurations. 
Results also indicated that longer dwell times for measurements may add 
value for the more challenging cases. However, it was not obvious that 
``wait-in mode'' concepts of operations provide advantages for certain 
threats.

                   PHASE III TEST REPORT--CONCLUSION

    The Phase III Test Campaign was a critical piece of a larger effort 
to evaluate the performance of ASP systems. Phase III testing was 
focused on testing the ASP systems with a significantly expanded 
variety of sources, shielding and ``masking'' configurations, and 
concepts of operations. The results have provided an additional data 
set in the on-going comparison of ASP performance to current systems. 
At the same time, the data gained from Phase III testing has limits, 
and it is critical that results are interpreted in the context of the 
original test objectives.

                      ASP-IRT REPORT--INTRODUCTION

    The ASP-IRT was tasked with providing two elements of assessment: 
(1) ``the testing approach, from contractor testing through operational 
testing, processes employed, specifications, test procedures, and 
analysis methods;'' and (2) ``the probability of success to detect and 
identify radiation and nuclear threats and assess the performance of 
the ASP [systems] compared to the first generation systems.'' The ASP-
IRT Report was a culmination of analyses aimed at assessing these two 
elements, conducted from August 2007 through February 2008. These 
analyses were based on information provided by DNDO and CBP, as well as 
other outside sources. This information included DNDO test plans, test 
reports from several ASP evaluations, and numerous discussions with 
officials from both DNDO and CBP. However, the analysis and conclusions 
reached were completely independent of either DNDO or CBP, and the 
resulting conclusions reflected the assessment of the ASP-IRT members.

                    ASP-IRT REPORT--INITIAL FINDINGS

    The ASP-IRT Report includes an Executive Summary which highlights 
conclusions of the document.
    While the ASP-IRT did not concur with assertions that the GAO made 
in September 2007 (GAO-07-1247T) that ASP testing in February through 
September 2007 ``used biased test methods that enhanced the performance 
of the ASPs,'' it did agree with other claims, including the fact that 
tests were ``not designed to measure the range of ASP system 
performance.'' In addition, the ASP-IRT indicated concern that test 
results and measures of effectiveness were not properly linked to 
operational outcomes, which led to difficulties in developing 
conclusions from the results. Fundamentally, the ASP-IRT asserted that 
testing to date was ``properly characterized as Developmental Test and 
Evaluation. Independent Operational Testing has not been conducted.''
    In evaluating the performance of ASP systems directly, as compared 
to first generation systems, the ASP-IRT focused solely on ASP 
secondary scanning operations. Based on initial independent analysis, 
the ASP-IRT concluded that for the 13 objects used in Phase I testing, 
using ASP systems ``did not affect the probability of a missed 
threat,'' when compared to current generation RIIDs. The ASP-IRT stated 
that this conclusion was based on the assumption that all RIID results 
of ``unknown'' where resolved by CBP Laboratory and Scientific Services 
(LSS), which provides technical support to CBP Officers at POEs. Yet, 
the ASP-IRT did allow that, based on an alternate assumption in which 
many RIID ``unknown'' alarms were resolved in the field, ``it appears 
that ASP could substantially reduce the probability of entry for nine 
of the 13 test objects--for most, by at least 20 to 30 percent and 
possibly by 30 to 50 percent.'' The ASPIRT was not able to draw any 
conclusion regarding the affect of ASP for the remaining four test 
objects.
    In addition, based on first principles calculations, the ASP-IRT 
asserted that ``the relative performance of the ASP [systems] and the 
RIID depends on several factors.'' The ASP-IRT argued that sample 
spectra from both systems would indicate comparable performance if a 
RIID is optimally placed. However, the ASP-IRT also acknowledged 
challenges associated with localizing radiation sources within a 
container, and the likelihood that operators may target the wrong ``hot 
spot'' for secondary inspection. The ASP-IRT also stated that ``ASP 
performance could be improved in all cases by slowing the passage of 
the truck through the portal, though there would be increased costs.'' 
The ASP-IRT noted the benefit of improved consistency in scanning 
provided by ASP systems, as compared to RIIDS, especially by ``reducing 
the impacts of operator inattention, fatigue, and variability of the 
placement of the RIID.'' Finally, the ASP-IRT also noted that 
``substituting the ASP for RIID in secondary screening would reduce the 
number of cases that qualify for referral to LSS under the current CBP 
CONOPS.''
    Additionally, the ASP-IRT made several additional observations 
based on their evaluation of the ASP program. These included a 
potential need for a more disciplined acquisition process to guide 
large DHS programs, an independent operational test and evaluation 
process, and a more well-defined requirements process to ensure that 
mission needs are properly accounted for in operational requirements.

       ASP-IRT REPORT--DNDO AND CBP RESPONSE TO INITIAL FINDINGS

    DNDO recognizes the thoughtful evaluation that the ASP-IRT provided 
to the Department, and values the critiques that were included in the 
Final Report. Several of the concerns that were raised are valid and 
the Department is taking steps to address these issues.
    Unfortunately, in some instances, analysis was limited to 
information immediately available, which was not in all cases a 
complete and accurate representation of events. In addition, due to the 
short time in which the ASP-IRT was tasked to produce a final report, 
subsequent iterations of information exchange that may have normally 
been performed were not feasible. Subsequently, staff from DNDO has met 
with ASP-IRT members, and many of the concerns that are outlined below 
were discussed. In many instances, it was acknowledged that as 
additional information that was provided to the ASP-IRT during their 
analysis, alternate conclusions emerged. In other instances, it appears 
that the ASP-IRT stands by its initial conclusions.
    Limitation of analysis to secondary scanning operations.--In 
Section I.C of the Report the ASP-IRT states that they ``sought to 
determine the extent to which the use of the ASP would impact the 
frequency of nuisance alarms and the probability of illicit radioactive 
materials passing through [POEs]--taking into account other equipment 
with which it might be used, as well as other means of detecting the 
illicit materials.'' However, in the same section, the ASP-IRT 
explained that they solely ``focused its analysis on the use of the ASP 
in the Secondary screening role.'' This approach to the analysis 
discounted the economic and time impacts of scanning delays due to high 
nuisance alarm rates in primary scanning. In addition, it also 
discounted the possibility that certain threats may never be referred 
to secondary screening. In the long term, DNDO and CBP expect that the 
greatest benefits of ASP technologies will be in these primary scanning 
operations, where DNDO testing at NYCT has already shown that ASP 
systems may reduce nuisance alarm rates by more than a factor of 10 
(1.70 percent for PVT systems and 0.11 percent and 0.12 percent for two 
ASP systems). A reduction of secondary referral rates of this 
magnitude, when averaged over the entire volume of cargo containers 
entering the United States annually, would potentially result in 
hundreds of thousands fewer secondary inspections required each year. 
The savings that the elimination of these inspections would have in the 
efficient processing of trade and manpower resources of CBP should not 
be ignored in what is argued to be a ``system-of-systems'' analysis.
    False dismissal rates in secondary inspections.--In its ``system-
of-systems'' approach, the ASP-IRT initially questioned the decision of 
DNDO to omit LSS analysis of RIID data from comparisons of system 
performance. DNDO has cited evidence that RIIDs produced ``unknown'' 
alarms in up to 60 percent of cases, leading to either increased 
requirements for physical inspections, or the potential for an 
inadvertent release of a threat. The ASP-IRT analysis instead assumed 
that all of these alarms would be sent to LSS for further analysis. The 
ASP-IRT raised questions about the validity of this assumption, based 
on contrary evidence from operational data, which indicated that actual 
LSS referral rates were less than one-tenth of the expected rates, 
based on evaluations of RIIDs. However, while acknowledging this 
discrepancy, the ASP-IRT only asserted that, ``If the ASP were to 
replace the RIID in Secondary screening, it seems likely that some 
fraction of these `unknown' cases would be properly resolved as NORM or 
else referred to LSS for resolution. However, based on the available 
data, we were not able to determine what that fraction would be.''
    The reality is that it is not operationally feasible to send all 
``unknown'' alarms to LSS for additional analysis. Operational data 
indicates that only 3,000 alarms are sent to LSS annually--far less 
than the minimum of 40,000 annually predicted by the ASP-IRT. DNDO 
estimates indicate that using ASP systems in secondary scanning 
operations would reduce the number of alarms requiring LSS analysis to 
approximately 3,000-4,000 per year, a number manageable with current 
LSS resources. More importantly, this would allow all alarms that 
should be referred to LSS to be actually referred to LSS, ensuring that 
no threats are mistakenly released into the Nation under an ``unknown'' 
alarm, even when CBP CONOPs are followed.
    First principles calculations of comparative performance.--In 
addition to providing analyses of available test data, the ASP-IRT 
performed a series of ``first principles'' calculations which attempted 
to predict the performance of hypothetical ASP systems and RIIDs. These 
calculations focused on the theoretical signal-to-noise ratios of the 
two systems, based upon distances from source to detector, the size of 
the detector, and the time interval of the scan. The ASP-IRT argues 
that the advantages provided by the additional detector size of ASP 
systems is, in some cases, outweighed by the shorter distance and 
longer scanning intervals provided by RIID systems. However, in initial 
calculations, the ASP-IRT assumed that RIIDs would be able to 
successfully locate the source ``hot spot'' and a lengthier (1 to 3 
minute) scan could be focused on that location, with a source to 
detector distance of 1 foot. The incorrect assumption that a RIID would 
be able to effectively localize and scan any source within 1 foot of a 
detector drastically affected the outcome, and significantly reduced 
the perceived improvements provided by ASP systems.
    The reality is that the height of containers (up to 13.5 feet) and 
the requirement that an operator hold the RIID (limiting effective 
detector height to 6 to 7 feet) make scanning the entire container 
surface with the RIID difficult. Additional calculations done by DNDO, 
and provided to the ASP-IRT, show that for sources located near the 
center of a loaded container the RIID is approximately as sensitive as 
the ASP but only over a 2-foot radius circle on the surface of the 
container. Outside of that radius, the sensitivity falls off 
drastically. This means that a single RIID measurement can only 
effectively scan approximately 2 percent of the area of the container. 
Test data indicate that it is difficult to accurately locate the 
correct ``hotspot'' at which to place the RIID, which further erodes 
the effectiveness of the unit. ASP systems, unlike RIIDs, stand 14 feet 
tall, and provide the ability to uniformly scan the entire contents of 
a container. In addition, this scan is performed in 15 seconds, as 
opposed to the 1 to 3 minutes per ``hot spot'' measurement by the RIID. 
To effectively scan the ``entire'' container with a RIID to the same 
consistency as an ASP would take approximately 1 hour, assuming only 1 
minute per scan. While the ASP-IRT acknowledges some of these 
challenges in the Report, they also propose alternative solutions, such 
as improved RIID software, or gantry systems for consistent scanning of 
containers. However, these calculations show that ASP is the solution 
that will effectively scan an entire container quickly, because even 
RIIDs with improved software (one recommendation of the ASP-IRT) would 
still be limited in effective detection ranges based on the smaller 
detector size and probability of localization error.
    Other effects that differentiate ASP Systems and RIIDs.--Finally, 
in addition to the issues highlighted above, DNDO has noted several 
other issues which affect the comparison of ASP systems and RIIDs that 
were not accounted for by the ASP-IRT. First, the ASP-IRT Report fails 
to account for the possibility of multiple ``hot spots'' in a single 
cargo container. Because CBP protocols require Officers to scan the 
entire container and then focus on the regions of highest detected 
radiation, threat materials with lower emissions could be missed and 
more intensive scans instead focused on other ``hotter'' locations 
within a container. Again, the ability of ASP systems to scan an entire 
container in a uniform fashion provides the ability to identify threats 
throughout a container, rather than just those that emit the most 
radiation. Second, while the ASP-IRT highlighted the importance of 
background radiation and the confounding effects that it has on 
radiation identification, their analysis does not account for the fact 
that ASP systems are designed to shield background radiation from 
interfering with the detection of sources in the containers being 
scanned. While ASP systems are shielded by one inch of steel on the 
back and sides of the detector, RIIDs have no similar shielding to 
focus the detection containers. While difficult to quantify, this 
shielding provides measurable improvement in the ASP signal-to-noise 
ratio when compared to RIIDs. While the ASP-IRT acknowledged these 
additional effects, it did not adopt firm positions as to the 
associated benefits.

                       ASP-IRT REPORT--CONCLUSION

    The ASP-IRT Report provides a valuable independent assessment of 
the ASP program, and will serve as an important source of information 
in the eventual decision to certify ASP systems. However, their 
analysis was limited in the scope of information made available, which 
in some instances may have resulted in conclusions contrary to those 
that would have been reached if more information were available. Since 
that time, DNDO has met with the ASP-IRT and provided additional 
information and discussed differences in conclusions reached by each 
party. In some instances, DNDO analysis has shown that ASP-IRT 
conclusions provided unnecessarily limited appreciation for the 
improvements that ASP systems offer. These issues include the ASP-IRT 
decision to focus solely on ASP improvements to secondary scanning 
operations, assumptions made by the ASP-IRT concerning referrals of 
``unknown'' secondary alarms to LSS, the probability of localization 
error on first principles analysis of RIID performance, and other 
issues which differentiate ASP systems and RIIDs. DNDO believes that 
when these issues are considered, ASP systems clearly provide an 
improvement in operational effectiveness when compared to current 
systems.
    DNDO looks forward to continuing to work with CBP and other 
partners within and beyond DHS to improve the Nation's ability to 
detect radiological and nuclear threats at our ports and borders. DHS 
is facing an enormous challenge at our ports and borders as it 
struggles to balance the flow of goods and commerce with the need to 
sufficiently scan cargo for radiological or nuclear threats as it 
enters our Nation. The technologies that DNDO is pursuing in the ASP 
program are a critical component in addressing that challenge.

                               CONCLUSION

    I am confident that our plan for the development and evaluation of 
ASP systems is sound. The Phase III test results show promise from ASP 
systems, and the ASP-IRT has provided a valuable assessment of the 
program to date.
    I welcome and appreciate the committees' active engagement with 
this program, and look forward to continuing our cooperation as we move 
forward together. This concludes my prepared statement. Chairman 
Langevin, Ranking Member McCaul, and Members of the subcommittee, I 
thank you for your attention and will be happy to answer any questions 
that you may have.


    Mr. Langevin. The Chair now recognizes Mr. Thompson for 5 
minutes.

  STATEMENT OF GEORGE E. THOMPSON, DEPUTY DIRECTOR, PROGRAMS, 
                  HOMELAND SECURITY INSTITUTE

    Mr. Thompson. Mr. Chairman, Representative McCaul and 
distinguished members, thank you for the opportunity to address 
the subcommittee on the subject of the advanced spectroscopic 
portal.
    My name is George Thompson, and I am the Chair of the ASP 
Independent Review Team or IRT. Our final report was delivered 
to you and your staff last week.
    An independent expert review is a valuable source of 
advice, but only if the experts are truly expert and only if 
the advice is truly objective. So before talking about the 
findings of our report, I would like to take a moment to talk 
to you about the IRT itself and the process that was used to 
conduct the review.
    Beginning with myself as Chair, my role was to help frame 
the issues, integrate the team's efforts and contribute 
substantively. I should mention that I am currently a deputy 
director of the Homeland Security Institute, which is an FFRDC 
or federally Funded Research and Development Center. It was 
established in 2004 under the Homeland Security Act. Thus, it 
was the Congress that wisely foresaw the need for DHS to have 
access to an independent objective source of expert technical 
advice on complex Homeland Security problems such as the one we 
are discussing today.
    In September 2007, Mr. Paul Schneider, then Under Secretary 
for Management, asked me to serve as the IRT Chair. The review 
had been underway for several weeks at the time.
    Upon accepting the role, I reviewed the quality of the team 
members selected by my predecessor. They were, in a word, 
outstanding: Dr. Alan Berman of the Penn State Applied Research 
Lab; Dr. Dennis Slaughter, formerly of the Lawrence Livermore 
Lab; Dr. Peter Vanier of Brookhaven National Laboratory; Dr. 
Michael Wright of Oak Ridge National Lab and Dr. Kaus-Peter 
Ziock of Oak Ridge. All recognized experts in the basic and 
applied science of nuclear detection.
    However, I also recognize the need for experts in 
acquisition management and testing. Therefore, I asked three 
other distinguished individuals to serve as reviewers: Mr. 
Thomas Christie, former DOD director of Operational Test and 
Evaluations; Mr. William Houley, first director of Defense 
Acquisition Reform; and Dr. Marion Williams, formerly chief 
scientist at the Air Force Test and Operational Center or 
AFOTEC. These and all participants executed strict conflict-of-
interest and nondisclosure agreements.
    Mr. Schneider identified the specific questions that he 
wanted the team to answer, but he gave the IRT free rein in 
answering them. He also made it very clear that the team was 
free to offer any other observations we chose.
    From late August to late October the team reviewed over 120 
documents, interviewed dozens of key staff, both internal and 
external at DHS, and traveled to four ports of entry to observe 
both first-generation systems and ASP units in operation. At 
the time our goal was to complete the report by mid-November to 
inform the certification decision by the Secretary.
    However, in early November as we were drafting our report, 
the IRT learned that the Secretary had chosen to defer that 
decision. Nevertheless, Ms. Duke asked us to complete the 
report, and we delivered an interim version on November 19.
    During the remainder of November, December, January and 
early February, the DNDO and CBP reviewed the interim report. 
They discussed its contents with the team, provided some 
additional data and delivered a written response on February 
15. The IRT carefully considered the additional data and 
comments, made some important revisions and delivered the final 
report to DHS on February 20.
    With my remaining time, I would like to provide an overview 
of the final report. Our findings are organized around the two 
questions posed by Mr. Schneider last year to assess the ASP 
testing approach and to assess ASP performance.
    Because the Department intended an initial deployment of 
ASP in the so-called secondary screening role and also because 
of the limitations of the test approach, we focused the 
performance assessment on secondary screening.
    With respect to the ASP testing approach, the IRT 
identified several aspects of the overall testing approach that 
we believe could and should be improved. In general, these 
include a broader characterization of system performance and a 
stronger linkage between test results and operational outcomes.
    We also looked at the specific test procedures used in 
2007. Although they were not ideal, we did not find evidence 
that the test results had thereby been biased or manipulated.
    Second, ASP performance. We considered both security, 
minimizing the chance of a threat entering the United States, 
and commerce, minimizing unnecessary inspection of innocent 
cargo. In general, we found that the hand-held systems 
currently used to identify radioisotopes in cargo are 
characterized by wide variations in performance. The ASP could, 
if it performs in the field as intended--and if appropriate 
standard operating procedures are developed--could 
substantially reduce these variations in performance and thus 
reduce some key uncertainties in the Nation's ability to 
counter the threat of nuclear smuggling.
    Finally, the report also offered a number of observations 
concerning the need for greater discipline in requirements in 
test and evaluation oversight.
    In closing, I am grateful for the opportunity to be in 
service. I will do my best to answer any questions you might 
have and I will gladly make myself available to you and your 
staff for more detailed discussions if you wish.
    Thank you very much.
    Mr. Langevin. Thank you, Mr. Thompson.
    [The statement of Mr. Thompson follows:]

                Prepared Statement of George E. Thompson
                             March 5, 2008

                          INTRODUCTORY REMARKS

    Mr. Chairman, Representative McCaul, and distinguished Members: 
Thank you for the opportunity to address the subcommittee on the 
subject of the Advanced Spectroscopic Portal. My name is George 
Thompson, and I am the Chair of the ASP Independent Review Team (IRT). 
Our Final Report was delivered to you and your staff last week. The 
report is considered For Official Use Only, so I will be providing a 
general overview rather than a detailed description of the team's 
findings.
    An independent review is a valuable source of advice for 
decisionmakers--but only if the experts on the review team are truly 
expert, and only if the advice they provide is truly objective. So 
before I talk about our findings, I'd like to describe the IRT itself 
and process that was used to conduct the review.

                THE INDEPENDENT REVIEW TEAM (IRT) CHAIR

    As Chair of the IRT, my own role was to help frame the issues and 
integrate the contributions of the IRT members into a coherent report. 
I also contributed substantively in those areas in which I am 
personally knowledgeable. My formal background is in Applied 
Mathematics. I have spent the last 30 years as a practitioner of 
Operations Analysis, which is really just disciplined problem-solving, 
using the tools of mathematics, probability and statistics, simulation 
modeling, and systems analysis--with a healthy measure of critical 
thinking and common sense thrown in. I am currently a Deputy Director 
of the Homeland Security Institute. The Institute is what is known as a 
federally Funded Research and Development Center (FFRDC). It was 
established in 2004, pursuant to section 312 of the Homeland Security 
Act, which specified that the Institute was to be administered by the 
Science and Technology Directorate on behalf of the entire Department 
of Homeland Security (DHS). Thus, at the same time it established DHS, 
the Congress wisely foresaw the need for the Department to have a 
knowledgeable, independent and objective source of expert technical 
advice on complex homeland security problems--and that is the mission 
of the Homeland Security Institute.
    In September 2007, Mr. Paul Schneider, then Under Secretary for 
Management, asked me to serve as the IRT Chair. At that time, the 
review had already been underway for several weeks. Two other 
individuals, Dr. Pete Nanos of the Defense Threat Reduction Agency, and 
Mr. John Higbee of the Defense Acquisition University had, in turn, 
served briefly in this role but had to withdraw their services.

                            THE IRT MEMBERS

    When I accepted the role of IRT Chair, one of my first actions was 
to review the qualifications of the team members that had already been 
selected by my predecessor. They were, in a word, outstanding.
   Dr. Alan Berman of the Penn State Applied Research Lab is a 
        renowned expert in signal processing who has served on numerous 
        advisory panels for the U.S. Navy and Office of the Secretary 
        of Defense.
   Dr. Dennis Slaughter, formerly of the Lawrence Livermore 
        National Laboratory, is an expert in low energy nuclear physics 
        and cargo security.
   Dr. Peter Vanier of Brookhaven National Laboratory is an 
        expert in the detection of nuclear weapons. He is also a member 
        of the so-called Regional Reachback team that analyzes gamma-
        ray spectra submitted by State and local law enforcement 
        organizations.
   Dr. Michael Wright of Oak Ridge National Laboratory is 
        another Reachback analyst who is also expert in instrument 
        development and systems integration.
   Dr. Klaus-Peter Ziock of Oak Ridge is a recognized authority 
        on the subject of systematic noise and its impacts on radiation 
        detection.
    These individuals are certainly well-qualified in the basic science 
of nuclear detection.
    However, it was clear to me that the review would also need to 
consider important issues involving acquisition management, systems 
engineering and the basic principles of test and evaluation. 
Accordingly, I asked three other individuals with distinguished careers 
in the Department of Defense (DoD) to serve as reviewers of the draft 
report.
   Mr. Thomas Christie was formerly the DoD Director of 
        Operational Test and Evaluation.
   Mr. William Houley was the first Director of Defense 
        Acquisition Reform and a former Director of Test and Evaluation 
        on the staff of the Chief of Naval Operations.
   Dr. Marion Williams was formerly Chief Scientist and 
        Technical Director of the Air Force Operational Test and 
        Evaluation Center.
    In addition, a small group of technical support analysts--Mr. James 
Hurd, Mr. Bruce Shelton, and Ms. Georganne John--provided valuable 
assistance in areas such as systems engineering, process modeling, and 
program management.

                 ENSURING OBJECTIVITY AND INDEPENDENCE

    All these individuals--and indeed, all individuals who had access 
to the study in progress--were required to execute strict conflict of 
interest and nondisclosure agreements. As IRT chair, I had full 
visibility into the team's deliberations, and at no time did I observe 
anything less than an intellectually honest and open discussion of the 
issues.
    The DHS Under Secretary for Management provided the team a Terms of 
Reference memorandum, which spelled out the specific questions to be 
answered. However, the IRT had free reign in answering those questions, 
and the Under Secretary made it clear that the team was free to offer 
any other observations we saw fit to provide.

                           REPORT CHRONOLOGY

    During the roughly 2-month period from late August to late October, 
the team reviewed over 120 documents including test plans, test 
reports, directives, technical reports, briefings, and spreadsheets. 
(Further details are contained in Section II.B of the report, and a 
complete listing is at Appendix 4.) We conducted interviews and 
technical discussions with key staff from the Domestic Nuclear 
Detection Office (DNDO), Customs and Border Protection (CBP), the 
Pacific Northwest National Laboratory, the National Institute of 
Standards and Technology, the Department of Energy's National Nuclear 
Security Administration, and others. (The report lists the dates of the 
key meetings.) Team members traveled to four ports of entry to observe 
both first-generation systems and ASP units in operation. (Again, for 
full details, see section II.B of the report).
    At the time, our goal was to complete the report by mid-November, 
to inform a certification decision by Secretary Chertoff. (As you know, 
the Fiscal Year 2007 Appropriations Act contained language requiring 
the Secretary to certify to the appropriations committees that the ASP 
represents a ``significant increase in operational effectiveness'' 
compared to first-generation radiation detection and identification 
systems.) However, in early November, as we were drafting our report, 
the IRT learned that the Secretary had chosen to defer that decision. 
Nonetheless, Ms. Elaine Duke, Deputy Under Secretary for Management, 
asked the team to complete the report, since its findings could still 
be used--for example, to improve a new round of ASP testing. We 
delivered an interim report on November 19, 2007.
    During the remainder of November, December, January, and early 
February, DNDO and CBP reviewed the interim report. They discussed its 
contents with the team, and provided some additional data. DNDO and CBP 
delivered a written response on February 15, 2008. (That response is 
included in the report as Appendix 8.) The team carefully considered 
each statement in the response and decided whether to make changes as a 
result. (See Appendix 9 of the report.) In some cases, the team agreed 
with DNDO and CBP, and revised the report accordingly. In other cases, 
we disagreed with a DNDO and CBP statement; however, we could see the 
need do a better job in explaining our ideas. In all cases, we were 
careful to explain why we agreed or disagreed, and what changes (if 
any) we made as a result.
    We delivered the Final Report to DHS on February 20, 2008.

            SCOPE OF REVIEW--PRIMARY AND SECONDARY SCREENING

    Section I.C of the report describes which topics were studied, 
which were not, and why. It is important to understand that our 
assessment of ASP performance concentrated on the use of the ASP in the 
so-called Secondary screening role: Primary screening detects the 
presence of radiation in cargo; Secondary screening identifies the 
isotopes to determine whether or not there is a threat.
    One reason we focused on Secondary screening was DHS's intent, as 
of last Fall, to make an initial deployment to Secondary in order to 
gain greater operating experience with the ASP. We were charged with 
informing that decision. Another reason is that, in our judgment, the 
existing test data are insufficient to assess the operational impact of 
using the ASP in the Primary role. Section V.C of the report discusses 
the potential benefits and risks associated with using ASP in the 
Primary role, and the reasons why we believe that additional testing 
and analysis is needed.

                           OVERVIEW OF REPORT

    The report includes a chronology of events associated with the 
review itself, a description of the process used to ensure quality and 
objectivity, a summary of our technical approach, the system-of-systems 
framework that we developed in order to assess the operational 
significance of improved detection and/or identification capability, 
and, of course, our independent assessment of the ASP test procedures 
and the test results.
    A series of appendices provides additional technical detail, as 
well a list of source documents, biographies of the team members, and a 
copy of the Conflict-of-Interest/Non-Disclosure Agreement (COI/NDA) 
form that each of them was required to complete. As mentioned 
previously, the DNDO and CBP Response to the interim report is included 
as an appendix, as is the IRT's assessment of that Response.
    The report findings are organized around the Terms of Reference 
(TOR). The TOR asked the team to do two things: first, assess the ASP 
testing approach; and second, compare the performance of the ASP to 
first-generation radiation detection and identification systems.

                 REPORT FINDINGS--ASP TESTING APPROACH

    The IRT identified several aspects of the overall testing approach 
that we believe could and should be improved. In general, these include 
a broader characterization of system performance and a stronger linkage 
between test results and operational outcomes. We developed an 
operational process flow and proposed scoring schema that we believe 
could help DHS do a better job in assessing the operational impact of 
the ASP. We also looked at the test procedures that were used in 2007. 
Although those procedures were not ideal, we did not find any evidence 
that the test results were thereby biased or manipulated.

                    REPORT FINDINGS--ASP PERFORMANCE

    In assessing ASP performance, the IRT considered both security 
(minimizing the chance that a threat would be allowed to enter the 
United States) and commerce (minimizing the unnecessary screening and 
inspection of innocent cargo). We identified the key variables and made 
an independent estimate of ASP impacts on security and commerce based 
on test data, operating experience with first-generation systems, 
physical first-principles, and other factors. As noted earlier, our 
assessment of performance assumes that the ASP is used in the Secondary 
screening role, to replace the hand-held systems that are currently 
used.
    In general, we found that the hand-held systems currently used to 
identify radioisotopes in cargo are characterized by wide variations in 
performance. These variations derive from the degree to which these 
systems rely on the judgment of the CBP Officer in adjudicating 
radiation alarms, the degree to which their performance depends on 
source-detector geometry and the ability to localize the source within 
the container, and the degree to which their performance can be 
degraded by operator inattention or fatigue.
    The ASP could--if it performs in the field as intended, and if 
appropriate standard operating procedures are developed--substantially 
reduce these variations in performance and thus reduce some key 
uncertainties in the Nation's ability to counter the threat of nuclear 
smuggling.

                         REPORT FINDINGS--OTHER

    Many of the issues associated with the ASP test program are rooted 
in a larger set of issues having to do with the processes by which DHS 
manages large and/or complex acquisition programs. Accordingly, the IRT 
also offered a number of observations concerning the need for greater 
discipline in DHS acquisition management, requirements, and test and 
evaluation oversight.

                           CONCLUDING REMARKS

    I am grateful for the opportunity to be of service and to help 
inform important decisions on homeland security issues such as nuclear 
smuggling. I will do my best to answer any questions you may have, and 
I will gladly make myself available to you and your staff for more 
detailed discussions if you wish. I respectfully request that my formal 
statement be submitted for the record. Thank you.

    Mr. Langevin. The Chair now recognizes myself for 5 minutes 
for purpose of questions.
    Mr. Oxford, I would like to begin with you.
    As I stated in my opening statements, several findings in 
both reports caused me concern, and I would like to discuss one 
that I found particularly troubling, and perhaps you can help 
clarify. The unclassified executive summary phase of your 
report states that when it comes to identifying mass sources, 
PVT system performance appears to be better than ASP systems, 
because the PVT systems are on Naturally Occurring Radioactive 
Material, or NORM. As I understand it, the whole reason for 
pursuing ASP is because PVT makes no distinction between threat 
material or NORM. So the ASP is supposed to be able to 
distinguish between these two types of material.
    However, the report indicates that when a threat object is 
masked by NORM, the ASP is only seeing the masking material and 
not the threat object. This, in effect, makes it no better than 
the PVT. In some cases, it could even be more dangerous because 
if the ASP indicates that threat material masked by NORM is 
NORM only, then we have a false negative which could let 
dangerous material in.
    However, it seems to me that because PVT cannot distinguish 
between the two at all, that you have the same problem with 
that system. I don't understand how you would say that the PVT 
performed better, and I will get to explain that in some more 
detail.
    Mr. Oxford. Thank you, Mr. Chairman. In fact in your 
opening statement, you mentioned the nuances that go on in this 
business. You are right on in that regard.
    PVT will alarm on all NORM. In the report it says that it 
has higher alarm rates, which isn't necessarily the proper 
metric, as you tried to point out. When it does alarm, again, 
we have a lot of nuisance alarms. That gets sent to secondary.
    The problem in secondary, and if you refer to the IRT 
report--this is where they become closely coupled--you are now 
dependent on this small hand-held device to be able to try to 
find the threat in the middle of the masking sources.
    We have done analysis that shows to actually scan an entire 
container would take upwards of an hour with a hand-held device 
to be able to localize a potential threat. The hand-held 
devices will actually lock onto the highest output from the 
container and would likely pick up the NORM material regardless 
of what was embedded in it. So you are right in saying that if 
you refer to secondary, it doesn't necessarily get better.
    Now, on the other side with ASP, you have to couple the 
Phase III report along with the Phase I report, where we tested 
against the actual threat basis that I referred to in my 
opening statement. In that case where we looked at masking 
cases with the size of sources that are representative of the 
actual threat that we are designing against, we were getting 
probability of detection and ID, both in primary and secondary, 
of greater than 95 percent.
    So when you see the cases in the Phase III report, what you 
are seeing is a reflection of having now extended the size of 
the sources to, in some cases, a source size of less than 
approximately 15 percent of the threat basis, so we actually 
tried to look at how far we can extend the performance of ASP 
under those conditions. Even in those cases, we are getting 
answers of 50 to 60 percent, probability of detection ID for 
ASP against a much broader threat than what you see. So you 
have to really combine the Phase I and Phase III test reports 
to see the broad range of the outcomes.
    Mr. Langevin. To go back to the original question, does PVT 
perform better against masked material than ASP?
    Mr. Oxford. Again, it is not a fair comparison to do PVT 
and ASP. What you have to do is actually distinguish between 
the material, so you actually have to compare the overall alarm 
rates in primary to the ability in secondary to be able to find 
whether you can identify the threat source or not.
    So in this case you have to compare ASP and primary and 
secondary to a PVT and a hand-held device; and secondary, 
because it is the hand-held device that gives you the 
identification and secondary. If you look at the IRT report, 
they spent a lot of time looking at the comparison of ASP and 
the RIID, the hand-held device and secondary. We think there 
are severe limitations of the hand-held device and secondary. 
So in case if you look at the system-to-system comparison, PVT 
does not outperform ASP.
    Mr. Langevin. In its report, the ASP Independent Review 
Team found that DHS could benefit from an independent 
operational test and evaluation process and organization 
structure to ensure that testing measures the operational 
performance and reliability of the new system.
    Mr. Thompson, what were the factors with ASP testing that 
led to the IRT to make this recommendation? For Director 
Oxford, does the DNDO plan, too, use such an independent 
process or organization in the ASP testing plan for the next 
several months?
    Beyond just ASP, do you agree with this general finding, 
and will you use an independent organization to conduct testing 
on other technologies developed by DNDO?
    We will start with Mr. Thompson.
    Mr. Thompson. Thank you, Mr. Chairman.
    The factors that were behind that statement I would 
characterize as general principles or best practices, if you 
will, of test and evaluation. It is not uncommon in other 
organizations such as DOD to make a careful distinction between 
development tests conducted by the developing entity for the 
purpose of trying to improve the system and refine the design, 
make it better, prove that it does what it can do, versus 
operational test, where you put it in the operational 
environment, you let the actual operators operate it.
    Since those are too clearly difference types of tests with 
two clearly different kinds of objectives, most organizations 
will assign two different entities to be responsible for those.
    Mr. Langevin. Thank you. Director Oxford.
    Mr. Oxford. Mr. Chairman, we agree with the 
recommendations.
    As I noted in my opening statement there are several things 
in that report that we said we were going to adopt. I would 
also like to allow Ms. Duke to address this from the 
Department's perspective as well, but we are already taking 
steps to bring in independent testing on behalf of the 
operational basis Dr. Thompson mentioned, where we will 
essentially rotate the lead responsibilities over the course of 
the testing that will take place this year.
    For those purely developmental tests, DNDO will still have 
the lead, although we will have an external oversight function 
that we will review all test plans, assign those test plans and 
then also report to the Under Secretary for Management about 
their adequacy.
    When it gets into the operational testing, DNDO will 
provide mainly a support role as opposed to a lead role in the 
future because we did not have this independent capability.
    Again, I would like Ms. Duke to have the opportunity to 
address how the Department is going to do this.
    Mr. Langevin. Ms. Duke.
    Ms. Duke. Yes. I also agree with the recommendation, and we 
are clearly looking at how we can build the independent 
operational test and evaluation capability for the Department.
    Under Secretary for Science and Technology Jay Cohen has 
the lead on test and evaluation for the Department, and his 
executive agent is George Ryan, who has about 50 years of 
experience in the tests and evaluation. So we do believe that 
Mr. Ryan, as I stated earlier, will serve in what would be 
equivalent to the director of Operational Test and Evaluation 
role that DOD has for the continued testing of the ASP program.
    Mr. Langevin. Thank you. The Chair now recognizes the 
Ranking Member for 5 minutes.
    Mr. McCaul. Thank you, Mr. Chairman.
    Again, thank you for holding this hearing. I think the 
timing couldn't be more relevant. As we discovered yesterday 
from a computer seized from the FARC, a terrorist organization 
in Latin America, revealed that there were many communications 
with Venezuela and Hugo Chavez who is aligned with Iran, 
discussions about nuclear material, discussions about dirty 
bombs.
    I have always been concerned about that alliance to the 
Middle East in our own hemisphere, and this puts it in our own 
backyard. Coming from Texas I am always concerned, without 
engaging in a lot of hyperbole or irresponsible rhetoric, that 
the border does face some challenges in the post-9/11 world 
because that is obviously where this stuff is going to come 
from and cross over into. That, I think, demonstrates the need 
for this.
    I commend all of you for your work in this area because it 
is so important.
    Mr. Thompson, your great work in reviewing this ASP system, 
making these outside independent recommendations is very 
helpful.
    I tend to agree with you that the ASP when perfected will 
reduce the uncertainty in the current process. I think it will 
provide a more accurate system that will be more efficient. 
They can help the flow of commerce at the border, and yet still 
enter our ports of entry and also, at the same time, provide 
more accurate readings in terms of threat material and to 
distinguish between real threat material and nonthreat 
materials.
    So, having said all of that, I think we can all--I think I 
am safe in saying this--I think we can all agree on the goal 
here, and that is to provide the best technology possible in 
this screening process. I happen to believe the ASP is a way to 
go in this.
    I will throw this out to all three of you. What do you see 
as the current deficiencies in the ASP in terms of what needs 
to be improved upon? Perhaps more for you, Mr. Oxford, what is 
your timetable here in terms of your last round of testing and 
the ability to get this out? Then how, assuming we do perfect 
this, how soon can this be operational in the field?
    Mr. Oxford. Thank you for those questions.
    First of all, I will use a combination of the charts that 
are available to you and try to speak to this in a logical way. 
When we did the last round of field evaluation systems, we 
found what a key--what I call operational functionalities that 
CBP required that weren't quite ready for deployment, and this 
is what led to the Secretary's decision to delay the program. 
Those functionalities are not its ability to distinguish and 
detect and identify the threat.
    Let me just give you some examples of what this means. Once 
you put these systems into this stream of commerce, as you have 
already mentioned, we can't allow commerce to stop. We are 
having trouble with what I call system stability with some of 
the systems, where if they powered down they would take hours 
to reboot, which means you are essentially cutting down the 
traffic in that lane.
    The specification that the current ASP contractors are 
working to is they have got to be able to reboot within 1 
minute. If they use the natural background readings that were 
already in the system, they have to be able to reboot within 7 
minutes if they want to collect new background.
    That is a stability function. It is not performance against 
the threat. It is the operability of the system that CBP 
requires. In a similar way, they want their supervisory 
computer to be able to control four traffic lanes at any port 
of entry. We focused on one lane. This is a broadening of the 
requirement the CBP asks for.
    The contractors are now building to that. What will be the 
results of that--if you look at the chart that's in front of 
you--will be the first round of testing that we call systems 
qualifications test. That will be done at the vendor's location 
where they will start to demonstrate to ourselves, to the 
independent test office, as well as to the Customs and Border 
Protection, that these functionalities have now been built into 
the system and that they are stable.
    That will allow us to mature down the path of taking these 
out to Pacific Northwest National Laboratory, the CBP and the 
independent test organization we now use, to make sure, once 
again in a field-like environment, that these systems are ready 
for fielding. That will then be followed by deployment to CBP 
locations at CBP's choice. That is where DNDO will play a 
support role as opposed to a lead role.
    Independent of that, we will go back and verify that the 
performance against a threat has not been degraded in any means 
by the system upgrades. We always worry that if you upgrade a 
system that somehow you have lost performance elsewhere. So it 
is parallel to what CBP and the independent test organization 
do. We will take these back out to Nevada to make sure that 
they have not been degraded against the threat. That should 
lead us to an August timeframe, late fiscal year 2008, to a 
recommendation for the Secretary.
    Mr. McCaul. So if we are fortunate, by August we could have 
the new system?
    Mr. Oxford. We will be able to make an immediate production 
and deployment decision if the Secretary chooses to. We have 
low-rate initial production systems sitting in the warehouse 
floors with our current vendors, and we can immediately provide 
the upgrades to those systems, and those would be the initial 
deployment units.
    So that would be rather rapid while we go into the 
production buy, which is a 4- to 6-month time period to place 
the order to begin delivery. So we will be able to field 
roughly 45 systems within a very short period of time after the 
decision is made.
    Mr. McCaul. That is very good.
    Mr. Thompson, do you have any comments?
    Mr. Thompson. Yes, sir. In terms of improvements, I will 
mention four things: One area just has to do with the normal 
things you do as you take a technology or a system and you 
mature it, so it has to do with things like does the system 
operate reliably in the field. Those things you test and you 
prove them out over time.
    The second area has to do with some of the details of how 
you set thresholds for detection and identification. That is a 
pretty complicated thing that goes on within the algorithms of 
these systems. I think it is fair to say that DNDO is still 
doing some learning on that, again, as you normally do when you 
develop a system.
    The third area concerns what I call standard operating 
procedures, again, something that you do iteratively when you 
field a new system. It is a marriage of the technology and the 
procedures that you used to implement it.
    So, for example, is 2 miles per hour the right speed for 
the truck to go? What exactly should the investigating officer 
do when he sees a certain alarm condition? Those kinds of 
things.
    My fourth comment is not about the ASP system or technology 
itself, but on the analysis of test data, and you saw this in 
the report. It has to do with how you slice and dice the test 
data. I think there are some things that probably could be done 
better in this round of testing, but again that concerns the 
analysis of data, not the ASP system per se.
    Mr. McCaul. Just for clarification again, Mr. Oxford, do 
you think by August you would have the last round of testing 
and then, if successful, it would take maybe 4 to 5 months of 
production and then all the systems could be upgraded in the 
field at that point?
    Mr. Oxford. The initial upgrades would actually be for the 
systems that were manufactured under what we call low rate of 
initial production. They have already been manufactured. We 
have been precluded by the appropriations language to doing--to 
spending any acquisition money to upgrade those right now.
    So as soon as we would get a decision from the Secretary, 
we would have the ability to upgrade those systems immediately. 
That is not the 4 to 6 months. The 4 to 6 months would be to 
place the orders and to begin receiving deliveries of the 
brand-new systems on top of the 45 that are sitting in the 
warehouses now.
    Mr. McCaul. Thank you.
    Mr. Langevin. I thank the Ranking Member.
    The Chair now recognizes the gentleman from New Jersey, Mr. 
Pascrell, for 5 minutes.
    Mr. Pascrell. Thank you, Mr. Chairman.
    Mr. Chairman, the Nuclear Detection Office, their cost-
benefit analysis, it would seem to me, the report says this, 
does not justify the recent decision to spend $1.2 billion to 
purchase and deploy the ASP technology.
    In particular, the Domestic Nuclear Detection Office used, 
the report says, ``incomplete and unreliable data to evaluate 
the cost-benefit.''
    Mr. Chairman, we just had an example of what is going on in 
Homeland Security last week when we discussed the fence along 
the south, the southern border, Solution 28, Project 28. We saw 
how money was spent there, and now they have to replace 
everything.
    I would hope, because I know the great work that Ms. Duke 
is doing since you have come on board, you have only been there 
a little while, but you are doing a good job and I think you 
are trying to get things in order. But there is a perception, 
you know, why are we--how did we ever get ourselves in the 
position to using incomplete and unreliable data?
    The enemy is not--I mean the enemy, the enemy is within. We 
can't get it right. We are spending money in a very foolish 
way. We are wasting the money.
    We know in this situation that we need a--you need a 
primary, and you need static screening. I think that is very 
obvious, it is very clear. You might have to spend more money 
up front, but you either pay now or you pay later, and that is 
what we are dealing with. I am very concerned about this 
general perception, about spending money and then going back 
and having to do a lot of redos again and again.
    So, Dr. Oxford, I would like to ask you a question. We need 
the most sophisticated technology at our ports to ensure that 
there is proper nuclear detection. I think we would agree on 
that.
    But we must strike a balance between the technology 
advancements, and, obviously, what it is going to cost. Those 
are factors that have to be taken into consideration.
    It is my belief that the DNDO should not simply invest in 
new technology--this is my opinion--should not simply invest in 
new technology if it does not make a significant advancement 
over the equipment that is already deployed in the field. If it 
is not going to be a significant improvement, we are wasting 
money.
    So my question to you is, do Customs and Border Protection 
members, officers in the field, do they regard advanced 
spectroscopic portal monitors as a significant leap forward in 
detection over the equipment that is presently used in the 
field?
    The follow-up question to both Dr. Thompson and Under 
Secretary Duke is, do you have any additional thoughts on this 
question?
    Let me start with Dr. Oxford.
    Mr. Oxford. Thank you. First of all, I need to set the 
record straight, because this $1.2 billion number is out there, 
and it is in the press, and it is wrong. But it is the total 
value of the contracts we signed.
    But right now, if we live with the deployment decisions 
that we have planned with CBP, the total acquisition of systems 
would result in about 350, not that that is a small number, but 
about $350 million. So we have a contract flexibility to buy up 
to that amount.
    Mr. Pascrell. The total project is $1.2 billion, what we 
are talking about specifically. But we can't get through the 
specific project unless we go through each of those, so the 
$1.2 billion----
    Mr. Oxford. It is a contract ceiling. Contract ceiling 
versus what we plan to spend.
    Mr. Pascrell. I understand that. I think the Secretary will 
ultimately make the decision, not CBP, as to whether this 
represents a significant increase in operational performance. 
We are not spending the acquisition money in the amount that 
you are citing until the Secretary makes that decision, so we 
have not gone out and started to spend that money.
    We are spending development money to get to the point where 
we can make that acquisition decision through the testing 
program that I have played out today.
    CBP is a partner in this. I will tell you if you look at 
the numbers at Los Angeles/Long Beach, they are getting 400 to 
500 nuclear alarms per day. They have dedicated almost 200 
officers to this mission to help resolve those alarms. Current 
projections of ASP performance would allow that number to come 
down to 20 to 25 alarms that they would have to take seriously, 
which is a tremendous improvement in their flexibility to 
manage all the missions that CBP----
    Mr. Pascrell. Detected in one example, in one specific area 
we are talking about, what have we detected? What have we 
concluded? What have we found? Are they all false alarms? Are 
they all relevant? None of them, all of them?
    Mr. Oxford. No. So far if you look at the test data we have 
available, we have been able to against the threat basis that I 
mentioned in my opening statement that we are getting 
probability of identification and detection against those 
threats of greater than 95 percent. That is a much higher 
number than anything that is in the field today.
    Second, when we looked at inserting these systems into an 
operational port, and we did this at the port of New York-New 
Jersey, we are able to show that the secondary referral rates 
right now would come down by at least a factor of 10. So the 
amount of secondary referrals that the operator would have to 
pay attention to is down by at least a factor of 10, maybe as 
much as 20.
    That is the test data we have available. We will evaluate 
that over the course of the test program that I have laid out 
for you today.
    Mr. Pascrell. Would you agree when I am saying that we need 
a primary and secondary screening process so that we have a 
backup system that works that may be more cost-effective even 
though you have to spend more money up front?
    Mr. Oxford. Absolutely. That is the current configuration 
at our ports, at our land-border crossings. That is the model 
that we will continue to follow and just make CBP more 
effective.
    Mr. Pascrell. Madam Under Secretary.
    Ms. Duke. I agree. I think the position of CBP which was 
considered in the Secretary's decision was that they see 
significant promise in the ASP technology, but that the amount 
of operational test and evaluation done introduced too much 
risk.
    In all the decisions we have to make, Vayl has to make in 
terms of being the program executive, are really risk 
management. You have to tradeoff the risk of schedule, of 
performance, of putting out a technology that is not ready to 
be done. So throughout this process we will continue to manage 
that risk so that we can get the technology out as quickly as 
possible if the test results continue to demonstrate it is an 
improvement in threat detection and identification, but not put 
it out so quickly that the risk is actually increased because 
the operators aren't ready to use it. That is what we are going 
forward with CBP and DNDO.
    Mr. Thompson. Sir, just three comments, if I may. First--
and I guess this is my own clarification of the record--I 
believe the language that you have read on incomplete and 
unreliable data for the cost-benefit analysis is actually from 
a GAO report that was either circa October 2006 or----
    Mr. Pascrell. I think it is. They did a report also.
    Mr. Thompson. That is correct.
    Mr. Pascrell. You don't identify with what they said? You 
don't agree with that?
    Mr. Thompson. We were not tasked to look at the cost-
benefit analysis or, I should say, to renew the cost-benefit 
analysis. I will say that since the time the cost-benefit 
analysis was done, there is much more test data. If the 
analysis were to be redone, it would use the new and updated 
data, I am sure.
    The second point I wanted to make is I agree with your 
statement that it is important to look at both the costs and 
the comparison to the existing equipment. Absolutely. That was 
the reason for the second question that Mr. Schneider asked us 
to compare the ASP to the existing fielded systems.
    I won't quote the numbers here in open session, but I think 
you can see that there are some potentially pretty significant 
improvements in terms of reducing the probability of a missed 
threat, even just with a deployment to secondary, assuming that 
the standard operating procedures and so forth can be gotten 
right.
    The third point I want to make about ASP, the operator, is 
just an antidote to illustrate that that can be a pretty tricky 
affair. When the team went out to Long Beach and they had both 
the first-generation systems and the ASP side by side, and we 
said to the operators how do you like this ASP thing, they 
said--one of them said, I don't like it. We said, why not? It 
disagrees with my first-generation system. I said, well how do 
you know the first-generation system is telling you the right 
thing?
    So it is--you don't always get the carefully considered 
result by just asking the operator.
    Mr. Pascrell. Of course the response from the operator is--
you know, has to be tested too. But it would seem to me that 
that information must be part of your conclusion. These are the 
people that are going to be using this equipment. Their input 
into the equipment, and how the equipment should be used and 
what they are looking for would seem to me to be essential if 
we are ever going to protect the country.
    Mr. Thompson. Absolutely, I agree with you, sir, so the 
operator is the expert when it comes to how this system will be 
used and whether it is useful to him in that way.
    Mr. Pascrell. Well, don't you ask the people who are going 
to use the equipment: What don't you have that you should have? 
Don't you kind of work that into the criteria of what new stuff 
is going to come on line?
    Mr. Thompson. Yes, sir, that is an important step in the 
process.
    Mr. Pascrell. Thank you, Mr. Chairman.
    Mr. Langevin. The Chair now recognizes the gentlelady from 
the Virgin Islands, Ms. Christensen, for 5 minutes.
    Mrs. Christensen. Thank you, Mr. Chairman, and thank you 
for holding this hearing.
    Let me just say that the whole thing, as my colleague 
before me said, we had the hearing on the fence and the 
technology on the fence and funding money that was spent, to 
only now find that we have to redo certain parts of the system. 
It really troubles me that a request for appropriation would be 
made to purchase a technology and we would have to say test it 
first.
    So I am hoping--and I think I heard from Ms. Duke that more 
independent review teams will be set up and a process will be 
in place so that we don't have to go through this over and over 
again.
    Some of my questions have to do with the process, and I 
guess I would start with Dr. Thompson.
    One of your findings was that in the Phase I test, the team 
didn't find a document that laid out the requirements of the 
operational performance requirements and, instead, just a 
system specification that ASP should provide 95 percent or 
greater probability of detection. You concluded that this very 
reliance on system specification versus operational performance 
requirements made it difficult to achieve the real purpose of 
the test, which was to measure the progress of the ASP program 
toward meeting the operational program objectives.
    Can you expand on that, explain what the concern was there?
    Mr. Thompson. Yes, ma'am, I will certainly try.
    Again, it has to do I think with what questions you ask 
when you are trying to gauge a program like this.
    You can ask the question: Does this machine here have the 
ability to detect such and such of a substance at 95 percent 
probabilities, or whatever probability, and that is one 
question you can ask and that's a good and useful question.
    But there is a larger question as well which is: What is 
the probability that a threat will actually get through at the 
ports? There has to be a way to relate the answer to the first 
question to the answer to the second question, and that was one 
step that we found difficult to do.
    The good news is that the team felt compelled to be as 
constructive as possible so we actually constructed an end-to-
end analysis framework and offered it to the Department as an 
example of how one might take the existing results and use them 
to answer that larger question.
    Mrs. Christensen. Dr. Oxford, have later tests followed 
that have addressed this problem? How would you specify the 
mission objectives, perhaps, in response to this concern from 
IRT?
    Mr. Oxford. Yes, ma'am. A couple of points that I would 
like to point out. First of all, we are going back and we are 
addressing the specific requirements of the customer in terms 
of the functionalities they need in the field--as I pointed out 
earlier--that we will make sure that the unique features of 
these systems that CBP requires are built into the system and 
adequately tested, so that when these are delivered they will 
function as CBP requires.
    The other factors that Dr. Thompson mentions in terms of 
how do we score, how do we connect the operational outcomes was 
again something--if we go back to the Chairman's comments, it 
is very nuanced. The fact that we weren't able to directly 
explain the test data in some cases and how we scored it made 
it hard for some people in the limited amount of time to 
connect that to the operational outcome.
    We accept that, and we are trying to go come up with a 
better scoring mechanism where people can look at the results 
and then can make a direct comparison of that. That is what we 
will use when we go to the Secretary for any decision.
    Mrs. Christensen. The processes that you are setting up for 
independent review and analysis of new technology goes beyond 
the technology related to nuclear material, correct?
    Ms. Duke. Yes.
    Mrs. Christensen. Because several issues have been brought 
to our attention, for example, on the new passport card and 
changes that are being made; the fact that the new one doesn't 
have these identifying things on the back. So technology like 
this also--has it been going through the same process, being 
tested for efficiency and effectiveness?
    Because it would just seem to me, just as a layperson in 
this field, that having this embedded information would make 
the card a better card.
    Ms. Duke. Yes, there are two pieces to it. One is the test-
and-evaluation piece, which we are building. We do think we 
need an independent test capability. Whether we rely on one 
that exists in the Department of Defense or somewhere else or 
build our own has not been decided. We do think we need that 
sustained independent operational test capability.
    The second piece is in terms of just as these investments, 
whether it SBInet, whether it is one of our credentialing, our 
identification cards, what we are building is an investment 
review process where these major programs--and we have about 
100 of what we consider major programs--would come before the 
Department at key milestone decisions.
    So in the case of ASP, we are at between low-rate initial 
production where they buy a small number of machines to test 
them, go into the production decision, so that is a system that 
is going to be put in for all our major acquisitions.
    What we are doing right now to kind of, since we are just 
building that capability, is we are looking at all the major 
programs as they stand currently. As I said in my opening 
statement, we have done almost 40, what we are calling quick 
looks, just looking at the existing programs that haven't had 
oversight sustained, and assessing them based on risk and then 
focusing our attention on the programs that appear to have the 
most acquisition risk.
    Mrs. Christensen. Thank you, Mr. Chairman.
    Mr. Langevin. I thank the gentlelady.
    The gentleman from Texas, Mr. Green is recognized.
    Mr. Green. I thank you, Mr. Chairman, and I thank the 
Ranking Member as well for hosting this hearing, and I thank 
the witnesses for appearing.
    I have some information indicating that GAO has had some 
concerns. Is this true?
    Mr. Oxford. They have looked at the program several times, 
and we continue to respond to their inquiries, yes, sir.
    Mr. Green. Is it true that they have used terms like 
``biased methods to enhance performance results'' having been 
used?
    Mr. Oxford. They have used those terms, yes.
    Mr. Green. How have you responded to their contention that 
the methods have not been, shall we say, using the best 
practices for methods?
    Mr. Oxford. We clearly have disagreed with that, and we 
were happy to see that the IRT, when they took a fresh look at 
this, found no evidence of biasing of the data or manipulation 
of the data.
    Mr. Green. So, do we still have a dispute between IRT and 
GAO?
    Mr. Oxford. I wouldn't want to speak on behalf of the GAO. 
I don't think they have had to even respond to the IRT report 
at this point.
    Mr. Green. The IRT report has gone to GAO?
    Mr. Oxford. Well, it is separate from DNDO, so I would have 
to ask the Chair or Ms. Duke, as to what the dispensation of 
that report is.
    Ms. Duke. I do not know if--I would have to take a get back 
on that. I do not know if the final report has gone to GAO.
    Mr. Green. How long has the report been one that we would 
label as final?
    Ms. Duke. The final report came--is dated February 20.
    Mr. Green. Of?
    Ms. Duke. Of 2008.
    Mr. Green. 2008. Is there in the process a procedure or 
requirement that the report eventually will get to GAO?
    Ms. Duke. There is no requirement to distribute the report 
at all, I don't believe. What we did within a week of receiving 
it, we briefed the Secretary on it and then immediately we 
released it to our authorizing and our appropriation 
committees, but that was because we knew the continued 
interest. So it was sent. But to my knowledge we have not sent 
it to GAO.
    Mr. Green. If GAO has the concerns regarding GAO not 
getting the reports and whether or not the concerns have been 
adequately dealt with----
    Ms. Duke. If GAO would like the report, I would have no 
objection to releasing it to them.
    Mr. Green. Mr. Chairman, is it appropriate for us to in 
some way expedite this process of having GAO review the report?
    Mr. Langevin. Well, I agree with Mr. Green's question and 
the suggestion that the GAO should look at the report, and I 
would urge the Department of Homeland Security to forward that 
report to GAO so they might take a look at it.
    Ms. Duke. Yes, Mr. Chairman. I don't have any objection to 
that.
    Mr. Langevin. Thank you.
    Mr. Green. Thank you, Mr. Chairman. I yield back the 
balance of my time.
    Mr. Langevin. I thank the gentleman. I thank the witnesses 
for the testimony today. It has been very enlightening. I want 
to close by saying I have been a member of the Homeland 
Security Committee since its inception when it was first a 
select committee, and of course it was right after 9/11 that it 
was formed. I can remember the great concern of the entire 
country that we could potentially be vulnerable to a nuclear 
attack, being a material--either a weapon or material being 
smuggled in across our borders and we were totally unprotected. 
We have come a long way since then. I commend the DNDO and 
Department of Homeland Security for moving as aggressively as 
they have to ensure that we have maximum coverage of radiation 
portal monitors at our ports and border crossings. This is a 
daunting task. Obviously the work is not yet done, but 
significant progress has been made at least with the deployment 
of these first generation radiation portal monitors. I know we 
are anxious to get to the newest and best technology available. 
We all hope that it will be as soon as possible, but progress 
has been made and we look forward to more of it in the future.
    Thank you all for the great work you are doing for the 
country. I appreciate it, and I do want to thank the witnesses 
for the valuable testimony, the Members for their questions, 
and the Members of the subcommittee, including myself, may have 
additional questions for the witnesses and would ask that you 
respond expeditiously in writing to those questions.
    Hearing no further business, the subcommittee stands 
adjourned.
    [Whereupon, at 3:41 p.m., the subcommittee was adjourned.]

                                 
