[House Hearing, 111 Congress]
[From the U.S. Government Publishing Office]
THE SCIENCE OF SCIENCE
AND INNOVATION POLICY
=======================================================================
HEARING
BEFORE THE
SUBCOMMITTEE ON RESEARCH AND SCIENCE EDUCATION
COMMITTEE ON SCIENCE AND TECHNOLOGY
HOUSE OF REPRESENTATIVES
ONE HUNDRED ELEVENTH CONGRESS
SECOND SESSION
__________
SEPTEMBER 23, 2010
__________
Serial No. 111-109
__________
Printed for the use of the Committee on Science and Technology
Available via the World Wide Web: http://www.science.house.gov
______
U.S. GOVERNMENT PRINTING OFFICE
58-486PDF WASHINGTON : 2010
-----------------------------------------------------------------------
For sale by the Superintendent of Documents, U.S. Government Printing
Office Internet: bookstore.gpo.gov Phone: toll free (866) 512-1800; DC
area (202) 512-1800 Fax: (202) 512-2104 Mail: Stop IDCC, Washington, DC
20402-0001
COMMITTEE ON SCIENCE AND TECHNOLOGY
HON. BART GORDON, Tennessee, Chair
JERRY F. COSTELLO, Illinois RALPH M. HALL, Texas
EDDIE BERNICE JOHNSON, Texas F. JAMES SENSENBRENNER JR.,
LYNN C. WOOLSEY, California Wisconsin
DAVID WU, Oregon LAMAR S. SMITH, Texas
BRIAN BAIRD, Washington DANA ROHRABACHER, California
BRAD MILLER, North Carolina ROSCOE G. BARTLETT, Maryland
DANIEL LIPINSKI, Illinois VERNON J. EHLERS, Michigan
GABRIELLE GIFFORDS, Arizona FRANK D. LUCAS, Oklahoma
DONNA F. EDWARDS, Maryland JUDY BIGGERT, Illinois
MARCIA L. FUDGE, Ohio W. TODD AKIN, Missouri
BEN R. LUJAN, New Mexico RANDY NEUGEBAUER, Texas
PAUL D. TONKO, New York BOB INGLIS, South Carolina
STEVEN R. ROTHMAN, New Jersey MICHAEL T. McCAUL, Texas
JIM MATHESON, Utah MARIO DIAZ-BALART, Florida
LINCOLN DAVIS, Tennessee BRIAN P. BILBRAY, California
BEN CHANDLER, Kentucky ADRIAN SMITH, Nebraska
RUSS CARNAHAN, Missouri PAUL C. BROUN, Georgia
BARON P. HILL, Indiana PETE OLSON, Texas
HARRY E. MITCHELL, Arizona
CHARLES A. WILSON, Ohio
KATHLEEN DAHLKEMPER, Pennsylvania
ALAN GRAYSON, Florida
SUZANNE M. KOSMAS, Florida
GARY C. PETERS, Michigan
JOHN GARAMENDI, California
VACANCY
------
Subcommittee on Research and Science Education
HON. DANIEL LIPINSKI, Illinois, Chairman
EDDIE BERNICE JOHNSON, Texas VERNON J. EHLERS, Michigan
BRIAN BAIRD, Washington RANDY NEUGEBAUER, Texas
MARCIA L. FUDGE, Ohio BOB INGLIS, South Carolina
PAUL D. TONKO, New York BRIAN P. BILBRAY, California
RUSS CARNAHAN, Missouri
VACANCY
BART GORDON, Tennessee RALPH M. HALL, Texas
DAHLIA SOKOLOV Subcommittee Staff Director
MELE WILLIAMS Republican Professional Staff Member
MARCY GALLO Democratic Professional Staff Member
BESS CAUGHRAN Democratic Professional Staff Member
MOLLY O'ROURKE Research Assistant
C O N T E N T S
September 23, 2010
Page
Witness List..................................................... 2
Hearing Charter.................................................. 3
Opening Statements
Statement by Representative Daniel Lipinski, Chairman,
Subcommittee on Research and Science Education, Committee on
Science and Technology, U.S. House of Representatives.......... 8
Written Statement............................................ 9
Statement by Representative Vernon J. Ehlers, Minority Ranking
Member, Subcommittee on Research and Science Education,
Committee on Science and Technology, U.S. House of
Representatives................................................ 10
Written Statement............................................ 12
Witnesses:
Dr. Julia Lane, Program Director of the Science of Science and
Innovation Policy Program, National Science Foundation
Oral Statement............................................... 13
Written Statement............................................ 14
Biography.................................................... 24
Dr. Daniel Sarewitz, Co-Director of the Consortium for Science,
Policy & Outcomes and Professor of Science and Society, Arizona
State University
Oral Statement............................................... 24
Written Statement............................................ 26
Biography.................................................... 37
Dr. Fiona Murray, Associate Professor of Management,
Technological Innovation & Entrepreneur Group, MIT Sloan School
of Management
Oral Statement............................................... 37
Written Statement............................................ 39
Biography.................................................... 50
Dr. Albert H. Teich, Director of Science & Policy Programs,
American Association for the Advancement of Science
Oral Statement............................................... 51
Written Statement............................................ 52
Biography.................................................... 55
Appendix 1: Answers to Post-Hearing Questions
Dr. Julia Lane, Program Director of the Science of Science and
Innovation Policy Program, National Science Foundation......... 66
Appendix 2: Additional Material for the Record
Statement of the California Healthcare Institute (CHI) submitted
by Representative Brian P. Bilbray............................. 68
THE SCIENCE OF SCIENCE AND INNOVATION POLICY
----------
THURSDAY, SEPTEMBER 23, 2010
House of Representatives,
Subcommittee on Research and Science Education
Committee on Science and Technology
Washington, DC.
The Subcommittee met, pursuant to call, at 2:07 p.m., in
Room 2325 of the Rayburn House Office Building, Hon. Daniel
Lipinski [Chairman of the Subcommittee] presiding.
hearing charter
COMMITTEE ON SCIENCE AND TECHNOLOGY
SUBCOMMITTEE ON RESEARCH AND SCIENCE EDUCATION
U.S. HOUSE OF REPRESENTATIVES
The Science of Science and Innovation Policy
thursday, september 23, 2010
2:00 p.m.-4:00 p.m.
2325 rayburn house office building
1. Purpose
On Thursday, September 23, 2010, the Research and Science Education
Subcommittee will hold a hearing to examine the current state of
science and technology policy research, how this research informs
policymaking, and the role of the federal government in fostering
academic research and education in this emerging interdisciplinary
field.
2. Witnesses
Dr. Julia Lane, Program Director of the Science of
Science and Innovation Policy program, National Science
Foundation.
Dr. Daniel Sarewitz, Co-Director of the Consortium
for Science, Policy & Outcomes, Arizona State University.
Dr. Fiona Murray, Associate Professor of Management
in the Technological Innovation & Entrepreneurship Group, MIT
Sloan School of Management.
Dr. Albert H. Teich, Director of Science & Policy
Programs, American Association for the Advancement of Science.
3. Overarching Questions
What is the ``science of science policy?'' How can
science and technology (S&T) policy research contribute to and
inform evidence-based local and national policy decisions? To
what extent are science and technology policies in the United
States being shaped by what has been learned from S&T policy
research?
What new and continuing areas of research in this
area could significantly improve our ability to design
effective programs and better target federal research
investments? What are the most promising research opportunities
and what are the biggest research gaps? Is the Federal
government, specifically the National Science Foundation,
playing an effective role in developing the science of science
policy?
What is the state of education in science and
technology policy at U.S. universities? What are the
backgrounds of students pursuing graduate degrees in S&T
policy? What career paths are sought by science and technology
policy program graduates? What are the fundamental skills and
content knowledge needed by science and technology policy
practitioners? Is the National Science Foundation playing an
effective role in fostering the development of science and
technology policy programs at U.S. universities?
4. Background
During his keynote address in 2005 at the American Association for
the Advancement of Science's Science and Technology Policy Forum, Dr.
John Marburger, then science advisor to President Bush, called for the
establishment of a ``science of science policy.'' The ``science of
science policy'' (SoSP) as described by Dr. Marburger and others
includes the development of scientific theories, analytical tools, and
rigorous datasets that will assist policymakers in science policy
decisions. The SoSP is an interdisciplinary field that draws together
researchers from economics, political science, and the social and
behavioral sciences to improve our understanding of the science and
engineering enterprise, including the process of innovation in an
effort to establish a more quantitative approach to science and
technology policy decisions.
While most believe that science, technology, and innovation are
critical to the competitiveness and prosperity of the United States, we
lack the rigorous tools to quantify that relationship. Therefore, it
remains difficult to actually measure the economic impact, social
benefits, and effectiveness of federal research and development (R&D)
investments. In addition to improving our ability to target federal R&D
investments, research in the area of SoSP holds the potential to
provide insight into the effect of globalization on the U.S. science
and engineering workforce, increase our understanding of technology
development and diffusion, communicate the social and economic benefits
of R&D spending to the general public, and shed light on the process of
creativity and innovation.
In 2006, in response to Dr. Marburger's call to action, an
interagency working group, co-chaired by the National Science
Foundation (NSF) and the Department of Energy, was formed within the
Subcommittee on Social, Behavioral, and Economic Sciences under the
National Science and Technology Council. The interagency working group
conducted an assessment of the state of SoSP research and surveyed the
Federal agencies about the tools, methods, and data they were using to
make investment decisions. This work resulted in the release of a
Federal SoSP research roadmap \1\ in 2008. The roadmap outlines three
broad themes and poses 10 research questions to be addressed by
federally-funded SoSP research.
---------------------------------------------------------------------------
\1\ The Science of Science Policy: A Federal Research Roadmap
http://www.whitehouse.gov/files/documents/ostp/NSTC%20Reports/
39924-PDF%20Proof.pdf
---------------------------------------------------------------------------
Theme 1: Understanding Science and Innovation
Question 1: What are the behavioral foundations of innovation?
Question 2: What explains technology development, adoption,
and diffusion?
Question 3: How and why do communities of science and
innovation form and evolve?
Theme 2: Investing in Science and Innovation
Question 4: What is the value of the Nation's public
investment in science?
Question 5: Is it possible to ``predict discovery''?
Question 6: Is it possible to describe the impact of discovery
on innovation?
Question 7: What are the determinants of investment
effectiveness?
Theme 3: Using the Science of Science Policy to Address National
Priorities
Question 8: What impact does science have on innovation and
competitiveness?
Question 9: How competitive is the U.S. scientific workforce?
Question 10: What is the relative importance of different
policy instruments in science policy?
Role of the National Science Foundation
In 2006, NSF's Directorate for Social, Behavioral and Economic
Sciences (SBE) held three workshops to ask for recommendations and
guidance from the research community about the breadth of activities
that should be supported under an NSF-funded SoSP program. In 2007, NSF
allocated $6.8 million for a new Science of Science and Innovation
Policy (SciSIP) program. SciSIP supports both single investigators and
collaborations in two areas. First, the program supports research on
data and the improvement of science metrics, including research to
improve our ability to identify, characterize, and measure returns on
federal R&D investments. Second, the program supports research directed
toward the development of models and other statistical tools as well as
qualitative studies that will improve our understanding of the process
of innovation and science outcomes, both societal and economic. In
addition to supporting research, the program supports workshops,
conferences, and symposia to help foster a community of researchers in
the SciSIP area.
NSF's SciSIP budget request for fiscal year 2011 was $14.25
million, of which $8.05 million will be devoted to SciSIP research and
community building activities through SBE's Office of Multidisciplinary
Activities and $6.2 million will be for the development of data survey
tools through SBE's Division of Science Resource Statistics (SRS). The
data compiled by SRS for the biennial Science and Engineering
Indicators report serve a vital role in the SoSP as a long-term source
of unbiased information about the science and engineering enterprise.
NSF's current efforts in SciSIP are not its first. From the 1970's
through the early 1990's NSF had a modest-sized staff carrying out
policy research and analysis. These analysts worked in the Office of
Research and Development Assessment, later the Division of Policy
Research and Analysis (PRA), on specific tasks requested by the Office
of Management and Budget, the Office of Science and Technology Policy,
the Congressional Office of Technology Assessment, and other federal
agencies. Additionally, PRA had a small budget to support academic
research in areas directly relevant to their policy analysis tasks. In
1992, PRA was involved in a scandal over the faulty assumptions used to
predict a looming shortage in engineers. The scandal led to an
investigation by the Committee on Science & Technology and the
dismantling of PRA.
STAR METRICS
The National Science Foundation and the National Institutes of
Health are currently collaborating on a project known as STAR METRICS
(Science and Technology for America's Reinvestment: Measuring the
Effect of Research on Innovation, Competitiveness and Science), which
is the first federal-university partnership to develop a data
infrastructure that documents the outcomes of science investments for
the public. An initial pilot project was recently completed with a
handful of regionally and otherwise diverse institutions of higher
education through the National Academies' Federal Demonstration
Partnership. The pilot project validated the initiative's concept and
its ability to collect relevant data from existing university
databases. The full-scale project will proceed in two phases: Phase I
will develop uniform, auditable and standardized measures of job
creation resulting from science spending included in the American
Recovery and Reinvestment Act; Phase II will develop measures of the
impact of federal R&D spending on economic growth, workforce
development, scientific knowledge, and social outcomes.
International Efforts
The Organization for Economic Cooperation and Development (OECD)
has been developing and collecting science and technology indicators
from their member nations for nearly 50 years. In 2004, the Science &
Technology Ministerial called for a ``new generation of indicators
which can measure innovative performance and other related output of a
knowledge-based economy'' emphasizing ``the data required for the
assessment, monitoring and policy making purposes.'' \2\ Since that
time the OECD has continued to refine its science and technology
indicators, and improve the tools they use for analyzing the impact of
science and technology. Earlier this year the OECD released a report
entitled, ``Measuring Innovation: A New Perspective.'' \3\ The report
identifies five areas for which international action is needed: the
development of innovation metrics that can be linked to aggregate
measures of economic performance; investment in a high-quality and
comprehensive statistical infrastructure to analysis innovation at the
firm-level; the promotion of innovation metrics in public sector and
for public policy evaluation; the identification of new approaches to
understand knowledge creation and flow; and the promotion of the
measurement of innovation on social goals.
---------------------------------------------------------------------------
\2\ What Indicators for Science, Technology and Innovation Policies
in the 21st Century? Blue Sky II Forum--Background http://www.oecd.org/
dataoecd/9/48/37082579.pdf
\3\ http://www.oecd.org/document/22/
0,3343,en-41462537-41454856-44979734-
1-1-1-
1,00.html
---------------------------------------------------------------------------
On April 14, Dr. Julia Lane spoke to the European Parliament about
the STAR METRICS effort, emphasizing the global nature of science and
engineering and the common need for better tools to assess and predict
the impact of science, technology, and innovation. During her speech,
Dr. Lane indicated that creating a universal researcher identification
system could be an important first step in a global effort to
understand and measure the return on scientific investment. Niki
Tzavela, a Greek Member of the European Parliament, who serves as Vice-
Chair of the European Parliament Delegation to the United States, and
sits on the Parliament's Industry, Research, and Energy Committee
(ITRE), has been a leader on the issue of improved science metrics in
the European Union. Having indicated that the EU 8th Framework Program
represents an opportunity to evaluate and improve science policy, Mrs.
Tzavela introduced an initiative to the ITRE Committee proposing that
the EU collaborate on this topic with the United States. The EU is now
considering initiatives that would complement the STAR metrics project,
and the Scientific Technology Options Assessment Panel within the EU
has been designated to provide an in-depth analysis on Science
Metrics.\4\
---------------------------------------------------------------------------
\4\ http://www.euractiv.com/en/science/eu-looks-to-us-model-for-
measuring-rd-impact-news-448950
Education in Science & Technology Public Policy
According to the AAAS Guide to Graduate Education in Science,
Engineering and Public Policy \5\ there are more than 25 U.S.
universities that offer a graduate degree in the interdisciplinary
field of science and technology public policy. These degree programs
draw from a number of fields, including economics, sociology, political
science, and engineering; however the coursework associated with each
program varies and is dependent upon the academic department or school
that houses the program.
---------------------------------------------------------------------------
\5\ http://www.aaas.org/spp/sepp/
---------------------------------------------------------------------------
5. Questions for Witnesses
Dr. Julia Lane
1. Please describe NSF's Science of Science Policy and
Innovation program, including a description of the Foundation's
overall vision and strategy for research and education in this
area.
Specifically,
How is NSF fostering collaboration between
social and behavioral scientists and
researchers from other disciplines, including
computer scientists, engineers, and physical
scientists, in science and technology policy
research?
How is NSF fostering the development of science
and technology policy degree programs and
courses of study at colleges and universities?
What is the current scope and level of support
for such programs?
How is NSF encouraging the development of a
community of practice in science of science
policy and the dissemination of research
results to policy makers?
2. As a Co-Chair of the Science of Science Policy Interagency
Group under the National Science and Technology Council, please
briefly describe the work of that group and how the various
federal science agencies are collaborating on the development
and implementation of science of science policy tools to
improve the management and effectiveness of their R&D
portfolios and other science and technology-related programs.
3. Please provide a brief description and update on the status
of the OSTP led project on science metrics, known as STAR
Metrics, including a description of international engagement
and interest in this effort.
Dr. Albert Teich
1. How can research on innovation and the scientific
enterprise also known as the science of science and innovation
policy (SciSIP) be used to inform the design of effective
federal programs and the management of federal research
investments? Do you believe the results of science and
technology policy research are being effectively incorporated
into national policy decisions?
2. What are the challenges to the incorporation of science and
technology research in the decision making process? What is
AAAS's role in mitigating those barriers? Specifically, how is
AAAS helping to build a community of practice in the SciSIP?
What recommendations, if any, do you have for the National
Science Foundation's SciSIP program? Do you believe SciSIP
research is being effectively coordinated across the federal
agencies? If not, what if any recommendations do you have
regarding interagency coordination?
3. As you know there are more than 25 U.S. universities that
offer graduate degrees in science, engineering and public
policy. In your opinion, are these programs having the intended
effect of producing graduates with the skills necessary to
shape science and technology policies? What type of education
and training should science and technology policy practitioners
receive? Is the National Science Foundation playing an
effective role in fostering the development of science and
technology policy programs at U.S. universities? If not, what
recommendations, if any, do you have for NSF and/or the
universities with such programs?
Dr. Daniel Sarewitz
1. Please provide an overview of the research activities of
the Consortium for Science, Policy, and Outcomes. How are you
facilitating interdisciplinary collaborations within the
Consortium? What new and continuing areas of research in the
science of science and innovation policy (SciSIP) could
significantly improve our ability to design effective programs
and better target federal research investments? What are the
most promising research opportunities and what are the biggest
research gaps?
2. Is the Federal government, specifically the National
Science Foundation, playing an effective role in fostering
SciSIP research and the development of a community of practice
in SciSIP? What recommendations, if any, do you have for the
National Science Foundation's SciSIP program?
3. Please describe the education and outreach activities of
the Consortium for Science, Policy, and Outcomes.
4. How can the dissemination of SciSIP research findings be
improved so that policymakers are better informed of the
current state of research? Are there best practices that can be
implemented by the Federal government and/or the research
community to improve the incorporation of science and
technology policy research into the decision making process?
5. What are the fundamental skills and content knowledge
needed by SciSIP researchers and practitioners? What are the
backgrounds of students pursuing graduate degrees in science
and technology policy and what careers paths are sought by
these graduates? Is the National Science Foundation playing an
effective role in fostering the development of science and
technology policy degree programs at U.S. universities? If not,
what recommendations, if any, do you have for NSF and/or the
universities with such programs?
Dr. Fiona Murray
1. Please provide an overview of your research. What new and
continuing areas of research in the science of science and
innovation policy (SciSIP) could significantly improve our
ability to design effective programs and better target federal
research investments? What are the most promising research
opportunities and what are the biggest research gaps?
2. Is the Federal government, specifically the National
Science Foundation, playing an effective role in fostering
SciSIP research and the development of a community of practice
in SciSIP? What recommendations, if any, do you have for the
National Science Foundation's SciSIP program?
3. What are the fundamental skills and content knowledge
needed by SciSIP researchers and practitioners? What are the
backgrounds of students pursuing graduate degrees in science
and technology policy and what careers paths are sought by
these graduates? Is the National Science Foundation playing an
effective role in fostering the development of science and
technology policy degree programs at U.S. universities? If not,
what recommendations, if any, do you have for NSF and/or the
universities with such programs?
Chairman Lipinski. This hearing will now come to order.
Good afternoon and welcome to today's Research and Science
Education Subcommittee hearing on the Science of Science and
Innovation Policy, also known as SciSIP. For those of you who
may not be familiar with the phrase, the Science of Science
Policy is a field of interdisciplinary research that focuses on
understanding how our policy decisions impact innovation and
science and engineering research. Given the magnitude of the
federal investment in science and technology, there is a need
for objective analysis and evaluation of federally funded R&D
programs. And given the size of the budget deficit,
Congressional decision makers need the best information
possible to make sure we are spending taxpayer dollars
optimally.
Today we will be hearing from a diverse panel of witnesses
about the current state of research and education in this
emerging field. This topic is of particular interest to me
since it goes to the core of why I joined the Science and
Technology Committee when I first came to Congress. Like most
members of this committee, I believe that science and
engineering research, and education have driven long term
economic growth and improved the quality of life for all
Americans. I have viewed science and innovation policy as
critical for maintaining our international competitiveness and
creating jobs.
But the best policies are not self-evident. As someone who
was trained as an engineer and a social scientist, I believe we
need data and proper analysis of this data to be able to
determine as best we can the optimal policy. We are going to
hear today about some of the research that is being done on
science policy. I am eager to hear to the panel's thoughts on
what is being found, how well these findings are being
disseminated, and whether research in this area is actually
helping policymakers.
While many of us would agree that science has had a
positive impact on our lives, I think we know very little about
how the process of innovation works. What kinds of research
programs or institutional structures are most effective? How do
investments in R&D translate to more jobs, improved health, and
overall societal well-being? How should we balance investments
in basic and applied research? With millions of Americans out
of work it becomes more critical than ever that we find answers
to these questions.
We will also take a closer look at the state of education
in science and technology policy and how these degree programs
and courses of study are contributing by educating the next
generation of researchers and science policy practitioners.
There are a variety of science and technology programs that are
popping up across the country. They can be found in public
policy schools, economics departments, business schools, and
other places, even philosophy departments. I am looking forward
to hearing more about these programs, including what kinds of
students they attract and where those students go upon
graduation.
Finally, I hope to hear recommendations from today's
witnesses about how the Federal Government, particularly the
National Science Foundation, can foster interdisciplinary
research in this area, and how it can continue to improved
education and training for students who want to pursue a career
at the intersection of science, technology, and public policy.
I thank the witnesses for being here this afternoon, especially
as we have had to move this hearing back from the morning. I
look forward to your testimony.
Now before I recognize Dr. Ehlers, I--this will likely be
the last hearing of this subcommittee, the last meeting of this
subcommittee. It may not be, but just in case it is the last
for this Congress, I wanted to say that I think we should all
recognize Dr. Ehlers for his contributions in Congress and
especially on this committee through the years. It has been
certainly--I have had a great partner working on this as I have
chaired the Subcommittee for the last two years. He is someone
who really, truly is dedicated to the issues that we are facing
here and we deal with here in the Committee. Too many things
right now are becoming partisan footballs, and Dr. Ehlers
really has kept his eye on what is best and trying to find what
is best for our country. And I want to thank you for the years
that you have put in here and wish you the best in your next
endeavors, but it has been a pleasure to work with you,
especially over these last few years.
[The prepared statement of Chairman Lipinski follows:]
Prepared Statement of Chairman Daniel Lipinski
Good afternoon and welcome to today's Research and Science
Education Subcommittee hearing on the science of science and innovation
policy, also know as SciSIP. For those of you who might not be familiar
with the phrase, the ``science of science policy'' is a field of
interdisciplinary research that focuses on understanding how our policy
decisions impact innovation and science and engineering research. Given
the magnitude of the federal investment in science and technology,
there is a need for objective analysis and evaluation of federally
funded R&D programs. And given the size of the budget deficit,
Congressional decision makers need the best information possible to
make sure we are spending taxpayer dollars optimally. Today we'll be
hearing from a diverse panel of witnesses about the current state of
research and education in this emerging field.
This topic is of particular interest to me since it goes to the
core of why I joined the Science and Technology Committee when I came
to Congress. Like most Members of this committee, I believe that
science and engineering research and education have driven long-term
economic growth and improved the quality of life for all Americans. I
view science and innovation policy as critical for maintaining our
international competitiveness and creating jobs.
But the best policies are not self-evident. As someone who was
trained as an engineer and a social scientist, I believe we need data
and proper analysis of this data to be able to determine--as best we
can--the optimal policy to implement. We are going to hear today about
some of the research that is being done on science policy, and I am
eager to hear the panel's thoughts on what is being found, how well the
findings of this research are being disseminated, and whether research
in this area is actually helping policy makers.
While many of us would agree that science has had a positive impact
on our lives, I think we actually know very little about how the
process of innovation works. What kinds of research programs or
institutional structures are most effective? How do investments in R&D
translate to more jobs, improved health, and overall societal
wellbeing? How should we balance investments in basic and applied
research? With millions of Americans out of work, it becomes more
critical than ever that we find answers to these questions.
We'll also take a closer look at the state of education in science
and technology policy and how these degree programs and courses of
study are contributing by educating the next generation of researchers
and science policy practitioners. There are a variety of science and
technology policy programs that are popping up across the country. They
can be found in public policy schools, economics departments, business
schools, and other places, even philosophy departments. I'm looking
forward to hearing more about these programs, including what kind of
students they attract and where those students go upon graduation.
Finally, I hope to hear recommendations from today's witnesses
about how the Federal government, particularly the National Science
Foundation, can foster interdisciplinary research in this area and how
it can contribute to improved education and training for students who
want to pursue a career at the intersection of science, technology, and
public policy.
I thank the witnesses for being here this afternoon and look
forward to their testimony.
Chairman Lipinski. And with that I will now recognize Dr.
Ehlers for an opening statement.
Mr. Ehlers. Thank you for those kind words, Mr. Chairman. I
think my biggest challenge will be learning how to sleep in,
but I very much appreciate those comments. I always just try to
do a good job wherever I am. It is a trade I learned from my
parents and I never, never, never, ever expected to be in the
Congress or in politics. My mother never quite got over it. As
she put it, what are you doing with all those nasty people? But
it turns out my colleagues are not nasty people, and I
appreciate your leadership on the Subcommittee. And you have
done a great job of leading us in the right direction, and it
has been a pleasure to work with you. Thank you.
With that I will proceed to the opening statement. Today we
will explore the current state of science and technology policy
research and the role it plays in informing our policy
decisions. And I have to insert a little comment in here, that
is, when I first arrived here and was assigned to the Science
Committee, which made obvious sense since I was at that time I
think one of the very few, if the only, scientist in the
Congress, at least on the Republican side. And at the first
meeting of the Science Committee I asked the Chair, how many
scientists do you have on staff? And the answer was none. And I
said really? How can you function without--and he said well, we
don't really need people who understand science. We need people
who understand science policy.
Well, as a scientist I had never thought much about science
policy and little did I know that in conversation with Newt
Gingrich where I commented that I thought it a bit strange that
the science policy we were operating under in the government
and in the Congress was by Vannevar Bush's 1945 book, and I
said that is a little out of date. Things change rapidly in
science. It is a great book, ``The Endless Frontier''. Vannevar
Bush was a great man. He had done a lot of good works
especially during World War II. But I talked to Newt Gingrich
about that, that that was the latest science policy book that
was guiding the Government, so he did as Newt Gingrich always
did and said hey, it is time to get another one. Why don't you
do it? So I learned after a couple of years never to suggest
anything to Newt because he always dumped the burden on me.
In any event, I did proceed to work on a book, which just
walked in the door with my aide, and some of you have seen it
already. It is called ``Unlocking Our Future''. Now this is not
a great science policy book. I knew absolutely nothing about
science policy when we proceeded to write it but it seemed to
me that there were certain things that were obvious and we put
them in here. And I deliberately said ``Unlocking Our Future''
because I felt we had so much to do and I was not able to do it
in this thin little volume. It did get some notice, and it
inspired some science policy individuals to engage more
seriously in this. And some of them, many of them are
represented here. But it was a real education to me. You should
try that sometime, writing a book about something you know
nothing about. It is a great way to learn and fortunately as a
child I was homeschooled because of illness, so all the
learning I did was things that I learned and learned on my own.
So that was good preparation for this.
We clearly, badly need something like this again and it is
one case where the author is delighted to say this is too old
now. It is time to get busy. Someone else better start writing
a better thing.
Let me continue with my opening statement. When Dr. John
Marburger, who was science advisor to President Bush, called
for the establishment of a Science of Science Policy in 2005,
we embarked on a new journey into this emerging field of
interdisciplinary research by establishing an interagency
working group, the Science and Science of Innovation Policy
program at the National Science Foundation--the shorthand for
it was SciSIP--and most recently, the Science and Technology
for America's Reinvestment, which is the emphasis that I and
others, including the authors of ``The Gathering Storm,'' have
been emphasizing, because it is important for us to measure the
effects of research and innovation competitiveness and science
which has come to be called STAR METRICS. I hope this hearing
will provide us with a detailed measurement of how far we have
come on that journey as well as an encouraging picture of the
progress we have made.
I have spent many years on this Committee working to
strengthen U.S. innovation and science education, and I have
been a long time advocate of increased federal funding for
basic research. I wish the entire Congress was receptive to
that notion as they--but this funding produces the
technological innovations that will keep America competitive in
the global market and it is essential for us to educate
American workers in the skills needed for 21st century jobs.
As with any program, sustained Congressional oversight is
required to insure that the Science of Science Policy Programs
are effective and that they progress in a timely and fiscally
responsible manner. I am encouraged by efforts which seek to
maximize our current investments in scientific research, and I
believe it is very important that those are the investments
that provide us with measureable returns. And that is why I
have worked so hard to try and make the research and
development tax credit permanent, because that is one good way
to encourage industry to work on these issues. We must be
mindful of that fact as Congress deliberates the best ways to
use American taxpayer funds in this difficult economic climate.
To that end I am very interested in learning more about the
progress and potential of the STAR METRICS program and its
recently completed project. I hope--I look forward to learning
more about the status of science affecting science policy and
the advancements which have been made since 2005. And I wanted
to thank our panel of witnesses for being here today, for
accommodating our last second scheduling change, and I look
forward to hearing their insights on this topic. There is much
work to be done to help our nation recover its lead in
technological development, and in manufacturing, and in science
in general.
And so I am looking forward to the testimony today and I
hope you can enlighten us, and out of this will come first of
all a new version of this, and secondly there is some
improvement in our judgments about science and also science
education in this Nation. Thank you very much.
[Statement of Mr. Ehlers follows:]
Prepared Statement of Representative Vernon J. Ehlers
Today, we will explore the current state of science and technology
policy research, and the role it plays in informing our policy
decisions. When Dr. John Marburger, then science advisor to President
Bush, called for the establishment of a ``science of science policy''
(SoSP) in 2005, we embarked on a new journey into this emerging field
of interdisciplinary research by establishing an interagency working
group, the Science of Science and Innovation Policy (SciSIP) program at
the National Science Foundation (NSF), and, most recently, the Science
and Technology for America's Reinvestment: Measuring the Effect of
Research on Innovation, Competitiveness and Science (STAR METRICS)
project. I hope this hearing will provide us with a detailed
measurement of how far we have come on that journey, as well as an
encouraging picture of the progress we have made.
I have spent many years on this committee working to strengthen
U.S. innovation and science education, and I have been a long time
advocate of increased federal funding for basic research. This funding
produces the technological innovations that will keep America
competitive in the global market, and it is essential for us to educate
American workers in the skills needed for 21st-century jobs.
As with any program, sustained Congressional oversight is required
to ensure the SoSP programs are effective, and that they progress in a
timely and fiscally responsible manner. I am encouraged by efforts
which seek to maximize our current investments in scientific research,
and I believe it is very important that those R&D investments provide
us with measurable returns. We must be mindful of that fact as Congress
deliberates the best ways to use American taxpayer funds in this
difficult economic climate. To that end, I am very interested in
learning more about the progress and potential of the STAR METRICS
program and its recently completed pilot project.
I look forward to learning more about the status of science
affecting science policy and the advancements which have been made
since 2005. I want to thank our panel of witnesses for being here
today, for accommodating our last-second scheduling change, and I look
forward to hearing their insights on this topic.
Chairman Lipinski. Thank you, Dr. Ehlers. Maybe we can make
that a best seller now. Now if there are Members who wish to
submit additional opening statements, their statements will be
added to the record at this point. And at this time I want to
introduce our witnesses. Dr. Julia Lane is the Program Director
of the Science of Science and Innovation Policy program at the
National Science Foundation. Dr. Daniel Sarewitz is the Co-
Director of the Consortium for Science, Policy & Outcomes and
Professor of Science and Society at Arizona State University.
Dr. Fiona Murray is an Associate Professor of Management in the
Technological Innovation & Entrepreneur Group at MIT Sloan
School of Management. And Dr. Albert H. Teich is the Director
of Science & Policy Programs at the American Association for
the Advancement of Science.
As our witnesses should know, you will each have five
minutes for your spoken testimony. Your written testimony will
be included in the record for the hearing. When you all have
completed your spoken testimony we will begin with questions.
Each Member will have five minutes to question the panel. And
before we begin I just want to mention that we will be having
votes coming up soon, so probably what the most important thing
is if everyone could hold to their five minutes it will help us
so we don't have--hopefully won't have an interruption in the--
at least in the testimony part here. But with that, we will
start with Dr. Lane.
STATEMENT OF JULIA LANE, PROGRAM DIRECTOR OF THE SCIENCE OF
SCIENCE AND INNOVATION POLICY PROGRAM, NATIONAL SCIENCE
FOUNDATION
Dr. Lane. Chairman Lipinski, Ranking Member Ehlers, Members
of the Subcommittee, it is my distinct pleasure to be with you
here today to discuss NSF's Science of Science and Innovation
Policy Program, the activities of the Science of Science Policy
Inter-Agency Group, and the STAR METRICS program, the last of
which is a new federal and university partnership to document
the scientific, social, economic, and work force outcomes of
science investments to the public. I am Dr. Julia Lane. I am
the Program Director of the SciSIP program at NSF and Co-Chair
of NSTC working group on the Science of Science Policy.
I submitted a written statement to supplement or to
accompany my oral testimony. So the focus of these three
efforts is to provide, as you noted, better methods and data to
inform science--federal science investment decisions. They
represent the first efforts to construct a scientific framework
that is supported by multiple agencies and multiple
institutions, all jointly engaged. It represents a true all-out
effort to providing an evidence basis for U.S. science policy.
Its success is important for policy makers because, in a
nutshell, you can't manage what you can't measure, and what you
measure is what you get.
Developing a scientific framework involves several things.
It requires the engagement of scientists from many disciplines
to address science policy issues. The NSF does this through the
SciSIP program. The overarching goal of this effort is to
conduct basic research that creates new objective models,
analytical tools, and data sets to inform our nation's basic
research, and to inform public and private sectors about the
processes through which investments in science and engineering
work their way through the outcomes we have mentioned. It funds
researchers from a wide range of disciplines and it funds
students to study science policy issues in a scientific manner.
As you also know, it supports the redesign of surveys
undertaken by the National Science Foundation's Science
Resource Statistics. It is the statistical agency charged with
describing the science and innovation enterprise. For example,
the new business and innovation survey, the BRDIS [Business R&D
and Innovation Survey] survey, has been completely redesigned
from a 1950s structure to something that captures the new R&D
innovation activities.
So it is not just the academic community that is advancing
the Science of Science Policy. It is also policymakers in the
Executive and Legislative branches who recognize that we need
these better approaches. That is why the National Science and
Technology Council established the Science of Science Policy
Inter-Agency Task Group. That task group, the science policy
Agencies that were represented on that, created a road map that
characterized our current system of measuring the science and
engineering enterprise as inadequate. We can do better. There
is enormous potential to do better. The first step to doing
better is to get better data. Just as good bricks need straw,
good research in an empirical field like science and innovation
policy requires good data. So to that end, the SciSIP Program
and the Science of Science Policy Inter-Agency Group initiated
the development of the STAR METRICS program to which you have
already alluded. The benefits of this program is that rather
than having organized data sets that different agencies and
different institutions use to answer the types of questions
that the American people are asking, we can develop a common,
bottom-up, empirical infrastructure that will be available to
all recipients of federal funding and to science agencies to
quickly respond to state, congressional, and OMB requests. It
is critical that we take a bottom-up effort in order to develop
these approaches--one that is domain specific, generalized, and
replicable.
Phase one started in March, jointly sponsored by NIH, the
lead agency, NSF, and OSTP. And that is collecting the data
required to, with no burden, respond to questions about the
jobs associated with science funding. Phase two, which is
trying to collect broader data on a wide range of outcomes--not
just jobs, but social, scientific, economic, and work force
outcomes--is beginning this fall with formal consultations with
research institutions.
Furthermore, science is fundamentally an international
endeavor. We have engaged with the European Union, with the
Japanese, with the Brazilians, with many countries in order to
document the impact of science investments. In fact, the
Japanese Government has recently set aside funding for a
Japanese equivalent of a SciSIP program. The European Union has
also shown considerable interest in what we have been up to,
and is considering emulating the bottom-up, no burden endeavor
that the SciSIP, the Science of Science Policy, and the STAR
METRICS Program have pushed forward. And the Brazilian
Government has also requested briefings on the SciSIP, Science
of Science Policy, and STAR METRICS Program.
This concludes my testimony, Mr. Chairman. I look forward
to answering any questions you or the Members of the Committee
might have.
[The prepared statement of Dr. Lane follows:]
Prepared Statement of Julia I. Lane
Chairman Lipinski, Ranking Member Ehlers, and Members of the
Subcommittee, it is my distinct privilege to be here with you today to
discuss NSF's Science of Science and Innovation Policy (SciSIP), the
activities of the Science of Science Policy Interagency Group, and the
STAR METRICS program--a new federal effort designed to create a
scientific quantifiable measurement of the economic and social impacts
of federal research spending. I am Dr. Julia Lane, the program Director
of the SciSIP program at the National Science Foundation, and co-chair
of the NSTC working group on Science of Science Policy (SOCP).
At the outset, I would like to express my appreciation to all the
Members on the House Committee on Science and Technology for their
unstinting support to advance both the cause, and the frontiers of
science. This Committee has long held steadfast in the knowledge that
America's present and future strength, prosperity and global
preeminence depend directly on fundamental research.
The National Science Foundation has always believed that optimal
use of limited Federal funds relies on two conditions: Ensuring that
research is aimed--and continuously re-aimed--at the frontiers of
understanding; and certifying that every dollar goes to competitive,
merit-reviewed, and time-limited awards with clear criteria for
success. When these two conditions are met, the nation gets the most
intellectual and economic leverage from its research investments.
Yet our portfolio keeps changing. We have a minimal vested interest
in maintaining the status quo, and pride ourselves on an ability to
shift resources quickly to the most exciting subjects and most
ingenious researchers.
Moreover, we regard it as an essential part of our mission to
constantly re-think old categories and traditional perspectives. This
ability is crucial now, because conventional boundaries are
disappearing--boundaries between nations, boundaries between
disciplines, boundaries between science and engineering, and boundaries
between what is fundamental and what is application. At the border
where research meets the unknown, the knowledge structures and
techniques of life science, physical science, and information science
are merging.
Additionally, our scope is extremely wide, extending across all the
traditional mathematics, science and engineering disciplines. That is a
major advantage in today's research climate, where advances in one
field frequently have immediate and important applications to another.
The same mathematics used to describe the physics of turbulent air
masses may suddenly explain a phenomenon in ecology or in the stock
market, or the changes in brain waves preceding an epileptic seizure.
The same algorithms used by astronomers to discern patterns in the
distant heavens can aid radiologists to understand a mammogram, or
intelligence systems to identify a threat. Only an agency that sees the
``big picture'' can assure this kind of interdisciplinary synergy.
For all of these reasons, the National Science Foundation is
fostering the development of the knowledge, theories, data, tools and
human capital needed to cultivate a Science of Science and Innovation
Policy program. The program has three major aims: advancing evidence-
based science and innovation policy decision making; developing and
building a scientific community to study science and innovation policy;
and developing new and improved datasets.
The overarching goal in this effort, however, is to conduct basic
research that creates new objective models, analytic tools, and
datasets to inform our nation's public and private sectors about the
processes through which investments in science and engineering research
may be transformed into scientific, social and economic outcomes.
We need to better understand the contexts, structures, and
processes of scientific and engineering research, to evaluate reliably
the tangible and intangible returns from investments in research and
development (R&D), and to predict the likely returns from future R&D
investments.
It is not only leaders in the scientific and engineering fields,
but policymakers as well in the Executive and Legislative Branches who
recognize that we need better approaches for developing science policy,
which is why the National Science and Technology Council established
the Science of Science Policy Interagency Task Group. That task group's
roadmap characterized our current systems of measurement of the science
and engineering enterprise as inadequate. There is enormous potential
to do better.
To begin to create a scientific, quantifiable measurement of the
economic and social impacts of our federal research investments, this
Administration has initiated an innovative new program, STAR METRICS
(Science and Technology in America's Reinvestment--Measuring the
EffecTs of Research on Innovation, Competitiveness and Science). This
initiative is led by the National Institutes of Health and the National
Science Foundation under the auspices of the White House Office of
Science and Technology Policy. The goal is to develop a system that can
be used to track the impact of federal science investments. I will
return to the topic of STAR METRICS later in my testimony.
1) The overall vision and strategy for research and education in the
`Science of Science and Innovation Policy.
Federally funded basic and applied scientific research has had a
significant impact on innovation, economic growth and America's social
well-being. We know this in the broad sense from numerous economic
analyses but it is difficult to disentangle the impact of Federal
investment versus private, state and industrial investments. We have
little information about the impact of individual projects and
programs, whether federally or privately funded. We have little
information about the impact of science agencies. Thus, although
determining which federally funded research projects yield solid
results and which do not is a subject of high national interest, since
American taxpayers invest more than $140 billion annually in research
and development (R&D), there is little evidence to support such
analysis. In short, although Congressional and Executive Branch policy
decisions are strongly influenced by past practices or data trends that
may be dated, or have limited relevance to today's economic situation.
A deeper understanding of the changing framework in which scientific
and technical innovation occurs would help policymakers decide how best
to make and manage limited public R&D investments to exploit the most
promising and important opportunities.
The lack of analytical capacity in science policy is in sharp
contrast to other policy fields that focus on workforce, health and
education. Debate in these fields is informed by the rich availability
of data, high quality analysis of the relative impact of different
interventions and computational models that often allow for forward-
looking analyses with impressive results. For example, in workforce
policy, the evaluation of the impact of distinct education and training
programs has been transformed by careful attention to issues such as
selection bias and the development of appropriate comparison groups.
The analysis of data about geographic differences in health care costs
and health care outcomes has featured prominently in guiding health
policy debates. And education policy has moved from a ``invest more
money'' and ``launch a thousand pilot projects'' imperative to a more
systematic analysis of programs that actually work and that promote
local and national reform efforts.
Each of those efforts, however, has benefited from an understanding
of the systems that are being analyzed. In the case of science policy,
no such agreement currently exists. NSF's Science of Science &
Innovation Policy (SciSIP) program is designed to advance the
scientific basis of science and innovation policy.
Vision
The principal goal is to advance the scientific basis of making
science policy decisions, particularly those involving budgets, through
the development of improved data collection, theoretical frameworks,
computational models and new analytic tools.
A major component of the SciSIP program is the funding of
investigator initiated research. Through direct engagement of the
federal policy community with the research community, it is hoped that
future policy decisions can be informed by empirically validated
hypotheses and informed judgment. Our aim, as noted in the program's
description, is to ``engage researchers from all of the social,
behavioral and economic sciences as well as those working in domain-
specific applications such as chemistry, biology, physics, or
nanotechnology in the study of science and innovation policy. The
program welcomes proposals for individual or multi-investigator
research projects, doctoral dissertation improvement awards,
conferences, workshops, symposia, experimental research, data
collection and dissemination, computer equipment and other
instrumentation, and research experience for undergraduates. The
program places a high priority on interdisciplinary research as well as
international collaboration.''
The program explicitly fosters a multi-level science (in addition
to more obviously being an interdisciplinary science) that spans from
the study of cognitive phenomena in individual scientists (e.g., the
study of fixation, insight, reasoning, and decision making) to the
study of whole industries and policies at the industry level. What
makes the overall effort a potentially transformative effort is the
support of research at multiple levels: industry level policies are
only successful if it has individual-level effects (i.e., that
engineers and scientists change), and individual-level effects are only
important if they scale to produce industry-level differences.
Another focus of the SciSIP program is the redesign of the surveys
undertaken by NSF's Science Resources Statistics, the federal
statistical agency responsible for collecting and disseminating data on
the U.S. science and engineering enterprise. The most visible activity
has been the redesign of the Business R&D and Innovation Survey (BRDIS)
which collects information from a nationally representative sample of
about 40,000 companies, including companies in both manufacturing and
nonmanufacturing industries. This survey is the primary source of
information on business, domestic and global R&D expenditures, and
workforce. The new structure enables respondents to provide detailed
data on the following:
How much is a company investing in its domestic and
worldwide R&D relationships, including R&D agreements, R&D
``outsourcing,'' and R&D paid for by others?
What is the strategic purpose of a company's
worldwide R&D activities and what are their technology
applications?
What are the details of a company's patenting,
licensing, and technology transfer activities, and companies'
innovative activities?
In addition, a limited number of questions are asked about
activities related to new or improved products or processes. These are
intended to serve as basis for collecting an expanded set of innovation
metrics in the future. The results of this data collection are now
being published as part of SRS's ongoing reporting activity:
Strategy
The focus of the program's strategy has been to convince the
academic community that the study of science policy is a worthwhile
academic endeavor. This has taken three main forms. The first has been
to engage in a substantial amount of outreach through presenting at
professional workshops and conferences (an average of five or six a
year), through supporting specific workshops on various science policy
topics (two or three a year), through establishing a very active
listserv (which has grown to over 720 members in less than two years)
and through supporting a Science of Science Policy website (http://
scienceofsciencepolicy.net).
The second part of the strategy has been to invest in high quality
research datasets. Good bricks need straw, and good research in an
empirical field like science and innovation policy requires good data.
Fields as disparate as biotechnology, geosciences, and astronomy have
been transformed by both data and knowledge access. NSF hopes to
similarly transform the study of science policy by improving science
data. Such a transformation will occur in three ways. First, the
scientific challenge is compelling: the way in which scientists create,
disseminate and adopt knowledge in cyberspace is changing in new and
exciting ways. Collaborations between computer scientists and social
scientists, fostered by SciSIP, can capture these activities using new
cybertools. Second, new and exciting data attract new researchers to
the field. This in turn attracts new graduate students, who see new
ground being broken and exciting opportunities for research. Finally,
we aim to actively engage the federal science policy community through
a variety of workshops, as well as direct engagement through the
Science of Science Policy Interagency Group.
The program has made a total of 99 awards: 19 in 2007, 23 in 2008,
31 in 2009, and 26 in 2010. The program began to accept doctoral
dissertation proposals in 2010; five of those were funded. The success
rate for standard proposals is currently about 25%; higher for doctoral
dissertation proposals. A total of 182 principal investigators have
been supported--of those 147 are scientists from Social, Behavioral and
Economic Science domains and the balance are from areas as diverse as
Computer and Information Sciences, Education, Physics, Biology and Law.
Results
The program is beginning to achieve some of its ambitious goals. A
SciSIP Principal Investigator (PI), who is a Business School Dean at a
university with a strong focus on publicly-funded research, has noted
``I know full well that this new program provides unique grant
opportunities for faculty members in management, information systems,
and other fields of business administration. He cites the following
from his personal experience `` . . . in the field of business
research, and business management, the Science of Science & Innovation
Policy papers are featured in some of the best sessions at the Academy
of Management Meetings. This innovative program has sparked
considerable interest in public policy among management scholars, and
particularly in business schools. The impact of the research you are
funding struck home when I read the latest issue of BizEd, the magazine
of The Association to Advance Collegiate Schools of Business (AACSB, an
association of educational institutions, businesses, and other
organizations devoted to the advancement of higher education in
management education. It is also the premier accrediting agency of
collegiate business schools and accounting programs worldwide. The
research you have funded was prominently featured in their magazine,
which is circulated to thousands of business schools worldwide.''
Additionally, the impact of the SciSIP program has influenced
several National Research Council studies, and thus impacted public
policy with respect to technology commercialization and academic and
public sector entrepreneurship. One is the Congressionally-mandated
evaluation of the Small Business Innovation Research Program. Another
is a committee entitled ``Best Practices in National Innovation
Programs for Flexible Electronics'' and a third is ``Management of
University Intellectual Property: Lessons from a Generation of
Experience, Research, and Dialogue''
In another example, a major part of the science and innovation
policy debate has been the role of R&D and research tax credits whose
budgetary cost is about $15 billion each year \1\. The obvious policy
question is how effective are these tax credits in stimulating
innovation? SciSIP funded PIs have examined changes in R&D tax credit
generosity across countries and US states over time to evaluate
business firms' response. They estimate that for every $1 of tax
credits firms spend about $1 more on R&D. However, the research also
extends to the impact of firms' response to the uncertainty about the
duration of the federal Research and Experimentation (R&E) tax credit,
which is not currently permanent and in fact expired at the end of
2009. The uncertainty about renewal has offsetting effects--one is to
increase short-term expenditures because firms think they need to do
R&D now to get the credit. The reverse is to reduce overall R&D
expenditures since uncertainty is detrimental to the expected payoffs
from long-term investments such as fundamental R&D. The sign of the net
effect is an empirical question, and again something the SciSIP PI has
been working on--he finds a strong negative effect of uncertainty on
general investment and employment, and is currently extending this work
to R&D. The same PI presented in September 2009 to the Federal Reserve
Bank Board of Governors, including Governor Bernanke; the Fed was
trying to understand why the IT ``productivity miracle'', which was a
major driver of US economic growth in the late 1990s, has slowed down
by the late 2000s. One possible reason is that better use of IT is
associated with organizational change, and the rate of organizational
change has potentially slowed down; a major SciSIP-funded grant
supports collecting a large national survey to try to examine why and
how that change has occurred.
---------------------------------------------------------------------------
\1\ http://www.ncseonline.org/NLE/CRSreports/08Aug/RL31181.pdf,
CRS-3
---------------------------------------------------------------------------
We can also learn from history. Another SciSIP PI has looked at two
case studies in depth: the invention of the airplane and Edison's
invention of the electric light. In both cases, the invention took a
long period of time--110 years and 80 years, respectively. In both
cases even the earliest attempts were based on many years [of work on
mathematics and technology and hundreds of years of work of science. To
illustrate, Sir Humphry Davy first demonstrated incandescence of
materials in 1808. His work drew on the Voltaic pile (battery) invented
in 1800, the Leyden jar developed in 1744, and carbon produced as
charcoal during the Roman Empire no later than 25 A.D. Leyden jars
depended on work by the ancient Greeks in 600 B.C. Thus, the foundation
of the science behind electric light dates back 2400 years before
incandescence, after which it took 80 more years of R&D to develop an
effective electric light. The airplane also has a similarly long
foundational period and duration of invention. In looking at various
inventions, research has shown that there are several different weak
methods but also some powerful strategies that vastly speed things
along. Edison succeeded simply because he had enormous resources (the
Edison Electric Light Company was capitalized at $300,000--about $30
million today. The Wright Brothers were far more efficient at
developing the airplane than Edison was in developing the electric
light.
How is the NSF fostering collaboration between social and behavioral
scientists and researchers from other disciplines,
including computer scientists, engineers and
physical scientists in science and technology
policy research?
This is being done in a number of ways: through the program call,
through workshops, and through successful and visible interdisciplinary
projects.
Program Description
The SciSIP program explicitly encourages interdisciplinary
cooperation in the program description. In particular, the program
states
``The SciSIP program invites the participation of researchers
from all of the social, behavioral and economic sciences as
well as those working in domain-specific applications such as
chemistry, biology, physics, or nanotechnology. The program
welcomes proposals for individual or multi-investigator
research projects, doctoral dissertation improvement awards,
conferences, workshops, symposia, experimental research, data
collection and dissemination, computer equipment and other
instrumentation, and research experience for undergraduates.
The program places a high priority on interdisciplinary
research as well as international collaboration.''
Workshops
Most of the workshops that have been hosted have been explicitly
interdisciplinary in nature, bringing together domain scientists and
social, behavioral and economic scientists, and have resulted in calls
for proposals (called Dear Colleague Letters) supported by multiple NSF
programs.
Examples include:
A two-day workshop to advance the scientific study of
federally funded centers and institutes as key elements in the
innovation ecosystem. The workshop brought together engineers
and natural, physical, and social scientists to address central
questions relating to the role of NSF-funded centers and
institutes in science and innovation policy.
Two separate workshops studying innovation in
organizations. One of these, hosted by the Conference Board and
supported by four Social, Behavioral and Economic (SBE)
Sciences and three Computer and Information Science and
Engineering (CISE) programs, was attended by computer
scientists, SBE scientists and representatives from the
business community to examine the potential for cyber data to
better inform our understanding of innovation. A second
conference brought together 20 leading computer scientists
(from the fields of data management, data mining, security/
privacy, social networks) and social/organizational scientists
(that included economists, sociologists, psychologists,
anthropologists) to identify emerging major challenges in the
collection and use of confidential data collection for the
study of innovation in organizations. SciSIP led the reusulting
development of a Dear Colleague Letter whose purposes was to
gather and create new Cyber-enabled data on innovation in
organizations, supported by six SBE and four CISE programs as
well as the Office of Cyber Infrastructure.
A workshop in conjunction with the NSF's Chemistry
Division that examined the impact of science R&D in the United
States, focusing on chemical sciences and related industries.
This led to a Dear Colleague Letter from SciSIP and the
Chemistry Division reaching out to the chemistry and the social
science communities advising them of funding opportunities
related to assessing and enhancing the impact of R&D in the
chemical sciences in the United States.
An interdisciplinary workshop which examined the
potential for new visualization tools to track the impact of
investments in science. These possibilities include tracing the
impact of basic research on innovation, examining the changing
structure of scientific disciplines, studying the role of
social networks in the dispersion of scientific innovations as
well as making comparisons of how the U.S. compares
internationally in science. That workshop brought together
researchers from a broad range of disciplines to examine such
key questions, and to engage the federal science community in a
discussion about whether and how the tools could be used in the
federal context.
Three workshops have directly engaged CISE and SBE
researchers in enhancing NSF's ability to describe its research
portfolio. The SciSIP program worked with the CISE directorate
to form an Advisory subcommittee to provide advice on
approaches to improving the way NSF interacts with its proposal
and award portfolio. Although NSF staff still rely on
traditional methods to do their jobs, such methods are becoming
less practical given the rapidly changing nature of science,
the increased recognition of the importance of funding
interdisciplinary and potentially transformative research, and
the significant increase in the number of proposals submitted.
Individuals with research expertise in machine learning, data
mining, information visualization, human-centered computing,
science policy, and visual analytics were recruited for this
effort. Nine teams were put together and charged with providing
advice to NSF on identifying and demonstrating techniques and
tools that characterize a specific set of proposal and award
portfolios. Their report, in turn, will advise NSF on how to
better structure existing data, improve use of existing machine
learning, analysis, and visualization techniques to complement
human expertise and better characterize its programmatic data.
The results should help NSF identify tools that will help
fulfill its mission including identifying broader impacts, as
well as funding transformative and interdisciplinary research.
NSF has also engaged program managers across the federal
government so that our collective approaches can inform not
only us, but other science agencies.
A workshop responding to Congressman Holt's request
for better ways to measure the economic impact of federal
research investments. SciSIP, together with NIH and other
agencies, is supporting the National Academy of Sciences' Board
on Science, Technology, and Economic Policy (STEP) and
Committee on Science, Engineering, and Public Policy (COSEPUP)
2011 workshop on science measurement. This workshop is aimed at
discussing new methodologies and metrics that can be developed
and utilized for assessing returns on research across a wide
range of fields (biomedical, information technology, energy,
and other biological and physical sciences, etc.), while using
background papers that review the methodologies used to date as
a starting point.
As one SciSIP PI has noted, ``SciSIP.. creates a domain around
which researchers from a variety of disciplines--biology and physics
and economics as well as information science and public policy--can
coalesce to pursue research topics in this domain for their own sake,
rather than in the interstices of other projects in their home
disciplines. As such, it acts as an attractor for top researchers
across the natural and social sciences, allowing them to pursue their
interests in SciSP topics''
Successful Examples
There are a number of examples of the fruits of these activities.
For example, SciSIP funded research supports a University of Michigan
research team consisting of a sociologist, a bioethicist specializing
in informed consent and stem cell regulation, a bioethicist trained as
a molecular biologist who is working on cell banking, and a post-doc in
stem cell biology. The combination is a powerful one as it matches
expertise with social scientific data and analysis methods, with deep
knowledge about both the policy and the science.
Similarly, the interdisciplinary work of two SciSIP Pis has helped
developed new metrics of the transmission of knowledge. These metrics
go beyond citation metrics to usage metrics and help us better
understand the impact that federal investment in research is having on
research results. By mapping the structure of science and looking at
how this structure changes over time, we can see the shifting landscape
of scientific collaboration and understand the new emerging
disciplines. That will enable us to to anticipate these changes and
properly target research funding to new and vibrant areas. For
instance, their work provides a striking example of the emergence of
neuroscience over the past decade--changing from an interdisciplinary
specialty to a large and influential stand alone discipline on a par
with physics, chemistry, or molecular biology.
How is NSF fostering the development of science and technology policy
degree programs and courses of study at colleges
and universities? What is the current scope and
level of support for such programs?
As with many NSF programs, the SciSIP program explicitly encourages
submissions that support graduate student development. While there is
no direct targeting of funds to policy programs, SciSIP supported 28
researchers from science and technology policy programs. In an example
of the type of support that has been provided to expand the course of
study, over 250 undergraduate students from Economics (behavioral
economics), Cognitive Science, Electrical and Computer Engineering, and
Industrial Engineering have participated in a project at Purdue
University, which is an interdisciplinary collaboration linking social
scientists and computer scientists and engineers.
A further example is the work done by Marcus Ynalvez at Texas A&M
International University, which has the explicit goal of mentoring
TAMIU Graduate Students (Students from Historically Underrepresented
Populations): The hands-on training and mentoring of TAMIU graduate
students represents an attempt to engage Hispanic students in
international scientific research activities with the intention of
introducing them to the possibilities of developing professional
careers in science and technology. These students are currently
gathering, synthesizing and reviewing literature materials for the
project's manuscripts, publications, and reports. With the data from
the Japan, Singapore, and Taiwan surveys, these students will be
analyzing data using a number of statistical software such as:
Statistical Packages for the Social Sciences (SPSS), STATA, and
Statistical Analysis System (SAS). They will learn how to interpret
statistical results associated with the family of generalized linear
regression models, namely: linear, logistic, and negative binomial
regression models, analysis of variance, and path analysis. Not only
have the TAMIU graduates gained actual research experience, they have
also developed professional relationships with students and professors
from the prestigious National University of Singapore
How is NSF encouraging a community of practice in science of science
policy and the dissemination of policy to policy
makers
A major avenue has been the linkage with the Science of Science
Policy Interagency group, which is discussed in more detail below. In
addition, the listserv and the website have been very important
dissemination vehicles.
However, the most important vehicle has been two PI workshops with
the explicit goal of fostering further collaboration among the PIs
actively engaged in the study of Science of Science & Innovation Policy
and the link to the federal community. The 2009 workshop had three
overarching goals:
to provide NSF with an early opportunity to organize
a collegial discussion of work in progress under SciSIP's two
rounds of awards well before this work will begin to appear in
professional forums and publications;
to begin to develop from among the purposefully
diverse set of disciplinary perspectives reflected in SciSIP's
two solicitations and subsequent awards, a ``community of
experts across academic institutions and disciplines focused on
SciSIP;'' and
To identify new areas of emphasis for support in
future SciSIP solicitations.
The 2010 workshop, scheduled for October 19, 2010 seeks to focus on
two objectives that flow from the National Science and Technology
Council's 2008 report: The Science of Science Policy: A Federal
Research Roadmap. The first task, as called for in the Roadmap report,
is ``to advance the scientific basis of science policy so that limited
Federal resources are invested wisely.'' The second is to build a
``community of practice'' between Federal science and technology
policymakers and researchers engaged in the development of new
theories, tools of analysis, and methods for collecting and analyzing
data.
This October 2010 workshop will consist of brief presentations by a
number of SciSIP grantees who have been invited to participate via a
competitive peer review of abstracts previously submitted based on
their ongoing research. These presentations will be followed by
roundtable discussions led by federal policymakers who will comment on
the relevance of the research, followed then by open discussions among
all participants. A networking session will be scheduled at the close
of the formal sessions to allow for continued discussion.
2) As a Co-chair of the Science of Science Policy Interagency Group
under the NSTC please briefly describe the work of
that group and how the various federal science
agencies are collaborating on the development and
implementation of science of science policy tools
to improve the management and efficiency of their
R&D portfolios and other science and technology
related programs.
In 2006, the National Science and Technology Committee's
Subcommittee on Social, Behavioral and Economic Sciences (SBE)
established an Interagency Task Group on Science of Science Policy
(ITG) to serve as part of the internal deliberative process of the
Subcommittee. In 2008, this group developed and published The Science
of Science Policy: A Federal Research Roadmap which outlined the
Federal efforts necessary for the long-term development of a science of
science policy, and presented this Roadmap to the SoSP Community in a
workshop held in December 2008. The ITG's subsequent work has been
guided by the questions outlined in the Roadmap and the action steps
developed at the workshop.
The development of the STAR METRICS (Science and Technology for
America's Reinvestment: Measuring the EffecT of Research on Innovation,
Competitiveness and Science) program is the number one priority of the
interagency group. The initiative is a multi-agency venture led by the
National Institutes of Health, the National Science Foundation (NSF),
and the White House Office of Science and Technology Policy (OSTP).
Another major activity is sponsoring a series of workshops to bring
the science agencies together to share what is already established in
the field, identify gap areas and outline steps forward for the
creation of better tools, methods, and data infrastructure.
The first of these workshops was held in October, 2009 to delve
into the issues surrounding performance management of federal research
and development portfolios. The focus was on sharing current practices
in federal R&D prioritization, management, and evaluation. Over 200
agency representatives attended. The conference featured 27 speakers
and panelists, representing 20 federal agencies, offices, and
institutions, and over 30 poster presenters, representing more than 25
agencies and institutions. Topics that were discussed included:
Methods to set federal research priorities and
strategic directions;
The use of metrics to improve federal R&D efficiency;
and
Ways in which research evaluations can inform current
and future R&D decisions.
It addressed the following key questions:
How do federal science and technology agencies
systematically identify and prioritize research and development
alternatives? How can these processes be strengthened?
How can research-performance metrics be used to
improve research efficiency? How can these metrics be improved?
How do research-performance evaluations inform and
improve R&D investment decisions? How can these feedback loops
be reinforced?
While the 2009 workshop developed a dialogue within the federal
science policy community, the ITG has a workshop planned for December
2010 that engages the federal community with the academic community in
advancing the ``Science of Science Measurement''. The first goal is to
create a dialogue between the Federal S&T agencies and the research
community about relevant models, tools, and data that advance
scientific measurement in key areas of national S&T interest. The
second objective is to identify a joint Science of Science Policy
(SoSP) research agenda for the Federal S&T agencies and the research
community. The workshop has four modules intended to advance
measurement in: 1) Economic benefits; 2) Social, health and
environmental benefits; 3) S&T workforce development; and 4) Technology
development and deployment. Four academic researchers will be
presenting in each module, with a rapporteur synthesizing the
presentation at the end of each module.
The audience will be primarily science policy practitioners from
the Federal agencies who are interested in very practical issues, such
as: getting new ideas about how to manage their portfolios in a more
scientific manner; developing performance and outcomes metrics;
measuring the return on investment; and using science to identify
emerging trends in the U.S. scientific enterprise.
Another activity has been the establishment of a website to provide
information on best practices to Federal and non-Federal agencies. The
website (http://scienceofsciencepolicy.net) was launched in January
2010, and has become a model for other interagency groups (including
the Forensic Science interagency group). The web site serves as a
repository for data, documents, research papers, and communication
tools for the communities of users. , The site receives over 2,000 hits
a month. The associated Listserv is the highest visibility listserv in
science policy, and has over 720 members.
The interagency group meets monthly, and has active participation
by over 15 agencies. It is actively providing input to the Center of
Excellence on Science Policy being established by the State Department
in the Middle East.
3) Please provide a brief description and update on the status of the
OSTP led project on science metrics, known as STAR
METRICS, including a description of international
engagement and interest in this effort
The STAR METRICS project is a federal and university partnership to
document the outcomes of science investments to the public. The
benefits of STAR METRICS are that a common empirical infrastructure
will be available to all recipients of federal funding and science
agencies to quickly respond to State, Congressional and OMB requests.
It is critical that this effort takes a bottom up approach that is
domain specific, generalizable and replicable.
Currently, the project is structured in two phases:
- Phase I: The development of uniform, auditable and
standardized measures of the initial impact of ARRA and base
budget science spending on job creation.
- Phase II: The development of broader measures of the impact
of federal science investment, grouped in four broad
categories:
Scientific knowledge (such as publications
and citations) and, later,
Social outcomes (such as health and
environment),
Economic growth (through patents, firm start
ups and other measures),
Workforce outcomes (through student mobility
and employment),
Phase I of the STAR METRICS project began in earnest in March of
2010 with funds formally designated for the project. The participation
agreement was signed in May 2010, and a press release was issued by the
three lead agencies: NIH, NSF and OSTP \2\. As noted in that press
release:
---------------------------------------------------------------------------
\2\ http://www.whitehouse.gov/sites/default/files/microsites/ostp/
STAR%20METRICS
%20FINAL.pdf
``A new initiative promises to monitor the impact of federal
science investments on employment, knowledge generation, and
health outcomes. The initiative--Science and Technology for
America's Reinvestment: Measuring the EffecT of Research on
Innovation, Competitiveness and Science, or STAR METRICS--is a
multi-agency venture led by the National Institutes of Health,
the National Science Foundation (NSF), and the White House
---------------------------------------------------------------------------
Office of Science and Technology Policy (OSTP).''
In Phase I, through a highly automated process, with essentially no
burden on scientists and minimal burden for administrators, STAR
METRICS collects longitudinal employment data from the participating
institutions to be able to assess the number of jobs created or
retained (or lost) through federal funding support. The system is set
up such that all jobs will be captured and not just principal
investigators and co-principal investigator. In addition, in Phase I,
STAR METRICS can provide estimates of jobs supported through facilities
and administration (F&A) costs and through various procurement
activities in the institutions.
STAR METRICS will also help the Federal government document the
value of its investments in research and development, to a degree not
previously possible. Together, NSF and NIH have agreed to provide $1
million in funding a year for the next five years.
More agencies are joining the STAR METRICS consortium. While
meetings of the Consortium are convened by OSTP, the lead agency is
NIH, which is hosting the data infrastructure. The official STAR
METRICS website will be available September 30 2010. NSF is providing
key leadership in engaging the scientific community, particularly
through the SciSIP program.
Phase II of the project expands the data infrastructure to
incorporate the broader impact of science investments on scientific,
social, economic and workforce outcomes. In keeping with the bottom up
approach of the program, STAR METRICS is beginning a formal set of
consultations with the scientific community to understand what data
elements and what metrics the community would find useful to find in
STAR METRICS. The first of these will occur October 22, 2010, with a
meeting with Vice Presidents for Research of interested institutions.
Other meetings will follow with research agencies and other interested
groups.
In a very short period of time since formalizing the project, over
100 research-intensive universities, mostly from the Federal
Demonstration Partnership (FDP), have expressed interest in
participating in STAR METRICS; about 20 are contributing data.
Universities have expressed enthusiasm and support for the project.
Science is fundamentally an international endeavor. And so must be
its evaluation. In fact, there has been substantial international
interest. Members of the STAR METRICS team have provided information or
directly briefed Brazilian and Japanese science and technology
agencies. The State Department is actively interested in learning about
the program to advance the science of science policy in the Middle
East.
Our most active international counterpart, however, is the European
Union. A major presentation was given to the European Parliament in
April \3\. A joint EU/US conference has been proposed for March 2011 in
the Rockefeller Foundation's Bellagio Center. The goal is to produce a
roadmap that will outline a path for creating a US/European
collaboration in developing a common theoretical and empirical
infrastructure to describe and assess the outcomes of science
investments. To achieve this, it will bring together key European and
US science policy experts and makers, administrators and academic
researchers. The group is carefully chosen to consist of the key
players from the US side who have the experience in developing such an
infrastructure in the US. The European attendees will consist of
individuals who have both the deep understanding of the issues and the
ability to effect change in Europe in a collaborative framework with
the US.
---------------------------------------------------------------------------
\3\ http://www.euractiv.com/en/science/eu-looks-to-us-model-for-
measuring-rd-impact-news-448950
---------------------------------------------------------------------------
The outcomes will include a roadmap that represents a combined
effort to build on and extend existing efforts in both regions: notably
the US investment in the STAR METRICS program and the European efforts
to build better assessments for their investments. It is hoped that the
roadmap will have the same success that the Science of Science Policy
Interagency Roadmap had in the United States and that in the EU the
road map will be the basis for including assessment measures in future
legislation implementing science programs.
Conclusion
The NSF's Science of Science and Innovation Policy program, the
NSTC's Interagency SOSP ITG, along with STAR METRICS, represent the
first efforts to construct a scientific framework that is supported by
multiple agencies and multiple institutions--all jointly engaged. It
represents a true bottom up approach to providing an evidence basis for
U.S. science policy. Its success is important for decision-makers: in a
nutshell, you can't manage what you can't measure and what you measure
is what you get.
NSF's innovative Science of Science and Innovation Policy program,
and STAR METRICS, can help all of us do a better job in explaining this
essential symbiosis.
This concludes my testimony, Mr. Chairman. I look forward to
answering any questions you or Members may have.
Biography for Julia I. Lane
Dr. Julia I. Lane is the Program Director of the Science of Science
& Innovation Policy program at the National Science Foundation. Her
previous jobs included Senior Vice President and Director, Economics
Department at NORC/University of Chicago, Director of the Employment
Dynamics Program at the Urban Institute, Senior Research Fellow at the
U.S. Census Bureau and Assistant, Associate and Full Professor at
American University.
Julia has published over 60 articles in leading economics journals,
and authored or edited six books. She became an American Statistical
Association Fellow in 2009. She has been the recipient of over $20
million in grants; from foundations such as the National Science
Foundation, the Sloan Foundation, the MacArthur Foundation, the Russell
Sage Foundation, the Spencer Foundation, the National Institute of
Health; from government agencies such as the Departments of Commerce,
Labor, and Health and Human Services in the U.S., the ESRC in the U.K.,
and the Department of Labour and Statistics New Zealand in New Zealand,
as well as from international organizations such as the World Bank.
She has organized over 30 national and international conferences,
received several national awards, given keynote speeches all over the
world, and serves on a number of national and international advisory
boards. She is one of the founders of the LEHD program at the Census
Bureau, which is the first large scale linked employer-employee dataset
in the United States. A native of England who grew up in New Zealand,
Julia has worked in a variety of countries, including Australia,
Germany, Malaysia, Madagascar, Mexico, Morocco, Namibia, Sweden, and
Tunisia.
Her undergraduate degree was in Economics with a minor in Japanese
from Massey University in New Zealand; her M.A. in Statistics and Ph.D.
in Economics are from the University of Missouri in Columbia. She is
fluent in Swedish and German and speaks conversational French.
Chairman Lipinski. Thank you, Dr. Lane. Dr. Sarewitz.
STATEMENT OF DANIEL SAREWITZ, CO-DIRECTOR OF THE CONSORTIUM FOR
SCIENCE, POLICY & OUTCOMES AND PROFESSOR OF SCIENCE AND
SOCIETY, ARIZONA STATE UNIVERSITY
Dr. Sarewitz. Thank you, Chairman Lipinski and Ranking
Member Ehlers. I very much appreciate the invitation and the
opportunity to testify. So, my name is Daniel Sarewitz. I am a
Professor of Science and Society at Arizona State University
where I Co-Direct the Consortium for Science Policy and
Outcomes, which works to understand and improve the linkages
between science and technology and social outcomes. We are
located on ASU's Tempe campus. We also have a location here in
DC. We are a highly interdisciplinary and collaborative
organization involving researchers at dozens of other
institutions. We are also fortunate to receive generous grant
funding from NSF, including from the Science of Science and
Innovation Policy Program, so I declare my vested interest in
the outcomes of this hearing.
I would like to make three brief points in support of my
over-extensive written testimony. The first is about the
importance of the SciSIP Program itself. With shrinking
discretionary budgets, vibrant economic competitors, and
daunting challenges to our well-being, the Nation needs
effective tools for making better decisions about how to
design, assess, and set priorities for our science and
innovation enterprise. For the most part, we lack these tools,
as we have already heard. As former Presidential Science
Advisor Jack Marburger said in 2005, ``the nascent field of
social Science of Science Policy needs to grow up, and
quickly.''
With modest resources, SciSIP is mobilizing a community of
researchers to focus on the complex problem of how to bring the
most out of our public investment in R&D. SciSIP reacted
quickly to support research assessing the impacts of stimulus
funding for R&D, and is beginning with NIH to take on the
incredibly complex problem of evaluating what the nation gets
for its enormous investment in bio-medical R&D. These are
really difficult challenges and it is hard to see how this
committee and others at the helm of the R&D enterprise can
guide it effectively in the absence of such efforts.
A second point is that outputs are not outcomes. And SciSIP
needs to focus on outcomes. Outputs are immediate products of
R&D like publications, and patents, and Ph.D's. Outcomes are
what people care about, not just economic growth, but of course
economic growth, but also secure and affordable food supplies
and energy supplies, high quality public health, a clean
environment, expanding job opportunities and strong national
defense. The 40-year war on cancer has yielded the output of
remarkable new scientific knowledge yet very modest gains in
public health outcomes despite the tens of billions spent.
Thirty years of energy R&D output have done little to advance
the outcome of reducing our vulnerability to energy-based
threats to security, economy, and environment. Research on
science and innovation policy to date has given us a pretty
good idea how to design and assess science policies to advance
outputs. But we still have a lot to learn about how to
implement and assess successful outcome-based science and
innovation policies.
My final point is that research on outcome-based science
and innovation policies and the use of such research by
decision makers are not separate problems. While the SciSIP
Program is commendably serious about disseminating its research
and its results to policy makers, the dissemination problem is
also structural. That is it is built into the way we organize
much research including SciSIP, and its great strength in
supporting bottom-up inquiry on fundamental problems is also a
weakness when there is an urgent need for new knowledge, the
need that Dr. Marburger pointed out. Such cases require close
ties between those who do research and decision makers who
might use research results. We already heard from Julia Lane
about her efforts to create those ties. Now on the one hand,
these ties allow researchers to understand the needs of
decision makers and to recognize the types of information that
will be both usable and used. But at the same time, close ties
allow decision makers to understand what research can and
cannot do for them. Such mutual understanding breeds trust and
value, and usable science.
There are many examples of federal programs that link
research performance and research use, including USDA's
Agricultural Extension Service, the USGS Earthquake Hazards
Program, and NOAA's Regional Integrated Sciences and
Assessments Program. Similarly, DARPA is justifiably well
regarded for its capacity to connect the technology needs of
DOD to research groups in academia and the private sector.
These and other examples are discussed in the handbook ``Usable
Science'', which I just happened to have brought along with me,
which summarizes the results of CSPO's [Consortium for Science,
Policy & Outcomes] five year NSF Decision Making Under
Uncertainty project, carried out jointly with researchers of
the University of Colorado.
These lessons can be applied to SciSIP. Let me mention
three possibilities. First, NSF could sponsor one or more large
centers for SciSIP research, education, and outreach with a
core requirement to build strong, ongoing collaborative links
between researchers and science policy decision makers. Second,
NSF could work with mission-oriented R&D agencies to integrate
SciSIP activities into a range of existing outcome-oriented
programs. Third, NSF could require all of its center-scale
awardees, such as Science and Technology Centers and
Engineering Research Centers, to be designed from the outset to
include integrated SciSIP components.
Through these sorts of approaches, SciSIP could enhance its
capacity to produce usable knowledge for the near to medium
term and help accelerate a convergence between science and
innovation policy research and policy decisions across a range
of R&D outcome priorities. Thank you for your attention. I look
forward to discussing these issues more.
[The prepared statement of Dr. Sarewitz follows:]
Prepared Statement of Daniel Sarewitz
Mr. Chairman, Members of the Committee, thank you for inviting me
to testify today. My name is Daniel Sarewitz, and I am co-founder and
co-director of the Consortium for Science, Policy, and Outcomes at
Arizona State University, as well as Professor of Science and Society
at ASU. My formal training was in geosciences, but for more than 20
years I have worked in science and technology policy, first as a AAAS
Congressional Science Fellow and then a staffer on this Committee,
working for Chairman George E. Brown, Jr., and more recently as an
academic, at Columbia University and now at ASU. So I'm very pleased to
return to the place that launched me on a new and incredibly
interesting and exciting career path and intellectual journey, and
honored that you have asked for my input to the Committee's
deliberations on the status of the science of science and innovation
policy.
Introduction: Input-Output Science and Innovation Policy
Most people agree that government support of research and
development is an essential foundation of today's complex, knowledge-
based, high technology society. Yet the problem of how to make the most
out of the nation's investment in R&D remains amazingly poorly
understood. This problem has been actively debated in Congress since
World War II. In the interim the annual public investment in R&D has
grown from a few tens of millions of dollars to about 140 billion
dollars. Yet, throughout this period of remarkable growth--and, I
should say, remarkable bipartisan support for such growth, exemplified
by this Committee--the basic principles, terms of debate, and policy
tools for guiding investment and measuring its effects have changed
remarkably little.
For more than sixty years, the core of science policy has been the
belief that more money for R&D translates into more benefits for the
nation. Science policy has, above all else, been science budget policy.
The capacity of the nation to solve problems related to science and
technology has been measured by the incremental growth of the R&D
budget. The idea that the size of the R&D budget is a measure of the
social value of science and technology remains the bedrock of science
policy.
Three other powerful beliefs have dominated science policy decision
making. The first is that research becomes valuable for society as part
of a linear progression starting with basic discovery and leading to
application, either in the form of technological innovation, or
information to inform decision making. The second, related belief is
that there is a clear distinction between research activities aimed at
creating new knowledge, and research aimed at applying that knowledge
to solving problems. The third belief is that scientific excellence, as
defined and assessed by scientists themselves, typically through the
peer review process, is the best measure of the potential value of
science for society.
The result of these beliefs has been a national R&D enterprise that
is largely understood and discussed in terms of simple inputs--how much
money is being spent on which type of science?--and simple outputs--how
much scientific knowledge is being produced? That this simple input-
output way of understanding science and technology policy led to the
world's largest and most productive R&D enterprise is, however, much
more of a happy historical accident than an endorsement of this way of
looking at R&D policy.
Coming out of World War II, the U.S. simply had no serious
scientific or economic competitors, so we had a huge head start that
only began to be seriously eroded in the 1980s. Moreover, the U.S. R&D
enterprise as a whole was--and still is--so much bigger than that of
any other nation that simply as a function of scale it could--and still
does--outperform everyone else. An additional crucial point is that by
far the dominant player in translating the public R&D investment into
tangible societal outcomes was the Department of Defense. The core of
DOD's approach was the cultivation of very powerful linkages between
high-tech private sector firms, research universities, and the DoD
itself, an arrangement that was responsible for creating most of the
important technological systems that undergird our society and our
economy today.
I present this thumbnail sketch to explain how we have arrived at
the situation in which we find ourselves today. The limits of the post-
War input-output approach, as I have said, became increasingly clear
starting in the 1980s, with the rise of serious economic and
technological competitors, especially in east Asia; with the end of the
Cold War, and the decline of DoD's catalytic role in civilian
technological innovation; and with the increasing awareness of an array
of social challenges that seemed to demand scientific and technological
solutions--from cancer and emerging infectious diseases to energy
security and environmental quality. Yet if one looks at the endless
series of reports over the past decades sounding the alarm bells about
the nation's science and technology enterprise, one finds the problem
still discussed predominantly in terms of the same old input-output
measures: how much are we spending, how many scientists are we
producing, how many publications or patents are issued, and how do
these input-output numbers compare to our economic competitors?
The problem with the input-output model is that it can't tell us
very much about what actually matters: how the size, organization, and
productivity of the R&D enterprise itself relates to the achievement of
the societal outcomes that we desire and expect. Because pretty much
everyone assumed that these outcomes flowed automatically from the R&D
enterprise, as long as it was big and scientifically productive, there
seemed to be no reason to worry about how the enterprise worked. These
assumptions put a damper on research, as well as debate, about the
complex relations between scientific advance, technological innovation,
and the well-being of society. Why try to understand these issues if
the only thing that really mattered was the size of the budget?
But in an era of constrained resources and mounting challenges to
our well-being, the limits of the input-output approach have become
impossible to ignore. We cannot ignore them because we need to make
difficult choices about how to allocate scarce resources. We also
cannot ignore them because we are faced with strong prima fascia
evidence that the input-output approach is leading to significant
science and innovation policy failures. For example, the National
Institutes of Health's forty year War on Cancer has yielded remarkable
new scientific knowledge, yet remarkably modest public health benefits
for the tens of billions spent. The devastation of New Orleans by
Hurricane Katrina occurred despite the existence of comprehensive
scientific knowledge about the inevitability and precise consequences
of such an event. Thirty years of energy R&D has left the nation no
less vulnerable to energy-based security, economic, and environmental
threats than it was when the Department of Energy was created. These
are not input-output problems, but they are science and innovation
policy problems.
In 1992, this Committee issued a brief ``Chairman's Report''
entitled the ``Report of the Task Force on the Health of Research,''
which pointed at the need to re-think basic assumptions about science
and innovation policy. (As a Committee staffer at the time, I was
privileged to be one of the members of that Task Force.) While there
certainly were, at that time, small pockets of academic scholarship on
the links between science policy and societal outcomes, and while some
federal S&T programs had of course had great success in achieving the
outcomes that the public expected from them, the fact is that there
existed in the United States at the end of the 20th century an
extraordinarily modest capacity to develop knowledge, tools, and human
resources that would allow the nation to improve its capacity to turn
progress in S&T into progress toward desired societal outcomes.
A turning point in achieving high level attention and action came
in 2005, when President Bush's science advisor, John Marburger,
speaking at the Science and Technology Policy Colloquium of the
American Association for the Advancement of Science, declared that
``The nascent field of the social science of science policy needs to
grow up, and quickly.'' His point was that the nation could no longer
afford to set policy for one of it's most important areas of public
investment on the basis of simplistic ideas that had arisen in a very
different world, half-a-century ago. The National Science Foundation
responded to the urgency of Dr. Marburger's call by creating the
Science of Science and Innovation Policy (SciSIP) program.
Committee Question 1. (A) Please provide an overview of the research
activities of the Consortium for Science, Policy,
and Outcomes. (B) How are you facilitating
interdisciplinary collaborations within the
Consortium? (C) What new and continuing areas of
research in the science of science and innovation
policy (SciSIP) could significantly improve our
ability to design effective programs and better
target federal research investments? (D) What are
the most promising research opportunities and what
are the biggest research gaps?
Background to CSPO:
The Consortium for Science, Policy, and Outcomes (CSPO) was
conceived in 1997 during discussions between myself and Michael M.
Crow, who was then Executive Vice Provost at Columbia University, and
formally launched in 1999. The decision to create CSPO was made for
much the same reasons that SciSIP was created: despite the overwhelming
importance of science and technology in our society, policy makers and
scholars almost completely lacked the knowledge and tools necessary to
make informed and effective decisions. CSPO was founded as one small
effort to begin to reverse this lack of capacity.
When Michael Crow became President of Arizona State University he
asked me to move to ASU as well, and gave me the opportunity to help
transform CSPO from a small research and policy center to a broader
consortium with expanded ambition and reach. Today this consortium
operates at three organizational levels: First, there is a core group
of fifty or so faculty, researchers, students, and staff who work
directly in CSPO, mostly in Arizona but with several of us located here
in Washington, DC. Second, there is a significantly expanded group of
collaborators throughout ASU as a whole, ranging from many of the
university's top scientists and engineers, to faculty and students in
ASU's programs on public policy, law, business, architecture and
design, communications, journalism and even the arts. Third, we have
deep and persistent collaborations with researchers and students at
other universities in the U.S. and around the world. Virtually all of
our major research thrusts are carried out in collaboration with
individuals or groups at other universities, and CSPO hosts an
continual stream of visiting scholars and students, many from foreign
universities and research institutions, for periods of up to two years.
In briefly describing CSPO's major research activities, I want to
emphasize a point that should be obvious but is often lost in
discussions of the Science of Science and Innovation Policy. Public
support for science and innovation is justified for a wide range of
reasons, many of which are non-economic. For example, we count on
science to provide a safe, abundant, and tasty food supply for a
growing population; ensure the protection of our natural environment
and the provision of reliable and affordable energy; protect and
improve our health; help ensure national security; and create new and
challenging work opportunities. The reason I belabor this obvious point
is that in fact we are particularly empty-handed when it comes to
understanding how best to design and assess S&T policies aimed at
advancing these non-economic outcomes. This is the arena where CSPO
focuses most of its efforts.
CSPO is engaged in a wide range of research activities that seek to
advance knowledge, real-world practice, and human resources in this
broad domain of science and innovation policy for social outcomes. And
I want to gratefully acknowledge the National Science Foundation's
generosity in providing peer-reviewed grant support for many of our
most important and I would say high-risk, high-pay-off ideas, through a
variety of its programs, including SciSIP.
At the core of all of our research is a commitment to looking at
S&T activities as part of larger social systems. Trying to understand
and assess the outcomes of science and innovation by studying and
measuring research and development activities alone is like analyzing a
family's home life by studying lumber mills and brick kilns. What makes
a given line of research valuable for society? Of course the science
itself must be of high quality, just like a fine home needs to be
constructed of quality materials. But for investments in science and
innovation to support desired social outcomes, many other elements will
come into play: the ways that scientists choose projects; the culture
and organization of research institutions; public-private interactions;
economic incentives and regulatory structures; public preferences and
behavioral norms--all this and more make up the process by which
knowledge, innovation, and social benefit are connected.
1. (A) Please provide an overview of the research activities of the
Consortium for Science, Policy, and Outcomes.
With this background, let me outline some of our efforts, in four
areas of direct relevance to the science of science and innovation
policy.
(1) CSPO's flagship research program is our Center for
Nanotechnology in Society (CNS), an NSF Nano-scale Science and
Engineering Center which has just been renewed for a second and final
five-year grant period, under the directorship of CSPO co-director
Professor David Guston. CNS takes a systems view of technological
innovation to ask: what are the factors that may influence whether an
emerging domain of technology, in this case nanotechnology, is able to
move toward areas of social need and desired outcomes? CNS involves
multiple universities and researchers from multiple disciplines
bringing numerous specialties to bear on what we call ``real-time
technology assessment,'' or a capacity to understand linkages between
new knowledge, emerging innovations, and societal outcomes--as they are
unfolding.
Among the many specific research activities encompassed by CNS are
relatively traditional tools for assessing scientific productivity such
as citation and patent analysis, as well as proven methods for tracking
public opinions and preferences. But we also bring social scientists
together with nanoscale scientists and engineers to reflect on the
choices available to them for advancing nanotechnology, and to develop
and discuss future scenarios of nanotechnology-enabled society. We
cultivate ongoing discussions with the public about potential benefits,
problems, and dilemmas of nanotechnology. We bring graduate students
working on nanotechnology into discussions of science policy and social
outcomes. We work with science and technology museums to create
programs and exhibits that go beyond technical explanations to help
people understand the ways that nanotechnology and society influence
each other.
In total, what we are trying to create with CNS is a test-bed for
developing a more holistic understanding of science, innovation, and
social outcomes, where the choices made about science, innovation, and
their application in society are brought out in the open and discussed
even at the earliest stages of the innovation process, to bring into
better alignment the directions of science and innovation, and the
aspirations and needs of society. I also hope it is clear from this
brief description that standard categories of ``basic research,''
``applied research,'' ``education,'' and ``outreach'' are not pursued
separately, but are part of an integrated approach at CNS.
I want to emphasize three elements of U.S. science policy that made
this research program possible. First was the explicit desire of this
Committee and the Congress in general, as expressed in the 21st Century
Nanotechnology Research and Development Act of 2003, to ensure that
nanotechnology advanced along with a capacity to understand unfolding
social implications. Second was the complementary recognition by the
National Nanotechnology Initiative, under Mihail Roco's early
leadership, and the National Science Foundation, that understanding the
social aspects of nanotechnology should be an important aspect of the
overall nanotech research agenda. And third was ASU itself, a
university that has made huge strides in reducing the barriers to true
interdisciplinary collaboration, and that is simultaneously committed
to connecting the work of its faculty and students to the needs of
society.
(2) A second project I want to mention is Science Policy Assessment
and Research on Climate (SPARC), funded through NSF's Decision Making
Under Uncertainty program. SPARC is a collaboration with the Center for
Science and Technology Policy Research at the University of Colorado,
and we are finishing the project up after a five-year funding period.
SPARC explores a question that lies at the heart of science and
innovation policy: what makes the results of a scientific research
project useful, and usable? While the broad context for this project
was the nation's considerable investment in research related to
climate, our research looked at science policy decision making aimed at
many different problems, including water management, weather and
natural hazards, nanotechnology, technological standards, agriculture,
and ecology.
SPARC results reinforce a major point: science policies tend to be
more successful when they are carried out through institutional
arrangements that allow scientists and decision makers to understand
each other's needs and capabilities. Fostering close, ongoing, trusting
relations between those who produce new knowledge and those who might
benefit from it seems to be an essential attribute of science policies
that lead to new knowledge quickly moving into society for public
benefit. Drawing on the lessons of this major project, we produced a
short handbook for science policy decision makers, called ``Usable
Science.'' We released this report last April at a meeting here in DC
that attracted about 100 participants, many from federal agencies. The
handbook is available at: http://cstpr.colorado.edu/sparc/outreach/
sparc-handbook/.
(3) A third project, called Public Value Mapping, or PVM, has been
supported by the SciSIP program, as well as the V. Kann Rasmussen
Foundation and the Rockefeller Foundation. The idea behind PVM draws on
my previous point that most publicly funded S&T activities aim to
advance a variety of social outcomes, not just economic ones. PVM finds
that these desired social outcomes--what we call ``public values''--are
clearly expressed at many levels across the science and innovation
policy endeavor--in legislation and laws; in the strategic plans and
budget documents of R&D agencies; in the websites and press releases of
individual R&D programs and even projects.
Because public values are harder to characterize, measure, and
assess than economic values, they are often given short shrift both in
debates about science and innovation policies, and in research to
evaluate the outcomes of such policies. Yet a key concept for PVM is
that the public values associated with science and innovation policies
may conflict with one another, and with economic values. For example, a
new medical technology may create profit for a corporation and benefit
from those who have access to the technology, even as it contributes to
health care outcome disparities and over-diagnosis and unnecessary
treatment. PVM seeks to unravel and clarify such complexities, in order
to help view and assess the full range of social outcomes tied to
science and innovation policies.
In brief, our research aims first to identify public values across
a particular area of science and innovation policy. We then analyze how
various value statements actually relate to each other (for example,
are they complementary or contradictory?) and assess whether the
research activities are in fact organized in ways that may allow them
to achieve those values. Our work is still preliminary. During three
years of NSF-supported research, we have completed a set of detailed
case studies, looking at S&T policy issues such as technology transfer,
nanotechnology for cancer treatment, and environmental chemistry. One
intriguing, but still quite preliminary, result of our work is that we
think we can say something about the potential for a major research
program to achieve desired social outcomes based in part on how public
values are articulated across the program's various levels and
components. For example, our study of natural hazards research at the
U.S. Geological Survey shows a strong coherence among public values
expressed by scientists, the agency, legislative mandates, and various
stakeholders, whereas our analysis of Federal climate change research
shows considerable diversity and even conflict among values within and
across these various levels of activity. We are now working to test the
hypothesis that the relations among public values may in fact be
predictive of a program's performance. If this turns out to hold up
after further research, it could offer a powerful tool for assessing
the capabilities of science and innovation policies.
(4) As one final example, I want to mention CSPO's growing work on
energy technology innovation. This is a cross-cutting theme that works
its way into a number of our research projects, but I think it helps to
communicate our overall approach. Consider, for example, solar energy
technologies, which may have particular potential to serve energy needs
in a desert state like Arizona. Yet to understand the potential for
solar energy R&D to contribute to Arizona energy needs, one also needs
to understand issues of regulatory incentive, land use, water access
and availability, public lands management, agricultural policies,
transmission corridors, military bases, Indian reservations, even
immigration. Each of these variables may play a crucial role in
determining the outcomes of solar energy science and innovation
policies--and policies that do not attend to these variables run the
risk of failing to achieve their desired social outcomes, regardless of
levels of funding or scientific productivity.
1. (B) How are you facilitating interdisciplinary collaborations within
the Consortium?
CSPO facilitates interdisciplinary collaboration in three main
ways. First, we organize our activities around problems, not around
disciplines, and then we bring into our research teams the expertise
that we need to help us understand what's going on and how to make
progress.
Second, as an administrative matter, CSPO is located in ASU's
College of Liberal Arts and Sciences, so it does not have a
disciplinary affiliation. Our core faculty members have advanced
degrees in fields ranging from earth sciences and electrical
engineering to political science and philosophy. Core faculty are
jointly appointed between CSPO and a variety of academic units,
including the Schools of Life Sciences; Government, Politics, and
Global Studies; Human Evolution and Social Change; Geographical
Sciences and Urban Planning; Sustainability; Communications; and Social
Transformation. If these don't sound like familiar names for
traditional academic disciplines, that's because ASU itself has moved
to reorganize standard departments into interdisciplinary units in
order to bring appropriate intellectual force to bear on complex
problems.
Third, we have worked hard to cultivate long-term collaborations
with natural scientists and engineers across the university, many of
whom are affiliate faculty members at CSPO. We work with these
colleagues to design new educational and research projects and programs
that return value both to CSPO and to our science and engineering
partners. These activities create familiarity and trust that allow us
to engage in higher-stakes collaborations. For example, many of the
major science-and-engineering grant proposals submitted by ASU to
funding agencies now include an integrated set of activities aimed at
understanding and enhancing societal outcomes. We have even been funded
by NSF, partly with the support of the SciSIP program, to study the
impacts of natural science-social science collaborations in labs at ASU
and around the world.
1. (C) What new and continuing areas of research in the science of
science and innovation policy (SciSIP) could
significantly improve our ability to design
effective programs and better target federal
research investments? (D) What are the most
promising research opportunities and what are the
biggest research gaps?
CSPO faculty members have been brainstorming over the past few
weeks to develop a short list of ``foundational/transformative''
research challenges in response to a call for ideas issued by NSF's
directorate for Social, Behavioral, and Economic Sciences. Given CSPO's
orientation, our ideas, not surprisingly, are directly relevant to the
SciSIP program.
1. Science and innovation policies often aim to help transform
existing technological systems to achieve particular societal outcomes:
for example, to move the nation's energy system toward a more
economically, environmentally, and geopolitically secure technology
base; or to move the nation's health care system to achieve better
health outcomes at lower cost. New scientific and technological advance
are obviously going to be key drivers of such transitions. Yet modern
societies have very little understanding of how to catalyze and steer
these sorts of complex system changes, and well-intentioned efforts can
often lead to unanticipated consequences whose benefits are very
difficult to assess, as we have seen, for example, in efforts to
advance alternative biofuels. A key SciSIP research priority should be
to gain fundamental understanding about the drivers and dynamics of
transitions in complex socio-technical systems.
2. Science and innovation policies are, in one sense, a bet on the
future: that a certain type of knowledge or technology will prove
useful or valuable. Yet the future of social and technological change
is impossible to predict in detail. To try to deal with this
unpredictability, a relatively small number of forward-thinking
companies, academic units, and non-profit organizations employ a
variety of techniques and tools that can allow them to better
visualize, understand, and discuss a range of alternative possible
futures. Such activities can inform decision making by helping to make
clear the broad array and potential implications of scientific,
technical, and social options and pathways available for addressing
social challenges. SciSIP should support the study and assessment of
existing tools, and the development and testing of a range of new
tools, to bring future-visioning techniques to bear on science and
innovation policy making processes.
3. In general, SciSIP should emphasize support for research and
education programs that foster integration between natural sciences and
engineering, and social sciences. Such integration can help to ensure
that science and engineering activities are conceived and carried out
with a realistic understanding of the social context in which knowledge
and innovation are pursued and applied. In turn, social scientists will
gain a deeper, and earlier, understanding of the potential futures that
cutting edge R&D programs are making possible. The result should be a
growing capacity to design and conduct science and innovation
activities that are better able to contribute to desired social
outcomes.
4. SciSIP should consider supporting the development of a set of
case studies to identify and characterize the key attributes of S&T
institutions and programs that strongly link science and innovation
activities to desired social outcomes. Case studies should range across
the S&T enterprise, sampling a variety of sectors, scales, structures,
and desired outcomes. Such a program would need to be coordinated to
ensure comparability between the methods and organization of the cases.
Its institutional and programmatic focus would make it distinct from,
and complementary with, the STAR METRICS approach that NSF and sister
agencies are already taking. This case-based effort should focus on the
development of a set of key organizational principles that science and
innovation policy makers can use to guide investment strategies and
priorities.
Committee Question 2: Is the Federal Government, specifically the
National Science Foundation, playing an effective
role in fostering SciSIP research and the
development of a community of practice in SciSIP?
What recommendations, if any, do you have for the
National Science Foundation's SciSIP program?
Overall, I believe that NSF is doing a good job in building the
SciSIP program and community. But this is a very difficult task. The
community of researchers working in the SciSIP domain is rather small
and very diffuse. In fact, it does not really identify itself as a
single community, but rather as several independent communities, for
example, innovation economics, science and public policy, and science
and technology studies. So there's simply not a lot of capacity yet in
this domain, and what capacity there is needs to be better integrated.
Moreover, most of the quantitative data available for analysis of
science and innovation policy is input-output data--budget levels,
numbers of scientists and graduate students, publication numbers,
patents, and citations, and so on. Such data can be subjected to highly
sophisticated data mining and analysis techniques using ever-improved
software packages designed for this purpose, so it is very attractive
to researchers. But this kind of input-output data can offer only an
incomplete and in many ways distorted view of the societal value of the
S&T enterprise, a view that does not allow us to escape the simplistic
beliefs of the past.
Now it's clear that those running the SciSIP program understand
these problems. They brought together a good cross section of the
community to help plan the program in the spring and summer of 2005;
they have sought to attract grant applications from a wide array of
researchers; they have organized or otherwise supported events to bring
together SciSIP researchers to build a sense of integrated community;
they have provided grant support to a diverse set of research
approaches and problems; and they are working through the STAR METRICS
program to try to build better quantitative data sets that can assist
certain types of analytical work. All this is very positive.
To some extent, however, NSF's institutional strength is also a
weakness here. The agency prides itself on its bottom-up approach to
setting its research agendas. While the SciSIP program does reflect a
top-down decision to create a new program area, in part as a response
to concerns repeatedly expressed by then-Presidential science advisor
Marburger, the shape and direction of SciSIP has significantly been
dictated by the existing research community. Much of that community
continues to work within the input-output model of science and
innovation policy, due, as I've said, to existing data sources and
tools. For similar reasons of measurement ease, the community also
tends to focus on economic outcomes to the significant exclusion of the
much broader range of societal outcomes that the nation seeks to derive
from its S&T investment. Because researchers and peer reviewers are
drawn from the same general communities, such tendencies can be
difficult to escape.
A range of tools are potentially available for building the
community and its coherence, and driving the intellectual agenda away
from an input-output framework, and toward a systems-oriented,
outcomes-focused approach. Not all of these tools require new money.
SciSIP should use program guidelines and requirements to transform and
build the research community; indeed, this year's program announcement
is notable for its openness to a wide range of approaches to SciSIP
research. SciSIP could also consider using some of its budget to
support training grants, similar in spirit, if not in scale, to NSF's
successful IGERT (Integrated Graduate Research and Traineeship)
program, as a way to more quickly build up capacity. However, if the
Committee, and NSF, believe that the science and innovation policy
research community needs to be significantly larger and more coherent,
this will probably require more resources. To reinforce my position
throughout this testimony (also see my comments on ``dissemination,''
below), any claim to a bigger budget must be matched by programmatic
design elements to help ensure that knowledge created by SciSIP is both
usable and used. This would likely require a commitment to fund
integrated Science and Technology Center-type science and innovation
policy research organizations that can create and support ongoing
interaction between SciSIP researchers and policy makers, perhaps
analogous to NOAA's Regional Integrated Science and Assessment program.
Committee Question 3. Please describe the education and outreach
activities of the Consortium for Science, Policy,
and Outcomes.
CSPO sponsors a wide variety of education and outreach activities,
ranging from formal degree programs and intensive, short-term training
activities, to public outreach events and products targeted at science
and innovation policy makers.
1. Graduate Degree Programs
The ASU Professional Science Masters in Science and Technology
Policy was initiated in 2009. It provides professional education for
students seeking advanced public, non-profit, or private sector careers
in science and technology policy and related fields in the United
States or abroad. Students learn essential skills, knowledge, and
methods for analyzing innovation, expertise, and large-scale
technological systems. Particular emphasis is placed on the political
and societal contexts and impacts of science and technology policy. The
program is a one-year, 30-credit cohort-based program designed to
attract students of the highest caliber in their early to mid-careers.
Key learning outcomes of the program include:
Understanding of the theoretical foundations of the
interactions among science, technology, and society.
Understanding of US and, where appropriate to a
student's career interests, international science and
technology policies and the policy processes that generate
them.
Analysis of knowledge systems supporting policy
decisions.
Analysis of the social and policy dimensions and
implications of large-scale technological systems.
Analysis of scientific and technological innovation
systems.
Skills in collaborative, team-based analysis of
science and technology policy problems.
Skills in effective professional communication.
Ph.D. Program in the Human and Social Dimensions (HSD) of Science
and Technology. Here CSPO collaborates with ASU's Center for Biology
and Society and Center for Law, Science, and Technology to offer a
highly interdisciplinary and integrative program of advanced study. We
aim at training scholars and practitioners to understand and inform the
conceptual and philosophical foundations of scientific research; to
analyze and assess the increasingly powerful roles of science and
technology as agents of change in society and the economy; and to
challenge universities to become leaders in fostering the new science
and technology policies necessary to meet the problems and
opportunities of the 21st century.
The HSD curriculum is flexible, combining a strong, integrated,
first-year experience, with substantial freedom for students, in
conjunction with their advisors, to design carefully crafted programs
of study relevant to their own areas of interest and expertise. The
curriculum trains researchers with the necessary skills and preparation
to analyze three key aspects of the study of the human and social
dimensions of science and technology: 1) the historical, philosophical,
and conceptual foundations of science and technology; 2) the social and
institutional foundations of scientific research and technological
systems; and 3) the political, ethical, and policy foundations of
science and technology.
Research projects of current HSD students supported by CSPO
include:
Social and ethical challenges of smart grid
development
Leadership training in graduate science and
engineering education
Comparative analysis of interdisciplinary research
fields in the US and China
The emergence and stabilization of legal regimes in
online communities
The role of non-governmental organizations in energy
siting decisions in the United States
Public values and public engagement in energy policy
in the United States
The organization and management of international
scientific assessment processes
Connecting knowledge to decision making in water
policy
information technology in learning & inequality
2. Non-Degree Programs and Training
Ph.D. plus. This integrative, non-degree program offers advanced
graduate students in science and engineering the chance to consider how
their research relates to the world of science policy and the
relationship between science, technology and societal outcomes. Science
and engineering students work with a CSPO faculty member to write an
additional chapter of their dissertation that explores the social
implications, political context, or ethical concerns of their work. The
Ph.D. plus process is informal, and is arranged by discussions between
the student, her or his dissertation advisor, and the CSPO advisor.
Most Ph.D. plus students take one or more classes offered by CSPO
faculty; attend seminars and other activities sponsored by CSPO; and in
general interact closely with the CSPO community for an academic year
or more.
In the annual DC Summer Disorientation, cohorts of about 15 science
and engineering graduate students spend two weeks in Washington, DC
interacting with the government officials, lobbyists, staffers,
regulators, journalists, academics, museum curators, and others who
fund, regulate, shape, critique, and study science and technology.
Students participate in interactive role-playing experiences where they
may testify at mock Congressional hearings; work under tight deadlines
to write briefing papers for senior officials; or write op-ed pieces
for a demanding editor. The goal is to help future scientists develop
an understanding of the political and social context of their research.
CSPO has recently expanded this program and now accepts graduate
students from outside of ASU.
The Next Generation of Science and Technology Policy Leaders. Here
we are seeking to catalyze a community of early-career science policy
scholars who can span the terrains of intellectual inquiry and real-
world practice, communicate effectively to general audiences, and
contribute to effective decision making on key issues of science,
technology, and society. We organized a national competition to select
a dozen early-career science policy researchers and practitioners (5
years or less since Ph.D). This ``Next Gen'' group prepared draft
papers, and each scholar was then paired with an early career
``communicator'' (typically a writer working through new media). The
scholar and the communicator collaborated to craft a compelling, non-
scholarly description of the scholar's work--something that would
appeal to a general audience. Next Gen scholars also led a roundtable
discussion where each presented her/his research to a group of about 40
people at a major CSPO-sponsored conference, to allow the scholars to
hone the more technical aspects and presentation of their work, and to
interact intensively with an engaged audience. Next Gen scholars are
now working on two versions of their research papers, one for a policy-
making audience, and one for an academic audience. This project was
supported by grants from NSF's programs on Science, Technology, and
Society, and Informal Science Education.
3. Outreach
CSPO views outreach as an integral part of its operations at all
levels--not as a separate, add-on, or late-stage activity. As described
above, our research and education programs often involve policy makers,
members of the public, and scientists and engineers, and so also serve
an outreach function by creating and strengthening links and
communication between CSPO scholars and these other groups. Indeed, in
many cases it is difficult to know where research ends and outreach
begins. For example, much of our work on energy innovation policy is
presented to policy makers and the media in briefings and policy
reports at the same time as it is written up for academic audiences
(see: http://www.cspo.org/projects/eisbu/). Similarly, SPARC involved
numerous workshops that brought scientists and science policy makers
together in a way that enhanced both communication and learning.
The integration of outreach and education is apparent in CSPO's
growing collaboration with science museums and science centers. We view
these collaborations as ways of reaching wider audiences and increasing
the ability of our graduate students--social scientists as well as
natural scientists and engineers--to communicate to broader audiences.
Our Center for Nanotechnology in Society has fostered a national
strategic partnership with the NSF-funded Nano-Scale Informal Science
Education Network to develop programs and exhibition materials and
plans that incorporate societal interests and outcomes in communicating
about emerging technologies. CSPO opens science communication
opportunities for scientists and engineers through its monthly Science
Cafe series with the Arizona Science Center and incorporates museum-
floor experience into its integrated training of doctoral scientists
and engineers. CSPO is also working with the Museum of Science, Boston
and the National Academy of Engineering to plan a national educational
campaign to focus on climate change and engineered systems, to prepare
the next generation of engineers, citizens, and leaders to meet the
challenge of adapting the nation's technological infrastructure to
climate change.
Overall, we are continually engaged in a wide variety of efforts to
make our ideas accessible to the public and policy makers, through
informal and formal meetings and briefings in the Phoenix area and in
Washington, DC; through ``handbooks'' for decision makers; through
ongoing contact with the media; as well as by writing op-eds and
articles for non-technical magazines, websites, and blogs. We have just
received a small supplement to our SciSIP grant on Public Value Mapping
to produce engaging, instructional web-based videos for science policy
practitioners. New outreach products and activities are promoted via
CSPO's monthly electronic newsletter, which goes to over 3000 people in
academia, government, and industry. In all, I think it is fair to say
that CSPO views outreach, education, and research as equally necessary
foundations for pursuing its mission.
Committee Question 4. How can the dissemination of SciSIP research
findings be improved so that policymakers are
better informed of the current state of research?
Are there best practices that can be implemented by
the Federal government and/or the research
community to improve the incorporation of science
and technology policy research into the decision
making process?
SciSIP program officers, in collaboration with their grantees, with
organizations like the American Association for the Advancement of
Science, and with other federal agencies, has made an impressive effort
to ensure that research results are made available to science and
innovation policy makers, through the SciSIP website and listserve, and
through a variety of workshops, including one to be held this coming
December.
SciSIP and NSF more broadly face something of a dilemma here,
however. As I'm sure the Committee well appreciates, academic
researchers are generally not rewarded for communicating their work to
policy makers, or even for making the results of their work
comprehensible to non-experts. I'm extraordinarily fortunate to work at
a university where this is not the case. Moreover, given the
fundamental nature of much of the research supported by SciSIP, the
extent to which project results can translate into results immediately
useful to decision makers may be highly variable. At the same time,
it's fair to say that science and innovation policy decision makers may
not always be either receptive to, or able to act on, the results of
research conducted under SciSIP.
In line with many of the comments I've already made, and consistent
with research done by CSPO and many other groups, the best way to
further improve the value of SciSIP research for decision makers would
be to increase the level of interaction between the researchers and
decision makers. This point should not be interpreted as a criticism of
the current SciSIP program, which as far as I can tell is effectively
pushing the boundaries of typical NSF practice, and working at the
limits of its human and fiscal resources, to try to maximize
dissemination.
Yet ensuring that researchers are providing knowledge that decision
makers can actually use is not only a matter of ``dissemination,'' it
is also structural. For SciSIP results to be both usable and used,
researchers and decision makers must each come to understand the needs,
capabilities, and languages of the other--a process that we have
termed, in our SPARC project, ``reconciling the supply and demand of
research.'' Such a reconciliation takes time and ongoing interaction.
It can certainly be pursued along multiple paths--through joint
committees, workshops, personnel exchanges, interviews and surveys, and
so on--but the key is ongoing and meaningful interaction leading to
mutual understanding. An NSF research program, even one advanced with
the creativity and vigor that characterizes SciSIP, is unlikely to be
able, by itself, to provide the sort of institutional infrastructure
that leads to the production of consistently usable knowledge. The idea
of integrated SciSIP centers, previously mentioned, could be one way to
create a greater capacity to move ideas into use. Federal agencies and
programs that sponsor mission-oriented research, and that have a proven
record of producing usable knowledge, might also be able to play a role
here to help achieve the necessary integration.
Committee Question 5: What are the fundamental skills and content
knowledge needed by SciSIP researchers and
practitioners? What are the backgrounds of students
pursuing graduate degrees in science and technology
policy, and what career paths are sought by these
graduates? Is the National Science Foundation
playing an effective role in fostering the
development of science and technology policy degree
programs at U.S. universities? If not, what
recommendations, if any, do you have for NSF and/or
the universities with such programs?
As I've suggested, the domain of SciSIP research and practice
cannot and should not be defined by any particular set of skills or
area of knowledge. In fact, given the complexity and diversity of the
challenges facing SciSIP policy makers, it will be important to keep
the field as open and flexible as possible, where the necessary skills
and knowledge are determined based on the problem at hand, and on the
evolution of the field itself, rather than some arbitrary boundary.
I've already mentioned the varied backgrounds of CSPO's core faculty
group, and our graduate students are if anything even more diverse,
coming to us with degrees in business, information systems, science and
technology studies, astrophysics, political science, law, English,
public policy, library and information science, philosophy, physics,
biology, environmental science, geology, anthropology, sociology, and
industrial management.
Graduate training in science and technology policy is also diverse,
occurring in many different types of programs, with many institutional
and administrative arrangements, in many U.S. universities. There is no
standard-model science and technology policy graduate degree, and given
the complexity of the field perhaps that is just as well, but it does
create challenges in terms of attracting resources, creating an
identity, and setting priorities. Similarly, while many career paths
are open to those who have advanced training and degrees in science and
technology policy, there is no formula for how to build or advance a
career in this field, as there is in, say, law, medicine, or
engineering. In CSPO's brief experience with graduate education, we do
see our students and post-docs progressing on traditional academic
paths, but they are also going into the private sector, working at
nongovernmental organizations, and taking up positions in government
agencies and think tanks. I also want to emphasize the importance that
we place on ``continuing education'' via our professional Masters
program, which we hope will reach mid-career professionals already
working in areas related to science and technology policy, and equip
them with tools to do their jobs more effectively, or to move into more
complex jobs, in the public, private, and nongovernmental sectors.
As I discussed in my response to Question 2, NSF's SciSIP program,
as well as its Science, Technology, and Society Program, are working
hard to build a sense of community and identity among science and
technology policy researchers, and to provide support for research
across a broad domain of problems and applications. However, as
discussed at length by about 75 members of the community at this
summer's Gordon Conference on Science and Technology Policy, the
traditional academic structure of universities remains a considerable
obstacle to building long-term capacity in the field, and most science
and technology policy programs exist in the margins and spaces of
standard disciplinary schools and departments. I am fortunate enough to
work at a university whose leadership has a strong commitment to
cultivating interdisciplinary, problem-based research that can link
knowledge creation to solutions for complex societal problems. Yet even
at ASU the long-term future of science and technology policy research
probably depends on finding a way to more closely knit CSPO into the
fabric of the formal academic units on campus.
One conclusion here is that NSF's ability to foster the development
of the field of science and technology policy is partly dependent on
incentivizing universities to recognize SciSIP as a field worth
cultivating. While the SciSIP program is certainly of a scale
sufficient to mobilize and motivate individual researchers working on
science and innovation policy, it is probably not big enough to get the
attention of university administrators. I have already emphasized the
potential value of applying an integrated Science and Technology Center
model to building the SciSIP community and moving its research results
into use. An NSF commitment to supporting one or more such centers
would also send a strong signal to university leaders that the science
of science and innovation policy is a national priority, deserving of
strong focused effort and investment from our universities.
Biography for Daniel Sarewitz
Daniel Sarewitz is Professor of Science and Society, and co-
director and co-founder of the Consortium for Science, Policy, and
Outcomes (CSPO), at Arizona State University (http://www.cspo.org). His
work focuses on revealing the connections between science policy
decisions, scientific research and social outcomes. How does the
distribution of the social benefits of science relate to the way that
we organize scientific inquiry? What accounts for the highly uneven
advance of know-how related to solving human problems? How do the
interactions between scientific uncertainty and human values influence
decision making? How does technological innovation influence politics?
And how can improved insight into such questions contribute to real-
world practice? He is the author of Frontiers of Illusion: Science,
Technology, and the Politics of Progress (Temple, 1996), an exploration
of the public myths that underlie decisions about science and
technology; the co-editor of three other books; and the author of many
articles about the interactions of science, technology, and society. In
addition to scholarly journals his work has appeared in The Atlantic
Monthly, The New Republic, and many newspapers; from December 2009
until September 2010 he wrote a monthly column on science policy for
the journal Nature. His work has also received featured coverage on
NPR's Morning Edition, in the New York Times and the Chronicle of
Higher Education. From 1989-1993 he worked on R&D policy issues for the
U.S. House of Representatives, first as a AAAS Fellow in the office of
Congressman George E. Brown, Jr., and then as a staffer on the
Committee on Science, Space, and Technology. He received a Ph.D. in
Geological Sciences from Cornell University in 1986. He now directs the
Washington, DC, office of CSPO, and focuses his efforts on a range of
activities to increase CSPO's impact on federal science and technology
policy processes. His new book, The Techno-Human Condition (co-authored
with Braden Allenby; MIT Press) will be published in March 2011.
Chairman Lipinski. Thank you, Dr. Sarewitz. And it is the
beginning of votes, but we should be able to get to the
testimony in here. Dr. Murray.
STATEMENT OF FIONA MURRAY, ASSOCIATE PROFESSOR OF MANAGEMENT,
TECHNOLOGICAL INNOVATION & ENTREPRENEUR GROUP, MIT SLOAN SCHOOL
OF MANAGEMENT
Dr. Murray. Okay. Thank you very much. Thank you, Chairman
Lipinski and other Members of the Subcommittee, for the
opportunity for being here. My name is Fiona Murray as you
heard before I am a Professor of Innovation and
Entrepreneurship at the MIT Sloan School and I am also
Associate Director of the MIT Entrepreneurship Center.
Now to start my remarks I thought I would just describe the
perspective I bring. Briefly, I am the grateful recipient of
two SciSIP Grants. I worked on what I would think of as a
SciSIP oriented research agenda for more than a decade,
although I really only discovered the SciSIP research community
in about 2006. As a faculty member of a business school I also
engage on these issues with managers, scientists themselves who
are also interested, sometimes at the lower levels, in how to
organize effectively to ensure the productivity and impact of
their research.
I should also just say something about my own training. I
have a background, a Bachelor's, Master's, and Ph.D. in
Chemistry. That is a very unusual training for somebody who
does SciSIP. I think it enables me to bring a unique
understanding of the bench science to this research, although
as I do note in my written remarks I am not sure that this is
an ideal past to learn the rigorous social science methods that
one really needs. I have had to rely on again self-education
and some very patient co-workers to get me over what I think is
a quite high bar to make a serious contribution to this
endeavor, and in particular to do it in a way that contributes
to the policy and the scholarly debate.
I want to just take my time to see if I can make three
points. I will make two if time--if only time permits.
Something about the vision of SciSIP that--from what I think
that means about the kinds of gaps there are in the research.
And then also how I think the scientific community might more
effectively be organized to really have an impact in terms of
research links to the community and in particular education.
So I think that SciSIP is not completely about doing
science and technology analysis. I think there's already an
excellent scholarship describing policy initiatives, the
government attitude toward science, and the politics of science
and innovation policy. But I think that what SciSIP brings,
which is unique, is this sort of scientific lens to the
problem. And what I mean by that is that it is a serious and I
think important attempt to undertake causal analysis and
evidence-based analysis asking whether and how particular
policy interventions actually have an impact, whether it is in
the short run or the long run. And so I think that good
scientific research defines impact richly: it is about the
level of the rate and the direction of scientific progress and
innovation, but it is also about long run impact on economic
growth.
But I also want to emphasize this causal piece of what
kinds of policies we think make a difference. I think that at
its best, SciSIP defines policy broadly but precisely in
particular research instances. And so it can mean everything
from high level national policies, the laws, but also agency
implementation processes, agency selection processes, but below
that, community behaviors, things like the Bermuda Laws and so
on. And even at some microlevel, lab level choices around how
we choose to organize the scientific research at the ground
level.
I think a key approach to SciSIP has been grounded in two
recent developments. One is the data development which has
already been discussed. But I would also say it has been
enabled, actually, by a massive scientific data infrastructure
investment, so some of my own work has really been enabled by
investments in things like GEN Bank and that ability to
interrogate genetic data to then do science policy analysis.
But also I think a second piece of social science methods
is in program evaluation. Actually, you are familiar with this
from the work that you do on evaluating education policy. But I
think that the ability to use experiments and causal analysis
and so on from that policy evaluation tool kit is extremely
important to pushing SciSIP forward.
So I think that SciSIP has really been critical and
attracted serious cause, but in my view there are still some
gaps. To pick up the ``straw and bricks'' analogy, it strikes
me that while we need bricks, if we want to cross the bridge
from data to understanding we actually have to build a bridge
with those bricks. And what does that mean? I think that does
mean more analysis as well as just measurement. I think at the
moment a lot of the scientific work, including my own, is
intrinsically focused on biologists and on funding at the
National Institutes of Health. That is critical, but not the
only arena, and I think that there is--we do need to understand
how other disciplines and other agencies are working.
I think that there has been a focus on national-level rules
and specific agencies and less on these community-level choices
about how to organize structure, collaborations, and more
informal efforts. I think we also need to focus on
distributional issues. So not just how many more papers are
produced, but what kind of a breakthrough or everyday science,
what kinds of research, are they American or foreign, and so
on. I don't think we have focused enough on that.
And let me in the last few seconds just say something about
the SciSIP community. I think that the community actually needs
to do more of its own bottom-up organizing. The NSF has done a
tremendous job in kind of structuring it in a top-down way, but
that is a huge amount of work for one agency to do. And I think
as a community we need to do a more bottom-up in order to both
engage in more knowledge exchange among ourselves and to focus
on education. And I think the educational imperative at the
Ph.D. level does need to be organized across a number of
campuses. And then, as I think at the policy level of our
links, the policy makers again have to be organized in a more
community-based way. So I would suggest that that needs to be
done through a consortium of universities but with this
tripartite mission of research, education, and then links to
policy. And I will leave my remarks there. Thank you very much.
[The prepared statement of Dr. Murray follows:]
Prepared Statement of Fiona Murray
According to the National Science Foundation (NSF), the Science of
Science & Innovation Policy (SciSIP) program ``supports research
designed to advance the scientific basis of science and innovation
policy \1\. The program is an important and bold attempt to build a
strong intellectual foundation for science and technology policy making
regarding the laws and rules that shape the institutional environment
in which scientific research and innovation takes place. It does so by
adopting recently developed, leading-edge methodological approaches
based on both large scale empirical data analyses and complementary
qualitative analyses. The explicit goals of the program are to fund
research that ``develops, improves and expands models, analytical
tools, data and metrics that can be applied in the science policy
decision making process''. From my perspective as a SciSIP scholar, I
conceptualize the SciSIP agenda as the systematic, evidence-based and
causal analysis of the impact of policy interventions on the rate,
direction and impact of scientific knowledge production and innovation.
If successful in research and in coupling to policy decisions then this
agenda will enable Federal and state policymakers, as well as others
engaged in shaping the production and translation of scientific
knowledge (including scientists themselves, universities, Foundations
and scientific communities), to design more effective policies and
practices that ensure that investments in science and innovation have
rapid and extensive scientific, social, and economic impact.
---------------------------------------------------------------------------
\1\ Accessed from http://www.nsf.gov/funding/
pgm-summ.jsp?pims-id=501084&org=sbe 9/16/2010.
---------------------------------------------------------------------------
In this testimony I lay out my personal views of the SciSIP program
from the perspective of an NSF-SciSIP scholar (and grant recipient),
and as a Faculty member in a leading School of Management who engages
routinely with scientists concerned with the impact of their research,
policy students as well as MBA students and executives hoping to work
effectively at the academic-commercial interface.
In what follows I examine some recent breakthroughs that have
enabled SciSIP research, outline some of the key research emerging from
SciSIP to date and critical gaps. I then turn my attention to what I
observe as the need for greater community building and finally, the
potential for a significant educational initiative.
The notion that there can be a ``science'' of science and
innovation policy is relatively recent (Marburger 2005). There is a
long and distinguished traditional of science policy research
nonetheless, the current focus on measuring the causal influence of
science and innovation policy levers at different levels (national
policy, agency interventions as well as community and lab-level
actions) can be linked to advances in economics and related fields in
the early 1990s. During this period, leading economic historians
including Paul David, Joel Mokyr and Nathan Rosenberg developed
critical conceptual breakthroughs in understanding the economics of
science and innovation as grounded both in institutions (policy levers)
but also in the micro-level behaviors and incentives of scientists and
engineers themselves. Building on economics as well as the sociology of
science, they came to view Science as a distinctive institution in
several ways: as a knowledge production system, as an input into
technological innovation, and as a reward system.
The empirical promise of this conceptual agenda was taken forward
by a group of economists and sociologists who aimed to evaluate the
impact of public policies on research behavior, research outputs, and
associated economic outcomes (Marburger, 2005; Jaffe, 2006). In
following this agenda, scholars confront a number of key challenges. In
particular is it possible to separate the influence of a particular
policy or institution from the underlying nature of the scientific
knowledge that is being developed? To put it more simply, in the policy
``whodunit'' it is often hard to say whether it is the policy that had
the effect of speeding up scientific progress in a particular area or a
chance in our understanding of a scientific problem. Without a parallel
universe for policy experiments, when one observes the production or
diffusion of a piece of knowledge within a given policy environment,
one cannot directly observe the counterfactual production and diffusion
of that knowledge had it been produced and diffused under alternative
policy conditions. To resolve these challenges, SciSIP scholars have
combined methodological advances in program evaluation- particularly a
``natural experiments'' approach--with novel data techniques. The
experiments-based approach (with which the committee is likely familiar
from its work on education) relies upon methods pioneered in public
finance and program evaluation (Meyer, 1995; Bertrand, Duflo, and
Mullainathan, 2004; Angrist and Pischke, 2008). To complement these
methods, SciSIP scholars have made extensive use of novel datasets
including data on publications, patents and most recently citations.
This approach uses these ``documents'' as the core objects of analysis,
assuming that they represent ``pieces of scientific knowledge,'' and
citation analysis to investigate the impact of institutions on the
cumulativeness of discovery and innovation (Garfield, 1955, De Solla
Price, 1970; Jaffe, et al, 1993; Griliches, 1990, 1998). When placed
within a framework to evaluate science and innovation policy, these
elements constitute a robust approach to analyzing and tracking the
causal impact of public policies on science and innovation inputs and
outputs.
The power of the emerging SciSIP agenda is to incorporate these
novel approaches and therefore move beyond description and observation
of science at work or particular policies towards the more systematic
analysis of particular institutional interventions. Thus pioneering
SciSIP research typically combines three elements:
i) Providing clear theoretical foundations for understanding
the ways in which institutional change (at any level) might
influence the behavior of scientists and therefore the rate and
direction of their knowledge production.
ii) Building careful empirical designs that enable causal
analysis, and undertaking these empirical studies using
systematic data gathering methods at different levels
(including quantitative data but also including qualitative
studies).
iii) Grounding the analysis in a deep understanding of the
phenomenon--the details of the particular policy changes or
organizational choices as well as the ways in which these shape
scientists daily life.
As a contributor to the broader SciSIP agenda and approach, my
research in the past few years has focused on the conflicts and
compromises shaping the boundary between academic science and the
commercial world--especially the impact of intellectual property (IP)
rights and IP licensing strategies over basic scientific research in
areas as diverse at human genetics, stem cells and cancer biology. More
recently I have expanded my research to examine the community and
organizational-level interventions that scientists can make including
understanding how research quality is governed (through retractions)
and how projects are selected and evaluated). In my own work, I have
found that my training as a scientist provides aids in the third
element of the SciSIP approach but my work is strongly based on the
theories and methods of economics and sociology of science and
therefore links the three aspects outlined above.
A research project of mine recently completed with a series of co-
authors illustrates the SciSIP approach to the analysis of science
policy. It was designed to adjudicate one policy element of the
institutional complex--the impact of intellectual property rights on
research tools (and the licenses that shape access to such tools) on
scientific productivity and diversity. Rather than theorizing broadly,
it focuses specifically on one controversial episode in the genetics
community initiated by the discovery, patenting and then exclusive
licensing of mouse genetics technology (the Oncomouse approach and the
related Cre-lox approach) and the subsequent licensing agreement made
among DuPont, the Jackson Laboratories and the National Institutes of
Health to enable greater access to these key research tools.
In The Oncomouse that Roared (Murray 2010), I take a qualitative
approach to the question of whether and how the Oncomouse patent
influenced the scientific community. Rather than compare the mouse
genetics community to another scientific field (which may have any
number of inherent differences), I compare the periods before and after
the Oncomouse patent was granted and licensed. For 3-4 years, with no
intellectual property rights yet granted the mice were subject only to
the informal norms that characterize a competitive, but collegial,
scientific community. After the grant of the patent, DuPont (exclusive
licensee) strongly enforced its property rights on scientists. Through
detailed interviews and documentary analysis comparing the pre- and
post- patent period, I follow the SciSIP approach and closely analyzed
the impact of the Oncomouse patent on mouse geneticists. I find that
some scientists reluctantly acquiesced, dealing with complex contracts.
Others defied DuPont, sharing mice informally in the face of opposition
from their universities. Behind the scenes other more complex changes
were also taking place as scientists sought to reshape the role patents
in their scientific life. This is reflective of broader changes in the
scientific community in the face of higher levels of commercial
interest and engagement and the resistance to the encroachment of high-
powered commercial interests. Such a grounded perspective highlights
the importance of understanding how scientists respond to policy
interventions and has a number of policy implications. However it also
raises a more SciSIP-oriented question about the causal impact of the
compromise (when the NIH persuaded DuPont to sign a Memorandum of
Understanding making Oncomice open for experimentation) on the level
and type of research using these genetically modified mice i.e. do such
policy interventions shape the rate and direction of science.
I examine the causal impact of these shifts to greater openness in
``Of Mice and Academics'' (Murray et al. 2010). The ``dependent
variable'' in this paper is the level and type of scientific research
publications that use genetically engineered mice in each year from
1990 until 2006--based on a dataset of over 20,000 publications that
are coded by their level of basicness, the rating of the journal in
which they are published, the rank of the school affiliations of the
authors etc. The ``independent variable'' is the timing of the policy
shift in the openness of particular types of transgenic mice (Onco mice
and Cre-lox mice). To aid in the interpretation of the data we also
include a control group of papers that build on mice never influenced
by intellectual property rules. A central idea of this research design
is that while research discoveries (such as engineered mice) are
developed at a given point in time, their use by subsequent researchers
takes place over time. This insight motivates a differences-in-
differences approach to the analysis of follow-on scientific research:
If the policy environment governing the incentives and/or ability to
build on published discoveries changes over time (and affects only some
discoveries but not others), it is possible to identify the impact of
the policy change by examining how the pattern of follow-on research
(captured in published articles) changes after the policy intervention.
In other words, policy changes that impact one group of articles and
not another can constitute a natural experiment. This paper exemplifies
the SciSIP approach by linking (microeconomic) theory about the way
researchers respond to openness, with data/empirics that allow for
causal analysis, and a sufficiently detailed understanding of the
policies and practices of scientists to enable appropriate research
design. We find that the NIH MoU did indeed increase not only the level
of research using these mice but also spurred a greater diversity of
researchers to move into the field, follow novel paths and take new
approaches.
Taken together these two papers address questions of how
institutional and organizational changes shape the rate and direction
of scientific knowledge. They follow the three key elements of the
SciSIP approach by carefully and precisely focusing on the phenomenon
at hand, using that detailed understanding to link theories of
scientists' behavior to careful data, and building empirical strategies
in a way that enables causal analysis, normative conclusions and
theoretical contributions.
ASSESSING THE GAPS IN SciSIP KNOWLEDGE
As outlined above, the SciSIP agenda presents far reaching research
opportunities for scholars whose goal is to contribute to the social
sciences, to our understanding of science and innovation in the economy
and to have policy impact. A number of significant gaps in the current
state of knowledge remain and can be usefully considered around the
organizing framework laid out below. This describes SciSIP research
according to the level of analysis at which the policy interventions
are taking place: national rules and regulations, agency-level
interventions, community norms and practices and organizational
actions. I then propose three cross-cutting questions that apply to
each level (see below). To illustrate this perspective and the gaps it
reveals, I first describe research on high level rules and regulations
then move to more micro-level analysis of organizational interventions.
National rules and regulations: This includes
research on the effectiveness of national rules and regulations
on the rate and direction of scientific progress. A major area
of focus includes research on the influence of the Bayh-Dole
Act on university researchers (Owen-Smith and Powell 2003). In
my own recent research, we have examined the impact of US
regulations with regards to the funding of research in the area
of human embryonic stem cells (Furman, Murray and Stern 2010).
Gaps at this level of analysis remain with regards to the role
of international rules and regulations on science in the United
States, and the ability of U.S. researchers to remain highly
competitive and at the knowledge frontier in the light of
growing global spending on scientific research. In addition it
would be valuable to understand how the particular funding
levels, structure and incentives of university systems in
different countries impact downstream outcomes, such as
scientific production, firm founding, and health & welfare, and
how they contour the impact of government policies such as
those related to intellectual property rights.
Agency or University-level rules and norms: Funding
agencies, especially the Federal government, have a variety of
opportunities to shape the rate and direction of scientific
progress. Both areas have up to now been poorly understood.
Recent work funded by SciSIP has made significant progress
along these two dimensions but gaps remain. In particular the
influence of non-Federal funding sources particularly corporate
funding and the growing foundation funding is poorly documented
and understood.
Shaping the Direction of Research: Funding agencies,
as they select among research projects and shape the
expectations and controls they place on researchers
have a variety of opportunities to influence knowledge
production. This has often been thought of as a black-
box with the scientific community utilizing the peer
review system as the best mechanism to self-regulate
and shape direction. Pioneering analysis by my MIT
colleagues shows that exceptional scientists are much
more likely to produce innovative breakthrough science
when using long-term grants that allow them exceptional
freedom in the lab (Azoulay et al, 2010) \2\. This
study raises the question of how researchers are
encouraged to move into new and emerging research
areas, and how to encourage ideas at the high-quality
high-risk tail of the distribution.
---------------------------------------------------------------------------
\2\ They do this by comparing the research profiles of similar
biologists some of whom receive more open ended long-term funding from
Howard Hughes while others receive more traditional R01-style grants
from the NIH.
We must encourage more research to understand the
impact of funding choices and funding incentives on the
type of research outcomes. This agenda could also
benefit from the analysis of scientists outside the
U.S. in settings where different types of incentive
systems exist. In line with recent interest in
Challenges (prizes) as an alternative incentive
mechanism, we should also extend this analysis to
---------------------------------------------------------------------------
include other funding mechanisms or reward systems.
Shaping the Disclosure and Sharing of Knowledge and
Materials: Funding agencies have an opportunity to
shape the rate and effectiveness with which knowledge
that is generated as a result of grant-making is shared
among scientists and is diffused into the economy along
productive routes. Among the most important and
controversial rules shaping such impact of scientific
research are the rules around intellectual property
rights. This has been the topic of vigorous debate
particularly with regards to the increasing levels of
patenting within the scientific community. This is the
research arena in which SciSIP researchers have made
one of the greatest contributions, with their research
informing policy discussions at the National Academies
of Science, within the National Institutes of Health
(NIH) and elsewhere. In particular, research has
explored the impact of patenting on the rate at which
that research is diffused within the scientific
community and on the rate at which commercial or
socially-beneficial products are developed (Murray and
Stern 2007; Huang and Murray 2009; Walsh et al. 2003,
2005). Extensive research documents the impact of IP,
licensing and material sharing practices on scientists,
but gaps in our knowledge exist with regards to the
impact of these policies on both scientific knowledge
production and economic impact (few studies examine
both with Williams (2010) a notable exception). We also
have a less systematic understanding of how to design
the ``intellectual commons'' in an efficient and
effective manner so as to promote and rapid follow-on
research and commercialization. There is also a
significant opportunity to extend these studies beyond
the study of life scientists to explore differences
across research communities in a range of disciplines
such as chemistry, computer science, materials science
etc.
Community level activities: The policies and
practices that emerge from the scientific community also play a
critical role in scientific progress and impact. Thanks to more
systematic analysis of resource-sharing arrangements both
informally (see Hauessler et al. 2009; Waltsh et al., 2005) and
through formal mechanisms such as Biological Resource Centers,
there is definitive evidence that investments in community-
based infrastructure such as materials repositories and data
repositories have a significant positive impact on the rate of
scientific progress by enabling access, certification and
sharing (Furman and Stern 2010). More recent analysis of the
self-governance of scientific communities through the system of
retractions has also pointed out the role of the community as a
crucial analytic lens (Furman and Murray 2009). In another
stream of research grounded in organizational theory and
sociology, scholars have examined whether and how different
community structures emerge in order to undertake the complex
task of horizontal collaboration (e.g. Powell et al. 2004,
O'Mahony and Bechky 2008) and collective work (Ferraro and
O'Mahony forthcoming).
At this level of analysis, critical questions remain
unanswered: how are scientific communities formed? How do they
coalesce around new research areas and what role might policy-
makers play in such community formation? For example do
mechanisms such as those used in DARPA enable community
building and how does this shape the long run effectiveness of
scientific communities?
Organizational Interventions: Scientific research is
an activity increasingly undertaken by collections of
scientists organized into teams, networks, collaborations and
networks. Recent scholarship highlighted the potential for
significant productivity benefits of specific organizational
choices (Cummings and Keisler 2005, 2007, Wutchy et al. 2007,
Jones et al. 2008) \3\. Recent work on open source computer
science communities highlights the complex and sophisticated
nature of the organizational and governance choices that these
groups of scientists can make (Dahlander and O'Mahony
forthcoming) and their implications for the nature of the
knowledge production (MacCormack et al. 2006, 2008). However,
there remained only limited research that examines the
organizational choices of scientists for specific research
projects--they choice of collaborations, organization of tasks
in the lab, governance of the laboratory. In part this gap
arises because of the historic perspective of the scientist as
``loan genius.'' Moreover, the strong sense of autonomy among
the scientific community has limited the research on choices
that scientists themselves make.
---------------------------------------------------------------------------
\3\ Ben Jones, a leading scholars in the SciSIP field and author of
several key papers in this area is currently a senior economist at the
Council on Economic Advisors.
Opportunities for further research also cut across these levels of
---------------------------------------------------------------------------
analysis with three of key importance:
i) On what field has the SciSIP research been focused? In
other words, is the analysis focused on a particular scientific
discipline or sub-field e.g. biology, high-energy physics,
nanotechnology? In my opinion, too large a share of current
SciSIP research (including my own) highlights the biologists to
the exclusion of other arenas. For example we have little
knowledge of the influence of policies on material scientists
who, like biologists, rely on complex materials, data, images
etc. Our knowledge of chemistry, computer science & engineering
remains fragmented.
ii) On what outcomes has the SciSIP research been focused? Is
the analysis focused on academic publications, patents or
marketed products? As noted above, these outcomes are now well
documented in the SciSIP literature. More emphasis however
should be placed on linking up different measures i.e.
publications and patents and finding metrics that capture
commercializable or commercialized products (see Williams 2010)
or measures that capture the broader knowledge landscape such
as recent analysis of the patenting of the entire human genome
(Jensen and Murray 2005). In this regard, data on licensing
would be more valuable than patenting data alone and yet such
information (for ideas developed using Federal funding) is not
available. I would strongly recommend that this be changed to
facilitate greater and more systematic analysis using measures
closer to the outcomes and impacts of economic interest.
iii) On what part of the outcome distribution are SciSIP
analyses focused? It is important that SciSIP researcher
evaluate which researchers and which institutions were most
affected by particular policy interventions rather than simply
highlighting the average impact of particular policies. How do
policy interventions impact the distribution of knowledge
outcomes? While there may be no impact on the mean perhaps
interventions influence the distribution of outcomes--with more
high and low quality research. How might policy-levers all
levels influence different researchers? What is their marginal
impact on different groups of scientists: those at elite highly
funded schools versus elsewhere, or those with international
co-authorship ties \4\. Studies that emphasize these
distributional outcomes should be encouraged by SciSIP because
it is from the richness and diversity of the scientific
community that novel breakthrough outcomes arise. Studies could
also fruitfully include analysis of the differential impact of
policies on male versus female scientists \5\.
---------------------------------------------------------------------------
\4\ A distributional approach would enable SciSIP scholars to
assess the impact of policies on numerous dimensions: researcher and
institution status, nature of the researchers' institution (university,
private firm, government lab, etc.); researcher cohort; collaboration
type (e.g., within vs. across institution, state, country, and/or
field); basic vs. applied research; journal status; article breadth
(multiple subjects vs. single subject); journal reputation (``impact
factor''); and network characteristics.
\5\ Some of my own work has examined the theme of gender in
scientific research. In ``An Empirical Study of Gender Differences in
Patenting among Academic Life Scientists'' (Ding, Murray & Stuart 2006)
we show that for over 4,000 life science faculty, after accounting for
the effects of productivity, networks, field, and employer attributes,
the net effect of gender remains: women patent at 40% the rate of
comparable men. Other research in this spirit includes Ding et al.
(2009).
SciSIP COMMUNITY
The SciSIP, led by the National Science Foundation with critical
input from Program Officer Julia Lane has made tremendous progress in
spurring a group of scholars to pioneer studies in the science of
science and innovation policy. For some of these scholars, this
represented an increase in their commitment to a field in which they
already had an interest. For others, SciSIP was a new departure and an
opportunity to move into a new and burgeoning field of great policy
relevance and with significant intellectual challenges. The time is now
ripe to move from funding of individual researchers to extending and
emphasizing the SciSIP community. A stronger scholarly community--once
established--will provide a number of critical benefits. It will be in
a position to design and implement its own common pool resources and
data sharing infrastructure to ensure that research methods, data and
analytic tools are widely and effectively shared among scholars. At the
moment there only a limited data-sharing infrastructure: the STARS
program represents a key effort to gather new data, however many
studies rely on complex historical datasets that incorporate rich and
varied data sources but which are not shared across the community.
While issues of confidentiality do arise, it is imperative that we
follow the lead of the scientific communities we study and build a more
effective infrastructure, norms and rules for data exchange and reuse
\6\.
---------------------------------------------------------------------------
\6\ See Murray and O'Mahony (2007) for a detailed examination of
the need for incentives for disclosure, reuse and accumulation in
different knowledge communities and how these incentives are provided.
---------------------------------------------------------------------------
Community building will also enable a richer interchange across
scholars whose disciplinary training and identify lies in different
areas. At the present time, my perspective on SciSIP is that there
exist various sub-communities largely within disciplinary silos who
communicate but with little exchange across these traditional
boundaries. For example, those who take an economics oriented approach
gather as a community under the rubric of the Innovation Policy Working
Group of the Bureau of Economic Research Productivity Program
(including the Summer Institute Innovation Policy and the Economy
activities). Not surprisingly however, this is not a forum in which
sociologists, historians of science and technology or science and
technology studies (STS) scholars share their research. In sociology
there are few if any systematic gatherings of scholars with science
policy interests and SciSIP researchers from STS and organizational
behavior share similar concerns. One strong recommendation I have is
for the NSF SciSIP program to fund the establishment of a ``knowledge
hub'' that can orchestrate annual or biannual research meetings for
interested SciSIP scholars. As I outline below in my comments on
structuring SciSIP education, an effective cross disciplinary hub (that
could be modeled on the Consortium on Cooperation and Competitiveness
(CCC)) with governance from faculty from a number of key universities
and rotating responsibility for cross-university research meetings and
some (limited) cross-university doctoral training. Such a forum should
also enjoy strong input from the NSF but overall would be most
effective if it was organized with ``bottom-up'' support from faculty
rather than managed directly by the NSF or other agency.
Building stronger linkages between the SciSIP research community
and the community of science policymakers is another key pillar of the
broader SciSIP community that remains to be constructed. At the present
time, there is limited awareness of the key findings of SciSIP research
among policymakers, and SciSIP scholars have only been engaged in a
limited way in recent debates over key changes in science policy. For
example, in the recent discussions over the role of innovation grand
Challenges there was very little scholarly input from the SciSIP
community; many prize and challenge designers and implementers were
involved but there was little or no discussion of the tradeoffs
associated with the use of challenges and the characteristics of the
most effective problems that might be solved using challenges (and
whose which are less likely to be tractable with this incentive
system). Building stronger links to the policy community is a long-term
task that starts with the education of a new generation of policy
makers to become critical consumers and co-producers of SciSIP
research. However in the short run, links could be established with
different government research funding agencies through a series of
targeted workshops that bring policymakers, agency employees and SciSIP
researchers to focus either on the issues, problems and successes of a
particular agency or to focus on cross-cutting issues of mutual
interest. This is likely to require sustained engagement through a
series of regular meetings and dialogues in order to build up trust,
mutual respect and an appreciation of the problems and opportunities
that our nation's research agencies, researchers and policymakers
confront and the tools and insights that might guide them going
forward.
SciSIP EDUCATION
Education is a critical element of the SciSIP agenda and should be
a central pillar of SciSIP going forward. To date the program has
focused largely on research and establishing a community of scholars
among established academics. There is a pressing need to determine the
best mechanisms through which to build up the educational aspects of
SciSIP and to fund this education. The challenge of SciSIP education
can be considered along two dimensions--education of producers of
SciSIP research and education of consumers/practitioners of SciSIP
research.
PRODUCERS
The educational requirements of SciSIP researchers are intensive;
the approach requires strong disciplinary foundations in the social
science. These must then be complemented by three other elements:
theory, phenomenon, data/empirics:
Theory: A perspective of the theoretical foundations
that ground our understanding of the behavior of scientists,
the scientific community, and scientific progress (these can
include a microeconomic approach based on understanding
incentives, the role of control rights etc. as well as a
sociological focus on norms and practices or a psychological
view)
Data/Empirics: Strong data and empirical skills
specific to science and science policy. SciSIP is grounded in a
belief that while every scientific research project is
different, systematic data gathering, the use of both large-
scale analysis (with publication, patent, citation,
collaboration data) and granular field-data, and careful
empirical design will enable scholars to draw causal inferences
regarding the impact of specific policy levers (at the
national, regional, agency, university and lab level) on
scientific productivity and impact. Therefore education must
give researchers the ability to identify, gather and analyze
such data
Phenomenon: A deep appreciation for the nature of
scientific work and for the ways in which particular
interventions in scientific progress have shaped productivity,
impact or direction. This is challenging for scholars without a
scientific training but is essential if scholars are to find
the most effective research settings for their studies and if
they are to make their work relevant to scientists and to
science policy practitioners.
The education of the ``producers'' of SciSIP research is a critical
challenge that should be a high priority for the SciSIP community.
Specifically, we must strengthen the education of PhD students who will
become the leading scholars in the field developing the research
agenda, pushing forward and filling research gaps and pioneering new
methods for the scientific and rigorous analysis of science and
innovation policy. The skills needed to push this agenda forward are
two-fold--first a strong disciplinary grounding in the ``home'
discipline--economics, sociology, social psychology etc. and second, an
in-depth understanding of the theories, data/empirics, and the
phenomenon (as outlined above).
Establishing PhD ``SciSIP field concentrations'' within traditional
disciplinary PhDs: In my opinion, it is not fruitful to try and
establish a new discipline within universities termed the ``science of
science and innovation policy''. Instead I believe that it would be
extremely valuable establish a ``SciSIP field focus'' within a variety
of PhD programs within traditional disciplines including economics,
sociology, public policy etc. At the present time, Public Policy
schools are offering PhD degrees with a S&T policy focus. However, the
promise of building a ``science'' of S&T policy is to extend the
intellectual community well beyond the usual confines of policy
analysis and ground the empirical and theoretical study of scientific
productivity and impact in economics and sociology, as well as
psychology and other adjacent disciplines. Therefore, as a complement
to S&T Policy PhD education in Public Policy Schools it is critical to
establish the field of ``SciSIP'' within the traditional education of
PhD social scientists within their traditional departments. [It is
worth noting that this is not an effective educational path for those
from a scientific background to move into SciSIP. To do so requires a
switch into a social science program to learn the foundations of the
particular social science discipline followed by a SciSIP field focus].
Let me illustrate the proposal of a ``SciSIP field focus'' with the
case of economics: Building a ``SciSIP field focus'' within economics
would involve establishing a suite of courses and educational materials
at a small number of leading departments (who could share materials,
exercises, data etc.). This could then be complemented by educational
`bootcamps' which would bring these PhD students together (from across
schools) in a common forum to build their skills, build community and
hear from leading SciSIP scholars. Such an approach would mirror the
development of entrepreneurship as a field of study within economics--
an area that was pioneered by the ``Entrepreneurship Bootcamp'' funded
by the Kauffman Foundation and taught at the National Bureau of
Economics. The National Bureau of Economic Research has played an
important role in coalescing much of the activity around education in
the economic foundations of entrepreneurship through the
Entrepreneurship Working Group now part of the Productivity Program.
This has enabled vibrant cross-school collaboration not only on
research but also teaching. At the PhD level this has helped to build
up and educate a community of young scholars within economics
departments and management schools who now have additional training
allowing them to pursue this field within their discipline.
Hub and Spoke Approach: To build a strong and effective SciSIP-
oriented PhD educational program will require using Federal education
funding to actively seed a ``SciSIP'' field focus within at least 4 to
5 schools per disciplinary area (with at least two disciplines
represented)--the Spokes. This should be supplemented by funding to
develop an effective SciSIP `Hub' for PhD education. The SciSIP Hub
would coordinate activities that encourage coordination across these
educational efforts, community building activities for the students
involved, and the community ``Bootcamp''. One model to develop and
effective SciSIP `Hub' for PhD education is the Consortium on
Cooperation and Competitiveness (CCC) which ``links together scholars
interested in long-run performance of U.S.-based companies and
institutions'' but with a recent focus on PhD-level education, training
and community building among PhD students from a number of programs
(based mainly within leading Business Schools) with the involvement of
academic faculty. As they described, ``No single U.S. university or
graduate school contains a ``critical mass'' of scholars from diverse
disciplinary backgrounds concerned with issues that are primary to CCC.
Accordingly, the network structure of the Consortium is a significant
source of strength.'' \7\ A similar argument can be made with regards
to the SciSIP agenda, suggesting that a similar consortium could be
invaluable in advancing the PhD education and the scholarly
community.\8\
---------------------------------------------------------------------------
\7\ http://businessinnovation.berkeley.edu/ccc.html
\8\ The CCC was funded by an initial endowment of $500,000 in May
1988 by the Walter and Elise Haas Fund. It has also been supported by
grants from the Alfred P. Sloan Foundation in New York, the Smith
Richardson Foundation, the Pew Foundations, the Ford Foundations, and
the Herrick Foundation. From 1990-1995, the Sloan Foundations was the
primary funding source for the Consortium. At the current time, the
funding for doctoral activities is largely provided by individual
schools supporting their students and hosting the event and by the
Kauffman Foundation.
CONSUMERS
The consumers of SciSIP research include Science and Technology
policy makers as well as scientists and engineers at different stages
in their education. Each of these groups could benefit from a deeper
understanding of the results of SciSIP research. In particular, it
should be a high priority of the SciSIP community to ensure that the
SciSIP agenda is well understood within S&T Policy education; S&T
Policy graduates are key stakeholders in the SciSIP research community
will be leading consumers of our research, and partners in future
research design and implementation.
Science & Technology Policy Masters Education: As students with
Masters-level education in science and technology policy move out into
the policy community, into research-based public policy organizations,
and into the funding agencies that are the subject of much SciSIP
analysis, they should be educated to be critical consumers of SciSIP
research and to be co-producers of that research in partnership with
academics. Much SciSIP research is relatively new and involves novel
methods that are highly technical in nature and are not always taught
to public policy researchers. Therefore, SciSIP has not yet been
incorporated as a central pillar into the S&T Policy curriculum. For
example, I supervise a number of MIT Technology and Public Policy
students each year and find that they do not have an extensive and
thorough grounding in the SciSIP approach, methods and results.
Nonetheless, the students are quick to learn and start to use this
approach in the course of their thesis work. However, it would be more
effective to do this in a more programmatic fashion.
NSF therefore has an important opportunity to work with a number of
leading S&T policy programs around the country to develop a curriculum
for education in the imperative, methods and results of the SciSIP
agenda. This will require a distinctive training from that provided to
PhD social scientists for a number of reasons. First, these students
can be expected to have less grounding in the data-oriented empirical
methods that are common in SciSIP research. The focus should be on
understanding the empirical approach and critiquing its validity and
the robustness of findings rather than on replicating studies. Second,
it is critical to share an understanding of the research design of
SciSIP projects particularly those that are based on careful analysis
of policy changes, policy experiments and other studies with a
thoughtful counterfactual basis. This is a methodological approach that
has been pioneered within SciSIP (as noted above) and is a critical
element in the education of S&T Policy students. A greater
understanding of the SciSIP approach will enable higher levels of
collaboration between researchers and policy makers in the future. In
particular, it has the potential to seed a higher willingness to work
collaboratively with scholars to design and analyze policy experiments
with the goal of increasing our understanding of the impact of specific
policy interventions on scientific progress
Education of Scientists & Engineers: As has long been recognized in
our analysis of scientific productivity, faculty and students engaged
in leading-edge research in science and engineering play an important
and distinctive role in shaping the productivity and direction of their
laboratories. Indeed the organization and direction of large and
increasingly complex research laboratories with collaborators that
cross disciplines, cross universities, and often cross national
boundaries is a daunting task. Nonetheless, we provide limited
education to our science and engineering colleagues to guide them in
this challenging activity. Offerings for scientists and engineers
during undergraduate and graduate education are limited. As we develop
new knowledge regarding the factors shaping research group productivity
and the role of lab leaders in this productivity, it provides another
opportunity for the National Science Foundation together with other
leading funding agencies to work to provide such education. Effective
education for scientists and engineers would involve three elements:
Teach science and engineering undergraduates about
the role of science and technology in society and the economy
and given them a broader perspective on their technical
education by highlighting the role of S&T policy. Focusing on
the results of SciSIP oriented research will emphasize the
importance of systematic, rigorous and data-driven approaches
to policy, institutions and organizations. This will also
provide them with tools to guide them in their subsequent
careers, since they will run into the problems of the science
and technology at every stage of their careers.
Provide PhD students with short courses regarding the
ways in which their research can be more productive and have a
more rapid impact on society and the economy based on SciSIP
findings. Focus on the key interventions in the process of
knowledge production (according to the SciSIP framework)--
government policies, regulations etc., university policies and
practices, organizational choices. Make this relevant through a
focus on the career choices they will have to stay within
academia, move into business or focus on policy. For those
staying at the bench (in academia or industry) examine how to
maximize productivity and impact using the results of SciSIP
research--organizational choices they have available, the role
of incentives in research teams, the most effective
collaborative processes they can use, etc. Highlight the key
processes involved in shaping commercial impact including
entrepreneurship and technology transfer and the SciSIP results
on how these are most effectively deployed. Finally highlight
the key role of policy in shaping some of their choices. A
program of this type has not, to my knowledge been developed
systematically for PhD students. This could be done in
conjunction with other career-oriented activities provided by
the NSF and other funding agencies to recipients of PhD grants
and Fellowships.
Educate science and engineering faculty to have a
deeper understanding of how they can achieve greater
productivity and impact, based on the systematic, evidence-
based results of SciSIP research by running short courses at
the university level (perhaps for new faculty), examining the
organizational and institutional activities that they could
undertake to increase the productivity and impact of their
laboratory. This could be incorporated into existing efforts on
grantsmanship, communications etc. Possible topics could
include two dimensions: factors shaping productivity including
lab organizational choices, lab size choices, and collaborative
models and factors shaping impact including patenting,
technology transfer, materials sharing, networking
communication etc. Such an approach would provide a platform
for sharing the findings of SciSIP research with academic
researchers while at the same time having an on the ground
impact on the productivity of investments in research. Finding
a possible funder of such an initiative would allow for key
educational materials to be developed. The participation of key
scientific societies in this activity would also expand the set
of stakeholders in the SciSIP agenda.
REFERENCES
Azoulay, P. G. Manso, J. Graff Zivin (2010) Incentives and Creativity:
Evidence from the Academic Life Sciences. MIT Sloan Working
Paper.
Angrist J. and J.S. Pischke (2008) Mostly Harmless Econometrics,
Princeton, NJ: Princeton University Press.
Arrow, K. and R. Nelson (1962). Rate and Direction of inventive
Activity: Economic and Social Factors. UMI Press.
Bertrand M., Duflo E. and Mullainathan S. (2004) ``How Much Should We
Trust Differences-in-Differences Estimates?'' Quarterly Journal
of Economics, 119(1) 249-275.
Cummings, J. N., & Kiesler, S. (2005). Collaborative research across
disciplinary and organizational boundaries. Social Studies of
Science, 35(5), 703-722.
Cummings, J. N., & Kiesler, S. (2007). Coordination costs and project
outcomes in multi-university collaborations. Research Policy,
36(10), 1620-1634.
Dahlander, Linus and Siobhan O'Mahony. Forthcoming. ``Progressing to
the Center: Coordinating Knowledge Work'' Organization Science.
De Solla Price, D. (1970), ``Citation Measures of Hard Science, Soft
Science, Technology, and Nonscience'', in C. Nelson and D.
Pollock, eds. Communication among Scientists and Engineers,
(Cambridge: Heath Lexington Books).
Ding, W. F. Murray & T. Stuart (2006). ``An Empirical Study of Gender
Differences in Patenting among Academic Life Scientists.''
Science, Vol. 313, pp. 665-667.
Ding, W., S. Levine, P. Stephan & A. Winkler (2009). The Impact of
Information Technology on Scientists' Productivity, Quality and
Collaboration Patterns. Berkeley Institute for Research on
Labor and Employment Working Papre.
Fabrizio Ferraro and O'Mahony, Siobhan. Forthcoming. ``Managing the
Boundary of an Open Project,'' in Market Emergence and
Transformation, W. Powell and J. Padgett, Eds.
Furman, J. and S. Stern. (2010) ``Climbing Atop the Shoulders of
Giants: The Impact of Institutions on Cumulative Knowledge
Production,'' American Economic Review, forthcoming.
Furman, J. F. Murray (2009). ``Governing knowledge production in the
scientific community: Quantifying the impact of retractions''.
MIT Sloan Working Paper (under review).
Furman, J., F. Murray, and S. Stern. (2010) ``Growing Stem Cells: The
Impact of U.S. Policy on the Organization of Scientific
Research,'' working paper.
Garfield E. (1955) ``Citation Indexes for Science,'' Science 122: 108-
11.
Griliches, Z. (1990) ``Patent Statistics as Economic Indicators: A
Survey,'' Journal of Economic Literature, 92: 630-653.
Griliches, Z. (1998) R&D and Productivity: The Econometric Evidence,
Chicago, IL: University of Chicago Press.
Haeussler, C., L. Jiang, J. Thursby & M. Thursby, 2009. ``Specific and
General Information Sharing Among Academic Scientists,'' NBER
Working Papers 15315.
Huang, K., F. Murray (2009). ``Does Patent Strategy Shape the Long-Run
Supply of Public Knowledge: Evidence from Human Genetics.''
Academy of Management Journal, 52(6).
Jaffe, A. (2006) ``The `Science of Science Policy': Reflections on the
Important Questions and the Challenges They Present,'' Keynote
Address at the NSF Workshop on Advancing Measures of
Innovation: Knowledge Flows, Business Metrics, and Measurement
Strategies.
Jaffe, A., M. Trajtenberg, and R. Henderson (1993) ``Geographic
Localization of Knowledge Spillovers as Evidenced by Patent
Citations,'' The Quarterly Journal of Economics, 577-598.
Jensen, K., F. Murray (2005). ``The Intellectual Property Landscape of
the Human Genome''. Science, Vol. 310, 14 October 2005, pp.
239-240.
Jones, B.F., Wuchty, S., Uzzi, B., 2008. Multi-university research
teams: shifting impact, geography, and stratification in
science. Science 322 (5905), 1259--1262
MacCormack, A., M. J. LaMantia, Y. Cai and J. Rusnak (2008). Analyzing
the Evolution of Large-Scale Software Systems using Design
Structure Matrices and Design Rule Theory: Two Exploratory
Cases, Proceedings of the 7th Working IEEE/1FIP Conference on
Software Architecture (WICSA), Feb 2008.
MacCormack, A., J. Rusnak and C. Baldwin (2006). Exploring the
Structure of Complex Software Designs: An Empirical Study of
Open Source and Proprietary Code, Management Science, Jul 2006.
Marburger, J. (2005). ``Wanted: Better benchmarks,'' Science,
308(5725): 1087.
Meyer, B. (1995). ``Natural and Quasi-Natural Experiments in
Economics,'' Journal of Business and Economic Statistics, 13,
151-162.
Murray (2010). ``The Oncomouse that Roared: Hybrid Exchange Strategies
as a Source of Productive Tension At The Boundary Of
Overlapping Institutions''. American Journal of Sociology,
forthcoming.
Murray, F. and S. Stern (2007) ``Do Formal Intellectual Property Rights
Hinder the Free Flow of Scientific Knowledge? An Empirical Test
of the anti-Commons Hypothesis,'' Journal of Economic Behavior
and Organization.
Murray, F., P. Aghion, M. Dewatrapont, J. Kolev and S. Stern (2010).
``Of Mice and Academics: The Role of Openness in Science''.
NBER Working Paper.
Murray, F and S. O'Mahony (2007). ``Exploring the Foundations of
Cumulative Innovation: Implications for Organization Science,''
Organization Science 18 (6): 1006-1021.
O'Mahony, Siobhan and Beth Bechky. 2008. ``Boundary Organizations:
Enabling Collaboration Among Unexpected Allies,''
Administrative Science Quarterly 53: (422-459).
Owen-Smith, Jason, and Walter W. Powell. 2003. ``The Expanding Role of
University Patenting in the Life Sciences: Assessing the
Importance of Experience and Connectivity.'' Research Policy
32(9): 1695-1711.
Powell, Walter W., Douglas R. White, Kenneth W. Koput, and Jason Owen-
Smith. 2004. ``Network Dynamics and Field Evolution: The Growth
of Inter-organizational Collaboration in the Life Sciences.''
American Journal of Sociology. 110(4): 1132-1205.
Walsh, J., C. Cho and W.M. Cohen, ``View from the Bench: Patents,
Research and Material Transfers,'' Science. September 23, 2005,
pp. 2002-2003.
Walsh, J, A. Arora and W. M. Cohen, ``Effects of Research Tool Patents
and Licensing on Biomedical Innovation,'' in W.M. Cohen and S.
Merrill, eds., Patents in the Knowledge-Based Economy, National
Academy Press, 2003.
Williams, H. (2010). ``Intellectual Property Rights and Innovation:
Evidence from the Human Genome'', MIT Working Paper.
Wuchty, S., Jones, B.F. & Uzzi, B., 2007. The Increasing Dominance of
Teams in Production of Knowledge. Science, 316(5827), 1036.
Biography for Fiona Murray
Fiona Murray is an Associate Professor of Management in the
Technological Innovation and Entrepreneurship Group at the MIT Sloan
School of Management and Faculty Director of the MIT Entrepreneurship
Center. She received BA and MA degrees in Chemistry from the University
of Oxford before coming to the United States where she received her
doctoral degree from Harvard University's School of Engineering and
Applied Sciences. Her research interests moved away from the bench to
the study of science-based entrepreneurship, the organization of
scientific research and the role of science in national
competitiveness. After a lectureship at Oxford's Said Business School,
Fiona joined the MIT Sloan School of Management where she studies and
teaches innovation and entrepreneurship with an emphasis on the life
sciences, chemicals and materials sectors. Fiona is well-known for her
work on how growing economic incentives, particularly intellectual
property (IP), influence the rate and direction of scientific progress.
Fiona works with a range of firms designing global organizations
working with a wide range of internal and external innovators (through
traditional contracts and ``Open Innovation'' mechanisms) that are both
commercially successful and at the forefront of science. She is also
actively involved in policy debates over the appropriate use of IP and
licensing in universities and more recently debates on when and when
not to use patents to promoted discovery research in neglected
diseases. She is also interested in the most effective organizational
arrangements for the rapid commercialization of science including
start-ups, public-private partnerships, the role of venture
philanthropy, and university-initiated seed funding. Her research has
been widely published in a diverse range of scientific and social
science journals including Science, New England Journal of Medicine,
Nature Biotechnology, Research Policy, Organization Science and the
Journal of Economic Behavior & Organization.
Chairman Lipinski. Thank you, Dr. Murray. Dr. Teich.
STATEMENT OF ALBERT H. TEICH, DIRECTOR OF SCIENCE & POLICY
PROGRAMS, AMERICAN ASSOCIATION FOR THE ADVANCEMENT OF SCIENCE
Dr. Teich. Thank you, Chairman Lipinski, Ranking Member
Ehlers, other Members of the Subcommittee. Thank you for the
opportunity to testify at this hearing today. I am Al Teich,
and I am the Director of Science and Policy Programs at the
American Association for the Advancement of Science.
As you know, AAAS and I myself have been deeply involved in
science and innovation policy for many years. Although this has
been an active field of research at least since the 1960s, and
it has produced a large body of literature and a substantial
number of researchers, there is a feeling that the results of
this work are not widely known or used among those who actually
make science and innovation policy.
This was behind the frustration of Dr. Marburger in his
speech, which led to the establishment of the NSF SciSIP
Program. The SciSIP Program has a unique mandate to couple
advances in fundamental knowledge about processes of scientific
discovery and technological innovation with issues of relevance
to policy makers. Among the features that differentiate the
SciSIP Program from its predecessors is the fact that it is not
just supporting individual research grants, but it is
attempting to build a community of practice among researchers
and to connect that community with potential users of the
research, practitioners in the Federal Government.
AAAS has played an active role in building this community
of practice through a workshop in 2009 that brought researchers
together to learn from one another. In that workshop, we saw
how SciSIP researchers reflect distinct disciplinary traditions
that can inhibit productive interdisciplinary dialogue. Even in
this not-very-large field, they can't always talk to one
another. They may ask different questions, use different
theoretical frameworks that employ different methodologies even
when they may address seemingly similar topics.
At the same time, because of the academic reward system,
SciSIP researchers, like many other researchers, seldom speak
in terms that policy makers find directly useful. And as one
speaker said at the 2009 AAAS workshop, policymakers are
confronted with a Babel of tongues which leads them to ignore
the experts and turn to other sources of information and
advice. Next month AAAS will convene another workshop with NSF
support. In that one we will try to connect researchers with
customers in the government. We hope that that workshop will
serve to allow the two communities to better understand each
other's needs and expectations.
While projects like the AAAS-NSF-SciSIP workshops are an
important step in building a community of practice, there is
more that can be done. Here are a couple of ideas just as food
for thought. Regarding research, researchers tend to
communicate directly with their peers by journals and
conference presentations in order to gain recognition in their
fields. But few policy makers read those journals or attend
those conferences. We need to find ways to encourage SciSIP
researchers to communicate with policy makers, either directly
or through the media, and to be rewarded and not penalized by
less policy-oriented peers in their fields for doing so.
On the teaching side, although many of the university
programs that provide graduate training in science and
innovation policy are interdisciplinary, the training they
provide is not always responsive to the needs and priorities of
policy makers. It might be useful to strengthen ties between
researchers and policy makers by engaging policy makers in
helping to develop and review curricula as well as engaging
them in teaching as adjuncts or guest lecturers. Some schools
already do this. Others would do well to follow their lead.
Beyond education, there is another mechanism for promoting
greater mutual understanding between researchers and policy
makers. It is people transfer. One approach might be to create
a program to give SciSIP researchers the opportunity to work in
government for perhaps a year. Providing SciSIP researchers an
opportunity to work in a policy making setting for a while, as
we do for scientists and engineers in our Congressional Fellows
Program, would allow them to gain firsthand knowledge regarding
the needs, priorities, and modes of operation of the potential
users of their work. Like our workshops, this hearing is an
opportunity for the science policy community to hear from you,
as policy makers, what research questions you believe SciSIP
researchers should be addressing. I look forward to the Q&A as
an opportunity to exchange ideas on that subject.
[The prepared statement of Dr. Teich follows:]
Prepared Statement of Albert H. Teich
Chairman Lipinski, Ranking Member Ehlers and members of the
Subcommittee, thank you for the opportunity to testify today on the
evolving subject of the Science of Science and Innovation Policy.
The American Association for the Advancement of Science (AAAS) is
the world's largest multidisciplinary scientific society and publisher
of the journal Science. The association, which celebrated its 162nd
birthday earlier this week, encompasses all fields of science,
engineering, mathematics, biomedicine and their applications. For more
than thirty-five years, AAAS has demonstrated its commitment to and
involvement in science policy issues with projects and activities such
as the annual AAAS Science and Technology Policy Forum, the Science and
Technology Policy Fellows Program, more recently with our Leadership
Seminar in S&T Policy, and--most directly relevant to this hearing--our
joint project with the National Science Foundation on the Science of
Science and Innovation Policy (SciSIP). We have served the academic
science policy community by publishing the first Guide to Graduate
Education in Science, Engineering and Public Policy (known as the SEPP
Guide) in 1985 and maintaining it as an online resource to the present
day.
Background
From one perspective the Science of Science and Innovation Policy
is not entirely a new field. Since the 1950s--and probably earlier--
economists, sociologists, political scientists and others interested in
public policies for science and technology have sought ways of
measuring the value of research investments. Research articles on
topics such as measuring Return on Investment (ROT) from research and
development (R&D), national innovation systems, and comparisons of
state and international standings have been published for many years.
Government tools such as the 1993 Government Performance and Results
Act (GPRA) and the more recent Program Assessment Rating Tool (PART) as
well as their programmatic forebears, have attempted to quantify the
value of government investment in various programs, although they have
found R&D programs more difficult to assess than others.
In the 1960s, the National Science Foundation (NSF) supported the
development of research and graduate education programs in science and
technology policy in a number of universities. During the 1970s, it
created the R&D Assessment and R&D Incentives programs, which funded
research on some aspects of S&T policy in universities and non-profit
institutions. In addition to the SciSIP program, the Foundation
currently funds research in science policy and related areas through a
number of programs, including the Science, Technology, and Society
Program in the Directorate for Social, Behavioral, and Economic
Sciences, and the Division of Science Resources Statistics, which has
long provided data and analysis of importance to science policymaking.
The current Science of Science and Innovation Policy endeavor is
unique, however, in its focus on drawing this research together into a
systematic, coherent body of knowledge that can be brought to bear
directly on national policy decisions. The National Science
Foundation's SciSIP program is engaging the science policy community in
research in theory, methods, models, and data development along four
broad themes--workforce issues, innovation ecosystems, outcome
measures, and data infrastructure. The program has an explicit mandate
to couple advances in fundamental knowledge about processes of
scientific discovery and technological innovation with issues of
relevance to policymakers. As a field of research, the Science of
Science and Innovation Policy has essentially been raised in relevance
from a largely academic discourse to a field with a potential national
impact.
Science and technology policy research can have and has had a
positive effect on national policy decisions. R&D data analyzed and
reported by NSF, as well as by AAAS, for example, has provided a
roadmap for decades for policymakers such as the Members of this
Committee as a guide for crafting the federal R&D portfolio.
As the NSF SciSIP program is still quite young and has been
awarding grants for only a few years, we believe that it is premature
to expect the results of that program's research to be incorporated
into national policy decisions. Furthermore, the results of any science
and technology policy research--whether within or outside SciSIP--must
still run the gauntlet of the policy process.
In other words, simply because the research has been done, does not
mean that it will be used. As helpful as the AAAS R&D budget analysis
may be to its users, policymakers still make decisions based not only
on research and analysis, but also on constituent needs, economic and
political considerations, public opinion, and their own perspectives on
national priorities. The same goes for studies that measure the
effectiveness of federal programs. Politics is not a contaminant in the
policymaking process. It is, after all, the essence of a democracy.
One way that policymakers can increase the likelihood that SciSIP
research be used to inform the design of effective federal programs and
the management of federal research investments is to conceptualize and
design research that both advances knowledge in a discipline and
answers specific questions relevant to policy. Some examples of such
research topics are given in the NSF SciSIP program solicitation:
examinations of the ways in which the contexts,
structures and processes of science and engineering research
are affected by policy decisions,
the evaluation of the tangible and intangible returns
from investments in science and from investments in research
and development,
It should be pointed out that science and technology policy
research is just as unpredictable as basic research in physics,
chemistry, or life sciences, and decision makers must take into
consideration the fact that some studies may yield unanticipated
results and that some may serve long-term rather than short-term needs.
It is important to ensure that an effective SciSIP portfolio balances
research that reflects short-term and long-term policy interests.
Among the features that differentiate the SciSIP program from
similar, past efforts, is its focus on building a community of practice
among researchers in the many disciplines engaged in the study of
science and innovation policy and its conscious effort to build bridges
between this community and the practitioners in the federal government.
Previous programs to support science and technology policy research
have always focused primarily on providing grants to individual
principal-investigators.
AAAS has played an active role in building this community of
practice. We organized a workshop of the grantees from SciSIP's first
and second rounds (FY 2007 and 2008) of awards to further construct
this community. The outcome of that workshop was a report, titled,
Toward a Community of Practice.\1\ Next month we will convene a second
workshop to continue building a community of practice by connecting the
researchers with potential users of their results in the federal policy
community.
---------------------------------------------------------------------------
\1\ Albert H. Teich and Irwin Feller, Toward a Community of
Practice: Report on the AAAS-NSF Grantees Workshop, March 24-25, 2009
(Washington, DC: American Association for the Advancement of Science,
August 2009). Available online at http://www.aaas.org/spp/scisip/
scisip-report.pdf.
---------------------------------------------------------------------------
There are challenges to building this SciSIP community of practice.
A sizable group of researchers working on current projects as well as a
large body of literature already exists. To an important degree, these
individuals and this literature reflect distinct disciplinary
traditions that can inhibit a productive interdisciplinary dialogue.
These disciplinary clusters may ask different questions, draw upon
different theoretical frameworks, and employ different methodologies
and analytical models even when they may address seemingly similar
topics (e.g., diffusion of innovation). Sometimes it seems they even
speak different languages.
At the same time, as these researchers speak to an audience of
their peers--albeit within their disciplines--they often do not speak a
language that policymakers understand or find useful. A concern
expressed at the first AAAS SciSIP workshop was that policymakers would
be confronted with a ``Babel of tongues'' which would lead them to
ignore the experts and turn to other sources of information and advice.
Another challenge is the fact that not all SciSIP researchers have
experience working at the interface between academic research and
federal policymaking. Some lack an understanding of the user community
and who the policymakers are, what information or datasets they might
require, or what other information they might need to know in order to
effectively address national policy priorities. This is not to imply
that these researchers are not familiar with the organization of
government or the legislative process. Rather, it has more to do with
the subtleties and nuances of the ``game'' and having an insider's
perspective on the complex policy questions that decision-makers face
and the interplay of interests that often shapes the debate over
science and innovation policy.
The AAAS project is an effort to build these necessary
relationships and to help SciSIP researchers and policymakers speak
each other's language and better understand each other's needs and
expectations. The goal is not to build a grand over-arching theory of
science and innovation policy, but to seek convergences among findings
and a higher degree of understanding within the community about new
perspectives and paradigms regarding science and innovation policy. It
is to build a more interdisciplinary approach with an eye towards
practical application by practitioners.
This community of practice is intended to assist individual
researchers or teams of researchers by enlarging the set of variables
and/or relationships that they consider in their work. It provides an
opportunity to expose research findings to a wider set of critical
perspectives and allows researchers to consider how their findings may
relate to other disciplines and research findings in other areas.
As you know, the NSF initiative in the science of science and
innovation policy stemmed from a sense that the body of science and
innovation policy research does not seem to be very widely known or
used among those who actually make policy in these areas. The AAAS
SciSIP project is intended to facilitate interaction between relevant
federal agency representatives and the growing community of SciSIP
researchers, to help the agency representatives learn about emerging
theories and models, and to connect research results with policy
issues. At the same time, SciSIP researchers should be able to learn
from the user community about their policy priorities and needs, which
can help shape the direction of future projects.
While the SciSIP program and projects like the AAAS-NSF SciSIP
workshop are an important step in building a community of knowledge and
a strong foundation between research practitioners and policymakers,
there is more that can be done.
Communication: As noted earlier, researchers addressing questions
of science and innovation policy have tended to direct their work to
colleagues, peers and others within their core discipline. This
includes presentations at professional associations and conferences,
and publishing in specialized journals (e.g., Research Policy, Social
Studies of Science). This is quite understandable in view of the reward
structure of academia and desire on the part of scholars in this field
and others to gain recognition from their peers. Relatively few
policymakers read such journals or attend academic conferences with any
regularity. One could approach this problem in two ways: One approach
would be to encourage policymakers to read these journals and/or attend
more academic conferences. Given the constraints of time and energy
they face, this seems unlikely to work. Alternatively, SciSIP
researchers might seek, in addition to their regular publication
outlets, opportunities to reach out to policymaking community either
through themselves writing for those publications that policymakers do
read or by cultivating opportunities to have their work reported in
popular media.
Education and Training: This ``clustering'' of a narrow core
discipline has not only worked its way into the presentation of
information, but in the education and training of students studying
science and innovation policy that only encourages self-organization of
a research area. Although the AAAS SciSIP program may help in
encouraging the development of a more interdisciplinary curriculum, it
isn't the central goal of the project.
As the committee has noted, there are about 25 U.S. universities
that offer graduate education in science, engineering and public
policy. There is no central organization for these programs and do not
share a common curriculum or even a common nomenclature. The AAAS Guide
to Graduate Education in Science, Engineering, and Public Policy
mentioned earlier lists programs such as Science Policy; Technology
Policy; Science and Technology Policy; Science, Technology, and Public
Policy; and Engineering and Public Policy. In addition, many programs
in Science and Technology Studies (STS) include a policy component, and
some programs in public administration and public policy provide for a
science and technology concentration. Furthermore, these graduate
programs can be administered within different academic departments:
Schools of Engineering, Public Administration, International Affairs,
etc. Some programs allow for students to take coursework outside the
traditional curriculum in other tangential fields (e.g., law), while
other schools do not.
Many of the graduates of these programs have gone on to very
successful careers. Nevertheless, it might be useful to have people
from the policy community--the potential users--involved in reviewing
the curricula of these programs as well as engaging in teaching as
adjuncts or guest lecturers. This is obviously easier for universities
in the Washington, DC, area to do than for those in other regions and
some institutions in this area do it regularly to good effect. But it
is worth the effort and expense for all.
Fellowships: Another potential mechanism for promoting cross-
fertilization of ideas and greater understanding of the policymaking
community's needs, is to create a Fellowship program for SciSIP
researchers to work in government for one year, similar to the AAAS
Science and Engineering Policy Fellowship that allows scientists an
opportunity to work at a federal agency or in a congressional office or
committee. Intergovernmental Personnel Act appointments could also be
used for this purpose. Providing science and policy researchers and/or
graduate students an opportunity to work in a policy office of the
federal government would allow them an opportunity to learn first-hand
the language, needs, and priorities of an agency, department, or
congressional committee.
Conclusion.
I would like to thank the Members of the Subcommittee for holding
this hearing and for their interest in the SciSIP program and the area
of science and innovation policy research. I look forward to working
with your staff as we prepare for the next AAAS workshop. Like our
workshops, this hearing is an opportunity for the science policy
community to hear from you, as policymakers, what research questions
you believe SciSIP researchers should be addressing. I look forward to
the Q&A as an opportunity to exchange ideas on that subject.
Biography for Albert H. Teich
Al Teich is Director of Science & Policy Programs at AAAS, a
position he has held since 1990. He is responsible for the
Association's activities in science and technology policy and serves as
a key spokesperson on science policy issues. Science and Policy
Programs, which includes activities in ethics, law, science and
religion, and human rights, as well as science policy, has a staff of
40 and a annual budget exceeding $13 million. He also serves as
director of the AAAS Archives.
Teich received his bachelor's degree in physics and his PhD in
political science, both from M.I.T. Prior to joining the AAAS staff in
1980, he held positions at George Washington University, the State
University of New York, and Syracuse University. Al is the author of
numerous articles and editor of several books, including Technology and
the Future, the most widely used college textbook on technology and
society, the twelfth edition of which will be published in 2011 by
Cengage Learning.
He is a Fellow of AAAS and the recipient of the 2004 Award for
Scientific Achievement in Science Policy from the Washington Academy of
Sciences. He is a member of the editorial advisory boards to the
journals, Science Communication; Science, Technology, and Human Values;
Review of Policy Research; and Renewable Resources and has been a
consultant to government agencies, national laboratories, industrial
firms, and international organizations. He is past president of the
Washington Academy of Sciences; former chair of the Board of Governors
of the U.S.-Israel Binational Science Foundation, where he remains a
member of the executive committee; a member of the Technical Advisory
Committee of the Maine Space Grant Consortium; the Norwegian Research
and Technology Forum in the United States; the Advisory Board of the
University of Virginia's Department of Science, Technology and Society;
the Program Committee for the 5th EuroScience Open Forum (to be held in
Dublin, Ireland, in 2012) and the Council of Advisors for Research and
Innovation Strategy of the National University of Singapore.
Teich speaks frequently before audiences in the U.S., as well as
Europe and Asia. He has appeared on National Public Radio, CNN, C-SPAN,
as well as various other electronic media and has been quoted in
numerous print media, including The New York Times, The Washington
Post, National Journal, The Chronicle of Higher Education, and CQ
Weekly Report.
Chairman Lipinski. Thank you Dr. Teich. I am going to start
questions. If you want to leave to get over to vote I think we
have about four minutes left, probably, in the first vote. But
I think this vote is going to last a long time. But we will--if
the witnesses can come back afterwards--it is probably going to
be about an hour though. I am not sure if any of you have--we
will have to leave at that point, but--yes, Dr. Ehlers, yes.
You have a suggestion?
Mr. Ehlers. No, just a quick comment which shows the
importance of this topic and why we should come back if we can
depending on the votes. But I would simply observe that the
current process in the Congress is that science policy is set
by the Appropriations Subcommittees. Money controls everything
and when they decide to give a certain amount of money to a
certain project, that basically ends up being the decision.
That totally ignores the input of other scientists and SciSIP's
folks who have a much greater interest. So something you can
think about in the meantime is how that could be addressed
without throwing out the Appropriations Committee entirely
which is probably impossible. So I just wanted to mention that,
and I hope you will have some brilliant ideas on how we could
practically address that particular problem. My staff just
informed me that 300 people have not yet voted, so we could
probably walk over instead of running over. But I hope the
votes don't run too long. And I would be delighted if any of
you would take on the challenge too and follow this.
Let me add one quick last comment. When Newt Gingrich was
here he wanted to double the funding in NIH, which did happen
in the Appropriations Committee. I argued that we should have
equal funding increases for NSF, treat all the sciences
equally. He said we will do that one next. Well, unfortunately
we lost the majority and so the next--they were happy. But
today I have heard Newt say in numerous speeches that one big
mistake he regrets is not having increased NSF and the other
hard sciences at the same time when he increased NIH. So let
that be a moral note for all of you who hope someday to be the
Speaker of the House of Representatives. Thank you very much.
Chairman Lipinski. I am hoping that you can make sure you
spread that work amongst your colleagues before you leave.
Mr. Ehlers. Yeah.
Chairman Lipinski. So you can help us--who really wants to
make sure we get that done, get that done in the future. Well,
you now have your homework assignment. You will have probably
about an hour and we will be back. Hopefully sooner than that,
but it is going to be at least 45 minutes I would say and I
look forward to hearing your answers. I am most interested in
how we really make these connections. Dr. Teich, I appreciate
some of your comments. I would like to delve maybe some more
into how we can, having been a--now as a political scientist,
and talk about not--policymakers don't read the journals.
Political scientists weren't reading the journals because it
didn't really speak to them, much less the policymakers. But I
would like to delve into that also some more, how we can
improve that. But the Subcommittee will be in recess.
[Recess.]
Chairman Lipinski. I call this hearing back in. I will now
start the questioning. I understand Dr. Sarewitz has to leave
at four o'clock so we will--each of us will get the chance to
ask some question before you have to leave. So I will now
recognize myself of five minutes and will begin with Dr.
Sarewitz.
You mentioned in your testimony that most of the data
available for SciSIP analysis are input/output data, level of
funding, number of graduate students, patents, et cetera,
publications. And if you--data offer an incomplete view of
societal value of S&T investments. So what would you suggest
that we do to better characterize and measure the social
outcomes of R&D?
Dr. Sarewitz. Okay. Thanks for asking that. It actually--
see, how should I put this--it--my answer will reflect a
diversity of perspectives here. I think we can all agree that
the process--and Julia actually wrote about this wonderfully in
Science Magazine--that the process that leads from R&D to a
particular desired social outcome, for example more employment
or better health, is extremely complex, with many different
inputs into the process. But I think that measuring is one way
to understand things but also very close case-based and textual
analysis is another way to understand things. And my view is
that the system is so complex that we are probably not going to
come up with a big theory of how you can predict social
outcomes from science and technology inputs. But we are going
to be able to develop a number of principles that reflect our
understanding of particular examples.
So I think the kind of data that--and I wrote about this a
little bit in my testimony--that we really need a kind of--data
that we are lacking that would be very important is very
granular case studies of both successes and failures in this
full range of linkages from laboratories to social outcomes for
a particular range of scientific priorities. And I think by
doing that we will be able to elicit a set of general guiding
principles that can help you guys distinguish between policy
decisions that make sense and policy decisions that don't make
sense. I guess I am a little skeptical of the idea that we will
ever be able to actually predict with precision. But I think we
can be a lot smarter about the basic set of assumptions if we
can develop some really close case studies from end to end,
case studies that show great detail.
Let me just quickly say, one, we are looking, for example,
at Arizona State we are looking at the development of the solar
power industry in Arizona, because obviously we have a lot of
sun there. And so it is not--one of the important inputs of
course is R&D into the solar power industry, but there are all
sorts of local dynamics, from water availability, land use,
obviously regulatory frameworks, all of those things are
important and they are not generalizable.
So while I think we can develop a very rich case study
around solar in Arizona, I don't think we should necessarily
worry about a grand theory. So we should develop best practice
case studies looking very closely at the full process of
leading from the R&D activities themselves to the societal
outcomes.
Chairman Lipinski. Did anyone else want to--any of the
witnesses want to comment on that? Have anything to add to
that? If not I am--now I think about this in your answer, Dr.
Sarewitz and I--do we have the data available right now? Would
we need to do a better job of collecting data so that we can do
this kind of research? The whole generalized ability of this is
when you look at almost anything that is really a social
process. I always go back to my days as a political scientist
in trying to put together these theories that will predict
outcomes and the struggle with doing that and trying to make
political science into physics. How much can we do here when we
are talking about doing the SciSIP research, and what we can
really glean from the data that we have?
Dr. Sarewitz. So let me just say this. A diversity of
perspectives is here and that is good. I mean, I think it is a
rich field and it needs to bring lots of perspectives together,
from the highly quantitative model to the more case-based
qualitative work. We need all of that. I think we know a lot. I
think Dr. Teich's point about the problems of communicating
what we know is really important, and that thinking about how
to communicate more effectively the things that we already
know, for your benefit, is an essential part of it. And so two
things need to go on simultaneously. They are--this field is
really only just beginning to kind of get its legs.
Dr. Murray talked about how she's been doing it for a long
time, didn't know there was a field out there. I have been
doing it for a long time as well, but more or less in small
groups. So Dr. Lane's, you know, efforts to create a community
does two things. One is, it creates--it has created the
intellectual momentum that we are going to need to move the
field forward, but it also allows us to really collect what we
know already, which I think is considerable, and present that,
if we can figure out how to communicate effectively. I would be
glad to talk about that a little bit, too, if you would like.
Chairman Lipinski. Well, let us come back. Right now I am
going to yield back my time. I assume my time is up and I want
to yield now to recognize Dr. Ehlers for five minutes.
Mr. Ehlers. Thank you very much and I don't have any
questions for you Dr. Sarewitz, other than to note that we
produce weather today that is very close to what you have back
home. We did put a little moisture in the air as well, so that
is a little different.
Dr. Sarewitz. I wouldn't be dressed like this either.
Mr. Ehlers. That is true. But I appreciate you coming. I
don't have any questions for you that have not been either
answered or explained already. But I would like to ask on the
two ends of the panel, Dr. Lane, Dr. Teich, you both are quite
familiar with the Congress and how it operates. Do you have any
suggestions on what someone in the Congress could do to help
educate our Members about the importance of science policy and
what it should be, what it can do, what it cannot do, and any
wisdom you could give us I think would be very helpful as we go
forward in the Science Committee and try to--I hate to use the
word modernize, but you know what I am talking about. Just try
to get the workings of the House of Representatives and the
Senate to reflect reality, and what should be done about the
Science of Science Policy and in particular, what role science
policy should have in guiding the Congress on the very
difficult issues we have, particularly those relating to
funding. So we will start with you, Dr. Lane, and go to Dr.
Teich, and also Dr. Murray if you have any comments on that.
Dr. Lane. Well, thank you very much for that thoughtful
question. I am not as wise in the ways of Congress as you,
obviously, so this is very much in the spirit of the suggestion
rather than an expert approach. One of the things that I think
is most important, that will get Congress to understand the
value of science investments, is evidence. If there is clear
evidence of the impact of science investments, on the four sets
of dimensions--social, scientific, economic, and work force--
that both has a qualitative aspect, that is, that there are
real people affected, and there are real advances that are made
in the quality of life, but also quantitative. That is, when
you can unambiguously say there were--this amount of investment
led to a whole variety of different sets of outcomes, and that
tracer is clear. I think that is what gets people in Congress's
attention, because obviously they are serving the American
taxpayer, and that is what the American taxpayer is interested
in finding out.
Mr. Ehlers. Okay. Dr. Teich.
Dr. Teich. Yeah, I think I would turn that around a bit and
point out that it is really very much up to us in the SciSIP
and science policy communities to communicate effectively with
you in the Congress. You have so many messages coming at you
from so many different directions that somehow, what we need to
do is differentiate the kinds of information that we have,
hopefully evidence-based. And we have to recognize that it is
not the only influence, the only thing that you have to take
into account in making decisions.
That decision--I was struck by something Chairman Lipinski
said about making political science into physics. I started
out, I got an undergraduate degree in physics and my Ph.D. in
political science, and you know physics; in some respects
physics is lot easier, you know. You start, you can--my
freshman physics, you know, assume a frictionless plane. Okay,
well you can assume a frictionless plane and it works in some
respects. Assume a frictionless Congress and you know you have
got nothing. It doesn't make any sense. So there is a--politics
isn't neat. It is not, and data doesn't always trump a lot of
other factors that go into decisions. We have to understand
that we have to communicate within that framework, and then I
think it is up to you in the policy community to make use of
that. Best I could do on short notice.
Mr. Ehlers. If we had a frictionless Congress, things might
go better. Dr. Murray, do you have any wisdom to add to this?
Dr. Murray. I am not sure it is wisdom. It is certainly a
thought I have, is that--I think it is important for us to
provide data that is meaningful. I think it is also important
for us to think about studies that really show, again, sort of
causal impact. So I think that there is some new work that has
been funded by SciSIP and in other places where we can say,
look, you know we did have a quite big shock to the system in
terms of additional funding going in quite rapidly through the
Recovery Act, and some of the spending in other countries that
means very big shifts in research funding allocation that have
happened relatively quickly. And so I think we have a lot of
opportunities to both study those things and also to marshal
that evidence--because I think you could always go in and just
say, we want more for science and technology, and everybody has
heard that and of course we are going to say that. And so I
think that coming in with evidence that says--when you get
these shifts, both in level and distribution, real things
happen, real differences, and outcomes happen. I think if we
can marshal that evidence in a persuasive way, then I think we
can be much more informed and are much more likely to be
listened to.
Mr. Ehlers. Okay. Well, those are very good comments. I
worry a little bit about the Congress requiring a lot of
evidence because you know many experiments don't come out that
well and the Congress would say, now--next time you come
around, say, well, you know, you sold me on this project and
nothing really good came out of it. And that is pretty hard to
overcome.
I really appreciate the ideas you have presented and the
comments you have made today. And it has given me some new
insight. And I really do think that we need more concentration
on this not only in the Congress, but among the science policy
community. And what I said several times earlier on about this
was--I deliberately said ``Unlocking the Future'' because I
wanted someone in the future to write better, something better
about science policy and something along the line of Vannevar
Bush's book which was probably--it could have been what we want
today, but nevertheless he addressed a lot of issues that had
to be taken into account. He himself was very different but
very concerned about the fact that Congress did not pick up on
a lot of his suggestions, and particularly one creating a
different version of the National Science Foundation, but yet
out of his work and his arguments, eventually, I think some ten
years after he wrote the book, they did start establishing the
National Science Foundation. So even though he regarded his
work as a failure because the Congress didn't pick up on it,
eventually it did happen.
So I encourage the science policy community to become very
active, and frankly, also very aggressive in your addressing
Members of Congress. It would not hurt at all if a few people
from the science policy committee ran for Congress and got
elected. I just had an experience on the Floor not 10, 15
minutes ago. Someone came up to me and had been present this
morning at the Science Committee meeting and said Vern, what in
the world are we going to do without you, because I had used my
scientific knowledge in a number of statements. And I say well,
I think, you know I don't think I do that much. They will get
along. But the matter of fact is there won't be any scientists
left on the Science Committee. And it is just helpful to be
inside all the side discussions that are held. It is good to
have someone there.
So I repeat, as I have done with every speech I have given
to every engineering or scientific group: run for Congress. We
need more scientists in the Congress, and incidentally not just
for the benefit of science, but most scientists are fairly
clear thinkers on issues and frictionless or not, and they have
a lot to contribute to the operation of the governing bodies of
this country.
I would actually say I probably got--had much more impact
at the state level because I was truly a rarity there. And most
state governments don't have the resources to have scientists
on staff. And I had endless amounts of work to do trying to
resolve things, such as resolving difficulties between
optometrists and ophthalmologists, or dealing with questions
such as the foam installation that was the rage for awhile
pumping into homes and now people are sick from formaldehyde
fumes from that. These are issues no one in the state
legislature was equipped to deal with, and I resented all the
time I had to spend on it, but at the same time it was very
useful to society. So spread the word, please, and thank you
again for being here. I appreciate it very much.
Chairman Lipinski. Thank you, Dr. Ehlers.
Mr. Ehlers. Chairman, I beg your pardon but I have a bill
on the floor that has just been called up and I have to rush
down there to speak on it, so my apologies.
Dr. Teich. Mr. Ehlers, before you go I just want to say on
behalf of AAAS, the science community and the SciSIP Community,
we are going to miss you. Thank you for everything that you
have done.
Mr. Ehlers. Thank you very much. I appreciate that.
Mr. Lipinski. Dr. Ehlers, we will assume I have your
permission to continue here and wrap up. See as there was no
objection from Dr. Ehlers I will--I was asking for your
permission to continue on and to wrap up here. Thank you. Okay.
You are still here, and frictionless. I will now recognize
myself for five minutes. I--it is funny, the--talking about the
assumptions and comparing physics to--or trying to make
political science into physics. I had a colleague of mine in
grad school in political science who was also, like myself, had
a background in engineering before going to get a Ph.D. in
political science, and he always would say that political
science had physics envy and we were trying to be physics. Now
it did not stop. Political scientists, and even focusing
specifically on Congress, Congressional scholars did not--some
were not afraid to make assumptions that wind up where they
were talking about something that was supposed to be Congress
but pretty much--very much not Congress anymore after all the
assumptions that were made, and make all these assumptions that
said, with this, we were dealing with a imaginary legislature,
but then we are going to pretend like it is Congress. Hopefully
that is not the type of work that is going on here in SciSIP.
I want to make sure--one thing I wanted to ask--Dr. Murray
talked about this, and I want to ask everyone about, if they
have any more comments on this. Because I know Dr. Murray, in
your testimony you talked extensively about it as training, you
know, more people to be able to do this research in having
programs that produce the type of, you know you, go into what
she is talking about, Ph.D. programs, but we need to, in
general, produce people who can do this, to do this work. And I
know that people who are doing this, researchers in this field,
are located--as I mentioned in my opening statement--in all
kinds of different places.
Dr. Murray, I know you are in the business school. Are
there any suggestions--I don't know if there is anything else
you wanted to add, Dr. Murray, or Dr. Teich, or if Dr. Lane
would want to say anything about where we are right now in
terms of programs that are producing researchers that can do
SciSIP--where that is going. Are there programs such as this
that are out there? If not, where are they coming from? Is this
something that we need to, you really think we need to do more
of and concentrate on, to found such a specific field like
this, or can we get by with people coming from different
fields. Is that the way to do it? So I just want to throw that
question out there. As a, former political science academic I
am interested, in you know, questions like this in terms of
what we are doing out there in higher education.
Dr. Lane. So I think that is a very interesting question.
It is an important question. The main area--if you are going to
train people in doing this kind of research, they are going to
go into the field and do the kind--develop the kind of skill
sets that we need. You want them to be able to get tenure. You
want there to be able to be a career ladder. And the program,
besides SciSIP, isn't sufficient to support an
interdisciplinary field in its own right. Nor is it, I think,
possible to develop career paths for such a narrow set of
skills.
So I think what is important is to convince very smart
people in economics, and sociology, and psychology and many of
the other areas, that feel that science policy is a really
important and interesting field, that they can bring their
skill sets to bear on, to answer important science policy
questions and that they can publish and get--advance within
their own disciplines. So I think that is what is critical
rather than trying to establish a separate field in its own
right. I don't think that is feasible given budget constraints
and so on. So that is what we have been explicitly been trying
to foster, to make it an intellectually challenging, exciting,
and publishable type of field.
Chairman Lipinski. You going to comment Dr. Murray?
Dr. Murray. Yes, I think it is--I think that there are
three different constituencies for education. One is the
Ph.D.'s, who are probably the producers of research. The
other--and then there are the science and technology policy,
typically Master's students, who tend to go into policy roles
who I think we need to educate to be better consumers [of this
research] and also people who really understand what we do and
can help do it with us. And then finally there are, probably,
the scientists and engineers themselves who could benefit from
understanding some of this. We then become a sort of bottom-up
constituency who can shape agencies and so on. I think on the
Ph.D. side, I think Julia is exactly right. I don't think you
can have a new discipline of SciSIP. I think it is both too
small, and, in fact, one of the great values of SciSIP is
indeed the fact that people come from these other strong
disciplinary foundations. I think what we do need to emphasize,
though, is a serious, a sort of a field focus.
If you think of a Ph.D. in political science or economics,
mostly there is a field. At the moment I don't think many
places already have a field focus in something that we would
recognize as SciSIP. And I think that we could go a long way
towards funding things that would help establish that. You
know, Ph.D. education requires, especially in something like
this, you know, significant evidence, and teaching materials,
and data sets, and things so that students can work on this,
and so that we can effectively collaborate across a set of
schools to really begin to develop material, share expertise.
And then potentially bring the Ph.D. students together as a
community so that they recognize one another even though they
are always going home and we know we are educating them to be
hired by business schools' economics departments and so on.
So I think that there is an opportunity there as long as we
make sure we know what we're trying to accomplish, which is not
a new discipline. I think on the science and technology
Master's side, I am less familiar with that because even though
I do SciSIP research I don't teach in the technology and policy
program at MIT. But that in and of itself tells you something,
which is that there is, I think, a little bit still of a
disconnect--that the traditional science and technology policy
programs have not necessarily sort of incorporated SciSIP
research into their teaching material.
And so again I think that there is an opportunity to do
something about that. Not to insist that people do it, but to
provide opportunities to develop a really effective curriculum
so that as people go into different--into their careers as
policy makers, they understand what we are trying to do, some
of the methods, they know good SciSIP research from less good
SciSIP research and they themselves can say oh, you know, we
are doing something in our agency. We could actually run that
as an experiment that could be studied. We could try two
different ways of allocating funding and we could really do the
analysis with real data. And I think if we could educate people
to that level, we would have a much better interchange in the
long run, and it would be a really--it would be a very vibrant
community.
Chairman Lipinski. And Dr. Teich.
Dr. Teich. Yes, well, a couple of things worth noting in
response. First of all, there are and have existed for some
time about 25 programs in universities around the United
States, and some outside the U.S. in addition to those 25. They
have provided graduate education in science--in what we have
called science, engineering and public policy, and which
overlaps quite substantially with what we now call SciSIP. We
had many years ago we published a guide, an old-fashioned paper
type guide. We now have a website on the AAAS website that
links to all of these programs which could help people find
them.
I don't see this as a discipline either. As was said a
moment ago, it is a--my analogy is that it is more like a field
of, let us say, area studies in which, like Latin American
studies, for example, or African studies, it is a field in
which many different disciplines contribute to an understanding
of what is going on in this business. So that is one thing that
I wanted to mention.
Another thing is that there is--there are a lot of young
people who are very interested in this, and we need to
encourage them. There is an organization--an international non-
profit--it is incorporated as a 501(c)3 called Triple Helix,
Inc., which has about 500 students from many universities,
prestigious universities around the world, which provides an
opportunity for students to educate themselves about the
relationships between science and society, in ethics, business,
and law. They actually publish an undergraduate journal,
which--a couple of people from AAAS's staff serve on their
Board of Advisors. They also have a poster session at the AAAS
Annual Meeting.
And then there is a group called the Science and
Technology, or STGlobal Consortium, which is an association of
graduate students and these programs that I mentioned, which
also brings together people. They have a conference usually
here in Washington in collaboration between the AAAS and the
National Academies to provide an opportunity for younger people
to explore this field, get into it if they're interested, and
some of them do. We at AAAS have hired on our staff several
people who have been graduates of these programs--master's
degree graduates from these programs and some have been highly
successful and are really leaders, young leaders in the field.
So I am an advocate for this kind of education and I think
we are doing it. I think it will be useful for Congress, and
for Members of Congress if they were aware of this, to provide,
I would say, moral support by speaking at their meetings and
having staff attend and so on. So I am--I will leave it at
that.
Chairman Lipinski. Well, thank you, Dr. Teich, and I had
a--when we were going out for votes, I was getting in the
elevator and someone who had been sitting in the audience, he
went up and thanked me for having this, the hearing on SciSIP,
and said, how do Members really become educated? How do you
have the time? And I said, it is very, very difficult and what
it really takes is a dedication to, you know, being educated,
because the incentives, other than really wanting to do a good
job and being interested in this topic, aren't there. It is
unfortunate.
But the good thing is that we do have staff who are well
educated in these things and it leaves me to thank the staff
for all their work that you do, and all the staff on the
Science Committee do an excellent job so that we--help us to do
a better job here, hopefully, on the Science and Technology
Committee, help the Members do a better job. So I thank the
staff for all the work that they do.
With that I want to thank the witnesses for their
testimony. The record will remain open for two weeks for
additional statements from the Members and for answers to any
follow up questions the Committee may ask of the witnesses.
With that the witnesses are excused and the hearing is now
adjourned.
[Whereupon, at 4:23 p.m. the Subcommittee was adjourned.]
Appendix 1:
----------
Answers to Post-Hearing Questions
Answers to Post-Hearing Questions
Responses by Dr. Julia Lane, Program Director of the Science of Science
and Innovation Policy Program, National Science Foundation
Questions submitted by Chairman Daniel Lipinski
Q1. You describe in your testimony an effort by NSF to improve upon
the way in which NSF interacts with its proposal and award portfolio.
Can you please elaborate on this effort? How might the new tools you
are developing to be utilized in the development of future NSF budget
proposals, new programs or other aspects of policy development at NSF?
Also, can you please elaborate on the relevance of this effort to the
broader impact criterion?
A1. The SBE and CISE directorates have established a joint subcommittee
of their directorate advisory committees that is exploring new ways to
analyze and oversee NSF's portfolio of proposals and awards. The
subcommittee is developing a report that will be available to NSF
leadership and the broader community in November 2010. A particular
focus of the report will be identifying tools to help NSF program staff
better identify and support transformative and interdisciplinary
research and gauge the broader impacts of NSF's investments. The report
will also advise NSF on how to better structure existing data, improve
use of existing technologies to complement human expertise, and
characterize its programmatic data.
These new tools and resources have many potential uses in
establishing, justifying, and implementing budgetary priorities for
NSF. A principal aim is to assess the alignment of NSF's priorities
with emerging trends and opportunities in science and engineering
research and education and to assess NSF's impact on areas of national
priority. Other potential benefits include improving the efficiency of
NSF's core business processes by providing program officers with new
resources for managing the merit review process.
Appendix 2:
----------
Additional Material for the Record
Statement of the California Healthcare Institute (CHI) submitted by
Representative Brian P. Bilbray