[House Hearing, 107 Congress]
[From the U.S. Government Publishing Office]
E-RATE AND FILTERING: A REVIEW OF THE CHILDREN'S INTERNET PROTECTION
ACT
=======================================================================
HEARING
before the
SUBCOMMITTEE ON TELECOMMUNICATIONS AND THE INTERNET
of the
COMMITTEE ON ENERGY AND COMMERCE
HOUSE OF REPRESENTATIVES
ONE HUNDRED SEVENTH CONGRESS
FIRST SESSION
__________
APRIL 4, 2001
__________
Serial No. 107-33
__________
Printed for the use of the Committee on Energy and Commerce
Available via the World Wide Web: http://www.access.gpo.gov/congress/
house
__________
U.S. GOVERNMENT PRINTING OFFICE
72-836CC WASHINGTON : 2001
_______________________________________________________________________
For Sale by the Superintendent of Documents, U.S. Government Printing Office
Internet: bookstore.gpo.gov Phone (202) 512-1800 Fax: (202) 512-2250
Mail: Stop SSOP, Washington, DC 20402-0001
COMMITTEE ON ENERGY AND COMMERCE
W.J. ``BILLY'' TAUZIN, Louisiana, Chairman
MICHAEL BILIRAKIS, Florida JOHN D. DINGELL, Michigan
JOE BARTON, Texas HENRY A. WAXMAN, California
FRED UPTON, Michigan EDWARD J. MARKEY, Massachusetts
CLIFF STEARNS, Florida RALPH M. HALL, Texas
PAUL E. GILLMOR, Ohio RICK BOUCHER, Virginia
JAMES C. GREENWOOD, Pennsylvania EDOLPHUS TOWNS, New York
CHRISTOPHER COX, California FRANK PALLONE, Jr., New Jersey
NATHAN DEAL, Georgia SHERROD BROWN, Ohio
STEVE LARGENT, Oklahoma BART GORDON, Tennessee
RICHARD BURR, North Carolina PETER DEUTSCH, Florida
ED WHITFIELD, Kentucky BOBBY L. RUSH, Illinois
GREG GANSKE, Iowa ANNA G. ESHOO, California
CHARLIE NORWOOD, Georgia BART STUPAK, Michigan
BARBARA CUBIN, Wyoming ELIOT L. ENGEL, New York
JOHN SHIMKUS, Illinois TOM SAWYER, Ohio
HEATHER WILSON, New Mexico ALBERT R. WYNN, Maryland
JOHN B. SHADEGG, Arizona GENE GREEN, Texas
CHARLES ``CHIP'' PICKERING, KAREN McCARTHY, Missouri
Mississippi TED STRICKLAND, Ohio
VITO FOSSELLA, New York DIANA DeGETTE, Colorado
ROY BLUNT, Missouri THOMAS M. BARRETT, Wisconsin
TOM DAVIS, Virginia BILL LUTHER, Minnesota
ED BRYANT, Tennessee LOIS CAPPS, California
ROBERT L. EHRLICH, Jr., Maryland MICHAEL F. DOYLE, Pennsylvania
STEVE BUYER, Indiana CHRISTOPHER JOHN, Louisiana
GEORGE RADANOVICH, California JANE HARMAN, California
CHARLES F. BASS, New Hampshire
JOSEPH R. PITTS, Pennsylvania
MARY BONO, California
GREG WALDEN, Oregon
LEE TERRY, Nebraska
David V. Marventano, Staff Director
James D. Barnette, General Counsel
Reid P.F. Stuntz, Minority Staff Director and Chief Counsel
______
Subcommittee on Telecommunications and the Internet
FRED UPTON, Michigan, Chairman
MICHAEL BILIRAKIS, Florida EDWARD J. MARKEY, Massachusetts
JOE BARTON, Texas BART GORDON, Tennessee
CLIFF STEARNS, Florida BOBBY L. RUSH, Illinois
Vice Chairman ANNA G. ESHOO, California
PAUL E. GILLMOR, Ohio ELIOT L. ENGEL, New York
CHRISTOPHER COX, California GENE GREEN, Texas
NATHAN DEAL, Georgia KAREN McCARTHY, Missouri
STEVE LARGENT, Oklahoma BILL LUTHER, Minnesota
BARBARA CUBIN, Wyoming BART STUPAK, Michigan
JOHN SHIMKUS, Illinois DIANA DeGETTE, Colorado
HEATHER WILSON, New Mexico JANE HARMAN, California
CHARLES ``CHIP'' PICKERING, RICK BOUCHER, Virginia
Mississippi SHERROD BROWN, Ohio
VITO FOSSELLA, New York TOM SAWYER, Ohio
TOM DAVIS, Virginia JOHN D. DINGELL, Michigan,
ROY BLUNT, Missouri (Ex Officio)
ROBERT L. EHRLICH, Jr., Maryland
LEE TERRY, Nebraska
W.J. ``BILLY'' TAUZIN, Louisiana
(Ex Officio)
(ii)
C O N T E N T S
__________
Page
Testimony of:
Caywood, Carolyn A., Librarian, Bayside Area Library,
Virginia Beach Public Library.............................. 33
Getgood, Susan J., Vice President, Education Market,
Surfcontrol................................................ 38
Johnson, Marvin J., Legislative Counsel, American Civil
Liberties Union............................................ 9
Morgan, Laura G., Librarian, Chicago Public Library.......... 25
Ophus, Christian, President, FamilyConnect, Inc.............. 44
Taylor, Bruce A., President and Chief Counsel, National Law
Center for Children and Families........................... 20
Material submitted for the record by:
Johnson, Marvin J., Legislative Counsel, American Civil
Liberties Union:
Letter dated April 5, 2001, to Hon. Fred Upton........... 94
Letter dated April 5, 2001, to Hon. Edward Markey........ 94
(iii)
E-RATE AND FILTERING: A REVIEW OF THE CHILDREN'S INTERNET PROTECTION
ACT
----------
WEDNESDAY, APRIL 4, 2001
House of Representatives,
Committee on Energy and Commerce,
Subcommittee on Telecommunications
and the Internet,
Washington, DC.
The subcommittee met, pursuant to notice, at 10:05 a.m., in
room 2322, Rayburn House Office Building, Hon. Fred Upton
(chairman) presiding.
Members present: Representatives Upton, Stearns, Largent,
Shimkus, Pickering, Blunt, Terry, Markey, Green, McCarthy,
Luther, Harman, and Sawyer.
Staff present: Mike O'Rielly, majority professional staff;
Brendan Kelsay, minority counsel; and Yong Choe, legislative
clerk.
Mr. Upton. All right. We will start. Good morning. The
subcommittee will now come to order. Today's hearing is on the
Children's Internet Protection Act, otherwise known as ``CIPA''
or ``CHIPA.''
On April 20, CHIPA is to be implemented by the FCC, and
there have been several recent lawsuits filed in Federal Court
challenging the constitutionality of the law, and seeking to
block its implementation as it pertains to public libraries.
I support the goal of CHIPA. In my view, the taxpayers
should not be required to fund obscenity or child pornography,
or any means of accessing it. Nobody should be able to use
publicly funded library computers to access obscene pictures or
child pornography.
And libraries should be responsible for protecting children
from this material and other material which is harmful to them,
period. Under CHIPA, E-rate funding to public libraries and
schools will be conditioned upon their deployment of
technology, which will prevent children from accessing visual
depictions that are obscene, child pornography, or other visual
depictions that are otherwise harmful to minors.
Libraries and schools which do not comply will lose their
E-rate funding. Moreover, CHIPA requires schools and libraries
which receive E-rate funding to adopt and implement broad
Internet safety policies, which should address access by minors
to inappropriate matter on the Internet.
The safety and security of minors when using E-mail and
chap rooms, hacking by minors, unauthorized disclosure of
personal and identifying information regarding minors, and
measures designed to restrict minors' access to material that
is harmful to them.
As the parent of two young kids who use the Internet, I
know well the wonderful educational opportunities which the
Internet brings to them. However, I also know the fear that all
parents have about their kids being unwittingly exposed to smut
on the Internet, particularly when parents may not be around,
like at the library and at the school.
Primarily at issue in today's hearing is the use of the
Internet in public libraries. Our public libraries are among
our communities most valuable assets. Unlike movie theaters and
video arcades, public libraries are supposed to be where
parents can send their kids to learn in an environment where
they have access to only safe and appropriate materials.
By and large I believe that our Nation's libraries take
very seriously their responsibilities to protect kids. For
example, I recently visited the Kalamazoo Public Library in my
district, and I know that they have a terrific computer
facility, complete with Internet access.
Through a system of user identification cards, acceptable
use rules, and computer screens which are all in one place,
where they can be seen by an effective monitoring staff, there
are very few incidents of inappropriate material being accessed
by library users.
Those limited few who break the rules get caught and get
their privileges yanked. The system as I have watched it, I
know is working well in Kalamazoo. Nevertheless, CHIPA is the
law, and the practical question is whether filtering and
blocking technologies are able to provide an optimal level of
protection for all of our Nation's public libraries,
particularly where library systems and staff monitors are not
as effective as they certainly are in Kalamazoo.
Among others, the ACLU and the American Library Association
have filed suit in Federal Court challenging the
constitutionality of the law as it pertains to libraries. I am
not a lawyer and so I won't venture a guess as to how the court
might come down.
However, as a parent, and a taxpayer, and a believer in
public libraries, and a supporter of the E-rate system which
helps them provide computers and Internet access to those who
might not otherwise have it, I believe we need to better
understand the legal and practical arguments on both sides of
the litigation, not to mention the promises and shortcomings
that filtering and blocking technologies represent at this
time.
I look forward to hearing from today's panel of witnesses.
I appreciate their willingness to help us to get to the bottom
of the matter, and I appreciate them being on time. I would
note that we have a number of subcommittees that are also
meeting on this day at this time, and I would ask for unanimous
consent that all Member's statements be included as part of the
record in their entirety.
[The prepared statement of Hon. Fred Upton follows:]
Prepared Statement of Hon. Fred Upton, Chairman, Subcommittee on
Telecommunications and the Internet
Good morning. Today's hearing is on the Children's Internet
Protection Act, otherwise known as CHIPA. On April 20, CHIPA is to be
implemented by the FCC, but there have been several recent lawsuits
filed in federal court challenging the constitutionality of the law and
seeking to block its implementation as it pertains to public libraries.
I support the goal of CHIPA. In my view, the taxpayers should not
be required to fund obscenity or child pornography or any means of
accessing it; nobody should be able to use publicly funded library
computers to access obscene pictures or child pornography; and
libraries should be responsible for protecting children from this
material and other material which is harmful to them. Period.
Under CHIPA, e-rate funding to public libraries and schools will be
conditioned upon their deployment of technology which will prevent
children from accessing visual depictions that are obscene, child
pornography, or visual depictions that are otherwise harmful to minors.
Libraries and schools which do not comply will lose their e-rate
funding. Moreover, CHIPA requires schools and libraries which receive
e-rate funding to adopt and implement broad Internet safety policies,
which should address access by minors to inappropriate matter on the
Internet; the safety and security of minors when using e-mail and chat
rooms; hacking by minors; unauthorized disclosure of personal
identifying information regarding minors; and measure designed to
restrict minors' access to materials harmful to them.
As the parent of two young children who use the Internet, I know
well the wonderful educational opportunities which the Internet brings
to our kids. However, I also know the fear that all parents have about
their kids being unwittingly exposed to smut on the Internet--
particularly where parents might not be around, like at the library and
at school.
Primarily at issue in today's hearing is the use of the Internet in
public libraries. Our public libraries are among our communities' most
valuable assets. Unlike movie theaters and video arcades, public
libraries are supposed to be where parents can send their kids to learn
in an environment where they have access to only safe and appropriate
materials.
By and large, I believe that our nation's libraries take very
seriously their responsibilities to protect kids. For example, take the
Kalamazoo Public Library in my district. I recently visited and found
that they have a terrific computer facility, complete with Internet
access. Through a system of user identification cards, acceptable use
rules, and computer screens which are all in one place where they can
be seen by an effective monitoring staff, there are extremely few
incidents of inappropriate material being accessed by library users.
Those limited few who break the rules get caught and get their
privileges yanked. This system appears to be working well in Kalamazoo.
Nevertheless, CHIPA is the law, and the practical question is
whether filtering and blocking technologies are able to provide an
optimal level of protection for all of our nation's public libraries,
particularly where library systems and staff monitors are not as
effective as they appear to be in places like Kalamazoo.
Among others, the ACLU and the American Library Association have
filed suit in federal court, challenging the constitutionality of the
law as it pertains to libraries. I am not a lawyer, so I won't venture
a guess as to how the court might come down. However, as a parent, a
believer in public libraries, and a supporter of the e-rate system
which helps them provide computers and Internet access to those who
might not otherwise have it--I believe we need to better understand the
legal and practical arguments on both sides of the litigation, not to
mention what promises and shortcomings filtering and blocking
technologies represent at this time.
I look forward to hearing from today's witnesses, and I appreciate
their willingness to help us get to the bottom of this matter.
Mr. Upton. Without objection, I represent my friend and
colleague from California, Ms. Harman, for an opening
statement.
Ms. Harman. Thank you, Mr. Chairman. I am proud to be on
time, and I am also pleased that you are having this hearing,
because I think that this is a difficult and important subject
for us to address. I should tell the panel and this
Subcommittee that several Congresses ago I voted for the V-
Chip.
I voted for the V-Chip because as a parent of four children
myself, an overworked parent of four children myself--and that
probably applies to most of the people about to testify, and to
you, too, Mr. Chairman--I wanted to have technology that
enabled me as a parent to make better choices for my minor
children.
That's why I voted for the V-Chip, and I think that is the
opportunity the V-Chip gives us. On the other hand, I did not
serve in the last Congress, and so I believe I have never voted
on CHIPA. I know that I didn't in the last Congress, but I
don't think it came up in any other form before that.
I may be wrong, but at any rate, I would have had more
doubts about CHIPA than I did about the V-Chip. I would doubt
both its constitutionality and its wisdom. As for its contrast
with the V-Chip, the V-Chip gives parents choice. CHIPA does
not.
CHIPA mandates. It is a government mandate that librarians
must do things or forego Federal funds. That is not giving
parents choice. That is the government choosing. So in that
sense, there is a contrast.
Second, in constitutional terms, I think as many are
arguing about the Campaign Finance Reform Bill that there are
serious issues when the government decides what expression will
be permitted, and what expression won't be permitted. So I
think there are constitutional issues there.
I would note further that it is not just the ACLU that is
suing. As much respect as I have for the ACLU, and I do, it is
also the American Library Association that is bringing suit
here because I know that librarians--I have heard from many in
my district--have serious concerns again about the government
telling them how to handle minor access to pornographic
materials in their libraries.
My conclusion, at least at the start of this hearing, is
that there are serious constitutional issues here that
government should be more careful, I believe, in striking the
balance that we need to strike, and that my goal is to give
parents choice about what their minor children view on the net,
and in local parentis to give librarians who serve local
communities choice about how to administer the Internet sites
that our children are seeing in the public libraries.
So I approach this material in a dubious fashion. I am very
interested to see what our witnesses say. I share your goal,
Mr. Chairman, that we as parents, and that we as
representatives of our districts, should do everything that we
can to provide tools for responsible adults to help our
children make wise choices.
But I am not sure that those tools should be mandated by
government. Thank you very much. I yield back the balance of my
time.
Mr. Upton. Thank you. I would note that I was also a
supporter of the V-Chip on the House floor several years ago,
and the CHIPA amendment as I understand it, we never had a
separate vote on that, either in Committee or on the House
floor.
It was rolled in as part of the Labor-HHS Appropriation
Bill and signed by President Clinton last year.
I recognize for an opening statement Mr. Blunt from
Missouri.
Mr. Blunt. Thank you, Mr. Chairman, and thank you for
having this hearing. This is an issue, like you and Ms. Harman,
that I feel that there are certainly some good points on both
sides of this issue. We need to be sure that we don't either
solve the wrong problem, or come up with the wrong solution, or
create a bigger problem than we solve here.
But I think that is the reason that we have these hearings.
This is not the final--for our witnesses, this is not the final
committee action on a bill. This is truly having an opportunity
to ask questions and get information a topic that we all have
concerns on, and that we all want to see is solved in the right
way.
And we respect the individuals here who have different
points of view on the way that we need to address this as a
Committee, and as a Congress, and I look forward to being part
of the hearing, and reading the transcript on the hearing if I
am not able to stay for all of it. So thank you, Mr. Chairman,
for having this hearing today.
Mr. Upton. Mr. Sawyer from Ohio.
Mr. Sawyer. Thank you, Mr. Chairman. I would associate
myself with the comments of my colleagues. Thank you for having
this hearing. We are on the threshold of a time when libraries
and schools are changing their role in a way that we elevate
the skill level of an entire Nation, and expose Americans to a
breath in the world that is just breathtaking.
Making sure that we do that in a way that does not stand in
the way of that access is enormously important, and with that,
I will yield back the balance of my time, and look forward to
the comments of our witnesses.
Mr. Upton. Well, thank you. Our panel today includes Mr.
Bruce Taylor, President and Chief Counsel of the National Law
Center for Children and Families; Mr. Marvin Johnson,
Legislative Counsel, of the American Civil Liberties Union,
ACLU; Ms. Laura Morgan, a Librarian, from the Chicago Public
Library; Ms. Carolyn Caywood, a Librarian from the Virginia
Beach Public Library, Bayside Area Library; Ms. Susan Getgood,
Vice President of the Education Market SurfControl; and Mr.
Chris Ophus, President of FamilyConnect, Inc.
I appreciate all of you getting your statements, which are
made part of the record in their entirety, in advance. And
since the vote has not started as the Cloakroom promised, we
will start with Mr. Taylor's testimony.
We are going to have a clock on you up here for about 5
minutes. So you will notice these little lights and buzzers,
and everything else. You have got 5 minutes to proceed, and all
of your statements are made as part of the record in their
entirety.
Mr. Taylor. Thank you, Mr. Chairman, and Members of the
Committee. The National Law Center----
Mr. Upton. Since the vote has started, we are going to have
to break this up anyway, I think we may adjourn. Is this going
to be one vote or two?
Mr. Blunt. I do not know.
Mr. Upton. Is it going to be two votes? My guess is that it
is going to be two. Well, at this point, since Mr. Pickering
came, we will allow Mr. Pickering, who is one of the architects
in the CHIPA bill, to make an opening statement.
At that point, we will adjourn for about 15 minutes, and I
will do my best to round up some Members to come back and we
will start with you, Mr. Taylor, if that is okay. Maybe Mr.
Luther has an opening statement as well. Mr. Pickering.
Mr. Pickering. Mr. Chairman, I would just like to thank you
for holding this hearing. I look forward to hearing from the
panel as we ask questions and as we see CHIPA, the Children's
Internet Protection Act, implemented, and it is soon to be
implemented.
And hopefully we can find some common ground, but if not,
hopefully we can establish the record that this is a common
sense, mainstream, constitutional way to protect our children
from child predators, from obscenity, from child pornography,
that which is already illegal.
We believe that the language and the legislation was very
well crafted, taking lessons from recent communications efforts
to restrict this type of material, but that was ruled
unconstitutional in the Courts.
We believe that we avoided those pitfalls and those
problems by the way that we crafted the language. This is an
issue of funding, and it is an issue of child safety. And just
as we give incentives to States to have alcohol blood limits,
or seat belt restraints, for the safety of the public, we
believe that for the safety of our children, as well as
preventing that which is illegal--child pornography and
obscenity--from having access through our schools, and through
our libraries with Federal subsidies.
And we believe that this is a very mainstream, common
sense, approach, and that the agenda of the other side who
opposes is out of the mainstream, and it is extreme. It would
put our children at risk. So I look forward to the testimony
today and the questions as we establish a record in this
regard.
[The prepared statement of Hon. Chip Pickering follows:]
Prepared Statement of Hon. Chip Pickering, a Representative in Congress
from the State of Mississippi
Mr. Chairman, I appreciate your holding this hearing today. I
believe you have given us the opportunity to expose the myths and
distortions of this legislation that it has been subjected to by its
opponents.
Throughout this hearing today we will hear several common arguments
by those who support federally funded access to child pornography and
obscenity, and let there be no mistake that this is the bottom line in
this debate.
Opponents of CIPA have made 5 basic arguments and I would like to
take a minute to refute their charges.
1. CIPA is constitutional because the conditions imposed on public
libraries for receiving federal funds for Internet access are
``reasonably calculated to promote the general welfare'' and are
``related to a national concern.'' Congress has the authority and
responsibility to ensure that federal funds are not used by government
agencies (pubic schools and libraries) to provide access to pornography
that is illegal under federal law, i.e., obscenity (18 U.S.C.
Sec. Sec. 1462, 1465), child pornography (18 U.S.C. Sec. 2252 et seq.)
and that which is illegal under most state laws, material harmful to
minors displayed or distributed to minors. CIPA also promotes the
national interest by encouraging advancements in software filtering
technology.
The Supreme Court upheld a federal regulation that directed the
Secretary of Transportation to withhold a percentage of otherwise
allocable federal highway funds from States ``in which the purchase or
public possession . . . of any alcoholic beverage by a person who is
less than 21 years of age is lawful.'' The Court held: ``Incident to
the spending power, Congress may attach conditions on the receipt of
federal funds. However, exercise of the power is subject to certain
restrictions, including that it must be in pursuit of ``the general
welfare.'' Sec. 158 is consistent with such restriction, since the
means chosen by Congress to address a dangerous situation--the
interstate problem resulting from the incentive, created by differing
state drinking ages, for young persons to combine drinking and
driving--were reasonably calculated to advance the general welfare.''
South Dakota v. Dole, 483 U.S. 203 (1987).
CIPA does not require all public libraries and schools to use
filtering software, only those that accept particular federal funds for
Internet access. The government has no duty to fund access to illegal
pornography on the Internet, especially in government agencies (public
schools and libraries). In Kreimer v. Bureau of Police for Town of
Morristown, 958 F.2d 1242, 1256 (3rd Cir. 1992), the court, held: ``The
State, no less than a private owner of property, has the power to
preserve the property under its control for the use to which it is
lawfully dedicated.'' [Citing Perry Education Association v. Perry
Local Educators' Association, 460 U.S. 37, 44 (1983)].
CIPA is not viewpoint discrimination; it has nothing to do with
disagreement with the speaker's view. Rosenberger v. Rector & Visitors
of Univ. of Virginia, 515 U.S. 819, 829 (1995). The Supreme Court has
consistently recognized that the government may allocate funding
according to criteria that would not be permissible in enacting a
direct regulation.
In National Endowment for the Arts v. Finley, 118 S. Ct. 2168
(1998), the Court held that, ``the Government may allocate competitive
funding according to criteria that would be impermissible were direct
regulation of speech or a criminal penalty at stake.'' Id. at 2179.
``it is preposterous to equate the denial of taxpayer subsidy with
measures aimed at the suppression of dangerous ideas.'' Regan v.
Taxation with Representation, 461 U.S. 540, 550 (1983). ``The
Government can, without violating the Constitution, selectively fund a
program to encourage certain activities it believes to be in the public
interest, without at the same time funding an alternative program.''
Rust v. Sullivan, 500 U.S. 173, 193 (1991).
2. CIPA advances legitimate local library decisions. CIPA permits
local library officials to determine which software filter they will
use, and to set their own Internet policy. Federal funds may be used to
cover costs of filtering. CIPA permits a public library official to
disable the filter for bona fide research or other legal use by an
adult. Local officials have the right to oversee the filtering
technology to make certain that it complies with CIPA and their policy.
CIPA will assist local libraries to avoid sexual harassment and hostile
work environment complaints caused by the presence of Internet
pornography, such as has occurred in the Minneapolis and Chicago public
libraries. ``A school library, no less than any other public library,
is a ``place dedicated to quiet, to knowledge, and to beauty.'' Brown
v. Louisiana, 383 U.S. 131, 142 (1966) (J. Fortas). It is inconsistent
with the purpose of a public library to provide a peep show open to
children and funded by Congress.
3. CIPA will assist parents in poor communities to protect their
children from pornography while permitting safe and rewarding Internet
access in public libraries. It is much more likely that most parents
will not permit their children to use unfiltered Internet access.
Furthermore, parents who are able to provide filtered Internet access
in their home will be able to protect their children, while poor
children, dependent upon library Internet access, will not have the
same protection. The true ``digital divide'' is between protected
children and unprotected children who are exposed to pornography and
pedophiles in libraries with unfiltered Internet access. In Ginsberg v.
New York, 390 U.S. 629 (1968), the Court recognized that parents have a
right to expect the government to aid them in protecting their children
from pornography: ``While the supervision of children's reading may
best be left to their parents, the knowledge that parental control or
guidance cannot always be provided and society's transcendent interest
in protecting the welfare of children justify reasonable regulation of
the sale of material to them. It is, therefore, altogether fitting and
proper for a state to include in a statute designed to regulate the
sale of pornography to children special standards, broader than those
embodied in legislation aimed at controlling dissemination of such
materials to adults.''
4. CIPA provides security--not a false sense of security. A library
should inform the public whether the Internet access provided is
filtered or unfiltered. If filtered, the library should also inform
users that filters are not 100 percent effective in blocking
pornography. Filters are like the safety equipment on cars, e.g., the
brakes, seat belts, and headlights. We do not require 100 percent
effectiveness by any safety equipment before we use it. While we
provide children with driver's education and adult supervision, we do
not permit children to drive cars without safety equipment and expect
them to navigate safely on roads without traffic controls, speed limits
and law enforcement officers.
In the past two years, use of software filtering by public
libraries has increased 121 percent. A survey published in School
Library Journal, April-May 2000, reveals that 90 percent of public
school librarians and public librarians are either ``very well'' or
``somewhat well satisfied'' with filtering software. A February 2000
survey conducted by National Public Radio, the Kaiser Foundation and
the Kennedy School of Government revealed that 84 percent of Americans
are worried about children online accessing pornography. Seventy-five
percent want government to do something about it. Congress did so in
CIPA. Once again, I thank you for holding this hearing and look forward
to hearing from the witnesses.
Mr. Upton. Thank you. Mr. Luther, from Minnesota.
Mr. Luther. Mr. Chairman, thank you. I will submit my
opening statement for the record.
[Additional statements submitted for the record follow:]
Prepared Statement of Hon. W.J. ``Billy'' Tauzin, Chairman, Committee
on Energy and Commerce
I thank Subcommittee Chairman Upton for calling this hearing. This
is a timely hearing given the upcoming FCC final rules and the recently
filed court cases.
Today's hearing focuses on the Children's Internet Protection Act
(``CIPA'' or ``CHIPA'') that was enacted as part of the final spending
bill at the closing days of the 106th Congress. It is an effort to
address one of the downsides of the Internet--the availability of
obscene and illegal material over the Internet. For all of the benefits
of the Internet, and we know there are many, it is clear that some
depraved individuals are using the new technologies in harmful and
corrupting manner. CHIPA is designed as a condition on receiving
federal funds. This is unlike past attempts by Congress to address the
availability of such material, which enacted straight bans or imposed
access requirements.
I think most people agree that the Internet is an amazing
technological innovation. It has essentially created a whole new medium
for communicating and conducting business. We can see vast benefits of
the Internet almost everyday. The Internet has essentially turned
everyone and every computer into their own printing press. It has also
dramatically lowered the cost of doing business and reaching new
markets.
We, as policymakers, should ensure that we cause the Internet no
harm as it develops from its infancy to adulthood. We have an
obligation to shepherd the medium as it grows in age and maturity.
Recently, Internet stocks have behaved like a child going through the
terrible two's. While it seems rough now, this will pass and
experienced, well thought-out business plans can and will succeed in
the marketplace.
However, just because an activity is occurring over the Internet
does not necessarily mean that it is untouchable. Clearly, there is
also a dark side to the Internet. Some people are using the medium to
illegally transport material including child pornography and material
that is harmful to minors. This type of material is not protected by
the First Amendment and traffickers should be prosecuted. Last
Congress, we held a hearing on enforcement, or lack of enforcement,
efforts by the Department of Justice. I am hopeful that the new
Administration will actively pursue violators. I want to acknowledge
the leadership of Congressman Pickering and Congressman Largent on this
important matter.
In terms of CHIPA, while I understand the complaints filed by the
ALCU and the American Libraries Association, I think it best not to
comment on these court cases. CHIPA does include an accelerated court
review process of the law, including an automatic referral to the
Supreme Court. This should help minimize uncertainty for parents,
schools, libraries and others. I also note that the cases focus on the
funding restrictions on libraries contained in CHIPA and not the
restrictions on funding for schools. Let me repeat, the schools portion
of the E-rate program is not being challenged at this time. America's
schools should proceed with the process of preparing to comply with the
parameters of the law.
CHIPA also includes a provision requiring NTIA to conduct a study
of filtering and blocking technologies to determine whether they meet
the needs of educational institutions. The findings of this study are
not due for some time but I am hopeful that NTIA can provide a
preliminary report on its findings and recommendations. We could use a
clearer picture of the effectiveness of filtering or blocking
technologies.
Furthermore, America's libraries are clearly not doing enough.
Unsupervised Internet access has the potential to turn schools and
libraries into modern day pornography shops. Many libraries and
supporting communities have taken positive steps to protect the
education and community setting of their libraries. I commend these
libraries for having the foresight to understand the need to protect
its members, especially the children. I make the call to all libraries
to follow suit and address a prevalent problem, which accompanies the
low cost of Internet access.
Again, I thank the Subcommittee Chairman and look forward to the
testimony of the witnesses.
______
Prepared Statement of Hon. Gene Green, a Representative in Congress
from the State of Texas
Mr. Chairman: I want to commend you for holding this important
hearing today to get a better understanding of the of the recently
passed Children's Internet Protection Act (CHIP Act).
Last years decision by our colleague in the Senate to include this
legislation in the Consolidated Appropriations Act of 2001 was ill-
timed and unwise.
This legislation was enacted without any significant hearings or
public input and has now placed our schools and public libraries in a
delicate legal position.
Once again Congress, in its rush to protect children from online
smut, has over regulated the issue.
Although I support the principles of the Chip Act as it applies to
schools. My support is based on the fact that is illegal under just
about any circumstances for a minor of any age to access any type of
pornographic material.
Schools can exercise a greater level of control over student
viewing habits because most of the students are minors.
Trying to regulate content available over the Internet to adults at
taxpayer funded public libraries once again sets up a new round of
litigation covering the First Amendment.
In addition, it forces librarians into the role of judging what
material is simply pornographic and what is obscene.
Although I do not differentiate between pornographic and obscene
material, I think it is all disgusting, clearly the courts do see a
difference.
Under the CHIP Act schools and libraries who receive federal E-Rate
monies or Library Services Act funding face the daunting challenge of
trying to filter Internet sites for content..
Nowhere in the legislation did I see any funding increases to
schools or school districts to hire the additional technical personnel
needed to manage the Internet filtering or to fill out the new reports
required under the legislation.
Aside from the lack of funding, if the legislation had stuck to
schools and not libraries we may not be facing the current round of
litigation over whether the legislation violates the First Amendment.
I do not want children of any age to have access to pornographic or
obscene material whether at school or the library.
But when we start trying to regulate what adults can view at a
publically funded library, I question the wisdom of the legislation.
We are now asking our librarians to police the Internet and to make
subjective content decisions that only a court can determine.
On top of that, we have imposed what I consider draconian reporting
and compliance measures that will discourage use of the E-rate.
In reviewing the witness testimony, I can see a lot of the same
concerns being echoed by the panelists.
I was encouraged to see that Ms. Caywood has what appears to be a
compromise solution to this problem.
Breaking Internet access into layers of filtering, but retaining
computers that have no filtering software seems to me to be a workable
solution.
In addition, providing a physical privacy shield to the unfiltered
computers prevents anyone other than user from seeing the material
being viewed.
These steps do not limit free speech or place librarians in the
position of having to judge content.
Every time Congress tries to legislate morality, not matter how
worthy the issue, it seems we take it one step to far.
This legislation has strapped our schools and libraries with a huge
unfunded mandate and has made teachers and librarians cops of the
Internet.
I am sure this legislation is going to be litigated extensively,
but I am equally sure that the states will be coming to us to pay for
the related compliance and reporting requirements.
Mr. Chairman, I look forward to questioning the witnesses and I
yield back the balance of my time.
Mr. Upton. Okay. Since the vote is on, we will adjourn
until about 10:35 or 10:40.
[Brief recess]
Mr. Upton. We have about an hour until the next vote on the
floor. That will be two votes in a row. So at this point, we
will start with Mr. Johnson, and we will come back to Mr.
Taylor when he comes back. Mr. Johnson, welcome.
STATEMENT OF MARVIN J. JOHNSON, LEGISLATIVE COUNSEL, AMERICAN
CIVIL LIBERTIES UNION
Mr. Johnson. Thank you, Mr. Chairman. Mr. Chairman, and
Members of the Committee, I thank you for this opportunity to
testify regarding the effectiveness of the Children's Internet
Protection Act or CHIPA.
CHIPA requires that public libraries and schools implement
mandatory blocking of obscenity, child pornography, and
material harmful to minors, in those facilities receiving
specified Federal funds.
CHIPA does not just block information for children,
however. It also blocks information for adults. Adults can only
get unblocked access if they ask for permission from a
librarian, and they convince the librarian that they have a
bona fide research purpose or other lawful purpose, whatever
that may mean.
Anyone who may want to research something that is going to
be sensitive--for instance, health information--may be deterred
from seeking this permission, or they will be forced to lie.
The end result is a dummying down of the Internet and the
information available through the Internet in public libraries.
Now, we all want to protect our children. However, in doing
so, we have to be careful not to throw out the baby with the
bathwater. Unfortunately, CHIPA not only throws out the baby
and the bathwater, but it throws out the bathtub and the house
as well.
CHIPA makes about as much sense as a law requiring a
stranger to randomly pull books off shelves and refuse to tell
librarians or patrons which books are gone. CHIPA is anomalous
given the fact that Congress appointed a panel of experts to
study ways to protect children on the Internet, and then
pointedly ignore those findings in enacting CHIPA.
In October 1998, Congress appointed the Child On-Line
Protection Act Commission, or COPA Commission, and charged it
with identifying technological or other methods that would help
reduce access by minors to materials that is harmful to minors
on the Internet.
In October of 2000, the Commission reported that blocking
technology raises First Amendment Concerns because of its
potential to be over-inclusive in blocking content, concerns
are increased because the extent of blocking is often unclear
and not disclosed, and may not be based on parental choices.
The Commission specifically did not recommend any mandatory
blocking technologies. Congress, nonetheless, chose to ignore
those recommendations and they adopted CHIPA. Now, CHIPA is
destined to be ineffective when it is implemented because
technology protection measures do not work.
First of all, there is just too much information available
to be able to index it and retrieve it. The web is estimated to
have over 1.5 billion pages, and by the end of 2001, to have
between 3 to 5 billion pages of information available.
They grow at a rate of approximately 200 million pages, or
2 million pages, excuse me, per day. The sheer amount of
information and the fact that that information constantly
changes makes it impossible to review and index all of that
information.
Second, the problem is under-blocking, and under-blocking
means that it does not block all of the so-called objectionable
material that it is intended to block.
For example, one software package was tested for under-
blocking, and hundreds of pornographic websites were not
blocked by the software.
Examples included 069Palace.com. HotAsianFoxes.com, and
Organism.com. Blocking therefore just provides a false sense of
security for parents who believe that their children are being
protected when in fact they are not.
The third problem with blocking is that it over-blocks, and
that means that it blocks information that is not
objectionable. Last year during the election cycle, numerous
political websites were blocked, including Representative Lloyd
Dockett of Texas; Representative Jim Ryan of Kansas; and House
Majority Leader Dick Armey.
From this subcommittee, Ranking Member Markey found his
site blocked because it was characterized as hate, illegal
pornography, and/or violence. In March of this year, Consumer
Reports found that the blocking software is generally
ineffective, both because of the under-blocking and the
overblocking.
The fourth reason is that technology is inexact, and so
what it leads to is a significant constitutional problem
because of both the under and the overblocking. Thus, not only
will this technology not work, but the Act will be stricken as
unconstitutional.
There are less restrictive ways for Congress and libraries,
and particularly libraries, to be able to protect children when
they use the Internet, and many libraries are using these now.
For example, one is to use library web pages. They have
their own web pages where they have reviewed the information,
and they review the accuracy and adequacy of that information,
and then they put that on their web pages, and that helps guide
people away from possibly objectionable material.
And it makes sure that they get the best information
possible on the Internet. Second, educational programs also are
useful to educate parents and children, and last, Internet use
policies are also useful as well. In conclusion, Mr. Chairman,
we can find ways to protect our kids and honor the Constitution
at the same time. We don't cherish our children by destroying
the First Amendment Rights that are their legacy.
[The prepared statement of Marvin J. Johnson follows:]
Prepared Statement of Marvin J. Johnson, Legislative Counsel, American
Civil Liberties Union
Mr. Chairman, and members of the Committee: I am Marvin J. Johnson,
Legislative Counsel for the American Civil Liberties Union.
I appreciate the opportunity to testify before you today about the
Children's Internet Protection Act (CHIPA) on behalf of the American
Civil Liberties Union. The ACLU is a nation-wide, non-partisan
organization of more than 275,000 members devoted to protecting the
principles of freedom set forth in the Bill of Rights and the
Constitution.
The hearing today is to determine the effectiveness of the
Children's Internet Protection Act. CHIPA was signed into law on
December 21, 2000. It will become effective on April 20, 2001.
Sec. 1712(b) (to be codified at 20 U.S.C. Sec. 9134); Sec. 1721(h) (to
be codified at 47 U.S.C. Sec. 254(h)). CHIPA requires that public
libraries receiving e-rate discounts or funds under the Library
Services Technology Act (LSTA) implement and enforce technology
protection measures to block obscenity, child pornography and material
harmful to minors.
Under the e-rate provisions, libraries that do not timely certify
their compliance become ineligible for further e-rate discounts. Where
the library knowingly fails to insure compliance, it may be required to
reimburse any discounts received for the period covered by the
certification. Libraries receiving LSTA funds are not required to
reimburse the government in the event they fail to comply with CHIPA.
CHIPA's restrictions are not limited to library Internet access
supported only by the federal e-rate and LSTA programs. Both the e-rate
restrictions in Section 1721(b) and the LSTA restrictions in Section
1712 require libraries to certify that technology protection measures
are in place on ``any of its computers with Internet access'' and
``during any use of such computers.'' Sec. 1721(b) (to be codified at
47 U.S.C. Sec. 254(h)(6)(C)(i)-(ii)); Sec. 1712.15 (to be codified at
20 U.S.C. Sec. 9134(f)(1)(B)(i)-(ii)) [Emphasis added]. A library
subject to CHIPA must install and enforce the operation of technology
protection measures on all of its computers with Internet access even
if the library purchased the computers or paid for Internet access with
money that is not from federal programs.
While CHIPA is not yet in effect, it will be ineffective. There is
no reliable way to block out all objectionable material, so any
technological protection measure will be ineffective in removing that
material from view. Furthermore, all of the current technological
protection measures block significant amounts of material that deserve
constitutional protection. This overbreadth is one of the reasons CHIPA
is unconstitutional.
technology protection measures do not work
CHIPA will be ineffective because no available technology can
implement its mandate.
CHIPA defines a ``technology protection measure'' as ``a specific
technology that blocks or filters Internet access to the material
covered by a certification.'' 57 U.S.C. Sec. 254(h)(6)(H). CHIPA
requires blocking of material that is obscene, child pornography, or
harmful to minors. It is not possible to create a technology protection
measure that blocks access only to material that is ``obscene,''
``child pornography,'' or ``harmful to minors'' as defined by CHIPA, or
that blocks access to all material that meets those definitions.
In order to understand the reason these technological protection
measures are destined to fail, one must understand the nature of the
technology.
The World Wide Web is now estimated to contain over 1.5 billion
pages. It continues to grow and change at a geometric rate. Thus, there
is a massive amount of information to catalog, and that information
continues to change and grow every day.
Private companies produce technology that is designed to block
access to particular content on the web. The technology is commonly
referred to as ``blocking software'' or ``blocking programs.'' These
programs are computer software that is designed to block content on the
Internet that would otherwise be available to all Internet users.
Vendors of this software establish criteria to identify specific
categories of speech on the Internet. They then configure the software
to block web pages containing those categories of speech. Some programs
block as few as six categories, while others block up to twenty-nine or
more categories. These categories may include hate speech, criminal
activity, sexually explicit speech, ``adult'' speech, violent speech or
speech using specific disfavored words. Some of the blocked categories
express disapproval of a particular viewpoint, such as a category that
blocks all information about ``alternative'' lifestyles including
homosexuality.
The terms ``obscenity,'' ``child pornography'' and ``harmful to
minors'' as used in CHIPA are legal terms. None of the current vendors
of blocking technology claim to block categories that meet these legal
definitions, nor do they employ attorneys or judges to make those
determinations. Leaving decisions of what constitutes obscenity, child
pornography and material harmful to minors up to legally untrained
persons leads to more information being blocked than is legally
permissible.
Once blocking program vendors establish the criteria for
information they intend to block, they establish a method of
identifying the web pages that meet that criteria. Generally, they
conduct automated searches based on words or strings of words, similar
to searches done by standard search engines. Web pages are usually
blocked in their entirety if any content on the web page fits the
vendors' content categories, regardless of whether the content on the
page is textual, visual, or both.
No technology currently available allows vendors to conduct
automated searches for visual images that fit their content categories,
or that are communicated through email, chat, or online discussion
groups. As a result, any implementation of this technology is under-
inclusive, allowing access to material that CHIPA intends to block.
After using this technology to identify web sites to block, the
blocking program vendors add these pages to a master list of web pages
to block (``blocked sites list''). Some vendors claim to have employees
review individual web sites before adding them to the blocked site
list. These employees, however, are not lawyers or judges, and receive
no legal training. There is a great deal of employee turnover in these
jobs. As a result, untrained employees are making what are essentially
legal decisions and excluding constitutionally protected material.
An operational blocking program then blocks users from accessing
web pages on the program's blocked sites list. Vendors normally treat
their blocked sites list as a trade secret, and refuse to reveal this
information to their customers, prospective customers, or to the
public.
Two blocking techniques can be used by program vendors to block
access to email, chat, and online discussion groups. First, the
blocking programs may block access to all email, chat, and online
discussion groups. Second, the programs may selectively block out
particular words communicated through email, chat, or discussion
groups. For example, the programs may replace supposedly objectionable
words with ``xxx'' regardless of the context in which the word was
used. Hence Marc Rotenberg's 1 blocked version of the First
Amendment: ``Congress shall make no law abridging the freedom of
sXXXch, or the right of the people peaceably to XXXemble, and to
peXXXion the government for a redress of grievances.''
Because of the way these blocking programs work, they inherently
rely upon the exercise of subjective human judgment by the vendor to
decide what is objectionable and what is not. The vendor, rather than
librarians, other government officials, adult patrons, or parents
decide what gets placed on the ``blocked sites'' list.
Furthermore, because of the massive amounts of information
available on the web, and its constantly changing content, no company
can keep up with all the information or changes. It is estimated that
even the most sophisticated search techniques find less than 20% of the
web. Therefore, the idea that blocking technology will block out all of
the objectionable information on the web is an impossibility. Although
blocking program vendors provide updates to their blocked sites list,
it is impossible for them to find all of the content on the Internet
that meets their criteria, or to keep up with the rapidly increasing
and changing content available.
In March, 2001, Consumer Reports tested blocking software, and
found that most failed to block at least 20% of objectionable material.
Consumer Reports, March 1, 2001, ``Digital Chaperones for kids'' found
at http://www.consumerreports.org/Special/ConsumerInterest/Reports/
0103fil0.html
Not only does blocking software fail to block all material meeting
the legal definitions of ``obscenity,'' ``child pornography'' and
material ``harmful to minors,'' it also blocks much material which is
not objectionable, and protected under the First Amendment. Because of
this overbreadth, CHIPA will be found unconstitutional, and therefore,
ineffective.
The federal government and others have repeatedly documented the
failures and flaws of blocking programs. The United States Attorney
General has said that blocking programs inescapably fail to block
objectionable speech because they are unable to screen for images.
Brief for the Appellants, Reno v. ACLU, No. 96-511 (January 1997) at
40-41. Congress itself has repeatedly noted these flaws. A House report
found that such software is ``not the preferred solution'' because of
the risk that ``protected, harmless, or innocent speech would be
accidentally or inappropriately blocked.'' H.R. Rep. No. 105-775 (1998)
at 19.
In October 1998, Congress appointed the Child Online Protection Act
Commission (``COPA Commission''), and charged it with ``identify[ing]
technological or other methods that will help reduce access by minors
to material that is harmful to minors on the Internet.'' In October
2000, the Commission reported that blocking ``technology raises First
Amendment concerns because of its potential to be over-inclusive in
blocking content. Concerns are increased because the extent of blocking
is often unclear and not disclosed, and may not be based on parental
choices.'' The Commission specifically did not recommend any
government-imposed mandatory use of blocking technologies.
On October 23, 2000, Peacefire 2 issued a report of
blocking technology which found error rates anywhere from 20% to 80%.
Error rates were based on sites being blocked as ``pornography'' when
they were, in fact, not pornographic. Study of Average Error Rates for
Censorware Programs, October 23, 2000, found at http://
www.peacefire.org/error-rates/
On November 7, 2000, Peacefire issued its report Blind Ballots: Web
Sites of U.S. Political Candidates Censored by Censorware. (http://
www.peacefire.org/blind-ballots/). The report found numerous political
candidates' sites were blocked by this software. Jeffery Pollock,
Republican candidate for Congress in Oregon's Third Congressional
District, had originally favored blocking software. After hearing that
his site was one of those blocked, he reversed his position. The site
of Congressman Markey, the Ranking Minority member of this subcommittee
was also blocked by one of the programs that characterized his site as
``Hate, Illegal, Pornography, and/or Violence.''
Proponents of blocking often claim that even if some web sites are
blocked, there are others available on the topic that may be unblocked
so the information will ultimately be available. This position makes
little sense, particularly when discussing candidate web sites. Should
a Republican candidate be soothed by the fact that his blocked views
may be found and discussed at his Democratic opponent's unblocked web
site?
On December 12, 2000, Peacefire published a report demonstrating
that sites of human rights groups were being blocked by this software.
Amnesty Intercepted: Global human rights groups blocked by Web
censoring software, December 12, 2000, found at: http://
www.peacefire.org/amnesty-intercepted/
Consumer Reports in March 2001 found that blocking software varied
from 20% to 63% in its over-blocking.
Despite protestations from blocking software supporters that
instances of over-blocking are all ``old'' examples remedied by newer
versions, these examples are all recent. The flaws of blocking programs
are not a matter of individual flaws in individual products. These
flaws are inevitable given the task and the limitations of the
technology.
As a result of these problems, blocking software fails to protect
because it cannot block all material that meets the CHIPA criteria.
Furthermore, it blocks a huge amount of information that should not be
considered objectionable, and is clearly protected under the First
Amendment.
chipa restricts adult access as well as minors
While CHIPA purports to protect minors by blocking their access to
the Internet, it also blocks adult access. By sweeping so broadly,
CHIPA violates the Constitution.
Section 1721(b) of CHIPA requires public libraries that participate
in the federal e-rate program to certify to the FCC that they are ``(i)
enforcing a policy of Internet safety that includes the operation of a
technology protection measure with respect to computers with Internet
access that protects against access through such computers to visual
depictions that are (I) obscene; or (II) child pornography; and (ii) is
enforcing the operation of such technology measure during any use of
such computers.'' Sec. 1721 (to be codified at 47 U.S.C. Sec. 254
(h)(6)(C)). [Emphasis added.]
Section 1712 of CHIPA applies to libraries that do not receive the
e-rate discount but receive funds pursuant to 20 U.S.C. Sec. 9134(b),
the Library Services and Technology Act (LSTA), ``to purchase computers
used to access the Internet, or to pay for direct costs associated with
accessing the Internet.'' Sec. 1712 (to be codified at 20 U.S.C.
9134(f)(1)). Section 1712 requires the same installation and
enforcement of technology protection measures as is required by Section
1721(b). Sec. 1712 (to be codified at 20 U.S.C. 9134(f)(1)(A) and (B)).
CHIPA's restrictions are not limited to library Internet access
supported only by the federal e-rate and LSTA programs. Both the e-rate
restrictions in Section 1721(b) and the LSTA restrictions in Section
1712 require libraries to certify that technology protection measures
are in place on ``any of its computers with Internet access'' and
``during any use of such computers.'' Sec. 1721(b) (to be codified at
47 U.S.C. Sec. 254(h)(6)(C)(i)-(ii)); Sec. 1712.15 (to be codified at
20 U.S.C. Sec. 9134(f)(1)(B)(i)-(ii)) [Emphasis added].
Thus, while CHIPA is commonly referred to as a ``child protection
measure,'' it goes further and operates to block adult access as well.
In doing so, CHIPA will follow the CDA and COPA along the trail of
unconstitutional attempts to censor the Internet.
chipa further accentuates the digital divide
CHIPA will have little effect on the rich. They can afford their
own computers with unfiltered access. The poor who have to rely upon
library access to perform job searches, school homework, and general
research are the ones who will be penalized by CHIPA.
Public libraries play a crucial role in affording access to the
economic and social benefits of the Internet to those who do not have
computers at home. Libraries assure that advanced information services
are universally available to all segments of the American population on
an equitable basis.
For many people who cannot afford a personal computer or network
connections, Internet access at public libraries may be their only
means of accessing the Internet. Minorities, low-income persons, the
less educated, children of single-parent households, and persons who
reside in rural areas are less likely than others to have home Internet
access. For example, Whites are more likely to have access to the
Internet from home than Blacks or Latinos have from any location. Black
and Latino households are less than half as likely to have home
Internet access as White households. 3 According to the
National Telecommunications and Information Administration, this
``digital divide'' is growing. CHIPA will only worsen the situation
with these unintended consequences.
chipa overrides local control and decision-making
Many communities spent a lot of time studying the issue of Internet
access and how to deal with it in their public libraries. Kalamazoo,
Michigan, Holland, Michigan, and Multnomah County Public Library are a
few such examples. In each case, they decided blocking software was
inappropriate for their libraries, and they opted for other, less
restrictive measures to protect their children.
CHIPA ignores and overrides those local decisions, instead opting
for a ``one size fits all'' scheme that is unworkable and
unconstitutional.
chipa is unconstitutional because it limits free speech
CHIPA will further be ineffective to protect children because it
will be stricken as unconstitutional.
As you know, on March 20, 2001, the ACLU and the American Library
Association each filed a lawsuit in the Eastern District of
Pennsylvania against the Children's Internet Protection Act (CHIPA).
Under the Act, any challenge will be heard by a panel of three judges,
and appeals from any decision of the panel will go directly to the
United States Supreme Court. The three judges were just recently
appointed.
The First Amendment Applies to the Internet
In Reno I,4 a unanimous Supreme Court held that the
First Amendment applies to the Internet. The Court found the Internet
should be afforded the highest protection under the First Amendment,
equivalent to that provided books, newspapers, and magazines.
5 Therefore, any attempted regulation of Internet speech
such as CHIPA is constitutionally suspect.
The First Amendment includes the right to receive information as well
as to speak.
While the First Amendment discusses the freedom of speech, the
Supreme Court has made it clear that it also encompasses the
fundamental right to receive information.6 In Reno I, the
Supreme Court confirmed that the right to receive information applies
without qualification to expression on the Internet. 7 Thus,
attempts such as CHIPA to restrict information affect the
constitutional rights not only of the speaker, but the recipient as
well. For example, blocking a web site on safe sex violates the rights
of the web site operator (the speaker) but also the rights of the one
who wishes to review that material (the recipient).
CHIPA Is a Content-Based Restriction on Speech That Fails the Strict
Scrutiny Test
CHIPA purports to restrict speech based on its content (obscenity,
child pornography, and material harmful to minors). Additionally, many
blocking software vendors block sites they find politically
objectionable, for example, sites that discuss or condemn
homosexuality. ``Content-based regulations are presumptively invalid.''
8 In order to overcome the presumption of
unconstitutionality, content-based restrictions must meet the strict
scrutiny standard 9 and survive an exacting test. The strict
scrutiny test requires that the challenged statute or regulation is
necessary to serve a compelling governmental interest, and is narrowly
drawn to achieve that end. 10 ``It is not enough to show
that the Government's ends are compelling; the means must be carefully
tailored to achieve those ends.'' 11
Narrow Tailoring and Least Restrictive Means
Under the strict scrutiny analysis, the government has the burden
of establishing that a regulation is the least restrictive means and
narrowly tailored to its objective. 12 In other words, the
Government is not allowed to use a nuclear bomb when a small side arm
would suffice.
Government regulation of the Internet often fails because it
attempts to ``burn the house to roast the pig.'' 13 For
example, in Reno, the Court noted
``[we] are persuaded that the CDA lacks the precision that the
First Amendment requires when a statute regulates the content
of speech. In order to deny minors access to potentially
harmful speech, the CDA effectively suppresses a large amount
of speech that adults have a constitutional right to receive
and to address to one another. That burden on adult speech is
unacceptable if less restrictive alternatives would be at least
as effective in achieving the legitimate purpose that the
statute was enacted to serve.''
Because there were less restrictive alternatives available that
would be at least as effective as the CDA, the Court found the act
unconstitutional.
Like the CDA, CHIPA restricts far more speech than is targeted. As
noted above, no technology available today reliably blocks only
obscenity, child pornography and material harmful to minors. Thus, a
broad range of speech protected under the First Amendment gets
sidelined, while the filters also allow objectionable speech to get
through.
In passing CHIPA, Congress failed to consider less restrictive
alternatives. It also failed to heed the report of the COPA Commission
which did not recommend mandatory blocking programs, and recommended
various less restrictive alternatives.
CHIPA Is Overbroad
Overbreadth is a test that is used when an otherwise legitimate
regulation also affects speech that may not be lawfully restricted.
An example of an overbroad statute appears in Reno I, where the
Court reviewed the constitutionality of the Communications Decency Act
(CDA) 14, Congress' first attempt to regulate content on the
Internet. In invalidating the CDA, the Court noted the act's breadth
was unprecedented, 15 and that it suppressed a large amount
of speech that adults have a constitutional right to send and receive.
Therefore, even though the intent may be to protect children, a law or
regulation that burdens speech which adults have a constitutional right
to receive is unconstitutional ``if less restrictive alternatives would
be at least as effective in achieving the Act's legitimate purposes.''
16
Because the effect of CHIPA is to suppress more speech than is
necessary to achieve the government's objective, it is fatally
overbroad.
CHIPA Is An Unconstitutional Prior Restraint
Under the prior restraint doctrine, the government may not restrain
protected speech without the benefit of clear objective standards or
adequate procedural safeguards, including provisions for administrative
review, time limitations on the review process, and provisions for
prompt judicial review. 17
CHIPA implicitly assumes, for example, that a blocking software
vendor can legitimately determine whether expression is unprotected by
the Constitution. From a legal standpoint, that assumption is
incorrect.
In 1973, the Supreme Court in Miller v. California,18
crafted the definition of obscenity still used today. Known as the
Miller test, it requires that a trier of fact (a judge or jury) examine
the work and determine:
1. Whether ``the average person, applying contemporary community
standards'' would find that the work taken as a whole, appeals
to the prurient interest;
2. Whether the work depicts or describes, in a patently offensive way,
sexual conduct specifically defined in the applicable state
law; and
3. Whether the work, taken as a whole, lacks serious literary,
artistic, political or scientific value.
Only if the answer to all of these questions is ``yes'' can a work be
judged ``obscene'' and only then does it lose its protection under the
First Amendment.
In order to place certain speech into the category of obscenity,
the government must initially provide a series of procedural
safeguards. First, there must be a statute specifically defining the
sexual conduct that may not be depicted or displayed. This requirement
helps guarantee that speakers have fair notice of what is prohibited.
19 Second, the material cannot legitimately be banned
without a full adversarial trial. Finally, a jury must be available to
apply the relevant ``community standards'' for obscenity to the
challenged material.
The fact that a school or library uses third-party software that
decides what is ``obscene'' material exacerbates the policy's
unconstitutionality. ``[A] defendant cannot avoid its constitutional
obligation by contracting out its decisionmaking to a private entity.''
20
Mandatory blocking policies that rely on commercial blocking
software constitute prior restraints because they ``entrust all . . .
blocking decisions . . . to a private vendor'' whose standards and
practices cannot be monitored by the blocking library. 21
All substantive blocking decisions by commercial suppliers necessarily
lie outside the control of the government; consequently, each blocking
decision inherently lacks the requisite procedural safeguards. In fact,
in Mainstream Loudoun, the blocking software provider refused to
provide the defendants with the criteria it used to block sites, let
alone the names of the actual sites blocked. 22 Mandatory
blocking policies like CHIPA thus confer unbridled discretion on
commercial software providers, allowing them to restrict access
indiscriminately and without any administrative or judicial review.
In short, no speech is unprotected by the Constitution until a
court determines it to be so. CHIPA attempts to bypass legal
requirements and thus runs afoul of the Constitution.
CHIPA Is Unconstitutionally Vague
It is a general principle of law that ``laws [must] give the person
of ordinary intelligence a reasonable opportunity to know what is
prohibited, so that he may act accordingly.'' 23 If a law is
too vague to give this ``reasonable opportunity,'' it is deemed void
for vagueness. When a law interferes with the right of free speech, the
courts apply a more stringent variation of the vagueness test.
24 The Supreme Court has recognized that First Amendment
``freedoms are delicate and vulnerable, as well as supremely precious
in our society. The threat of sanctions may deter their exercise almost
as potently as the actual application of sanctions. Because First
Amendment freedoms need breathing space to survive, government may
regulate in the area only with narrow specificity.'' 25
In order to avoid the vice of vagueness, the law or regulation
``must provide explicit standards for those who apply them. A vague law
impermissibly delegates basic policy matters to policemen, judges, and
juries for resolution on an ad hoc and subjective basis, with the
attendant dangers of arbitrary and discriminatory application.''
26 Therefore, the law must provide an ``ascertainable
standard for inclusion and exclusion.'' 27 When that
standard is missing, the law unconstitutionally produces a chilling
effect on speech, inducing speakers to ``steer far wider of the
unlawful zone'' than if the boundaries were clearly marked.
28 It forces people to conform their speech to ``that which
is unquestionably safe.'' 29
CHIPA provides that ``[a]n administrator, supervisor, or other
person authorized by the certifying authority . . . may disable the
technology protection measure concerned, during use by an adult, to
enable access for bona fide research or other lawful purpose.'' No
definition of ``bona fide research or other lawful purpose'' is
provided. Sec. 1721 (to be codified at 47 U.S.C. Sec. 254 (h)(6)(D)).
Section 1712 provides that ``[a]n administrator, supervisor, or other
authority may disable a technology protection measure . . . to enable
access for bona fide research or other lawful purposes.'' Sec. 1712 (to
be codified at 20 U.S.C. Sec. 9134(f)(3)). Unlike the comparable e-rate
section, this provision appears to apply to minors as well as adults.
Again, no definition is provided for ``bona fide research or other
lawful purpose.'' The phrase is left to the interpretation of each
librarian or staff person tasked with making that determination.
CHIPA Violates Constitutionally Protected Anonymity and Privacy
CHIPA requires adults (and perhaps minors in the case of LSTA
funds) to seek permission from a government official in order to obtain
unblocked access. In doing so, a patron requesting such access loses
his or her anonymity and privacy. The Constitution protects anonymity
and privacy in communications and the ability to receive information
anonymously. 30
CHIPA Violates the Unconstitutional Conditions Doctrine
Broadly speaking, the unconstitutional conditions doctrine holds
that Congress may not condition receipt of federal funds upon the
waiver of a constitutional right. Under CHIPA, Congress conditions
receipt of federal money (except in the case of the e-rate) on the
condition that libraries violate the First Amendment.
During debates on the Children's Internet Protection Act (CHIPA),
some proponents claimed there was no constitutional infirmity in
conditioning receipt of federal money on acquiring and using blocking
software. Even if mandatory blocking itself violated the First
Amendment, it was claimed this was circumvented because schools and
libraries only had to block if they received federal funds. Since they
were under no obligation to receive those funds, there was no
violation.
The Supreme Court's decision in Legal Services Corporation v.
Velasquez 31 reaffirms the long-standing principle that the
government may not require the sacrifice of constitutional rights as a
condition for receiving a government benefit. 32 In
Velasquez, Congress required that funds distributed to the Legal
Services Corporation not be used to challenge existing welfare laws.
Legal Services attorneys therefore could not represent clients in
welfare benefits cases if the constitutionality of the welfare laws
became an issue. Thus, both the attorney and the client were prohibited
from challenging these laws; the attorney because of the funding
restrictions, and the client because they could not afford another
attorney. The Court thus had to decide ``whether one of the conditions
imposed by Congress on the use of LSC funds violates the First
Amendment rights of LSC grantees and their clients.'' The majority of
the Court concluded that it did.
While concluding that the government may, in certain circumstances,
use funding as a tool to mold speech, the Court noted `` `[i]t does not
follow . . . that viewpoint-based restrictions are proper when the
[government] does not itself speak or subsidize transmittal of a
message it favors but instead expends funds to encourage a diversity of
views from private speakers.' ''
The subsidies involved in CHIPA are made to encourage schools and
libraries to connect to the Internet. The funds thus are not intended
to facilitate a specific message, but rather to encourage the populace
to engage in the diversity of views that is the Internet. Also, like in
Velasquez, the money was given to one entity for the benefit of a third
party. In Velasquez, the money was given to LSC for the benefit of the
clients. In CHIPA, the money is given to schools and libraries for the
benefit of the patrons and students.
The situation in Velasquez and CHIPA is different than that in
National Endowment for the Arts v. Finley.33 In Finley, the
Court found the challenged provision only required that the NEA take
into account ``decency and respect'' in making its grants. It was not a
determinative factor, but one of several considerations. Thus, Congress
had not disallowed any particular viewpoints in subsidizing the arts.
The Court specifically noted the situation might be different if
the NEA engaged in viewpoint discrimination:
If the NEA were to leverage its power to award subsidies on the
basis of subjective criteria into a penalty on disfavored
viewpoints, then we would confront a different case. We have
stated that, even in the provision of subsidies, the Government
may not ``aim at the suppression of dangerous ideas, [citation
omitted] and if a subsidy were ``manipulated'' to have a
``coercive effect,'' then relief could be appropriate.
[citation omitted] 34
Velasquez is the latest pronouncement in this area of the law.
Since Congress is using federal money to force libraries to violate the
First Amendment, Velasquez declares that, under these circumstances,
CHIPA is unconstitutional.
constitutional alternatives that are less restrictive means of
accomplishing congress' goal
Congress passed CHIPA with the intent to protect children. For all
the reasons noted above, CHIPA is unconstitutional and will be
stricken, in addition to being ineffective.
As noted above, many libraries have already implemented options
that do not involve blocking software and are at least as effective as
blocking. These options include library web sites, educational
programs, and Internet Use Policies.
Many libraries have implemented their own ``home'' pages to help
patrons identify high-quality and useful sites. In addition to
providing its own content, a library may provide indexes of other links
it has evaluated and can recommend. Cataloging and organizing this
information helps lead users to resources in the subject areas of
interest and consequently helps them avoid unwanted resources.
Descriptions on the pages can assist users in deciding whether to visit
a particular site.
The same philosophy can be applied to library sites designed
specifically for children. The site can provide children with a safe
Internet experience by visiting sites reviewed by the librarian.
Many libraries educate patrons about Internet use. Through
education, librarians assist patrons in finding useful information and
avoiding unwanted information. Many public libraries offer classes on
the use of the library, the catalog, indexes and systems. In many
libraries, patrons are required to take such classes before they can
use public connections. These classes cover the library's use policies.
Topics for Internet classes often include: kinds of information and
subjects which are likely to be found on the Internet; how to construct
effective, high-quality search strategies taking advantage of features
of directories and search engines (truncation, Boolean searching,
searching on phrases); when to use various kinds of search aids; how to
evaluate resources found; and the advantages of using library-approved
Web sites and other sites known to collect quality resources.
Education was one of the recommendations made by the COPA
Commission in its report of October 20, 2000.
Libraries also may offer classes and resources to help parents
assist their children in using the Internet safely and productively.
Most reinforce the importance of parental supervision and involvement
with children when using the Internet. Parents should teach children to
be educated consumers of information and to talk to their parents about
what they find online. Parents may be advised to consider setting
boundaries on how much time children can be on the Net, and on the
kinds of information they look at. Children may also be instructed
about the importance of not giving their names, passwords, credit card
numbers, or other personally identifying information, or arranging to
meet anyone they talk to online without discussing it with their
parents. A good example of these guides is the Librarian's Guide to
Cyberspace for Parents and Kids, from the American Library Association.
(www.ala.org/parentspage/greatsites/safe.html)
Another method libraries use to educate patrons about Internet use
is the development of Internet Use Policies. These policies can remind
users about expected use of the library and of library resources in
general. The American Library Association has established general
guidelines for the development of library policies.
Many libraries require patrons to sign an Internet Use Policy
before they can access the Internet. These policies may explain the
diversity of information on the Internet, and point patrons to the
library-approved resources on the library web page. A substantial
number of policies discuss the decentralized, uncontrolled nature of
the Internet and warn patrons that they may encounter material they
find objectionable. The policy may explain that beyond the library web
page, the library does not monitor or control the information on the
Internet, and that patrons use it at their own risk. The policy may
inform parents that they are responsible for deciding what library
resources are appropriate for their children. The policy may also set
rules for Internet use, and can impose sanctions for violations,
including losing Internet access privileges, and reporting illegal
conduct to law enforcement authorities. In many cases, these policies
are tied together with educational programs.
There are numerous ways libraries can and do work with parents and
children to protect children while they use the Internet. These methods
are at least as effective as blocking technology without the side-
effect of blocking much material that is constitutionally protected.
conclusion
Protecting children is a laudable goal. CHIPA, however, fails to
protect children. No blocking mechanism or software is completely
effective. At the same time, CHIPA results in blocking a large segment
of constitutionally protected speech to adults as well as minors. Since
there are less restrictive alternatives, CHIPA is constitutionally
infirm.
The First Amendment is part of the foundation of our society and a
bedrock of our principles. Emasculating the First Amendment in the name
of protecting children only teaches our children that principles are
elastic and suggests to them that when those principles become
inconvenient, they should be discarded. Such a lesson leaves a child's
moral compass spinning. ``Indeed, perhaps we do the minors in this
country harm if First Amendment protections, which they will with age
inherit fully, are chipped away in the name of their protection.''
35
We can, and must, protect our founding principles as well as our
children. It is not an ``either-or'' situation. With thoughtful
consideration, both can be achieved.
Additional Materials:
ACLU Complaint http://www.aclu.org/court/multnomah.pdf
Blind Ballots: Web Sites of U.S. Political Candidates Censored by
Censorware, November 7, 2000 http://www.peacefire.org/blind-ballots/
Filtering Programs Block Candidate Sites, November 8, 2000
(verifying results of ``Blind Ballots'' report on CyberPatrol) http://
www.zdnet.com/zdnn/stories/news/0,4586,2651471,00.html
Amnesty Intercepted: Global human rights groups blocked by Web
censoring software, December 12, 2000 http://www.peacefire.org/amnesty-
intercepted/
Study of Average Error Rates for Censorware Programs, October 23,
2000 http://www.peacefire.org/error-rates/
COPA Commission Report, October 20, 2000 http://
www.copacommission.org/report/
Consumer Reports, March 1, 2001, Digital Chaperones http://
www.consumerreports.org/Special/ConsumerInterest/Reports/0103fil0.html
Footnotes
1 Marc Rotenberg is the Director of the Electronic
Privacy Information Center (EPIC) in Washington, D.C. The quote is
found at: http://www.peacefire.org/info/about-peacefire.shtml
2 Peacefire.org was created in August 1996 to represent
the interests of people under 18 in the debate over freedom of speech
on the Internet. It has been an active opponent of mandatory blocking
software.
3 National Telecommunications and Information
Administration, Falling Through the Net: Toward Digital Inclusion,
October 2000
4 Reno v. ACLU, 521 U.S. 844 (1997)(Reno I)
5 Id. at 871
6 See, e.g., Reno v. ACLU, 521 U.S. 844, 874 (1997)
(``Reno I''); Board of Education v. Pico, 457 U.S. 853, 867-68 (1982);
Tinker v. Des Moines Indep. Community Sch. Dist., 393 U.S. 503, 511
(1969); Campbell v. St. Tammany Parish Sch. Bd., 64 F.3d 184, 190 (5th
Cir. 1995).
7 521 U.S. 844 (1997) (``Reno I'')
8 R.A.V. v. City of St. Paul, 505 U.S. 377, 382 (1992);
see also Regan v. Time, Inc., 468 U.S. 641 (1984)
9 Turner Broadcasting System v. Federal Communications
Commission, 114 S.Ct. 2445 (1994).
10 Arkansas Writers' Project, Inc. v. Ragland, 481 U.S.
221, 231 (1987)
11 Sable Communications of California, Inc. v. FCC, 492
U.S. 115, 126 (1989)
12 Elrod v. Burns, 427 U.S. 347, 362 (1976)
13 Butler v Michigan, 352 U.S. 380, 383 (1957)
14 Communications Decency Act of 1996. 47 U.S.C.
Sec. 223
15 Reno I. at 878
16 Id. at 875
17 See, e.g., FW/PBS, Inc. v. City of Dallas, 493 U.S.
215, 225-29 (1990); Freedman v. Maryland, 380 U.S. 51, 58 (1965).
18 Miller v. California, 413 U.S. 15 (1973).
19 This is an important requirement the Government
overlooked in its enactment of the Communications Decency Act (CDA). In
Reno v. ACLU, 521 U.S. 844 (1997)(Reno I), the Government argued the
statute was not vague because it parroted one of the Miller prongs (the
material ``in context, depicts or describes, in terms patently
offensive as measured by contemporary community standards, sexual or
excretory activities or organs.''). The Court disagreed, noting that
the second prong of Miller contained a critical element omitted from
the CDA: that the proscribed material be ``specifically defined by the
applicable state law.'' The Court also noted the CDA went beyond
Miller's application to sexual conduct to include ``excretory
activities'' as well as ``organs'' of both a sexual and excretory
nature. Finally, the Court concluded that ``just because a definition
including three limitations is not vague, it does not follow that one
of those limitations, standing by itself, is not vague.''
20 Mainstream Loudoun v. Loudoun County Library, 24
F.Supp. 2d 552 (E.D. Va. 1998) (Mainstream Loudoun II).
21 Mainstream Loudoun II, 24 F. Supp. 2d at 569.
22 Id.
23 Grayned v. City of Rockford, 408 U.S. 104, 108 1972)
24 Village of Hoffman Estates v. Flipside, Hoffman
Estates, Inc., 455 U.S. 489, 499 (1982)
25 NAACP v. Button, 371 U.S. 415, 433 (1963)
26 Grayned, 408 U.S. at 108-109
27 Smith v. Grayned, 415 U.S. 566, 578 (1974)
28 Id.
29 Baggett v. Bullitt, 377 U.S. 360, 372 (1964)
30 See Talley v. California, 362 U.S. 60 (1960);
McIntyre v. Ohio Elections Commission, 115 S.Ct. 1511 (1995); ACLU v.
Johnson, 4 F.Supp.2d 1029 (D.N.M. 1998); ACLU v. Miller, 977 F.Supp.
1228 (N.D.Ga. 1997)
31 No. 99-603, February 28, 2001. The decision is
available at: http://laws.findlaw.com/us/000/99-603.html
32 See, for example, Rosenberger v. Rector and Visitors
of the University of Virginia, 515 U.S. 819 (1995). There, the Court
``reaffirmed the requirement of viewpoint neutrality in the
Government's provision of financial benefits.''
33 National Endowment for the Arts v. Finley, 524 U.S.
569 (1998)
34 Id. at 588
35 ACLU v. Reno, 31 F.Supp.2d 473 (E.D. Pa. 1999)
Mr. Upton. Mr. Taylor.
STATEMENT OF BRUCE A. TAYLOR, PRESIDENT AND CHIEF COUNSEL,
NATIONAL LAW CENTER FOR CHILDREN AND FAMILIES
Mr. Taylor. Mr. Chairman, thank you. My name is Bruce
Taylor, and I am President and Chief Counsel of the National
Law Center for Children and Families, and we have been involved
with helping and advising, and even actually writing some of
the briefs for the Members of Congress who supported both the
Communications Decency Act and the Child On-Line Protection
Act, CDA and COPA.
In both of those instances, the Members of Congress who
passed that legislation, as was done here with the CHIPA,
narrowed the scope of the law so that they would more
adequately apply to the Internet, as opposed to the way that
obscenity, and child pornography, and indecency laws apply to
broadcasting or street crimes for obscenity or child
pornography.
The same was done here with CHIPA. Some of the problems
that have been identified by the ACLU, or the American Library
Association against this Act, CHIPA, are that they do not want
to have any regulations.
But the alternative to that is that one of the things that
everyone who uses the Internet knows is that there is a lot of
hardcore pornography and child pornography on the worldwide web
pages that are run by the pornography syndicates, and that are
on the UseNet News Groups that are posted by people all over
the world to put both obscenity and child pornography there.
And then in the chat rooms, where these people who go into
there, many of whom are teenagers, or pedophiles, post pictures
for the rest of the people in the chat room. So if no action is
taken by the library to try to filter out access to that, then
all adults and children can go into public libraries or in the
school terminals and get illegal, hardcore pornography and
child pornography, which is a felony even to possess.
So the alternative to that is what are we going to do about
this free availability of all of this kind of pornography, and
one of the things that Congress said to do with CHIPA is that
we are going to ask two things of libraries and schools.
One, you have to try to use a filter to block out whatever
you think--you as the administrators of the school or library
think--would fit within those categories of illegal child porn,
obscenity, or what is obscene for minors.
Now, the term, ``harmful to minors,'' is a legal term of
art, just like obscenity is, and just like child pornography
is. Child pornography is not any picture of a kid that you
think is dirty, or obscenity is not just something that people
think is offensive.
And what is harmful to minors is not what somebody thinks
will hurt a child psychologically or morally. Those are legal
terms that are limited to a type of pornography. CHIPA gives
the discretion for the local library or school to tell their
filter to block a certain kind of category, and it can work
with the filter company, and decide for themselves.
And so many of the types of abuses that are being
hypothetically or even in the past have been examples of
overblocking or under-blocking are examples that the filter
company and the library of the school can work together not to
see happen in reality.
Because even though there may be a few sites, one way or
the other--and like he said, there were 3, or 4, or 5 porn
sites that came through. That may be better than 100,000 known
hard core porn sites. But the Act itself gives the total
discretion to the school to decide with their filter company
what they are going to block.
And most of the examples that they use are when using a
filter, just like Consumer Reports did, that is set at the
parental control level that you would use if you were a parent
trying to put a filter on your home terminal to protect a 7, or
8, or 10 year old kid.
You don't have to use that setting on a filter, and filter
companies have various categories of material that they block;
from hard core pornography to soft core, to nudity, to hate
speech, to violence, to drugs, to gambling, to offense speech,
and a lot of other categories.
And if you enable all of those categories on the most
conservative setting, sure it is going to block a lot of
material that might be sexually oriented, but not obscene for
children, and not obscene for adults, and not child porn.
Most of the filter companies also have settings that are
much more liberal that say that we are only going to block that
which we have reviewed to be hard core pornography, sexually
explicit pictures of children, and the kind of soft core
pornography that you would find in Playboy, Penthouse, and
those kinds of magazines that are a crime to sell to a child at
a local convenience store.
So the filters themselves have not been adequately tested
because they have not been indocketed at the settings that
would be appropriate for a library or a school. Certainly a
library or a school for grade school kids can be set more
conservatively than for high school, and a library may say I am
only going to block the most explicit, penetration visible,
hard core porn, and only those sexually explicit pictures of
kids, and only that kind of soft core porn that you could not
sell to the kid at the local corner drug store. That is the
kind of pornography that I am going to have my filter do.
The other purpose of CHIPA is to do what Congress has been
trying to do with all of the money that it has put into the
Internet for the past many years. We have put billions of
dollars into the development of the Internet, and we are
putting $3 billion into wiring up every school and public
library in the United States.
We want to see libraries and schools become the next
centuries place for people to get information. The alternative
to having pedophiles go into libraries and downloading child
porn because they know that when a search warrant comes with
the police department, and they find out it is a library
instead of a pedophiles home, that's why they do there.
Adult porn addicts can go, instead of the local adult
bookstore where they have to buy it, to the public library. And
libraries don't like that either I'm sure, or they shouldn't,
but at least Congress and the State Legislatures don't have to
have that if it is State subsidized monies.
But that kind of money being put into the development of
the Internet is going to help hopefully improvement the quality
of filters so that next year when you say when one library told
their filter to block a certain kind of material and it did,
and then another library told their filter to block another
kind of material and it did, the filter companies are going to
be able to develop the technology with the help of this law so
that it will carry both functions and duties equally well.
So the criticisms of filters, and, oh, they don't work.
Well, filters use the same search technology that we use to
find information on the Internet. The Internet can do a lot of
am amazing things, and for them to say--I think it is absurd
for them to say that the Internet can do anything that you
want.
It has all this information, and billions of web pages, and
you can find anything that you want, but the only thing that it
can do is bring you information, and the only thing it can't do
is block it out, because it is the same technology that filters
as does for the search technology, and that is one of the main
purposes of this bill.
If we don't give this an experiment to say that we are
going to put this experiment into the hands of the toughest
critics that the country could find, meaning very--you know,
someone more liberal, and educated, and techno-savvy,
librarians and school administrators, they are going to be the
best ones to say here is where the filters worked, and here is
where they failed.
And when they report back to Congress, we will have a
better way than just guessing on what the bad things that are
going to happen. But one thing that I think is going to be for
sure is that people will be able to use the terminals.
[The prepared statement of Bruce A. Taylor follows:]
Prepared Statement of Bruce A. Taylor, President and Chief Counsel,
National Law Center for Children and Families
nlc statement of legal arguments in support of the constitutionality of
cipa, the children's internet protection act of 2000
1. As a funding incentive, CIPA can require schools and libraries
that accept federal subsidies for discount Internet services (i.e.,
``e-rate'' funds) to use filters to attempt to restrict access by
minors under 17 to that kind of pornography that is legally ``Harmful
To Minors'', as well as to restrict minors' access to visual
pornography that is legally ``Obscene'' or ``Child Pornography'', and
thus illegal even for adults.
A. CIPA only applies to grade schools and high schools, not colleges.
B. CIPA only applies to public libraries that accept federal Internet
subsidies, not college libraries or private libraries that do
not accept federal funds.
C. Internet subsidies are not an ``entitlement'' program for libraries
and schools. Conversely, federal subsidies for free Internet
access in public schools and libraries are an important factor
in the intent of Congress to make Internet access safe and
educational for minor students in their schools and for minor
children who are entitled to use public libraries without being
exposed to illegal and harmful pornography or exposed to adults
who are viewing such pornography on publicly accessible
computer terminals in taxpayer supported libraries.
2. CIPA provides local determination of what the filter will
attempt to block by allowing the receiving school or library to decide
what could constitute the three types of pornography that their
filtering software attempts to block, guided by the scope of the legal
definitions used in federal law:
A. ``Harmful To Minors'' (as defined in CIPA to be ``obscene for
minors''); and
B. ``Obscenity'' (as limited to visual images in 18 U.S.C. Sec. 1460
and defined by the Supreme Court, see Miller v. California, 413
U.S. 15, at 24-25 (1973), Smith v. United States, 431 U.S. 291,
at 300-02, 309 (1977), Pope v. Illinois, 481 U.S. 497, at 500-
01 (1987), providing the constitutional criteria for federal
and state laws and courts); and
C. ``Child Pornography'' (as defined in 18 U.S.C. Sec. 2256 (8), i.e.,
visual depictions that are or appear to be of actual minors
under age 18 engaging in ``sexually explicit conduct'').
3. These three classes of pornography are unprotected under the
First Amendment for minors and obscenity and child pornography are
unprotected for adults, including on the Internet. The courts have
defined these categories of unprotected pornography as ``legal terms of
art'' so as to limit them to narrow classes of pornographic materials
that do not include serious works of literature, art, political speech,
or scientific or medical information. No adult has the right to gain
access to obscenity or child pornography in a school or public library
and no child has a right to access pornography that is ``obscene for
minors'' or ``harmful to minors'' in those settings and no school or
library has any duty to provide access to such materials on Internet
terminals.
The three classes of pornography that Congress requires schools and
libraries to attempt to filter out of their Internet access in exchange
for the massive federal subsidies that make such Internet access
available to all students and members of the public in libraries are:
A. Child Pornography: Consists of an unprotected visual depiction
of a minor child (federal age is under 18) engaged in actual or
simulated sexual conduct, including a lewd or lascivious exhibition of
the genitals. See 18 U.S.C. Sec. 2256; New York v. Ferber, 458 U.S. 747
(1982), Osborne v. Ohio, 495 U.S. 103 (1990), United States v. X-
Citement Video, Inc., 115 S. Ct. 464 (1994). See also United States v.
Wiegand, 812 F.2d 1239 (9th Cir. 1987), cert. denied, 484 U.S. 856
(1987), United States v. Knox, 32 F.3d 733 (3rd Cir. 1994), cert.
denied, 115 S. Ct. 897 (1995). Note: In 1996, 18 U.S.C. Sec. 2252A was
enacted and Sec. 2256 was amended to include ``child pornography'' that
consists of a visual depiction that ``is or appears to be'' of an
actual minor engaging in ``sexually explicit conduct''. Section 2252A
was upheld in United States v. Hilton, 167 F.3d 61 (1st Cir. 1999), and
United States v. Acheson, 195 F.3d 645 (11th Cir. 1999). But see Free
Speech Coalition v. Reno, 198 F.3d 1083 (9th Cir. 1999) (declaring
statute invalid as applied to child pornography that is wholly
generated by means of computer), cert. granted, sub nom Ashcroft v.
Free Speech Coalition (2001).
B. Obscenity (hard-core adult pornography): ``This much has been
categorically settled by the Court, that obscene material is
unprotected by the First Amendment.'' Miller v. California, 413 U.S.
15, 23 (1973). This is true even for ``consenting adults.'' Paris Adult
Theatre v. Slaton, 413 U.S. 49, 57-59 (1973). ``Transmitting obscenity
and child pornography, whether via the Internet or other means, is
already illegal under federal law for both adults and juveniles.'' Reno
v. ACLU, 521 U.S. 844, 117 S.Ct. 2329, at 2347, n. 44 (1997). The
``Miller Test'' can apply to actual or simulated sexual acts and lewd
genital exhibitions. See Miller v. California, 413 U.S. 15, at 24-25
(1973); Smith v. United States, 431 U.S. 291, at 300-02, 309 (1977);
Pope v. Illinois, 481 U.S. 497, at 500-01 (1987), providing the three-
prong constitutional criteria for federal and state laws and court
adjudications:
(1) whether the average person, applying contemporary adult community
standards, would find that the material, taken as a whole,
appeals to a prurient interest in sex (i.e., an erotic,
lascivious, abnormal, unhealthy, degrading, shameful, or morbid
interest in nudity, sex, or excretion); and
(2) whether the average person, applying contemporary adult community
standards, would find that the work depicts or describes, in a
patently offensive way, sexual conduct (i.e., ``ultimate sexual
acts, normal or perverted, actual or simulated; . . .
masturbation, excretory functions, and lewd exhibition of the
genitals''; and sadomasochistic sexual abuse); and
(3) whether a reasonable person would find that the work, taken as a
whole, lacks serious literary, artistic, political, or
scientific value.
C. Pornography Harmful To Minors (soft-core and hard-core
pornography): Known as ``variable obscenity'' or the ``Millerized-
Ginsberg Test'' for what is ``obscene for minors''. See Ginsberg v. New
York, 390 U.S. 629 (1968); as modified by Miller, Smith, Pope, supra.
It is illegal to sell, exhibit, or display ``HTM/OFM'' pornography to
minor children, even if the material is not obscene or unlawful for
adults. See also Commonwealth v. American Booksellers Ass'n, 372 S.E.2d
618 (Va. 1988), followed, American Booksellers Ass'n v. Commonwealth of
Va., 882 F.2d 125 (4th Cir. 1989), Crawford v. Lungren, 96 F.3d 380
(9th Cir. 1996), cert. denied, 117 S. Ct. 1249 (1997). Under CIPA,
pornography that is ``Harmful To Minors'' or ``Obscene For Minors'' is
defined for Internet purposes to mean pornographic visual images
(``picture, image, graphic image file, or other visual depiction''),
judged in reference to the age group of minors in the intended and
probable recipient audience, that could meet the following three prong
test:
(1) taken as a whole and with respect to minors, appeals to a prurient
interest in nudity, sex, or excretion (as judged by the average
person, applying contemporary adult community standards with
respect to what prurient appeal it would have for minors in the
probable or recipient age group of minors); and
(2) depicts, describes, or represents, in a patently offensive way with
respect to what is suitable for minors, an actual or simulated
sexual act or sexual contact, actual or simulated normal or
perverted sexual acts, or a lewd exhibition of the genitals (as
judged by the average person, applying contemporary adult
community standards with respect to what would be patently
offensive for minors in the probably or recipient age group of
minors); and
(3) taken as a whole, lacks serious literary, artistic, political, or
scientific value as to minors (as judged by a reasonable person
with respect to what would have serious value for minors in the
intended and probable recipient audience).
4. Congress can also require these federally subsidized schools and
libraries to use filters to attempt to restrict adult access to visual
images of Obscenity (hard-core pornography) and Child Pornography
(sexually explicit images of minors), especially since such pornography
is contraband and unprotected even for ``consenting adults'' and
because the transmission or transportation of which by phone lines or
common carriers is a felony under existing federal laws (see 18 U.S.C.
Sec. 1462, smuggling or any common carrier transport of obscenity, even
for private use; Sec. 1465, transportation, for sale or distribution,
of obscenity across state lines or by any means or facility of
interstate or foreign commerce; Sec. Sec. 2252 & 2252A, transporting,
receiving, or possessing child pornography within, into, or out of the
United States by any means, including computer; Sec. 1961, et seq.,
RICO crime for using an enterprise in a pattern of obscenity or child
exploitation offenses.
5. The power of Congress to act by tax subsidy incentive is greater
than its police power to criminalize or provide civil liability for
unprotected conduct. CIPA is not a criminal or civil law and places no
restrictions on the citizens or public.
6. Library patrons who are adults are not entitled to access any
particular materials of their own choice in a public library or via the
Internet and even ``consenting adults'' have no First Amendment right
to obtain Obscenity or Child Pornography, especially at taxpayer
expense in federally supported public libraries or schools. Students or
library patrons who are minor children under age 17 are not entitled to
access pornography that is ``obscene for minors'', ``obscene'' for
adults, or child pornography.
7. Congress may encourage children to use Internet computers in
schools and libraries by subsidizing the use of pornography filtering
technology so that minors will be protected from exposure to such
illegal and unprotected images during their educational and
entertainment use of the Internet and computer services.
8. This Act requires K-12 schools and public libraries to provide
filtered Internet access to minors and patrons, but allows the
determinations and delegation of the filter process to be made by local
school and library administrative personnel, without federal
interference or federal judicial review.
9. CIPA allows for unfiltered Internet use for bona fide research
or other lawful purposes and makes those determinations totally within
the local administrators' discretion.
10. Congress already granted immunity to libraries and schools, as
providers of Internet access, for voluntary actions to restrict access
to illegal and objectionable materials, even if the materials are
constitutionally protected, as part of the ``Good Samaritan''
protections in the CDA, 47 U.S.C. Sec. 230(c), so they will be free to
accept the e-rate funds and use filters without fear of legal liability
or harassment by users, special interest advocacy groups, or even
pornographers.
11. CIPA has a future-looking, beneficial purpose of encouraging
the development of filter technologies, thus furthering the mass
communications and Internet development goals of Congress. By
subsidizing Internet facilities in schools and libraries and asking
them to employ filter devices to try to restrict pornography from
reaching their computer terminals, Congress can create a market for
filter programs, foster research & development in the private sector
Internet industry for better and more customizable filter devices, and
re-evaluate the safety, policies, and performance of such ``technology
protection measures'' in light of the extreme scrutiny and competent
review that could be gathered from school and library administrators
and Internet access professionals who will be directing and evaluating
the filters, even when they personally or philosophically disagree with
or oppose the use of such filtering technologies in their institutions.
The virulence of their opposition can be the strength of their
constructive criticism, as Congress intends.
12. Without CIPA, many libraries and schools would continue to
provide unrestricted access by minors and adults to Internet terminals
that regularly expose them to illegal and unprotected pornography,
though many others will continue to provide filtered Internet access to
minor children and reduce the exposure of their students and patrons to
harmful pornography. This Act seeks to make all tax supported school
and library terminals open, freely accessible, and safe.
13. CIPA does not require subsidized schools or libraries to
restrict or filter any other materials other than what they themselves
think is Obscene, Child Pornography, or Harmful To Minors. The Act
requires no more, but does not interfere, on the other hand, with the
local school or library's choice, if they so choose, to try to filter
out violence, hate speech, or other dangerous and inappropriate
materials under their right to be ``Good Samaritans'' under the CDA's
immunity protection, either for minor children or for adults.
Mr. Upton. Your time has expired.
Thank you.
Ms. Morgan.
STATEMENT OF LAURA G. MORGAN, LIBRARIAN, CHICAGO PUBLIC
LIBRARY
Ms. Morgan. Good morning. In a speech discussing the urgent
need for the Children's Internet Protection Act, Senator John
McCain stated the following, ``What is happening in schools and
libraries all over America in many cases is an unacceptable
situation.''
My name is Laura Morgan, and I am here today to tell you
that unfortunately that the Senator is absolutely correct. As a
librarian in the Chicago Public Library's central branch, I am
well aware of the serious consequences of a completely
unrestricted Internet access policy.
I sincerely thank the Committee for giving me the
opportunity to submit testimony in support of the Children's
Internet Protection Act. I also wish to commend the United
States Senators and Representatives who have supported this
important legislation.
I should also tell you that, of course, since I am
criticizing the library's policy that I am not representing the
library here, and I also want to say that my criticism should
not diminish the many wonderful things that libraries do in
this country, particularly the Chicago Public Library.
But that I feel that the problems that are happening cannot
be ignored, and need to be talked about. I am concerned about
this issue from four different perspectives; as a parent of two
children, as a woman, as a citizen, and as a member of the
library profession.
As a parent of two daughters, I am very concerned about the
children who are accessing pornography on library computers,
both intentionally and unintentionally. Due to our library
administration's adamant stance against filters, even in the
case of computers used by children, this happens far too often.
One example that I had when I worked--I worked at a branch
library for a couple of weeks in the month of December to help
out, and there was a 9 year old girl who said a completely
unsolicited comment to me. She said, you know, it really
bothers me when the little boys here look at what she called
nasty pictures on the computers.
We supposedly have a policy where we can tell kids to get
off of these bad sites, but obviously this is happening.
Obviously there is no way that every staff person can watch
what every kid is doing, and this is happening definitely at
the Chicago Public Library and elsewhere.
Again, I ask you is this something that we want to have
happen in our public libraries, when a 9 year old child has to
be exposed to this type of material, and as we know, there is a
lot of extremely hard pornographic material. We are not talking
about very minor material. This is very extreme.
In fact, some of these kids I noticed are very adept at
changing or making the screen go blank when you walk by. At
this particular branch, I noticed after some of these boys
left--and it is usually young boys--I could check or go into
the bookmarks and the search history, and very extreme Triple X
porn sites had been accessed.
So this is definitely happening. I want to point that out
again. As a woman, I am concerned about the porn surfers, who
are almost exclusively male, creating a sexually hostile
environment, particularly for female staff and patrons.
On the floor where I work--and I am the architectural
librarian in the main branch--we have male patrons looking at
pornography every day virtually. And they do this sometimes for
hours on end.
They will go throughout the building, and this is allowed
by our administration. We do not censor the Internet in any way
for adults. I also want to say that the fact that the male
patrons are doing this is not a big enough problem to begin
with, it does encourage bad behavior by these patrons.
Verbal harassment, even public masturbation has happened,
and I don't think it should be a surprise to anyone when you
make hard core porn available in a public building that this is
not going to happen. I made a complaint at a public board
meeting about this, which in-turn has lodged an investigation
by the sexual harassment office of the city of Chicago.
They are currently doing an investigation into this matter,
and interviewing staff, and I hope that the truth really
surfaces about what is going on throughout that system. One of
the things again that I am concerned about as a citizen is the
whole idea of the illegal material, particularly child
pornography.
There was a--Bill Harmoning, who is the chief investigator
for hi-tech crimes in Illinois, of the Attorney General's
Office, said that it is a well known fact in law enforcement
that pedophiles do like to go to public libraries and do this
because they cannot be traced.
Again, this is a person in law enforcement saying this. I
have also heard from security guards in the Chicago Public
Library that people are coming in and surfing through this
material. This is a fact.
Again, considering the heinous nature of these kinds of
images, I find this simply abominable, and that they are not
doing more to stop it. Finally, as a librarian, I am concerned
what all of this means for the future of public libraries.
The plain fact remains that public libraries have never
been in the business of providing hard core pornography in
print, not to mention illegal obscenity and child pornography.
The argument that we must provide it now simply because it is
available via the uncontrollable medium called the Internet is
absurd.
Must we now add X-rated book store to our list of services.
Is that what the public library has now become? Filtering
opponents often cite acceptable use policies as a solution to
the problem. I have become increasing convinced, however, that
these policies are not adequate.
And in many ways they are actually more intrusive and
subjective than filters are, because it implies that a staff
person is watching what people are doing. And in conclusion I
just want to say that I am one of those librarians out there
that does support the Children's Internet Protection Act.
The American Library Association is giving the impression
that all librarians are opposed to this. I do believe that the
hierarchy of the association represents a radical view that is
not shared by either the majority of librarians or the public.
Thank you very much.
[The prepared statement of Laura G. Morgan follows:]
Prepared Statement of Laura G. Morgan, Public Librarian
i. introduction
In a speech discussing the urgent need for the Children's Internet
Protection Act, Senator John McCain stated the following: ``What is
happening in schools and libraries all over America, in many cases, is
an unacceptable situation.'' \1\ My name is Laura G. Morgan, and I am
here today to tell you that unfortunately, the Senator is absolutely
correct. As a librarian in the Chicago Public Library's central branch,
I am well aware of the serious consequences of an unrestricted Internet
access policy. I sincerely thank the Committee for giving me the
opportunity to submit testimony in support of the Children's Internet
Protection Act. I also wish to commend the United States Senators and
Representatives who have supported this important legislation.
---------------------------------------------------------------------------
\1\ U.S., Congress, Senate, Senator John McCain speaking in support
of Amendment no. 3610, 27 June 2000.
---------------------------------------------------------------------------
On March 20, 2001, the American Library Association, the American
Civil Liberties Union and others, filed a legal challenge against the
Children's Internet Protection Act that became a law in December, 2000.
At a press conference, ALA president Nancy Kranich referred to the
61,000 members of the Association and stated that ``we are here
speaking for all of them today.'' \2\ This statement is troubling
because I believe there are many library professionals who do not
condone the ALA's legal challenge of CIPA, nor the Association's
ideology regarding Internet access in libraries. I am also deeply
concerned that many statements by the ALA hierarchy are at best
misleading, and at worst, simply not true. I hope that my experiences
as a public librarian in an unrestricted Internet access environment
will expose the seriousness of this issue and the need for the
Children's Internet Protection Act. I also hope my testimony will
encourage you to listen to those who object to CIPA with a great deal
of skepticism.
---------------------------------------------------------------------------
\2\ http://www.ala.org/cipa/kranichremarks.html
---------------------------------------------------------------------------
ii. the chicago public library: a case study
The Chicago Public Library's central building where I work, as well
as its seventy-eight branches, are a tremendous asset to the city of
Chicago. Mayor Richard Daley and Library Commissioner Mary Dempsey have
been tireless advocates for improving library services for all of
Chicago's citizens. Since 1989, I have held the position of
architecture librarian, as well as arts periodicals librarian, in the
Visual and Performing Arts Division of the Harold Washington Library
Center. I am truly grateful that I have had the opportunity to work in
one of the finest public libraries in the United States, if not the
world. It is because of this deep regard and commitment that I have for
the Chicago Public Library and the library profession that I have
chosen to speak out publicly against our Internet policy. While my
criticism of unrestricted Internet access should not diminish the many
positive aspects of libraries, I feel that the negative consequences of
such a policy can not, nor should not, be ignored.
Like the official stance of American Library Association, the
Chicago Public Library administration is firmly opposed to Internet
filters, even on computers located in children's departments. The
Chicago Public Library policy states:
The Chicago Public Library provides public access to the
Internet as a way of enhancing its existing collections with
electronic resources from information networks around the
world.
While the Internet provides many valuable sources of
information, users are reminded that some information on the
Internet may not be accurate, complete, current, or
confidential. The Library has no control over the information
on the Internet, and cannot be held responsible for its
content.
It is not within the purview of the Library to monitor access
to any resource for any segment of the population. The
Circulation Policy of the Chicago Public Library states:
``The Library makes its collections available to all users
without regard to age, sex, race, national origin, physical
disability, or sexual orientation.''
The responsibility for use of library resources by children
thirteen (13) and under rests with the parent or legal
guardian.
The Chicago Public Library adheres to the principles expressed in
the following documents of the American Library Association (http://
www.ala.org/):
Library Bill of Rights (http://www.ala.org/work/freedom/
lbr.html)
Free Access to Libraries for Minors (http://www.ala.org/
alaorg/oif/free--min.html)
Freedom to Read (http://www.ala.org/alaorg/oif/freeread.html)
Freedom to View (http://www.ala.org/alaorg/oif/
freedomtoview.html) \3\
---------------------------------------------------------------------------
\3\ http://www.chipublib.org/003cpl/internet/policy.html
---------------------------------------------------------------------------
In an article entitled ``Porn Again'' in the Minneapolis/St. Paul
City Pages, the Chicago Public Library Internet policy is summarized as
follows:
``In the children's department, librarians keep an eye on
what kids are looking at and redirect them if they seem to be
looking at inappropriate Web sites, says library commissioner
Mary Dempsey. But in the adult areas, patrons are free to view
anything, including pornographic sites. ``Adults have a right
to look at those things. Adult terminals have privacy screens.
If they want to look at it, that's fine. But you don't have to
look at it, and I don't have to look at it,'' Dempsey says.
``People are free to surf. We're a big city, with 3 million
people. What is objectionable to one person is not necessarily
objectionable to another.'' \4\
---------------------------------------------------------------------------
\4\ Kokmen, Leyla, ``Porn Again,'' Minneapolis / St. Paul City
Pages, 17 May 2000.
---------------------------------------------------------------------------
The major problem with such a policy is obvious. The administration
is giving its tacit approval to patrons who wish to view and print a
vast array of hard core pornographic material that is normally
associated with an x-rated book store or peep show. There is no
precedent for this in public libraries, since traditionally this type
of material was never purchased in print form. Specifically, what I
mean by ``this type of material,'' are sexual images created strictly
for the sake of sexual arousal and gratification. The easy availability
of pornography on the Internet at the Chicago Public Library and in
libraries across the nation has great potential for negatively
affecting the staff, patrons (especially children), and the overall
environment. The administration claims that the ``privacy screens''
solve this problem, however, the screens do not completely block the
view, nor the negative behavior that is sometimes associated with the
habitual porn surfers. In my opinion, the Chicago Public Library
administration did not sufficiently consider all of the legal and
ethical ramifications of the chosen Internet policy. I am deeply
concerned about this issue from four different personal perspectives:
as a mother, as a woman, as a citizen, and as a member of the library
profession.
iii. children and internet pornography
As a mother, I am very concerned about children who access or are
exposed to pornography on library computers, both intentionally and
unintentionally. Due to the library administration's adamant stance
against filters, even in the case of computers used by children, this
happens far too often. Prior to the spring of 2000, I had not given
much serious thought to the issue of children accessing pornography on
the Internet, primarily because, as of that date, I had not witnessed
it on the eighth floor where I work. What focused my attention was
hearing from staff in the Central Library's Children's Department that
children were occasionally accessing pornographic and violent web sites
on the twelve new unfiltered Internet computers donated by the Bill and
Melinda Gates Foundation. One of the more extreme examples involved a
child caught viewing a downloaded porn video displaying a woman
performing oral sex on a man. I was extremely disturbed by this
revelation because I had assumed that the computers in the children's
departments would be filtered. In other words, I had assumed that the
library administration would have chosen to make every effort to block
pornographic web sites from being accessed in the first place. To their
credit, the children's staff tell the kids to get off those sites when
they see it happen, but to me the damage has already been done. Whether
or not children are deliberately accessing these sites or stumbling
upon them by accident is not really the point, either. When it happens,
the images are there for anyone in the vicinity of the computer screen
to see. As an arts librarian and one who has a graduate degree in art
history, I can tell you that images are often much more powerful than
words. The Crimes Against Children Research Center's recent study
entitled Online Victimization: A Report on the Nation's Youth
corroborates this point. The study revealed that a significant number
of young people who are exposed to unwanted sexual material on the
Internet are deeply disturbed by it. Furthermore, the report's authors
ask the following questions. ``What if a quarter of all young visitors
to the local supermarket were exposed to unwanted pornography? Would
this be tolerated? We consider these levels of offensiveness
unacceptable in most contexts.'' \5\
---------------------------------------------------------------------------
\5\ Crimes Against Children Research Center, Online Victimization:
A Report on the Nation's Youth, Funded by the U.S. Congress Through a
Grant to National Center for Missing & Exploited Children (Washington,
D.C.: National Center for Missing & Exploited Children, June 2000), p.
33.
---------------------------------------------------------------------------
Over the past several months, I have spoken to several Chicago
Public library staff members who have described incidents of children
under the age of fourteen viewing pornography in children's
departments. In defense of their policy, the library administration
claims that staff can monitor what kids are doing at all times while
they are using the computers. Many staff have told me this is simply
not possible. One children's librarian told me that when she is not in
the department due to a day off or lunch, etc., it is a ``free for
all'' in the children's area, and that she often finds porn sites
bookmarked on the children's computers upon her return. Another
children's librarian commented how a young girl told her that the boys
were looking at ``bad things'' on the computers. I had a similar
experience while working at a branch library last December, when a nine
year old girl told me that it bothered her when the boys looked at what
she called ``nasty pictures'' on the computers. What kind of a message
does that give to a little girl about her local library, the place that
is touted as a ``safe haven for a safer neighborhood?'' At that branch,
I also witnessed how adept some of the boys are at hiding what they are
doing by changing the screen as someone walks by. After they left the
library, I could easily tell by looking at the recent search history
and bookmarks that they had accessed extreme XXX porn sites. What I ask
all of you today is this: have we as a society become so desensitized
that the idea of children accessing hard core pornography in a
children's library does not bother us? I sincerely hope this is not the
case.
In addition to children under the age of fourteen accessing porn in
children's departments, minors under the age of eighteen have been
known to access pornography in the subject departments of the central
library, as well as on the adult computers in the branches. I have
witnessed this myself, as well as hearing from several employees about
porn viewing incidents involving teenage boys. A librarian told me that
she saw some teens viewing Asian child pornography on the fourth floor
of the central library. One extreme example I witnessed was a young
teen looking at sado-masochistic images of nude women bound with duct
tape over their eyes and mouths. Just last week, I noticed a groups of
boys around one of the eighth floor computers soon before we closed.
After they left and I went over to shut down the computer, I noticed
several hard core porn sites were left open. Another group of teen boys
once left some print-outs by the computer of a porn site that boasted
``Young Teens from Holland.'' I believe it is obvious that many
patrons, and in particular teenage boys, deliberately seek out porn on
Internet computers in libraries. This will continue to be true
regardless of how many ALA touted ``educational programs'' or
``acceptable use policies'' are in place.
iv. internet pornography and the creation of a sexually hostile
environment
As a woman, I am concerned about the porn surfers (who are almost
exclusively male) creating a sexually hostile environment, particularly
for female staff and patrons. Almost every day on the floor where I
work, I see male patrons viewing and sometimes printing pornography.
Security guards have told me that some of the men surf for XXX porn for
hours on end, by going from floor to floor. I was recently told that
the porn surfers now even frequent our ninth floor Special Collections
Reading Room, where one staff member jokingly refers to these men as
``Internet scholars.'' In many cases, therefore, the Internet computers
at the Chicago Public Library become peep show booths. If the fact that
male patrons are allowed to do this is not bad enough, consider for a
moment the behavior that it encourages including harassment and public
masturbation. I have spoken to numerous staff members who have
experienced these kinds of incidents. One employee told me how a male
patron had pulled up an image of a sex act and said to her ``can you do
this?'' Several employees have experienced porn images being left
intentionally on computer screens. Other clever patrons have figured
out how to change the computer wallpaper to porn images. Some patrons
have been known to intentionally call staff over to ``fix their
computer,'' only to find that a porn image is on the screen. In the
worst case scenarios of porn viewing and accompanying behavior, male
patrons have been known to masturbate through their clothes, put their
hands in their pants, and sometimes even expose themselves.
Additionally, a library security guard told me that he often finds porn
print-outs in the men's restrooms.
Not surprisingly, patrons have also been offended by these
conditions. A woman told me a few months ago how it made her
uncomfortable that a male patron was viewing and printing ``dirty
pictures'' on the computer next to her. I heard a similar story of a
female patron on our seventh floor who was shocked this was allowed. A
recent incident on our fourth floor involved two patrons signing up for
time on an Internet computer, only to leave quickly upon realizing the
computer directly next to them was being used by a porn surfer. A third
floor librarian told me of a female patron leaving in disgust for the
same reason. It would appear that the library administration is more
concerned about protecting the rights of the porn surfers over everyone
else!
At a library board meeting on September 19, 2000, I spoke out about
these conditions, and mentioned the phrase ``sexually hostile work
environment'' in this context. In response, I was asked to speak to
attorneys in the City of Chicago's Sexual Harassment Office, which is
part of the City's Department of Personnel. It is interesting to note
that complaints by staff regarding Internet pornography had been
routinely ignored or brushed off prior to this date. It was not until I
made a public complaint for anyone to finally take this issue seriously
and contact the City's Sexual Harassment Office. A positive result of
my three and a half hour meeting with the attorneys on December 1, 2000
was their decision to commence a full scale investigation into how
Internet pornography is affecting the environment at the Chicago Public
Library. At the very least, I believe this is a step in the right
direction. Considering that the corporate world is taking the issue of
Internet pornography very seriously in light of sexual harassment
lawsuits, I am pleased that the City of Chicago is looking into the
matter. I recently spoke to one of the attorneys who confirmed they are
still in the process of interviewing employees and expect to complete
the investigation within the next few months. Once they complete their
report, they will give it to the City's Law Department, who will in
turn, make any necessary decisions.
v. illegal obscenity and child pornography
As a citizen, I am concerned about patrons who access illegal
material, in particular, child pornography. In a hearing I attended
last September, Bill Harmening, an investigator of high tech crimes in
the Illinois Attorney General's office stated that ``it is common
knowledge in the business of pedophiles and traders of child
pornography to go to your public library and download it because it's
there.'' \6\ Although he was not speaking specifically about the
Chicago Public Library, I have heard accounts by guards and staff that
patrons are accessing child pornography on library computers on
occasion. Considering the heinous nature of these kinds of images, I
find this simply abominable. In addition, many XXX porn sites qualify
as illegal under Illinois obscenity law, and thereby are indefensible
on First Amendment grounds for anyone.
---------------------------------------------------------------------------
\6\ Illinois House Republican Hearing on House Bill 1812, Marion,
Illinois, September 7, 2000.
---------------------------------------------------------------------------
vi. pornography at your library
As a librarian, I am concerned about what all of this means for the
future of public libraries. The plain fact remains that public
libraries have never been in the business of providing pornography in
print, not to mention illegal obscenity and child pornography. The
argument that we must provide it now simply because it is available via
the ``uncontrollable'' medium called the Internet is absurd. Must we
now add ``x-rated bookstore'' to our list of services? Is that what the
``public library'' has become? Think about that, and what that says
about the library as a public institution. Regardless of what people
think of pornography on a philosophical level, I believe that most
Americans would agree that viewing and printing it in a public library
building is highly inappropriate. The library administrators who
prohibit porn surfing often claim that their ``acceptable use
policies'' are a solution to the problem. Such a policy would certainly
deter some of the porn surfers at the Chicago Public Library, but I
have become increasingly convinced, that these policies are not
adequate. In addition, such ``tap on the shoulder'' policies are much
more intrusive and subjective than filters, because they imply that
library staff are watching what patrons are viewing on the computers,
all the while making inconsistent individual judgments about site
content.
vii. internet pornography in libraries: a nationwide problem
In his report entitled Dangerous Access, 2000 Edition: Uncovering
Internet Pornography in America's Libraries, former librarian David
Burt documented numerous cases of children accessing pornography,
sexual harassment, adults exposing children to pornography, patrons
accessing illegal material including child pornography, and so on, in
libraries across the country.\7\ He collected the data by making
Freedom of Information Act requests to libraries for their Internet
logs, incident reports, and other data pertaining to Internet use. As
expected, the American Library Association discouraged libraries from
complying with Mr. Burt's requests, thereby resulting in a relatively
small return rate. The Chicago Public Library, was in fact, one of the
libraries that refused his FOIA request. Many people have speculated
that the ALA and many libraries did not want to comply because they
were wary (for good reason) of this kind of negative information
becoming publicly known. In my opinion, it is very obvious that there
is indeed something to ``hide.''
---------------------------------------------------------------------------
\7\ Burt, David, Dangerous Access, 2000 Edition: Uncovering
Internet Pornography in America's Libraries (Washington, D.C.: Family
Research Council, 2000).
---------------------------------------------------------------------------
There has been increasing media coverage of problems relating to
Internet pornography in libraries across the United States. Last year,
a major story broke surrounding the unrestricted Internet access policy
at the Minneapolis Public Library. Several courageous employees spoke
out about the egregious conditions there, and twelve ultimately filed a
charge of a sexually hostile work environment with the U.S. Equal
Employment Opportunity Commission.\8\ Even though conditions improved
once the administration adopted an acceptable use policy, librarian
Wendy Adamson recently informed me that some patrons still attempt to
break the rules and surf for pornography. Another library porn news
story involves the 21 branches of the Sno-Isle Regional Library System
in the state of Washington. As reported in the American Library
Association's online news, ``Councilman Dan Anderson successfully
argued for a council resolution earlier this month that asks the
library to amend its Internet policy to comply with the Children's
Internet Protect Act, to be phased in beginning April 20.'' Several
citizens have voiced complaints regarding adults and children accessing
pornography on the library's computers.\9\ Another recent news story
described how the Camden County, New Jersey Library System decided to
filter every computer due to problems relating to Internet
pornography.\10\
---------------------------------------------------------------------------
\8\ Oder, Norm, ``Minneapolis PL Modifies Net Policy,'' Library
Journal, June 1, 2000, pp. 15-16.
\9\ http://www.ala.org/alonline/news/2001/010402.html
\10\ ``Philadelphia-Area Library Found Internet Filters Far From
Simple,'' The Philadelphia Inquirer, 8 March 2001.
---------------------------------------------------------------------------
viii. deconstructing the anti-filtering arguments of the american
library association
I am well aware of the American Library Association's many
arguments against filters in public libraries and public schools, even
in the case of children's departments. At a few sessions I attended at
the ALA conference in Chicago in July 2000, these points were raised
repeatedly. As the Wall Street Journal stated in a editorial in
September, 1999, however, the ALA's ideology ``makes no room for common
sense.'' \11\ One of the Association's primary arguments is that
libraries simply make Internet access available and that parents hold
the sole responsibility of supervising their children when using the
Internet. What this statement does not take into account are the many
responsible parents who do supervise their children but who have no
control over the adult or unsupervised kid accessing a porn site on the
computer next to them. Additionally, by the time a child is of a
certain age, it is neither realistic nor possible to supervise one's
children 24 hours a day. In a speech advocating the mandated use of
filters on tax-funded computers, Senator John McCain stated that
``Parents, taxpayers, deserve to have a realistic faith that, when they
entrust their children to our nation's schools and libraries, that this
trust will not be betrayed.'' \12\
---------------------------------------------------------------------------
\11\ ``Dr. Laura's Theme,'' Wall Street Journal, 3 September 1999,
p. W15.
\12\ U.S., Congress, Senate, Senator John McCain speaking in
support of Amendment no. 3610, 27 June 2000.
---------------------------------------------------------------------------
A second ALA argument against filtering of any kind, is that
defending the right of a patron to access a hard core pornography web
site is no different than defending the right of a patron to access
controversial books, music, or videos from library collections. The
Visual and Performing Arts Division in which I work does, in fact,
include books on a handful of artists whose body of work includes
pieces considered controversial. All were carefully selected by
librarians because of the artists' prominence in the established art
world. Most of these books are kept in the closed reference stacks and
patrons must leave an I.D. to use them in the library. I think there is
an obvious difference between these relative few art books owned by our
department and the thousands of web sites that feature everything from
bestiality to child pornography. If these sites had print equivalents,
I can tell you with certainty that the Chicago Public Library would
never buy them. When filtering advocate and librarian David Burt
offered a free subscription to Hustler magazine to any public library
to prove this point, he had no takers. In a Chicago Sun Times editorial
regarding Internet access in public schools and Illinois House Bill
1812, writer Dennis Byrne adds, ``might I suggest that if school
administrators and teachers stocked school bookshelves and libraries
with the materials available unfiltered on the Internet, parents would
consider a public lynching.'' \13\ Why then does the American Library
Association and some library administrators treat the Internet as an
exception to traditional collection development policies?
---------------------------------------------------------------------------
\13\ Byrne, Dennis, ``Parents Need Help in Fight With Pop
Culture,'' Chicago Sun Times, 23 August 2000, p. 53.
---------------------------------------------------------------------------
A third argument is that filters don't work. While I do not propose
to be an expert on filters, I have spoken to librarians who work in
libraries with filters on children's computers and even some with
filters on all computers. Everyone knows that no filter claims to be or
is one hundred percent effective, but the librarians who have real
experience with them tell me they suit their purpose quite well. One
library administrator told me that the odds of accessing an
inappropriate site with a filter on is about ``as likely as winning the
lottery.'' The ALA claims that filters give parents a ``false sense of
security.'' As a parent, I can tell you that I would be quite happy
with the odds that the administrator mentioned. In addition, the ALA's
favorite example of filters blocking most of the web sites about breast
cancer because of the word breast are simply not true.
A fourth argument against filtering or even acceptable use
policies, which prohibit patrons from accessing hard core pornography,
is that only a minority of users actually access objectionable web
sites. My response to this is who is to say how much is too much or too
little? Should the viewing of hard core pornography by children and
adults in public libraries be tolerated on any level? In January 2000,
the Wall Street Journal quoted Sarah Long, the previous past president
of the ALA, as saying that ``the American Library Association has never
endorsed the viewing of pornography by children or adults.'' \14\ The
editorial continues by saying that the ``problem is, it's never
endorsed their not viewing it, either. Quite the opposite.'' The plain
truth remains that unrestricted Internet access policies permit
numerous instances of porn surfing in libraries across the country. The
few examples I have provided represent only a fraction of the actual
situations witnessed by me and other staff of the Chicago Public
Library. If I had the opportunity to speak to each and every employee,
I am certain that everyone would have their own stories to tell.
Cumulatively, the numbers and situations would be significant. Then
consider the times this must happen on computers with unfiltered
Internet access in other Illinois libraries and elsewhere in the United
States. While some libraries have acted responsibly and at the very
least have installed filters in children's rooms and enforced
acceptable use policies for adults, many have not. The hierarchy of the
American Library Association and some others in the library profession
strongly oppose any state and federal mandates for Internet filtering,
most recently exhibited by their legal challenge to the Children's
Internet Protection Act. I believe they represent a radical view that
is not shared by the majority of librarians or the public. While they
will try to marginalize those of us who do not agree with the official
ALA party line as right wing extremists, I am proud to say that I have
always considered myself a liberal. And in the end, support of the
Children's Internet Protection Act is not a matter of left or right,
liberal or conservative, but a matter of common sense. It is time for
each and every one of us who is concerned about maintaining a safe and
welcoming environment for all library users to stand up and make our
voices heard.
---------------------------------------------------------------------------
\14\ ``Taste--Review & Outlook: X-Rated,'' Wall Street Journal, 14
January 2000, p. W11.
Mr. Upton. Thank you.
Ms. Caywood, welcome.
STATEMENT OF CAROLYN A. CAYWOOD, LIBRARIAN, BAYSIDE AREA
LIBRARY, VIRGINIA BEACH PUBLIC LIBRARY
Ms. Caywood. Thank you. I appreciate this opportunity to
participate in this hearing. My name is Carolyn Caywood. I am
the Bayside Branch Librarian in the Virginia Beach Public
Library, and I have come to tell you how we have handled the
Internet in Virginia Beach, and to answer your questions.
My written testimony will provide more details, and while
those details are specific to Virginia Beach, we have borrowed
from and compared notes with hundreds of other libraries and
schools. So we know that they, too, are working on policies and
processes.
Library boards and school boards are finding what meets
their community, and States, too. Virginia requires us to have
a policy, and I know that some States even require filters.
I want to make four points. The responsibility for making
decisions about Internet usage should always be made at the
local level within the bounds of the United States
Constitution.
Libraries and school boards have this policy and they use
it every day. They are the best equipped to make the decisions
that best serve their communities. Second, technology cannot
substitute for an informed community, effective librarians and
teachers, educated families, and trained Internet users.
Third, resources that are devoted to education will be more
effective in protecting our children than will be federally
mandated filters installed at local expense, especially when
that mandate removes the patron's choices.
And, finally, filters do not work the way the CHIPA law
needs them to work. I'm sorry, but I have been pronouncing it
CHIPA for months. I have confidence in our Nation's libraries
and librarians.
Librarians share Congress' concerns underlying the law that
children's experiences on the Internet be safe, educational,
and rewarding. No profession that I know is more concerned
about children's safety, and development, and growth, than
librarians.
We have been unfairly maligned and our position has been
misconstrued by those who are pursuing a different agenda.
Their hype diminishes the concern that every one of the
librarians that I know feels for children as we work on
difficult policy decisions.
Librarians know as well as anybody else that new
technologies can create and exacerbate social issues, and we
deal with this. Virginia Beach receives $25,000 from the E-
rate. We use filters in four ways. First, we have to go with--I
think you would call it KidsNet.
It is a list of selected URLs that are developmentally
appropriate to young children, and they can go to only those
that we have examined and embedded. Second, we block chat. We
agree that chat is not appropriate to library use in our
system.
Third, we provide choice on the other Internet terminals.
You can choose the one that is unfiltered, and you can choose
the one that is filtered according to your needs at the moment.
And the fourth one is that using again the blocking
ability, we block everything but our library's catalog on the
ones that are devoted the catalog. So we use filters in all of
these ways, and yet we would not be in compliance with CHIPA.
We would have to really go back to square one.
We went through a 2 year development process, and we would
need to repeat that to find a new community solution that
complied with the law. I think that CHIPA will have a
devastating impact on the ability of library users to access
constitutionally protect material. I think that it may increase
risks for children whose parents gain a false sense of security
if only those things that Mr. Taylor mentioned are blocked.
This is not what parents are thinking when they think that
their child is using a filtered computer. I believe that
communities must be involved in policy decisionmaking, and
while CHIPA permits some specific choices, it doesn't really
allow for the kind of policy decisionmaking involvement that we
have had in our community.
And it denies local communities the right to determine what
approach they want for their children and families. My branch
has six public Internet access terminals, in addition to the
kids net.
I hope that this hearing will provide us with the first
step toward a dialog about how many other ways we have found
that really work with our communities to handle Internet
access.
[The prepared statement of Carolyn A. Caywood follows:]
Prepared Statement of Carolyn A. Caywood, Librarian, Virginia Beach
Public Library
Thank you for the opportunity to participate in this important
hearing today. My name is Carolyn A. Caywood. I am the Bayside Branch
Librarian in the Virginia Beach Public Library System. My branch serves
a population of 85,000 people and our library system serves a
population of about 450,000 people overall. I have been a librarian for
over twenty-eight years.
I am also a member of the Freedom to Read Foundation Board of
Directors and an active member of the American Library Association
(ALA). However, I am here today in my capacity as a library branch
manager to share with you our experiences in Virginia Beach libraries,
experiences I know to be similar to situations across the country as it
relates to libraries and filtering and the implications of the
Children's Internet Protection Act (CIPA) enacted in the last Congress.
As you know, this legislation requires the installation and use by
schools and libraries of technology that filters or blocks Internet
access to various types of images on all computers as a condition of
eligibility for E-Rate discounts or certain technology funding under
the Library Services and Technology Act (LSTA) and the Elementary and
Secondary Education Act (ESEA).
I will leave the discussion of the legal and Constitutional issues
to the attorneys. We are all waiting for the results of the litigation
recently initiated by ALA and others. And, we are all waiting for the
promulgation of rules by the Federal Communications Commission (FCC),
and guidance by the Institute of Museum and Library Services (IMLS) and
the Department of Education to see how the law may be implemented.
The Virginia Beach Public Library System, a department of the City
of Virginia Beach, has developed and implemented its Internet use
policies. While the details are unique to us, our story is similar to
those from hundreds and hundreds of other libraries in the country.
And, the story is comparable also to the K-12 public and private
schools. Communities across the country are already addressing the
issues raised by the Internet. Library boards and school boards have
already grappled with and developed policies and networks that meet the
needs of their communities. Some states, including my state of
Virginia, have their own rules requiring Internet use policies. A few
states require filters of some sort.
I want to make the following points with you in this testimony:
Responsibility for making decisions about Internet usage
policies and procedures should always be made at the local
level within the bounds of the Constitution. Library and school
boards and their communities have the responsibility, which
they are already exercising everyday. They are best equipped to
make decisions based upon the needs, values and resources in
their respective communities;
Technology cannot substitute for an informed community,
effective librarians and teachers, educated families and
trained Internet users;
Resources devoted to education are more effective in the long
run to protect our children than having Federally mandated
filters installed at local expense, especially when that
mandate removes options for patron choices about using filters
or not.
For the record: I want to applaud our Nation's libraries and
librarians. All librarians share the Congress' concerns underlying this
law--that children's experiences on the Internet be safe, educational
and rewarding. No profession is more vitally concerned about children
and their safety, development and growth than our Nation's librarians.
We have been unfairly maligned and our position misconstrued by those
with a different political agenda. Their hype diminishes the concerns
that all of us have for our children as we all struggle to make these
difficult public policy decisions together. Librarians know as well as
anyone else, that, as new technologies proliferate, it is critical that
we balance the extraordinary value they bring to communications and
lifelong learning with responsible, safe use and careful guidance
through education and training.
The core belief of libraries is that knowledge is good. With it,
people can take charge of their future. Librarians take seriously the
First Amendment limits on government, of which we are a part, and we
promote intellectual freedom because that's the only environment in
which learning can thrive. Libraries are not prescriptive, we do not
endorse the contents of our collections or judge the information people
seek. Librarians cannot nor should not substitute for parents. These
important Internet decisions must be made by parents.
Libraries are tax supported institutions generally providing no-fee
public services. We ensure that each person has the opportunity to
learn and discover new ideas and different opinions. In recent years,
that has meant adding Internet access to prevent a Digital Divide
between those with access to electronic information and those without.
Not having Internet access is becoming a form of social
marginalization, but even owning a computer is not enough if a person
lacks the skill to use it effectively. The skill divide is as important
as the economic divide.
I believe the Virginia Beach situation, which is typical of what is
happening across the country, supports how these responsibilities are
taken fully and seriously. On the issue of E-rate and filtering in
Virginia Beach: we get $25,000 from the E-rate. We use filters in three
ways: 1) to present the best web sites for kids; 2) to block chat
rooms; and 3) to provide patron options for Internet searches in the
library branches.
For example, on the ``Kidsnet'' pages of our web site, our library
system uses filters to block everything but the URLs that have been
selected by our library staff. In other words, ALL other URLs are
blocked. Children going to the ``Kidsnet'' site find only materials our
librarians believe is age appropriate and developmentally appropriate
materials.
We provide ongoing classes and training sessions in the library
branches for different age groups, including family sessions. We
provide an online list of links for parents to learn more about using
the Internet, preferably in conjunction with their children. This list
includes interactive exercises that parent and children can do together
to find out and discuss questions about privacy, using the Internet,
safe web surfing, and so forth. I encourage you to review our web site:
http://www.virginia-beach.va.us/dept/library/families/kid.html
We have had, and continue to have, open, broad and ongoing
discussion within our community about Internet use and when and how we
use filtering. We will continue to apply for the E-rate but we cannot
break faith with our community and the policies it has established
through public dialogue, education, and local decision making. The
relationship between the community and the library in the development
of guidelines for access to the Internet, is extremely important in
Virginia Beach and elsewhere.
As a practicing librarian in a community that has developed a
policy for addressing children's Internet use, I believe that CIPA will
have a devastating impact on the ability of all library users to access
valuable constitutionally protected material. Equally, if not more
importantly, CIPA will actually increase the risks for many children
because filters give parents a false sense of security. What is more,
it strips library boards and local communities of local control and
decision making and will impose extraordinary financial and
administrative burdens on libraries and schools.
As a branch librarian in Virginia Beach, I have had direct
experience with the development and adoption of policies for library
patron access to the Internet. In my experience, the role of the
community in helping to inform and shape a solution is absolutely
critical. My concern with the law is that, while it permits some
discretion for local officials to determine what material is ``deemed
to be harmful to minors'' and what software to use to block content, it
denies local communities the opportunity to determine what approach
will best serve children in these communities in dealing with
challenging content.
It is not just that one solution doesn't fit all communities. It is
also that a Federal mandate on a matter so closely tied to local norms
and values is, in my view, counterproductive and even harmful. The law
may not only discourage communities from doing the hard work to reach
their own solutions and to educate themselves, it also lacks the
legitimacy necessary to foster broad community support.
While no one approach to Internet safety will satisfy everyone in
the community, I believe it is possible, indeed necessary, to work with
the community to fashion a ``bottom up'' approach that respects
community values, to address core concerns and to provide useful
solutions. Not surprisingly, local decision-making processes vary
significantly and the solutions are extremely diverse. But what they
have in common is involvement of the community, understanding of local
norms and values, knowledge of practices that take into account the
information needs of children and teens, and a general good faith
desire to work together to find a solution that respects the diverse
perspectives in the community. Libraries are educating and encouraging
parents and children to work together and have family dialogues about
how best to use the Internet and other library resources by developing
search skills, critical thinking and knowledge of risks and benefits of
using the Internet.
Virginia Beach developed our policies as part of a larger dialogue
on what kind of library services our community wanted and needed. We
started discussing the Internet and filters with the public as early as
1994. We also started a public dialogue about library services as a
whole and how the Internet and other electronic resources fit into this
mix of services. This was done as part the process we used for
developing long term plans for the expansion, construction and/or
remodeling of our library branches. These public dialogues were
extensive and held throughout the City in a series of eight meetings.
It included discussions of just what the public wanted in terms of the
balance between books and other printed materials vs. electronic
resources.
Starting with these community discussions, our library launched
many Internet education programs for individuals and families. It is
important that our education programs inform all stakeholders about the
Internet and its strengths and weaknesses so that informed decisions
can be made. We continue to provide Internet training for parents and
for families through classes and literature. In this process we
encourage parents to ask whether their children know their own family
values, whether they know and understand how best and safely to search
the Internet, and how to behave online, in chat rooms, and on email.
We discuss with parents that no one sends a toddler out to cross
even a neighborhood street alone. Adults accompany their children and
stay with them at the roadside, until they are mature enough and
trusted enough to cross on their own. As a child gets older they learn,
again with more adult training and supervision, how to cross busier
roads. They eventually learn that it is never wise to dash across a
major interstate highway. It just isn't safe. The same type of
incremental education and opportunities can and should be applied to
using online Internet resources.
Our library advisory board, like hundreds of library boards across
the country, has been directly involved in developing and leading the
public discussions that have shaped our policies. Staff at all levels
are also involved. We have provided continuing staff training and
discussion about these issues so that staff understand and feel
comfortable with the community policy. And, because this is a
community-wide issue and we are a department of city government, we
also met with the police department, the sheriff's department, and the
office of the Commonwealth's Attorney during policy development.
We met with the recreation department, the schools, and even the
public works department to inform and explain the community policy. If
someone finds something on the Internet that they think is obscene or
child pornography, we encourage them to go to the police with their
complaint to have it properly investigated. Our policy is not static--
just as the technology is not static. For example, right now we're
amending our polices to deal with instant messaging issues.
In our branch, we have six Internet public access terminals not
counting the terminal devoted to ``Kidsnet.'' Patrons have a choice
about whether to use a terminal that is fully filtered or one without
filtering. One terminal is fully filtered using I-Gear software. We
utilize their maximum level of filtering on that terminal which is in
an open desk-carrell. There are five other terminals with no filtering.
The unfiltered terminals are designed for maximum privacy so that
no one but that patron can see the screen. We do this in part so that
there is no ``visual startlement'' for any other patrons. You have to
invade their physical space to see what they are looking at. This is
extremely important for all types of users. (Imagine looking up
information about your own cancer treatment and likely prognosis in a
public area.) We respect that different people have different values
and comfort levels. That is why our community developed this flexible
policy that respects patron choice.
Even before we offered public access, we had extensive staff
training and discussion. We are sensitive to the concerns of our
employees to help them understand why and how the policy was developed.
We also have a complaint process although we remind people that we are
a library, not a court of law; we are not authorized to legally
determine whether something is obscene or not, whether it is
Constitutional or not.
Now with CIPA, those well reasoned and community supported outcomes
will be swept away and replaced by a blunt, indeed a crude instrument
that cannot respect First Amendment freedom, distinguish between the
needs of adults and children, or between the needs of a 7 year old and
a 17 year old. The law does not respect the diversity of values of our
communities or the power of concerned adults to find common sense
solutions to protect children. Sadly, the communities that will suffer
most from the CIPA mandate are those where librarians are struggling to
provide the first bridge across the digital divide and most need the E-
rate discounts.
What is expected from librarians under CIPA? Simply put--to do what
cannot be done. As Clarence Page so eloquently put in a recent
editorial in the Chicago Tribune to, ``force them to bear the cost of
technology that is expected to do what technology cannot do: make value
judgement about what material may be too pornographic, hateful,
illegal, or violent for human consumption.''
It would be difficult to put a price on the loss of the library as
a ``mighty resource in the free market of ideas'' (6th Circuit 1976).
It would be difficult to put a price on the transformation of the
librarian into a full time content monitor and censor. It would be
difficult to put a price on the replacement of trained librarians and
teachers, working and living within their communities, by a filtering
company which must sell to a national market to make a profit and which
typically refuses to disclose its blocking criteria, their employees'
qualifications, their ``point of view'' or their biases.
Librarians are well aware that Internet access can create or
exacerbate social problems, but we are philosophically committed to
finding answers in humane, not mechanical ways. We look to education,
both for skills and character, rather than to technology, for
solutions. We cannot and should not substitute for parents. It is
precisely because libraries are not a mass medium that we have no way
of knowing what any individual child's parents would choose for that
child. We constantly urge parents to be part of their child's library,
not just Internet, experience because no one knows their child better
or can apply their personal values better. And, we do not want our
parents to have a false sense of security by relying too heavily on
technological measures. The Internet is not the issue--it's people and
behavior that are at issue.
Now, with CIPA, Congress has substituted its judgement for
libraries all over our country that have--with their communities--
tackled the tough questions on how best to guide children's Internet
access and reached a diverse set of solutions. When Congress enacted
CIPA, the issue of how best to guide children's Internet access
appeared to be treated as an easy ``yes or no'' decision. In fact, it
is complex and deserves a full range of discussion in the community and
in the Nation. In my experience, those discussions lead people of all
persuasions to recognize that there is no simple answer to this
complicated issue and to encourage us all to work toward a viable
solution.
In the end, the CIPA law forces libraries to make an impossible
choice: submit to a law that forces libraries to deny their patrons
access to constitutionally protected information on the Internet or
forgo vital Federal assistance which has been central to bringing the
Internet to a wide audience. It is because the CIPA law demands that
libraries abandon the essential role that they play in a free society
as the ``quintessential focus of the receipt of information.'' (Third
Circuit 1992) that the American Library Association, the Freedom to
Read Foundation and many local libraries and state library associations
have challenged this law in Federal Court.
Although I do not agree with the decision made by Congress, I am
hopeful that your Subcommittee will recognize the vital role that
libraries play in assisting parents to help their children and
themselves learn to use these marvelous resources in ways consistent
with their family values. Although I believe that CIPA cannot and will
not achieve the goals of the promoters of filtering, and that, in the
process, communities and the First Amendment will be the victims, I am
hopeful that this will start a renewed dialogue between your
Subcommittee, the library community and other stakeholders. I realize
that it is too much to suggest that Congress should revisit this issue
but I believe that we must work together on how best to provide our
children, lifelong learners and students with the skills and the
resources to function effectively and safely in the information age of
the Internet.
Congress must understand that there is ``no one-size fits all''
solution that the Federal government can impose that is better or more
thoughtful than the solutions communities adopt. Even as we all wait
for the pending litigation process to be completed, we in the library
community, stand ready to work with you and to continue this dialogue.
Mr. Upton. We thank you for your testimony, and again it is
made part of the record in its entirety.
Ms. Getgood, welcome.
STATEMENT OF SUSAN J. GETGOOD, VICE PRESIDENT, EDUCATION
MARKET, SURFCONTROL
Ms. Getgood. Thank you. Chairman Upton and distingished
members of the subcommittee, I appreciate the opportunity to
speak with you today about Internet filtering technology, the
reasons that so many schools use it, and how it works.
My name is Susan Getgood and I am Vice President for
Education Markets at SurfControl. SurfControl is the owner of
CyberPatrol, the most widely used Internet filtering software
in homes and schools. I have been in the filtering industry for
nearly 6 years, which makes me something of an elder
stateswoman in this area.
CyberPatrol was a member of the Plaintiff's Coalition that
successfully challenged the constitutionality of the
Communications Decency Act in 1996. One of the chief arguments
in that case was that filtering technology was more effective
than the law in protecting children from inappropriate content
on line. It still is.
The difference between now and then is that there are
vastly more children on line and the technology is vastly
better. More children are surfing the Internet than ever
before; about 30 million according to the last study.
Educators are well aware of the dangers on the Internet.
Almost all of America's K through 12 schools have Internet
access. Many directly in the classroom, and about 60 percent of
these schools already use filtering technology. In deciding to
use filtering technology to safeguard kids, educators have
parents squarely behind them.
According to a 2000 digital media forum study, 92 percent
of Americans thought that pornography should be blocked on
school computers, and most educators agree. Filtering software
puts the choice of how and when children should use the web
where it should be; in the hands of parents and educators.
Filtering software in 1996 was, and in 2001 continues to
be, the most effective way to safeguard kids from inappropriate
content on-line, while safeguarding our First Amendment rights.
Filtering software is safety technology, like seatbelts, for
Internet surfing.
Seatbelts are not 100 percent guaranteed to save a child's
life, but there is not a parent in America that doesn't buckle
up when they get in the car. In the same way, filtering
technology may not be 100 percent fool-proof, but are users say
it is more than 90 percent effective, and they demonstrate
their satisfaction with our products by buying it, installing
them, and renewing them year after year. CyberPatrol's renewal
rate is 90 percent.
Educators know that filtering software is reliable,
effective, and flexible enough to allow them to tailor it to
their specific needs. They also know what filtering technology
is not.
It is not a replacement for the guidance of parents and
teachers. Schools implement filtering technology for many
reasons, and clearly the most compelling reason is the desire
to protect children at school from anything to sexually
explicit content to how to build a bomb, and how to buy a gun.
Increasingly, we find that schools are also driven by
issues of legal liability and network band width. Schools are
already filtering, as are some libraries, regardless of any law
or government mandate. We currently have more than 20,000
installations of CyberPatrol in schools, school districts, and
libraries, across the country, filtering over 1 million school
computers.
I have been asked to tell you a little bit about how the
technology works. Despite the widespread use of Internet
filtering technology, there is a great deal of misunderstanding
about how it works.
In the case of SurfControl and CyberPatrol, human
reviewers, who are parents, teachers, and trained
professionals, build lists based on published criteria. We do
use artificial intelligence in the research process, but all
sites added to our list of inappropriate sites have been looked
at by a person.
This is an important point because it means that there is
no confusion over chicken breasts than human ones. Filters used
in schools and other institutions are usually server based and
integrate with existing network users and groups for the ease
of use by the library or the school.
In our case, we offer stand alone versions of CyberPatrol
for patents at home, and server based solutions for schools. In
our product, CyberPatrol, keyword filtering is strictly
optional. It allows more control, including blocking search
engine results, which can often be quite descriptive.
Using key word filtering can also filter out material that
is not inappropriate, a condition often referred to as a false
positive. Because of this, we offer key word filtering as a
customable option in our software, but never as a default
technology used to filter websites.
We are often asked why we don't publish the list. We have
spent thousands of dollars in 6 years of work creating a list
that cannot be duplicated and is proprietary. No one has ever
made a credible business case for reviewing the list, and
ultimately a company whose mission is to protect kids is not
going to publish a directory of dirty sites.
I am certain that every company in our industries feel the
same way. Filtering software is very effective. Independent
reviews consistently show our CyberPatrol to be 80 to 90
percent effective in filtering out inappropriate content. That
is much more than a passing grade.
But the ultimate test of the filter's effect in this is how
well it meets the user's needs. Each parent, each school,
decides how it wants to deploy the filter. And then last just a
few comments since I am running out of time on CHIPA.
We believe that CyberPatrol effectively protects children
from adult material and fully satisfies the Children's Internet
Protection Act requiring that schools and libraries use such
filtering technology to receive their Federal funds. We also
believe in choice, and believe that it should be up to each
library and school to decide what is best for its patrons.
Some schools mistakenly believe that the ACLU and ALA
lawsuits apply to them, and they don't. Many schools are
waiting for the FCC ruling regarding certification on April 20.
For the 60 percent of schools in this country that have
already implemented filtering software, this is a crucial date.
We believe that there is an interesting Constitutional case
regarding mandated filtering in public libraries, and we hope
that the ACLU and the ALA would stick to their legal arguments
and not turn to the erroneous arguments that filters don't
work.
Filters do work and they work well. We believe that a
simple self-certification is the best solution. We also think
that a message needs to be sent to schools to let them know
that this lawsuit is not about schools, and we hope that this
hearing and the FCC ruling next month will clear up some of the
confusion. Thank you very much.
[The prepared statement of Susan J. Getgood follows:]
Prepared Statement of Susan J. Getgood, Vice President, Education
Market, SurfControl, Inc.
Chairman Upton, and distinguished members of the subcommittee on
Telecommunications and the Internet, I appreciate the opportunity to
speak with you today about Internet filtering technology, the reason so
many schools use it and how it works. My name is Susan Getgood and I am
Vice President for the Education Market at SurfControl.
SurfControl is the owner of Cyber Patrol, the most widely used
Internet filtering technology in homes and schools. I have been in the
filtering industry for nearly six years, which makes me something of an
elder stateswoman in this arena. Cyber Patrol was a member of the
plaintiffs coalition that successfully challenged the constitutionality
of the Communications Decency Act in 1996, ACLU v Janet Reno. One of
the chief arguments in that case was that filtering technology was much
more effective than the law in protecting children from inappropriate
content online. It still is. The difference between now and then is
that the technology is vastly better. And, there are vastly more
children online that deserve protection.
the growth of the net savvy child
More children are surfing the Net at home and school than ever
before. More than 30 million children in the United States have access
to the Internet, according to the Pew Project on the Internet &
American Life. Once online, these children find a wealth of valuable,
educational and entertaining content. But, as you know, not all online
content is meant for kids. The respected National Center for Missing
and Exploited Children estimates that 25 percent of children are
exposed to unwanted and inappropriate content online.
Educators are well aware of the dangers. Almost all of America's K-
12 schools have Internet access, many directly in the classroom and of
these, about 60 percent of schools already use some sort of filtering
device, according to Quality Education Data.
In deciding to use filtering technology to safeguard kids,
educators have parents squarely behind them.
parents and educators speak out
A 2000 Digital Media Forum survey found that 92 percent of
Americans thought pornography should be blocked on school computers.
A Middle and High School Computer Lab Director at Silver Creek
Central School District in New York was recently quoted in the press
talking about the schools' wake-up call, and why it decided to buy and
install filtering software. The educator said:
``I checked the history of each computer daily and was appalled
at the Web sites our students were able to access. Students
were visiting sexually explicit sites, gambling, applying for
credit cards, buying products with their parents' credit cards,
sending for free stuff and talking to strangers via chat
rooms.''
Ray Tode, School Technology Office for Andover, Massachusetts
schools, uses SurfControl's Cyber Patrol:
``The Internet is an important tool for the classroom. But with
the Internet comes inappropriate sites. So we want to filter
out those inappropriate sites to protect our students.''
about filtering software
Filtering software puts the choice of how and when children should
use the Web where it should be . . . in the hands of parents and
educators. Filtering software in 1996 was, and in 2001 continues to be,
the most effective way to safeguard kids from inappropriate Web content
while safeguarding our First Amendment rights of free speech.
Filtering software is safety technology, like seatbelts, for
Internet surfing. Seatbelts aren't 100% guaranteed to save a child's
life, but there's not a parent in America that doesn't buckle their
child's seatbelt when the family gets in the car. Similarly, filtering
technology may not be 100% foolproof, but our users say it is more than
90 percent effective and they demonstrate their satisfaction with our
product by buying it, installing it and renewing their subscriptions
year after year. Cyber Patrol's renewal rate is over 90%.
Educators know that filtering software is reliable, effective and
flexible enough to allow them to tailor it to their specific needs.
They also know what filtering technology is not. It is NOT a
replacement for the guidance of parents and teachers.
surfcontrol
SurfControl is a leading provider of Internet filtering solutions
for homes, schools and businesses. It acquired SurfWatch in 1999 and
Cyber Patrol in 2000. Both of these companies were pioneers in the
Internet filtering industry.
Because SurfControl provides filtering products for all major
sectors--business, education, home and other technology companies--it
understands why each market deploys filtering software.
At home, parents purchase filtering software to protect their
children from inappropriate content online. Corporations implement
filtering software to maximize employee productivity, protect the
company from legal liability arising from potential sexual harassment
and preserve network bandwidth and security.
Schools implement filtering software for ALL of these reasons.
Clearly, the most compelling reason is the desire to protect children
at school from everything from sexually explicit content to how to
build a bomb and how to buy a gun. Increasingly, we are finding that
schools are also driven by the issues of legal liability and network
bandwidth.
This was confirmed by a recent survey we conducted asking 1200
customers how important network bandwidth was in their Internet
management this year. About 70% of the schools said that network
bandwidth was important or very important this year. This compares to
only 55% that noted its importance last year. The growing need to
better manage bandwidth in schools has been given additional importance
with the popularity of file sharing services like Napster and the
widespread use of streaming video.
What this means is that the majority of schools were already
filtering and now even more find it an important Internet management
tool--irrespective of any law or government mandate.
We currently have more than 20,000 installations of Cyber Patrol in
schools and school districts, filtering over 1 million school
computers. Business is booming.
how web filtering works
Despite the widespread use of Internet filtering technology and its
longevity in the marketplace, a great deal of misunderstanding exists
about how it actually works. The most commonly used filters in schools,
like SurfControl's Cyber Patrol and N2H2's Bess, are category list-
based products that filter by IP address or domain name.
In the case of Cyber Patrol, human reviewers, who are parents,
teachers and trained professionals, build the lists based on published
criteria. We use artificial intelligence in the research process, but
ALL sites added to our CyberNOT list of inappropriate content have been
reviewed by a person. This is an important point because it means there
is no confusion over chicken breasts and human ones.
Some products filter at the root, or domain, level. More
sophisticated filters like Cyber Patrol allow restrictions to be set at
directory or page levels, so you don't have to restrict an entire
website if one page contains inappropriate content.
The CyberNOT list is divided into 12 categories: Violence/
Profanity, Partial Nudity, Full Nudity, Sexual Acts, Gross Depictions;
Intolerance; Satanic/Cult, Alcohol & Tobacco, Drugs/Drug Culture,
Militant/Extremist, Sex Education and Questionable/Illegal & Gambling.
Other products used in schools offer similar categories.
Filters used in schools and other institutions are usually server-
based and integrate with existing network users and groups for ease of
administration and security. In our case, we offer standalone versions
of Cyber Patrol for parents at home and server-based solutions for
schools. A new version of Cyber Patrol has been created for Microsoft's
ISA Server, the latest technology for Internet servers.
In Cyber Patrol, keyword filtering is strictly optional. It allows
more control, including blocking search engine results which can often
be QUITE descriptive. Using keyword filtering can also filter out
material that is not inappropriate, a condition often referred to as a
false positive. Because of this, we offer keyword filtering as a
customizable option in Cyber Patrol but never as the default technology
used to filter websites.
Typically, filtering software is sold as a subscription that
includes the right to use the software for a specified number of users
and a subscription to the vendor's list of inappropriate sites. As an
example, a 100-user license of Cyber Patrol would cost a school about
$1500 per year. We also offer schools an e-rate discount to help
compensate for the fact that e-rate funds cannot be used for filtering
software.
why don't you publish the list?
We are often asked why we don't publish the list of inappropriate
sites. SurfControl has spent thousands of dollars and six years of work
creating a list that cannot be duplicated and is proprietary. No one
has ever made a credible business case for revealing the list, and
ultimately, a company whose mission is to protect kids is not going to
publish a directory of dirty sites. I am certain that the other
companies in our industry have similar feelings.
how effective is filtering software?
Filtering software is very effective. Independent reviews
consistently show SurfControl's Cyber Patrol to be 80 to 90 percent
effective in filtering out inappropriate content. That's much more than
a passing grade.
But the ultimate test of the filter's effectiveness is how well it
meets the user's needs. Each parent, each school decides how it wants
to deploy the filter. The most commonly used filters like Cyber Patrol,
Bess and Net Nanny allow users to make their own choices about what is
restricted or allowed. The user can choose which categories to use,
customize filtering levels to individual kids or classes and even
create their own list of content to be restricted or allowed. For
example, with Cyber Patrol a school can restrict all sexually explicit
content for younger children and allow our Sex Education category,
which includes important resources like Planned Parenthood, for older
children.
Filtering software, including the server-based software used in
schools, is highly tamper resistant. It is also designed to be easy to
use, for the busy school technology coordinator, and easy to customize,
to satisfy the teachers who need adjustments made to meet educational
goals.
Ultimately, in a competitive market economy, companies like ours
are successful because we offer products that meet the needs of our
customers. Our customers require, and get, the best tools possible for
managing Internet access and our development team works every day to
constantly improve the technology.
surfcontrol's cyber patrol satisfies chipa
SurfControl's Cyber Patrol software effectively protects children
from adult material online and fully satisfies the Children's Internet
Protection Act requiring schools and libraries that receive federal aid
for Internet service to use such filtering technology.
Thousands of schools and libraries nationwide have been using Cyber
Patrol and other filters for years. Our focus was and continues to be
on schools, not libraries. We do not market to libraries. But we do
believe in choice. We believe it should be up to each local library to
decide what is best for its patrons.
Cyber Patrol does not have separate categories for ``Child
Pornography'' or ``Obscene by Legal Definition.'' These are legal terms
requiring interpretation by attorneys and the courts. But Cyber Patrol
does block illegal and pornographic material. It also filters obscene
speech that has been defined by the courts. And we filter other online
material that many people deem inappropriate for children, such as
gambling, violence, hate speech, cults, alcohol and tobacco. Using the
custom list capability, any user could also create their own
restrictive list, for example, of sites determined by a local court to
be obscene.
The ACLU and the ALA have an interesting constitutional case
regarding mandated filtering in public libraries. We had hoped they'd
stick to the legal arguments, and not turn to the erroneous argument
that filters don't work. Filters do work, and they work well. But they
have not stuck to the legal case and the result has been some
confusion.
Some schools mistakenly believe the ACLU and ALA lawsuits apply to
them. They don't. Many schools are waiting for the FCC ruling regarding
certification on April 20. For the 60% of schools in this country that
have already implemented filtering software, this is a crucial date.
We believe that a simple self-certification is the best solution.
We think that a message needs to be sent to schools to let them know
the lawsuit is not about schools. This hearing and the FCC ruling may
help clear up some of the confusion.
Filtering software products like Cyber Patrol are technical
solutions to help implement school policy and choice. SurfControl makes
the software; our users make their own choices about how they will use
it in their home, school or business. Our job is to meet the needs of
our users and we will continue to do so as those needs, and the
Internet itself, change and evolve. Thank you.
Mr. Upton. You did very well speaking very fast. I would
just note that your entire remarks are made part of the record,
and for purposes of an introduction, I yield to a good friend
and Member of the Subcommittee, Mr. Largent, from Oklahoma.
Mr. Largent. Thank you, Mr. Chairman. I want to welcome a
friend and a constituent from Tulsa, Chris Ophus. He is the
President of FamilyConnect, an Internet filtering service that
we employ in our own home, and, Chris, we want to welcome you
to the Subcommittee and look forward to your testimony.
STATEMENT OF CHRISTIAN OPHUS, PRESIDENT, FAMILYCONNECT
Mr. Ophus. Thank you. I appreciate it. As Congressman
Largent said, my name is Chris Ophus. I am co-founder and
president of a company called FC Technologies, and that
specifically deals in filtering technology and to create
workable solutions.
I am also currently serving as the president of the
Internet Safety Association, which is a group of Internet
filtering technology companies that have come together to be
able to offer solutions very much in particular to what this
bill has to say.
I, as the same as Ms Getgood, and I do agree with what she
has said, have such a tremendous amount of information that
there is no way to cover it all, and what I would like to do is
just cover a couple of things.
The Internet is without a doubt the most unique mass medium
that is out there. It is a convergence of all the mass
mediums--radio, television, print, mail. They are all coming
together as one.
And because it is an emergent technology, there is a unique
set of problems that have been created because of the open
forum, the open software and the way it works, and the
technology, and it creates a lot of problems.
But part of the problem that we are seeing here is we are
seeing an attack on Internet filtering because of the First
Amendment, and the First Amendment is a very sacred cornerstone
to our government, and we all believe in that, but there are
obvious exceptions: obscene, illegal, and harmful to minors
material.
My big question is why is there all this controversy here.
What makes the Internet as it is used in public schools and
libraries immune to some of these existing laws that are
already in place?
You have got to have exceptions; child porn and a lot of
the violence, and rape, and molestation, and those types of
things, filters really are the best way to block that and keep
the good.
Now, I will say that as anybody else who works in the
filtering industry, filters are not fool proof. But one of the
other things that we have been seeing a lot of is that there is
all different kinds of filtering products.
There are some that are client side, and some that are
server side, and Ms. Getgood mentioned that. In a recent
Consumer Reports article--and Mr. Johnson mentioned that just a
few moments ago--there was a test done.
And the test basically covered about six products that were
in the consumer side, and the results were very negative. And
because of that report, I had sent a letter to the editor,
David Hyme, of Consumer Reports, outlining some of the things
that I thought were concerns with his report; the small sample
size, and the unknown criterion, whether or not the sample was
random; testing only 6 of the 141 products that get netwised
out of ORD lists.
And also not testing educational filters, and I have a copy
of the Consumer Reports letter that he returned to me, but I
want to just outline in the next to last paragraph that he
returned a response and said, ``we are, however, guilty of
testing only so-called client side software.
``Since our founding, we focused on testing products
available to consumers. It is not part of our breach to test
software sold exclusively to schools and libraries.''
Now, the Consumer Reports article is being mentioned by the
American Library Association, and the ACLU, is proof positive
that filtering does not work; when in fact what is happening is
that they are testing some of the lower level filtering systems
and painting the entire filtering industry with that brush, and
that simply is not true.
The truth is that filtering does work. I would like to make
another comment regarding what he had said, and I will include
it in my testimony, because I feel that it is very important.
And that is regarding education. That education alone, or as a
component of some other ideas aside from technology protection
measures, can somehow be able to protect us.
If you take an example of drivers education. We have
drivers education and all the drivers education in the world is
not going to stop teenagers, or even adults, from getting into
accidents, and Ms. Getgood even mentioned the seat belt laws.
We have laws that are going to try to do the best that they
can to be able to control and manage this kind of technology. I
would like to quote another gentleman, Christopher Hunter, who
was one of the COPA panelists which was also mentioned.
And he said that the majority of the reports about Internet
content filters being both under and over inclusive--he was
talking about blocking--are from journalists and anti-
censorship groups who have used largely unscientific methods to
arrive at the conclusion that filters are deeply flawed.
If you look at some of the other testing that is done out
there, there have been some larger tests, some comprehensive
tests, one by David Burt, in ``Dangerous Access 2000,'' where a
particular filtering product was used in the public library in
Cincinnati over a large period of time, a large sample.
And found that they only wrongly blocked sites .019 percent
of the time. There have been similar studies in other libraries
that have done this type of thing. So, I would say in
conclusion that there is a crying need for Internet filtering
out there.
And out of the all of the decisions that need to be made by
this subcommittee, certainly whether or not filtering is
effective should not be one of them. The technology exists out
there; the artificial intelligence, computer spidering, human
review, millions of data bases categorized, and all of those
things come together to provide the effective tools that
librarians and educators need and already have to be able to be
effective. Thanks.
[The prepared statement of Christian Ophus follows:]
Prepared Statement of Christian Ophus, President, Internet Safety
Association
introduction
My name is Christian Ophus, I am the co-founder and President of
FamilyConnect, Inc. and S4F Technologies, Inc., a filtering technology
provider founded in 1997 and headquartered in Tulsa, OK.
In addition to my corporate duties, I currently serve as President
of the Internet Safety Association, founded in September 2000 and
headquartered in Washington D.C. The ISA (Internet Safety Association)
was created by leaders in the Internet Content Management Industry to
promote safe use of the Internet for all users.
I would like to thank the U.S. House of Representatives Committee
on Energy and Commerce, Subcommittee on Telecommunications and the
Internet for inviting me to submit testimony.
I will focus my comments specifically on filtering & Internet
content management technology, offering background, current approaches
and tools, and future developments.
technology protection measures--why are they necessary?
The Internet is truly the most comprehensive and unique mass medium
in the history of communication. The Internet is rapidly becoming the
convergence of all other forms of communication. Television, radio,
print, postal service and telephone service, are all available via the
Internet. But even more amazing, is that the Internet has become the
new backbone of these other communication mediums, ensuring that the
Internet industry is here to stay. Our dependency upon this new medium
has flourished, especially in the past decade. The Internet is an
emerging technology that has it's own set of problems.
The Internet is essentially an open network with a common language
that allows anyone worldwide to access and transmit information. It is
essentially a public forum, which fosters the free transmission of
information and ideas.
One of the sacred cornerstones of the founding fathers was to
preserve the free transmission of ideas and information. That is why
the very first amendment covered this issue. However, there are obvious
exceptions to the first amendment. Information that is obscene, illegal
and harmful to minors is not protected under the first amendment.
Outside of the Internet, this type of information in any other medium
is prosecutable under existing laws and regulations. To understand why
illegal content via the Internet has become so controversial is
puzzling. One might ask: What makes the Internet immune to existing
laws and statutes that are already in place to protect individuals from
material that is deemed detrimental in nature?
Although the Internet is a viable tool for business, education and
commerce, there is a significant amount of obscenity and illegal
information. The goal is to limit access to this type of material
without affecting the overall Internet experience for the user.
Filtering technology is the best alternative to solving these issues.
Historically, there has been controversy concerning the
effectiveness of filters. The rapid growth and dynamic nature of the
Internet make Internet filtering a constant moving target.
In the mid-nineties, a few companies emerged in an effort to offer
technological solutions to the ever-expanding problem of detrimental
and illegal activity on the Internet.
The first approach relied on artificial intelligence to block
access to pornographic or objectionable material. These systems were
based on keyword filters that would filter incoming data and look for
words such as ``sex'', ``XXX'' or ``breast''. This type of approach
was, in fact, good at identifying pornographic & illegal websites, but
inadvertently blocked legitimate site searches such as ``Middlesex'',
``Super bowl XXX'' or ``chicken breast recipes'', etc. To solve this
problem, new ways of filtering would have to be developed.
Many opponents of filtering use the argument that filters still
make these kinds of mistakes. Today's technology has risen far above
these early products by using computers that scour the Internet coupled
with human review to ensure a high level of accuracy.
In fact, today's Technology protection measures are more advanced
than ever before. Not every filtering product is the same. In the same
way that there are different types of automobiles, some have more
features than others, some are more expensive and then there are some
that were created with specific purposes in mind. If your desire were
to race in the Daytona 500, then you would not drive a Yugo. If your
goal were fuel economy, you would not drive a Hummer. Similarly, there
are different types of filters for different objectives. Some are less
expensive and offer less protection and less control. At the same time,
there are filtering products that have been specifically designed to
operate in a more commercial application such as large corporations,
schools and libraries.
To ensure successful lasting implementation of a technology
protection measure, you must fit the product with the application.
Opponents of filtering have misled the public into believing that
filtering does not work, or more accurately, does not work well. The
justification for this claim has been a few isolated studies where the
testing criterion is questionable and the results generalized.
In March 2001, Consumer Reports published an article about
filtering technology where 6 off-the-shelf filtering products were
tested. The results indicated that the tested products did poorly when
the testing criterion was applied. The article proceeded to question
the government's imposition of filtering on schools and libraries
through the Children's Internet Protection Act, citing that the test
results were clearly negative.
In response to the article, I wrote the editor of Consumer Reports
on February 23, 2001 and questioned the products tested and the
criterion used to test the effectiveness. Here is an excerpt of that
letter:
``First, the objectionable content site sample used, 86, was
obviously but a small fraction in comparison to the vast number
of adult and illegal websites on the web. To effectively test
any filter, a more appropriate sample might have been 10,000 or
even higher.
Second, a thoughtful set of criteria should be established in
the selection of sites to be tested to ensure that the sites
chosen are a statistically accurate representative sample of
the range and type of objectionable sites found on the web.
Your article did not indicate what criterion, if any, was used
to determine which 86 sites were to be used. For example, we do
not know if the author searched for 86 obscure sites or chose a
random sample from a popular search engine. The answer to that
question would dramatically affect the outcome of your informal
survey.
Third, only six of the 141 filter-related products listed on
the popular information website www.getnetwise.org were tested.
The products tested, with the exception of AOL's parental
controls, are client-side products. No server-side filter
systems were tested. Also, some of the most popular filter
programs were not included in the test.
Fourth, none of the filters tested are those typically used
in the educational space. Filters such as N2H2, X-stop, I-gear,
S4F and Web Sense were not even mentioned, and these products
represent the vast majority of the access-control market share.
Would it not be reasonable to test those products that are most
commonly used and perhaps those who have made the greatest
advancement in creating solutions that work for everyone?
Fifth, the test conducted did not include one of the most
important aspects of filtering, the ability of the software to
be overridden or bypassed by web-savvy kids. A filter can be a
false sense of security to a parent or educator if it can be
easily bypassed. Features such as this contribute greatly to
the overall value and effectiveness of a filter.
I hope you can see how these seemingly innocent oversights
lead to erroneous, generalized conclusions. The fact is, there
have been significant advancements by many companies even in
the past year that validate the claim that filtering works and
is effective in protecting children from illegal and dangerous
information.''
In response to my letter, I received a return letter dated March
7th, 2001 where the editor admitted that the products tested were from
the consumer level and not those used in the educational space.
``We are, however guilty of testing only so-called client-
side software. Since our founding in 1936, we've focused on
testing products available to consumers at the retail level. It
is not part of our brief to test software sold exclusively to
schools or libraries. By analogy, we would test garden hoses,
sponges and auto polish, but not commercial car-wash
equipment.''
I encouraged the editor to consider a more comprehensive test where
some of the more popular and broadly used filters could be included. I
am sure the results would be entirely different.
David Burt, in his written testimony before the COPA commission in
July of 2000, cited several larger studies of Internet filtering
products where the outcome of filtering effectiveness was quite
different.
In the Dangerous Access, 2000 edition by David Burt, the filter
product, Bess, used at the public library in Cincinnati and Hamilton
County wrongly blocked sites only .019% of the time.
A study by Michael Sims ``Censored Access in Utah Public Schools,
1999'' found error blocking rates at .036%. These numbers are a far cry
from so-called tests being highlighted by filtering opponents.
Christopher Hunter, a COPA panelist said:
``The majority of reports about Internet content filters
being both under inclusive and over inclusive have come from
journalists and anti-censorship groups who have used largely
unscientific methods to arrive at the conclusion that filters
are deeply flawed.''
current approaches to content filtering
There are two typical approaches to filtering--inclusion filtering,
and exclusion filtering.
Inclusion Filtering--White Lists
With inclusion filtering, Internet users are permitted access to
particular ``allowed'' sites. This type of filtering can be 100%
effective--assuming the person or organization that has compiled the
white list that shares the same set of values as the Internet user.
Because of the global nature of the Internet, it is difficult to create
with a globally accepted set of criteria. The main drawback of
inclusion filtering is that the ``acceptable list'' would have to be
enormous to be accurate. The creation of a blocked list tends to be
more manageable.
Exclusion Filtering
Exclusion filtering is based on black lists (or block lists) of
objectionable sites. This is a more common form of filtering than
inclusion filtering, and has the advantage that black lists will
invariably be smaller than white lists. A second advantage is that
unrated sites are presumed to be innocent till proven guilty, and so do
not need to be automatically excluded.
Both types of content filtering require a constant effort to
maintain a valid and updated list for use by the user. The most
effective approach is to use the benefit of computer technology,
coupled with unique capabilities in human review.
what content can be blocked?
In the early days, companies offered 1 or more categories of
blocked sites, offering little or no control to the end-user. Today,
most companies offer multiple categories and varying levels within
these categories, giving complete control and flexibility of
application to the end-user.
Some filtering providers offer as many as 35 categories allowing
the administrator complete local control over what is being blocked.
Here is an example of a typical category listing from N2H2:
Adults Only, Auction, Chat, Drugs, Education, Electronic
Commerce, Employment Search, For Kids, Free Mail, Free Pages,
Gambling, Games, Hate/Discrimination, History, Illegal, Jokes,
Lingerie, Medical, Message/Bulletin Boards, Moderated, Murder/
Suicide, News, Nudity, Personal Information, Personals,
Pornography, Recreation/Entertainment, School Cheating Info,
Search, Search Terms, Sex, Sports, Stocks, Swimsuits,
Tasteless/Gross, Tobacco, Violence, Weapons
Most of the above categories are not classified as illegal or
detrimental in nature, but give the user a wide range of control when
determining what information is appropriate for the viewer or more
commonly, for what application the filter is being used.
An employer may want to block access to job sites or other non-work
related sites to reduce employee Internet abuse in the workplace.
Several studies have indicated that loss of productivity from Internet
use has cost employers billions of dollars each year.
The point is filtering products today offer the user a wide range
of options and combinations that allow the user to determine what is
and is not blocked. In the educational space, the local school board
can determine what information is appropriate to block based upon
community standards, federal laws and the individual states harmful to
minors statutes.
how filtering is accomplished
There are several approaches to filtering content. As technology
has progressed, the most effective methods have been improved, new ways
to filter have been developed and many products have taken the best
features from each approach and created a hybrid of several methods.
There are four primary methods that are used in varying degrees.
URL Filtering
This is the most common, and most effective form of filtering, and
involves the filtering of a site based on its URL (i.e. its address).
It provides more fine-grained control than packet filtering, since a
URL can specify a particular page within a large site, rather than
specifying the IP address of the computer that hosts the Content.
S4F Technologies adds an average of 5,000--7,000 new URL's to its
database each week. Computer spiders scour the Internet using a
sophisticated search mechanism that collects potential sites for human
review. Spidering computers run programs that systematically read
through the World Wide Web and collect URL's (Uniform Resource
Locators) that match a particular set of criteria established a
filtering department. These computer can run 24 hours a day and collect
potential candidates to be added to the database. However, spiders are
not perfect, and using spiders alone as the mechanism for fortifying a
blocked site database would result in overblocking. That is why human
review must be used when accurately building a blocked database.
During the human review process, using custom browsers, sites can
be positively identified and properly added to the database. As soon as
a site is added, it is active in the blocked list for all to use. If a
site is inadvertently blocked, it is reviewed and a decision is made
within 24 at the most. If the site contains Child Pornography it is
automatically forwarded to the National Center for Missing and
Exploited Children.
One of the challenges facing filtering departments is managing the
constant change of the Internet. When a website is reviewed, it may not
contain obscene material, but at some later point, the author of the
website may change the content that now would be considered
inappropriate. Conversely, a site with content that may have at one
time been considered pornographic or illegal could change and be
perfectly acceptable. So, in addition to keeping up new sites that come
online daily, filtering departments must constantly review those sites
that are already categorized.
Considering the ongoing task of Internet content data management,
coupled with the constant change in the Internet snapshot, filtering
companies do an amazing job of keeping up.
Keyword Filtering
Keyword filtering was the first generation of filtering. With
keyword filtering, content is scanned as it is being loaded into a
user's computer for keywords, which are included in a black list. A
site is blocked if it contains any of the words in the block list.
The advantage of keyword filtering is that it adds very little
computational overhead. The main disadvantage is that it checks text
only, and cannot block objectionable pictures, plus, some products
filters are indiscriminate, as the context is not taken into account.
However, one of the advances of S4F Technologies, is the
development of IKSSB (Intelligent Keyword Search String Block out)
where the keyword component operates as a secondary line of defense to
the primary specific URL block out database, and has the ability to
decipher the difference between a website containing pornography, and
one that has text which contains the word pornography.
For example the IKSSB can differentiate between searches for
``breast'' and ``chicken breast recipes'' or another example, the
difference between ``sex'' and ``Middlesex, England''. Both of these
examples have been tirelessly used by opponents of filtering to claim
that keyword filters can block useful sites.
IKSSB Keyword Search String Examples
------------------------------------------------------------------------
------------------------------------------------------------------------
Sex....................................... Middlesex, England, Sexually
Transmitted Diseases, Sex
Education, Sextant
Breast.................................... Chicken Breast, Breast
Cancer
------------------------------------------------------------------------
Although S4F uses this filter component as a secondary line of
defense, it exhibits the technological adaptation of filtering
companies to remedy earlier filter problems. Technical issues regarding
filters have been overcome by most leading companies in the filtering
space.
Packet Sniffing
Content is delivered over the Internet in packets of information.
Each packet has the IP address of where it is going to, as well as the
IP address of where it has come from. Packet sniffing involves
examining the IP address of where the Content has originated. This
approach moves the point of filtering to the level of the router
offering increased speed and efficiency. There are several companies
that are developing packet-sniffer products at this time.
Image recognition filtering
A handful of companies have produced filtering products that
examine images as they are delivered to a user. This is a relatively
recent approach, and relies on techniques such as the detection of skin
tones, or indeed on the analysis of images themselves. It is
computationally quite intensive, and computers will invariably
experience difficulty in distinguishing between art and pornography. A
photograph that is artistic in nature cannot be distinguished from that
of obscenity. These types of value judgments can only be made by human
review. Video and other streaming media further complicate the
filtering task by supplying a constant flow of images to be examined
for undesirable content.
where does content filtering occur?
There are four technical components of filtering systems: browser-
based, client-side software, proxy servers, and server-side filtering
servers.
No filter is foolproof. There are 146 filtering tools listed on the
popular website www.getnetwise.org. Each of these products essentially
falls into one of the categories below. It is important to note that no
filtering system is designed to work well in every application.
Some of the lower-end products would not be recommended for use in
schools and libraries because they lack the specific features that
educators need to create the best filtering scenario for their school,
library and for their community. Conversely, those products that are
used in the corporate space may need more flexibility of categories,
and schools & libraries might be only interested in blocking sites that
fall into the obscene, illegal and harmful to minors categories where
parents might have other desires.
Client-Side Software--This type of method is typically marketed at
the consumer level. Filtering can be implemented by placing a software
program on the end-user's computer. The software then runs while the
user is online, performing the particular filtering functions. Client-
side software may require the end-user to configure the software and
download updated website lists.
The security loopholes with client-side software are a concern.
Many smart children can disable filtering software faster than a parent
or teacher can install it. In addition, there are quick and easy
programs written to disable the major companies' software with the
click of a mouse. These programs are circulated among children who
simply download it from the Internet, place it on a floppy disk, and
pass it around.
Proxy Servers--Filtering functionality can be removed from the end-
user's computer and placed on a server somewhere else on the Internet,
called a proxy. With a proxy server, all website traffic must go from
the end-user's computer through the proxy server, then to the rest of
the Internet.
Proxy servers offer more security than client-side software. All
users must go through this proxy server to be able to access the
Internet ``proper''. To do so, the client is required to configure
their software to ``point to'' this proxy server to be able to access
Web pages and ftp files. a range of Internet-based Failure to do so
will result in blocked access to the Internet. A proxy filter can be
selective about what it blocks, and can be configured to block or
permit access to services.
Browser Settings--Filters using built-in browser settings typically
uses a ratings system. These systems are less intrusive but typically
less accurate.
Microsoft Internet Explorer provides content security settings for
the Internet Content Ratings Association's RSACi ratings, the most
popular ratings system on the market. However, if a site is not rated,
it is not accessible. Popular sites that are not rated include ESPN,
CNN, eBay, Amazon, and AOL. In fact, most sites are not rated, making
them inaccessible to the user.
Some filtering software ``decides'' what to block based on how a
site is rated--not entirely unlike the way parents use movie ratings.
This method offers fewer features and less precision compared to some
of the higher-end server-side products.
Hybrid Filters--There is a new filtering method that utilizes the
best features from each of the other methods. This hybrid system has
varying forms. S4F Technologies patent-pending system uses a server-
side component that works in tandem with a thin client-side software
interface. By using more than one method, the user is able to take
advantage of the benefits of server-side filtering, including real-time
access to the most up-to-date database, the speed benefit and user-
control features of client-side technology.
future advancements in filtering technology
Filtering technology providers have dedicated thousands of man-
hours and millions of dollars in research and development to create
real solutions for schools, libraries, homes and businesses. At best,
the filtering industry is only 7 years old. The advancements in
technology over the past 2-3 years alone have brought about products
that combine artificial intelligence, advanced algorithms, intelligent
keyword databases, computer spidering technology, millions of websites
accurately categorized. All of this, while increasing speed, efficiency
and manageability through cutting edge system design and engineering.
Internet filtering is not foolproof. The dynamic of the Internet as
it relates to filtering can be likened to virus detection software.
Products in the virus detection industry use similar algorithms, they
monitor packets being transmitted over networks, and they have
extensive databases of known viruses and their signatures, yet these
virus detection tools are not fool-proof, still network administrators
world-wide use these programs to protect their networks because that
can offer a high level of protection, even if it is not 100%.
It seems that the opponents of filtering technology wish to cast
down the use of any filtering software because it might only be 95-99%
effective. Opponents are trying to hold filtering software to a higher
standard than other types of similar and related products. Windows and
Macintosh operating systems, Internet dial-up connections, computer
manufacturers and virtually any software application manufacturer all
create and sell products that are not fool-proof and error free. That
is why software companies continue to release updates and create new
versions, to keep up with the ever-changing marketplace. It is an
acceptable part of the computer industry.
Future filtering technology advancements will see the convergence
of several of the approaches reviewed.
are there other alternatives to filtering?
Some of the opponents of the Children's Internet Protection Act
have suggested that filtering is not necessary; rather, a strong
education program that trains children how to have a positive Internet
experience is all that is needed.
Although I feel that education is a great way to teach children
about the dangers of the Internet, it is surely no replacement for
technology protection measures. The biggest problem is that much of the
pornographic and illegal exposure to minors is accidental. The National
Center For Missing and Exploited Children released a study where 1 in 4
minors reported viewing of unwanted material. It is a well-known fact
that in an effort to increase viewer ship, operators of obscenity
websites will use unrelated keywords and misleading URL's to attract
unsuspecting users to their site. Once the image is viewed, the damage
is done and the law has been broken. All the education in the world
cannot stop that from happening.
To illustrate this erroneous argument, consider drivers education.
Millions of teenagers and adults each year take some form of driver's
education or training. Yet the government has put seatbelt laws in
place to protect people from harm. All the driver's education in the
world cannot stop accidents from happening. Seat belt laws do not
guarantee to protect the passenger 100% of the time, in the same way
that Internet filters cannot ever guarantee 100% perfect performance,
yet they are a great tool to divert the vast majority of Internet abuse
in schools and libraries.
Monitoring has been considered as an alternative to filtering. This
approach places the burden of policing the Internet on educators and
librarians who cannot possibly mange the activities of every Internet
user. Once again, if sites are accidentally seen, the damage has been
done.
can existing technology protection measures meet the requirements of
the children's internet protection act?
The answer is a resounding yes. The Children's Internet Protection
Act requires that a school or library select a technology protection
measure, which they choose, not the government through a public hearing
and the creation of an Internet safety policy. The local board
determines what to block based upon Federal and state laws as well as
local community standards.
This law encourages public education and empowers consumers and
local authorities to work together to create a solution that is right
for everyone. Schools and libraries have the affirmative duty to
protect minors while in their custody. Using technology protection
measures shows that educators are taking reasonable steps to protect
their kids. Effective filtering technology exists and is effective.
The leading filtering products in the educational space already
have the necessary functionality to meet the requirements of the law.
Here is a profile of those products:
CIPA related features comparison of the most popular filters in public schools and public libraries.
(provided by David Burt of N2H2)
----------------------------------------------------------------------------------------------------------------
Secure
SurfControl Symantec I- Computing 8e6
N2H2 Bess WebSense Cyber Gear Smart Technologies
Patrol Filter X-Stop
----------------------------------------------------------------------------------------------------------------
Separates pornography from sex \1\Yes \2\Yes \3\Yes \4\Yes \5\Yes \6\Yes
education, artistic nudity, etc?
Can be overridden at workstation Yes Yes Yes Yes Yes Yes
level by teacher or librarian?..
Ability to set different levels Yes Yes Yes Yes Yes Yes
of filtering (age, etc.)........
Provides page where student or \7\Yes \8\Yes \9\Yes May be \11\Yes \12\Yes
patron may request that a site added by
be blocked or unblocked?........ school or
library.\10
\
K-12 Market Share (IDC) \13\..... 19.9% 6.4% 18.2% 5.1% 7.7% 2.6%
Library Market Share \14\........ 20% 6% 51% NA 2% 2%
----------------------------------------------------------------------------------------------------------------
\1\ N2H2 offers six sex-related categories: ``Adults only'', ``Lingerie'', ``Nudity'', ``Porn'', ``Sex'', and
``Swimsuits''. Additionally, N2H2 has four ``Allow exception categories'' related to sexual material:
``Education'', for sexually explicit material that is of an educational nature, ``History'', for material of
historic value, such as the Starr Report, ``Medical'', for material such as photographs of breast reduction
surgery, and ``Text'', for pornographic or sexual material that only contains text. Category descriptions
available at http://www.n2h2.com/solutions/filtering.html
\2\ WebSense offers five sex-related categories: ``Adult content'', ``Nudity'', ``Sex'', ``Sex Education'', and
``Lingerie and Swimsuit.'' Category descriptions available at http://www.websense.com/products/about/database/
index.cfm
\3\ Cyber Patrol offers five sex-related categories: ``Partial Nudity'', ``Full Nudity'', ``Sexual Acts'', ``Sex
Education.'' Category descriptions available at http://www.surfcontrol.com/products/
cyberpatrol__for__education/product__overview/cybernot__cats.html
\4\ I-Gear offers six sex-related categories: ``Sex/Acts'', ``Sex/Attire'', ``Sex/Personals'', ``Sex/Nudity'',
``SexEd/Advanced'', ``SexEd/Sexuality'' Category descriptions available at http://www.symantec.com/nis/
category__defs.html
\5\ Smartfilter offers three sex-related categories: ``sex'', ``nudity'', ``obscene'', ``mature'' Category
descriptions available at http://www.securecomputing.com/index.cfm?sKey=86
\6\ X-Stop offers three sex-related categories: ``R-rated'', ``obscene'', ``pornography'' Category descriptions
available at http://www.8e6technologies.com/docs/Manual__nt__proxy45.pdf
\7\ N2H2 end users who feel they are unfairly blocked can request a review, or request a site be blocked at
http://www.n2h2.com/solutions/request__review.html
\8\ WebSense end users who feel they are unfairly blocked can request a review, or request a site be blocked at
http://database.netpart.com/site__lookup/. Users may also test a site to see if it is blocked or not.
\9\ Cyber Patrol end users who feel they are unfairly blocked can request a review, or request a site be blocked
at http://www.cyberpatrol.com/cybernot/ Users may also test a site to see if it is blocked or not.
\10\ I-gear end users who feel they are unfairly blocked can request a review, if the system administrator has
created a custom block page. This process is described at http://service1.symantec.com/SUPPORT/igear.nsf/
9ad8bd108cd5c204852568bf005eef45/afb45fe0adfcb 6af85256919004f1032?OpenDocument&Highlight=0,contact
\11\ Smart Filter end users who feel they are unfairly blocked can request a review, or request a site be
blocked at http://www.securecomputing.com/index.cfm?sKey=234 Users may also test a site to see if it is
blocked or not.
\12\ X-Stop end users who feel they are unfairly blocked can request a review, or request a site be blocked at
http://www.8e6technologies.com/submit/index.html
\13\ ``Worldwide Market for Internet Access Control'', Chris Chistensen, IDC, 2000. Page 11.
\14\ ``School Library Journal's School Internet Filtering Survey'', Dr. Ken Haycock, Cahners Research, August
2000. Page 19.
evidence of librarian satisfaction with filters
Statistics show a dramatic increase in filter use in libraries.
A new study \1\ by the U.S. National Commission on Libraries and
Information Science shows a dramatic increase in the number of Public
Libraries using Internet filters. In 1998, just 1,679 public libraries
offering public Internet access filtered some or all Internet
access.\2\ In 2000, that number more than doubled to 3,711,\3\
representing an increase of 121%.
One in four Public Libraries offering public Internet access now use
filters.
Overall, 24.6% of Public Libraries offering public Internet access
use filtering on some or all terminals.\4\ This percentage represents
an increase from 14.6% in 1998.\5\ The fact that the number of
Libraries filtering has more than doubled, while the overall percentage
of Libraries filtering has not doubled is explained by the fact that
the total population of Libraries offering public Internet access has
increased from 11,519 in 1998 \6\ to 15,128 in 2000.\7\
The most dramatic gains came in Libraries filtering some Internet
access, which increased from 801 or 7.0% in 1998,\8\ to 2,265 or 15.0%
in 2000 \9\. Data from this study indicate that there has been a 65%
increase in Public Libraries filtering all public Internet access since
1998. The number of Libraries that filter all access has climbed from
878 or 7.6% in 1998,\10\ to 1,446 or 9.6%.\11\ Nearly 1,500 public
libraries (one out of every ten) filter all access today.
A Survey shows that librarians and teachers are highly satisfied with
filters.
In April-May of 2000, respected library researcher Dr. Ken Haycock
conducted a survey of school librarians and public librarians on the
use of filtering software, for the magazine School Library Journal, a
publication of Cahners Research.\12\
An astonishing 90% of public librarians who used filters responded
that ``the software serves its purpose'' either ``very well'' or
``somewhat well''.\13\
The study asked both school and public librarians who used filters
to rate their level of satisfaction with filtering software in several
ways.
SCHOOL INTERNET FILTERING SURVEY,
Page 8, Table 15.
------------------------------------------------------------------------
Total Total Total
Sample % Public % School %
------------------------------------------------------------------------
Very/Somewhat Satisfied.......... 76 76 76
Very satisfied................... 37 43 36
Somewhat satisfied............... 39 33 40
Somewhat/Very Dissatisfied....... 24 24 24
Some dissatisfied................ 14 10 15
Very dissatisfied/Not at all 10 14 9
satisfied \14\..................
------------------------------------------------------------------------
SCHOOL INTERNET FILTERING SURVEY,
Page 9, Table 16.
------------------------------------------------------------------------
Total Total Total
Sample % Public % School %
------------------------------------------------------------------------
Very/Somewhat Well............... 88 90 87
Very well........................ 37 48 34
Somewhat well.................... 51 42 53
Not very well/Waste of Money..... 12 10 13
Not very well.................... 9 8 9
Waste of money \15\.............. 3 2 4
------------------------------------------------------------------------
News stories and public statements made publicly by librarians and
library patrons reinforce the research
Claudia Sumler, Director of the Camden County (NJ) Library System:
A library committee that had been keeping tabs on filtering
technology heard about a sophisticated filtering product being
used in local schools. ``We got it on a trial basis, and it
seemed to work,'' Sumler said. Called I-Gear, the application
is produced by Symantec Corp., maker of Norton AntiVirus
software. I-Gear resides on the computer server, rather than on
individual PCs, and Sumler said it allows librarians to set a
variety of levels for blocking Web sites.
She said that if a patron complains that the technology is
blocking a legitimate site, librarians easily can override the
controls. ``If there are complaints, librarians are to deal
with them right away,'' Sumler said. ``We don't want to deny
access.'' . . . ``We think this works for us,'' she said.\16\
David C. Ruff, executive director of the Rolling Meadows (IL) Library:
Expanding the filtering technology to block obscenity and
pornography on the library's 20 public computers was based on
the library's satisfaction with the Cyber Patrol software and
the desire to simplify some administrative procedures, said
David C. Ruff, the library's executive director . . . In the
week since the filtering policy was expanded, patrons have not
noticed the difference, Ruff said.\17\
Joan Adams, director of the Jefferson Parish (LA) Public Libraries:
After several months of wrangling with software companies,
parish officials on Thursday finally finished installing
filtering software on about 100 computers, cementing the Parish
Council's promise to do what it legally can to keep perverts
and smut out of public libraries. But for most computer users
who sat quietly pecking away at their keyboards, the added
restrictions were hardly detectable . . . ``I've gotten a lot
of `what if?' questions from the librarians,'' [Library
Director Joan] Adams said. ``But the average computer user
doesn't even notice it is there.''
So far, the WebSense software does not seem to be slowing
down the speed of library computers, a common side effect to
installing filtering software, library network administrator
Dwight Bluford said. The software program also seems to be
fairly on target with the sites it blocks. That's because
WebSense searches the content of Internet Web sites to
determine if there is offensive content, not the keywords, he
said. ``It seems to be working well,'' Bluford said. And
because it can be locally manipulated, ``we also have the
ability to immediately block a site if we get a complaint from
a patron, or to unblock a legitimate site if it is blocked.''
\18\
Library patrons and staff at the Plano (TX) Public Library:
James Engelbrecht wasn't too happy when Plano libraries were
compelled to install Internet filtering software on their
computers late last year. Because he doesn't have Internet
access at home, Mr. Engelbrecht uses the computers at the
L.E.R. Schimelpfenig and Maribelle M. Davis libraries about
twice a week. ``When it was first implemented, I wasn't crazy
about it,'' Mr. Engelbrecht said of the filtering policy. ``I
thought it was another bureaucratic layer.'' To his surprise,
the BESS filtering software hasn't impeded his ability to
navigate his way around cyberspace. ``It's not burdensome,'' he
said. ``If I do find a site blocked, I can ask to use an
unfiltered computer.'' While the controversial policy was
debated for a year before it was launched in December, its
implementation appears to have been fairly undramatic.\19\
Erin Noll Halovanic, Information Systems Librarian at Kenton County
(KY) Library:
Halovanic says if a customer complains about not being able
to access a site that's supposedly suitable, she reviews it on
an unfiltered staff PC and unblocks the site if she finds it
appropriate for the library. And that seems to be a good
compromise for Halovanic who admits, ``As a librarian,
filtering absolutely curdles my blood. It goes against my
training as a librarian and my belief in librarianship.
However, when it comes to the choice between pandering sexual
materials and between protecting people's personal rights, I
choose filtering over the alternative.'' \20\
Margaret Barnes, Director, Dallas (OR) Public Library:
After much conversation and serious reflection, it was
determined that a workable approach, enabling the Dallas
Library to furnish access to the public, would be the
installation of a filter system on all public Internet stations
. . . During the almost 1\1/2\ years that we have been
providing this service we have had no one formally or really
informally register an objection about a filter system being in
place on the workstations. We have received countless positive
comments about this service from all ages in our community.\21\
Judith Drescher, Director Memphis-Shelby County (TN)Library:
The library system's switch to pornography-blocking software
has gone so smoothly that it could be considered a nonevent.
The Memphis area's chief librarian, Judith Drescher, told a
Shelby County Commission committee Wednesday that more than
half the 26 public queries about blocking software had nothing
to do with the new software . . . In a report given to the
commission's education and libraries committee, Drescher
stated, ``Since installation, the library has received no
requests from the public to review and block a site. Library
staff has submitted five sites for review, all of which were
blocked.'' \22\
Footnotes for Survey and Quotations
\1\ PUBLIC LIBRARIES AND THE INTERNET 2000: SUMMARY FINDINGS AND
DATA TABLES. A report based on research sponsored by the U.S. National
Commission on Libraries and Information Science (NCLIS) and conducted
by John Carlo Bertot and Charles R. McClure. NCLIS Web Release Version,
September 7, 2000 (visited February 8, 2000) (hereinafter ``INTERNET 2000'').
\2\ U.S. NATIONAL COMMISSION ON LIBRARIES AND INFORMATION SCIENCE,
MOVING TOWARD EFFECTIVE PUBLIC INTERNET ACCESS: THE 1998 NATIONAL
SURVEY OF PUBLIC LIBRARY INTERNET CONNECTIVITY. A report based on
research sponsored by the U.S. National Commission on Libraries and
Information Science and the American Library Association and conducted
by John Carlo Bertot and Charles R. McClure. Washington, DC: U.S.
Government Printing Office, 1999 (visited February 8, 2000) (hereinafter ``THE 1998 SURVEY'').
Out of a total population of 11,519 public libraries providing public
Internet access (see Figure 8, p. D-10), 878 or 7.6% filtered all
terminals (see Figure 48, p. D-50), and 801 or 7.0% filtered some (see
Figure 49, p. D-51).
\3\ INTERNET 2000, at Figure 11, p. 18. Out of a total population
of 15,128 public libraries providing public Internet access (see Figure
4, p. 11), 1,446 or 9.6% filtered all terminals (see Figure 11, p. 18),
and 2,265 or 15% filtered some (see Figure 11, p. 18).
\4\ INTERNET 2000, at Figure 11, p. 18.
\5\ THE 1998 SURVEY, at Figure 48, p. D-50, and Figure 49, at p. D-
51.
\6\ THE 1998 SURVEY, at Figure 8, D-10.
\7\ INTERNET 2000, at Figure 4, p. 11.
\8\ THE 1998 SURVEY, at Figure 49, D-50.
\9\ INTERNET 2000, at Figure 11, p. 18.
\10\ THE 1998 SURVEY, at Figure 48, D-50
\11\ Internet 2000, at Figure 11, p. 18.
\12\ SCHOOL LIBRARY JOURNAL'S SCHOOL INTERNET FILTERING SURVEY by
Cahners Research, conducted by Dr. Ken Haycock of the University of
British Columbia. August, 2000. (hereinafter SCHOOL INTERNET FILTERING
SURVEY'')
\13\ SCHOOL INTERNET FILTERING SURVEY, at Table 16, p. 9.
\14\ SCHOOL INTERNET FILTERING SURVEY, at Table 15, p. 8.
\15\ SCHOOL INTERNET FILTERING SURVEY, at Table 16, p. 9.
\16\ ``Philadelphia-Area Library Found Internet Filters Far from
Simple'', The Philadelphia Inquirer, March 8, 2001.
\17\ ``Meadows library expands filters on Internet access'',
Chicago Daily Herald February 25, 2001.
\18\ ``Library's new Internet filters in place; Program installed;
few seem to notice'', The Times-Picayune (New Orleans), January 30,
2001.
\19\ ``Internet filtering accepted; Libraries quietly implement
policy'', The Dallas Morning News, June 30, 2000.
\20\ ``I-Gear for Education Success Stories: Kenton County Public
Library'', Symantec Website, available at http://www.symantec.com/sabu/
igear/igear__educ/story__2.html
\21\ ``Surfwatching the Internet'', by Margaret Barnes, Oregon
Library Association Quarterly, Volume 3, Number 4--Winter 1998.
\22\ ``Porn-Blocking Software Works at Library'', The Commercial
Appeal (Memphis), January 6, 2000.
Mr. Upton. Right on the money. Thank you. It was exactly 5
minutes. Good work, Mr. Largent. You were always one that could
work the clock in the inbounds line. Ms. Morgan, you heard me
describe a little bit about the Kalamazoo library situation,
where they monitor, and you have to have an access card to be
able to use the equipment, and it seems to work based on the
numbers that they have suggested to me. Does the Chicago
library system have anything like that?
Ms. Morgan. No. And actually the system that you described
sounds very good, but I think you also mentioned when you were
describing that that there are policies and situations very
widely throughout the country, and I think that's why this law
is so good, because it evens the playing field.
It says that this is a standard that we want in our schools
and libraries, and it is a standard that we need to support and
promote. There are just far too many situations and libraries,
and there are stories all across the country of very
unacceptable things happening.
And again, you described something that is fairly optimal,
where you have the access card for the child, et cetera. Many
libraries don't have that. We certainly don't have that.
Mr. Upton. In your written testimony, you cited a May 2000
news article which discussed the Chicago Libraries
Commissioner's view on Internet use in the Chicago public
library, and that article stated, and I quote, in the adult
areas of the library, patrons are free to view anything,
including pornographic sites. The Commissioner--and I presume
your boss----
Ms. Morgan. The big boss, yes.
Mr. Upton. The big boss, Ms. Dempsey, further states,
``Adults have a right to look at those things. Adult terminals
have privacy screens if they want to look at it. That's fine.
But you don't have to look at it, and I don't have to look at
it. People are free to surf. We are a big city with 3 million
people. What is objectionable to one is not objectionable to
another.''
Ms. Caywood, does the American Library Association stand
behind the Chicago Public Library Administrator's position,
that anything should go for adult Internet users in publicly
funded libraries?
Ms. Caywood. I am a member of the American Library
Association. We believe in abiding by the law. As Mr. Taylor
pointed out, there are laws that make certain things illegal. I
can't speak for every librarian, but I prosecute. I have
prosecuted and I will when people break the law.
Mr. Upton. Now, you in your library in Virginia Beach, you
indicated in your testimony that you block chats?
Ms. Caywood. Yes.
Mr. Upton. Do you have a system like I described that we
have in Kalamazoo?
Ms. Caywood. Our solution is considerably different from
Kalamazoo's. It is a different community. For example, Virginia
Beach is a resort area. One of our big sources of income is
tourism, and we would be involved in an endless issuing of
little cards if we tried to make a system that was card
controlled, and yet accessible to people who wanted to e-mail
back home.
What we do is we provide choices. We encourage families to
visit the library together and on the whole they do. And they
choose what they need at that moment, and it works for them.
Mr. Upton. You know, as I look at this issue, and as I look
back at the debate and the work by folks like Mr. Largent and
Mr. Pickering, and Members of the Senate as well, there is an
analogy that I take a look at, and that is the old debate that
we once had with the National Endowment of the Arts, a
federally taxpayer subsidy.
And it points certainly in the late 1980's, and the early
1990's, there were a number of graphic or pornographic events
that they funded in a number of ways that alarmed most Members
of Congress on both sides of the aisle.
And to his credit, a Member from Michigan, Paul Henry, took
up the cause as a member of the then Education and Workforce
Committee, and in fact indicated that for dollars to go in the
future to fund the arts community. In fact, they would have to
subscribe to certain standards.
And a number of the things that they had funded in the
past, and you might remember that a jar of urine, with a
crucifix inside, and things were no longer to be included as
part of the funding.
That resolution passed, and those safeguards that were put
into play because it was taxpayer money. Folks wanted to have
access to those types of performances in the arts community
would have to subscribe--if they were going to get Federal
funds, they were going to subscribe to the standards, or else
they would not receive Federal funds, and they would have to
look elsewhere in that arts community.
And I think that this is very much the same thing. I mean,
we are troubled. Again, as I look at my local libraries, they
have a system that works. Yet, when we look literally an hour-
and-a-half from my home, and when I was in Chicago earlier this
week, but I didn't go into the library there, but you have got
a system that is quite a bit different.
And I guess my last question is, since my time is rapidly
expiring, Mr. Johnson, do you feel that there is a right for
the libraries then to offer pornography then without some type
of restriction?
Mr. Johnson. Mr. Chairman, I think part of the problem in
these debates is the confusion between what is and isn't
protected speech. Obviously, obscenity is not protected speech
under the First Amendment, as is child pornography unprotected
speech.
But the problem is that everybody assumes that this is
something that you know when you see it, and that is not the
case under the law. It requires that there be a judicial
determination.
In other words, unprotected speech is not unprotected until
a court says it is unprotected. So you can't just say I know it
when I see it. What we are saying is that these filters do tend
to be over and under inclusive, and therefore they do block
more speech than is necessary and that is constitutionally
permitted.
Pornography is in fact protected under the constitution.
Pornography is separate from the issue of obscenity. However, I
think you also need to note that Congress does not have a carte
blanche in order to tie funding restrictions to content. The
situation that you are talking about with the----
Mr. Upton. Well, we do at the NEA.
Mr. Johnson. Well, you do in the NEA, and that is a whole
different situation. If you look at the case, NEA versus
Finley, it specifically said that it would be a different
situation if Congress tried to tie that money in such a way
that it was viewpoint discriminatory.
What they said was, was that what you did in that situation
was that you made decency a part of the requirement. It was not
the sole requirement. But they said that if you had engaged in
viewpoint discrimination that would be a whole different
situation.
If you look at the recent case of Velasquez versus Legal
Services Corporation, there Congress tried to tie funding as
well to the Legal Services Corporation, and restrict the
ability of the Legal Services Corporation to engage in speech,
as well as the clients.
There the Supreme Court overturned that and said that was
in fact viewpoint discrimination, and that's exactly what
happens here under CHIPA. You are engaging in viewpoint
discrimination by saying we are not going to have certain types
of information in the library, and therefore you install these
types of filters that not only don't block all of the types of
information that should be blocked, but then block other
information as well.
Mr. Upton. My time has expired, and I am sure that other
members will come back to this. I recognize Mr. Sawyer.
Mr. Sawyer. Thank you, Mr. Chairman. The testimony this
morning has been interesting, and I hope useful. I am
particularly interested, Ms. Getgood, in terms of with your
product specifically. How do you counsel schools and libraries
to make the decision to use your product?
Ms. Getgood. Well, I guess to start with, we don't counsel
libraries at all. In fact, we don't market to libraries. We
really focus on schools. And so the first thing I would say is
that it is a twofold process, and one is to have an acceptable
use policy which states what you intend the Internet to be used
for in the classroom.
And in fact if you are a library that wants to use
filtering, but what the rules are in the library. And then the
filtering software is to back that up. It is to help you manage
that policy.
Mr. Sawyer. Well, it is that precisely. I am not talking
about marketing the libraries, but if a library comes to you
and says we are in the market for a filtration system, how do
you adopt your product to the needs of a particular library?
If Ms. Caywood came to you, she might ask quite different
questions than if Ms. Morgan came to you, and yet I would
assume that your responses would be different. The software is
only a tool that gives them a number of different choices that
they can make, and this is specific CyberPatrol.
But also most of the products in this space would be the
same way. We offer categories of content, and which they can
choose to use, and we offer the ability to override those
categories. So if you want to allow specific content and
disallow other content, you can do that.
And, in fact, if you wanted to create your own list of
content--for example, that which has been deemed to be obscene
by your local court, you could do that as well. So the software
is really just a tool that they can use to implement their own
policies.
Mr. Sawyer. What kind of training do you provide to the
people who do the categorizing in your organization?
Ms. Getgood. Our researches are all parents and teachers,
or trained professionals, who have been taught how to apply our
published criteria which are published on our website. And
again this is specific to CyberPatrol, but in fact most of the
companies in the filtering industry do the same kinds of
things.
And so if you are a purchaser of the products, you know in
advance what the criteria area, and then we train our
researchers very, very intensely to apply those criteria.
Mr. Sawyer. Ms. Caywood comes to you and asks you for your
help. What type of training do you provide to the folks who
work in her library, and for that matter, for the volunteers
who work in her library system in trying to acclimate people to
use this technology?
Ms. Getgood. In actuality, filtering software is pretty
easy to use and it doesn't require a tremendous amount of
training. We do give them some background, in terms of what the
categories are, and how they can apply them.
When you install the software, in fact you can see if
something that you wanted to block was blocked or not blocked,
depending on your own choices. But it is pretty easy to use to
start with.
Mr. Sawyer. Can a school or library determine what has been
blocked?
Ms. Getgood. Absolutely. You can tell in two ways. We
actually have search engines that you can use to check in
advance. Any one of you could check to see what most filters
have on their list by going to our web sites and typing in is
this site blocked, and it would tell you.
But in fact if you are using the software, it is pretty
easy to tell if something has been blocked, because you are
either allowed to go there or you are not.
Mr. Sawyer. Ms. Caywood, have you had problems in your
system with people who have complained about what has been
blocked from them?
Ms. Caywood. No, because all they have to do is get up from
a filtered computer and walk over to an unfiltered one. We
don't unblock at all in our current situation, because what we
offer is a choice of machines. So we have stayed out of that.
Mr. Sawyer. Have you had complaints where inappropriate
sites came up on the machines that were dedicated to children?
Ms. Caywood. No, I have not, but then we use I-Gear, which
was a local product when we bought it, but it is now owned by
Symantec. And we have it set at completely to the fullest
extent that it will go.
And we have it that way because we don't want to risk
anybody walking by and being surprised. What we find is that
when people use a filtered computer their expectation is that
they won't be offended.
They are not thinking in terms of legal terms of art, like
obscenity and child pornography. They are thinking in terms of
I don't want to see a picture of a lion eating a zebra.
Mr. Sawyer. What kind of training do you provide your
volunteers?
Ms. Caywood. Now, our volunteers, their job is to help
people who need to know this is the mouse, and this is how you
move that around. This little thing that goes down the side is
a scroll bar, and you move that up and down.
They are not there to deal with content. If someone says,
now, how do you or how can I find a site on prostate cancer,
they would immediately take that to a librarian, who would come
over and work with them on that.
The volunteer's function is to help with acclimating people
to using computers. There are still a lot of people that are
frightened. You know, that the computer is going to come at
them.
Mr. Sawyer. Can I ask one more question? Ms. Getgood, I
asked you initially about the best way for a school or a
library to decide whether or not to use your product. Do you
think the country needs a Federal law requiring libraries and
schools to use products like yours?
Ms. Getgood. No.
Mr. Sawyer. You do not?
Ms. Getgood. No.
Mr. Sawyer. Mr. Chairman, could you----
Mr. Upton. Do you want to elaborate and then we will go to
Mr. Terry.
Ms. Getgood. Sure. Basically, because schools are already
using filters and they are using them for the compelling reason
that they protect children from inappropriate content on line.
They are also using them because they help them preserve
band width, and protect them from kids downloading too many
files from file sharing services now Napster.
So there is a lot of really good reasons for why they have
been installing filters all along. So we don't think a law is
necessary.
Mr. Sawyer. Thank you, Mr. Chairman.
Mr. Upton. Mr. Terry.
Mr. Terry. Mr. Chairman, thank you. Mr. Johnson, I want to
follow up with the Chairman's question. I appreciate the legal
discussion and pulling out a couple of pages from your brief,
and as a former lawyer, I guess once a lawyer, always a lawyer.
I appreciate that, but I am not sure I really understood
the answer in reference to the question. So I am going to ask
it again and maybe in a little bit different way. And that is
do you believe that people--and let's start with adults--have
the right to access hardcore pornography at a public library?
Well, without going into the legalese and quotations of NEA
versus Finley and all of that. You and the conglomerate of your
organization, and not necessarily you personally.
Mr. Johnson. Well, let me first of all point out that when
you say hardcore pornography, you are almost getting into the
obscenity area, and so you end up in a legal distinction there.
Mr. Terry. Well, I want to start at the extreme and work
back, and I want to know where the ACLU allows us to draw the
line, or suggest that we draw the line. Is it hardcore?
Mr. Johnson. Well, clearly hardcore, if it is obscenity, is
not protected under the First Amendment, and therefore would
not be allowed to be seen in the library.
Mr. Terry. As you are saying, there may be some hardcore
that a Judge would say is not obscene, but some is. So that the
library should not have the right to control--and then we will
work about what control is--access to those types of sites in
general?
I am trying to find out that if some are and some aren't,
is your position then that it should be unfettered and people
should have the right to look at that?
Mr. Johnson. Well, I think from a legal standpoint, yes,
there is a First Amendment right to access to pornography,
because it is protected expression under the First Amendment.
Mr. Terry. Even through our public libraries?
Mr. Johnson. Well, I think the public libraries may have
other ways that they can deal with the situation to try to
restrict it, and you don't necessarily even need to do it with
content based types of regulations.
For example, you can have Internet use policies that limit
the time that people spend on the computer. And so they are not
going to be spending a lot of time doing that sort of thing.
Mr. Terry. I agree with that, and so if arousal takes place
at 2 or 3 minutes, we cut it off at 2 or 3 minutes? I think we
get more absurd by talking about time standards.
Most of the public libraries in Omaha, Nebraska, by the way
already have like a 15 minute or half-an-hour time limit, just
because there are so few terminals to users, but that is a
different issue.
Let me ask it a different way then. If it is case by case
in essence, some hardcore may be pornographic, and some may
not. Some may be protected speech. Then would it be proper for
the librarian, for Ms. Morgan, or Ms. Caywood, to stand there
at the terminal and in essence observe and make a judgment
about whether or not that site is pornographic?
Mr. Johnson. Well, I think you would run into some problems
with librarians making those sorts of judgments, and in essence
being police. Now, obviously, if there is something that they
believe that is illegal, then they should report that.
And I believe that Ms. Caywood has indicated that she does
that and many librarians do report instances of what they
believe to be illegal activity.
Mr. Terry. So the line would be that they would be allowed
if they observe accessing a pornographic site that they can
turn that person in, but they wouldn't have the right to
somehow control access to that site.
Mr. Johnson. Well, it depends on what you mean by control
access. If you have the tap on the shoulder type policy or
whatever, and you indicate that they should not continue in
that area--and like you said, many of the Internet use policies
do that.
Mr. Terry. Well, that is the point that I wanted to get to.
Mr. Pickering. Would the gentleman yield just for a second?
Mr. Terry. I will give myself 10 seconds. What we are now
doing is talking about technology versus a person getting to
make that type of decision. I yield whatever time I have left.
Mr. Pickering. To Ms. Morgan, if you have to go tap a man
observing hardcore pornography, child pornography, obscenity,
what kind of hostile work environment does that create for you,
and would not the ACLU be concerned the rights of someone like
Ms. Morgan being put into a hostile context by your
recommendations of how to restrict access.
That you have no ability to use tools of technology. You
only can use someone tapping someone on the shoulder to keep
them from observing what is clearly inappropriate. Ms. Morgan?
Ms. Morgan. I will just say first of all, and I will repeat
again, that we are not allowed to do the tap on the shoulder.
So that is No. 1. Number Two, if we were, I would find that
much worse than having the technology. It seems to me that the
technology, the filtering technology, is a much more effective
means of dealing with this.
As I mentioned in my talk, these tap on the shoulder
policies are much more intrusive than filtering. That implies
that a staff person is watching what people are going on the
Internet at all times, and looking for this, looking for the
child porn, and looking for the hardcore porn.
It also implies that the individual staff person who is
observing that patron at the moment is making that decision,
which actually there is so many different staff people out
there that there is going to be a lot of inconsistency in how
or what they think is.
And again I find that much more subjective, the tap on the
shoulder idea, which has actually been recommended by the
American Library Association. It is much more subjective and
actually leads to a maybe much more greater concern about
censorship than filtering does.
And regarding sexual harassment, there is no doubt that
this is an issue. I think you are all aware that in the
corporate world that Internet porn is a big issue with sexual
harassment lawsuits that have been settled, and in a couple of
cases over $2 million each.
This is not something that we can dismiss. As I said, it is
almost all male porn viewers, and the vast majority of people
that work in libraries are women. And certainly many of the--I
have had female patrons complain about this.
And when we look at this whole issue of what kind of an
environment that we want in a library, I think that this is
absolutely key to all of this entire argument and to this law,
and that is again why I think it is a good law. Thank you.
Mr. Upton. Mr. Markey.
Mr. Markey. Mr. Johnson, could you draw the distinction for
us that you make between filtering devices for K-12 schools and
filtering devices in community libraries?
Mr. Johnson. I am not sure that I understand the question.
Mr. Markey. Do you believe that there are different
constitutional protections that should apply for libraries, as
distinguished from K-12 classrooms?
Mr. Johnson. Well, absolutely, because you have got--when
you are talking about public libraries, you are talking about a
traditional means of providing information to the community, to
not only children, but adults as well.
And when you start trying to restrict the information to
the level of what is appropriate for children in a public
library, then you are avoiding the entire purpose of the public
library. So there are two distinct purposes obviously between a
public library and the K-12 educational system.
Mr. Markey. So what constitutionally do you believe can be
put in place in a library to protect children against it?
Mr. Johnson. Well, some of the ways are included in my
testimony. For instance, the educational programs, library web
pages, and so forth that are already done. Now, I think that
some of the characterization has also been inaccurate, because
we are not saying that under no circumstances can there be any
sort of filtering.
As Ms. Caywood has indicated, they have filtering on some
of the computers, but they don't have filtering on the others.
It is up to the parents to decide whether the child uses one or
the other, and so----
Mr. Markey. Inside of the library?
Mr. Johnson. Inside the library as I understand it, and so
you have got the option there of----
Mr. Markey. So you are saying that the library itself
should segregate computers for children from computers for
adults?
Mr. Johnson. Well, I think that would be a permissible area
to at least for--particularly for younger children under 13, if
you want to have a filtered library terminal for children under
13, for example.
Mr. Markey. Why did you pick age 12? Why not age 14. Why
12?
Mr. Johnson. Well, the problem is that once you start
getting into the teen years, it is more--the courts have not
been particularly clear at what stage children start having
more constitutional rights.
And so the ACLU's position has generally been in the teen
years that they would have more constitutional rights, yes.
Mr. Markey. I think probably the Members of this Committee
would give more protection to 13 year olds. As you can see, we
have broadcast television, and so using the analogy of
broadcast television to the library, where adults watch, but
children do as well.
And although theoretically the programming in the evening
is supposed to be targeted at adults, we know that children
watch as well. So as a result, there are standards as to what
can be aired, because it is a community environment, even
though it is primarily for adults in the evening.
And we use that as a way of ensuring that children, the
most vulnerable audience, are not exposed to images, ideas,
that their parents don't believe generally speaking that they
are prepared for yet.
So that is I think kind of the core of this discussion,
because we make that kind of an analogy here. While it may be
for adults, that children necessarily are a part of the same
community simultaneously.
And again I am kind of sympathizing here with Ms. Morgan,
because as you point out, most of these librarians are women.
So you could have a small woman trying to tap a large male on
the shoulder, saying that is inappropriate for viewing, and
that could create quite an unhealthy dynamic in many cases in
libraries across the country. So just a tap on the shoulder
system might not be the best.
Mr. Johnson. That is only one of many options.
Mr. Markey. I am just responding to her, and I am trying to
eliminate that as something that I might think that we would
not want to put a lot of women into a situation of trying to do
that.
What are the reactions, Ms. Caywood and Ms. Morgan, when
these filtering devices are put into place? Do you get protests
from parents that their children are being exposed? What is the
level of opposition that you get from parents when the
libraries have these filtering devices?
Ms. Caywood. Bearing in mind that in the Virginia Beach
public library everyone has a choice which machine to use
according to their needs at the moment, or their desire for
filtering or not filtering, we have not had any protests.
Mr. Markey. You have not had any protests?
Ms. Caywood. No. But we also went through an extensive
process of work with the community on what they wanted. I will
say that people preferentially use the unfiltered computers.
The last one to be turned to is the filtered one.
Mr. Markey. And can I just ask one final question of Ms.
Getgood. You explained quite well that each one of these sites
is viewed by a human being.
Ms. Getgood. Correct.
Mr. Markey. And as a result, there is no confusion between
as you point out a chicken breast than a human. That each site
has had a decision made on it by someone who works for you in
providing a service to a home or to a school, or a library,
that has generically just grouped every single website with
that word in it for being blocked; is that correct?
Ms. Getgood. That's correct.
Mr. Markey. And how successful has it been as a result? We
know that it can't be perfect, and I guess that's my view. On
the one hand, you can argue that it is an unconstitutional
infringement of First Amendment rights of Americans, and at the
same time you can argue that it is imperfect in blocking out
sites.
But you can't have both arguments simultaneously. Either it
is too good or it is not good enough. And we do know that it is
imperfect, because something might slip through, but I think
that is what parents would prefer to have as something that is
in place that can help.
Ms. Getgood. On balance, I would say it has been--that
filtering software has been very successful in meeting the
needs of local communities and local schools, and indeed local
libraries to achieve that compelling desire to protect kids
from inappropriate contact.
Mr. Markey. Thank you.
Mr. Upton. Mr. Pickering.
Mr. Pickering. Thank you, Mr. Chairman. Mr. Johnson, just
to follow up on our earlier conversation, and knowing the ACLU
is very concerned about the civil rights of all Americans,
according to a recent USA Today story there are seven
Minneapolis librarians that are filing a discrimination
complaint with the Equal Employment Opportunity Commission,
saying that librarian patrons viewing pornography on the Net
have helped create an intimidating, hostile, and offensive
working environment.
Would the ACLU be interested in representing those seven
librarians who have to work in a hostile work environment?
Mr. Johnson. Well, without knowing all of the details,
Representative Pickering, I can't tell you that we would or
would not, because we don't know all about the specific
allegations that the plaintiffs are making. The problem that
generally these kinds of cases have is that the working
environment under the sexual or under Title VII----
Mr. Pickering. That's okay. This is about Internet
filtering. We won't go into sexual harassment and the details
of that. I was just wondering which side you were on; the
adults wanting to see pornography in a public place, publicly
subsidized, or with the women, the mothers, the sisters, the
daughters, who work in libraries who are trying to create a
healthy learning environment, instead of having to work in a
hostile work environment.
Mr. Johnson. We are on the side of the Constitution, sir.
Mr. Pickering. I am not exactly sure, because the bill
specifically addresses that, which is I believe in your
testimony not constitutionally protected speech, child
pornography, which I think you would agree is not
constitutionally protected speech; and obscenity, using the
well established precedent in terms and definitions of
obscenity, is not constitutionally protected speech.
And the third criteria would be harmful to minors, which is
also a well established term of art, and using community
standards and community input. The agenda here really is not to
look at in my view whether technology, filter technology, is
efficient, or whether it underblocks or overblocks.
But I think it is an extreme agenda to give your
interpretation of the Constitution to access for all people to
things that you wish were constitutionally protected.
And as you testified earlier, that children as young as 12
or 13 should have access to this type of material, and if you
look at the American Library Association and their bill of
rights, they say the American Library Association opposes all
attempts to restrict access to library services, materials, and
facilities based on the age of library users.
It goes on to say in another place that libraries and
librarians should not deny or limit access to information
available via electronic resources because of its allegedly
controversial content, or because of the librarian's personal
beliefs, or fear of confrontation.
I think it is clear that there is an extreme agenda to
legitimize pornography and obscenity, and make it accessible to
people of all ages and all places, regardless of the danger
that can create, or the hostile working environment that it
could create. The other----
Mr. Johnson. That is a mischaracterization of our
testimony, Representative Pickering, but that's fine.
Mr. Pickering. Well, let's talk about mischaracterizations
and distortions of the COPA Commission's finding, saying that
it did not make--that it made a finding that filter technology
is effective. It did not make a recommendation for or against.
It was neutral and it was silent. You characterized the
COPA Commission's recommendation as against filtering
technology. That is a distortion and a mischaracterization.
Mr. Johnson. That is not what I said, Representative
Pickering. What I said was that they specifically did not make
a recommendation for or----
Mr. Pickering. You said that Congress did not follow their
recommendation. They were silent.
Mr. Johnson. The Congress did not follow their
recommendations because they did not include mandatory
blocking. What they did was they made several recommendations--
--
Mr. Pickering. Didn't the COPA Commission find that
filtering technology is effective?
Mr. Johnson. Excuse me?
Mr. Pickering. Wasn't that one of their findings? Did the
COPA Commission find that filter technology is effective?
Mr. Johnson. They found that it was very problematic
because it overblocked information, and I have a copy of the
COPA Commission report here.
Mr. Pickering. And they also had a finding that it was an
effective means, an effective tool.
Mr. Johnson. They found it was an effective tool in some
circumstances, but it was not effective necessarily because of
the overblocking. And they found that there were some problem
with regard to the First Amendment. And I have a copy of the
COPA Commission report if you would like to take a look at it.
Mr. Pickering. As I listened to your testimony, and as I
look at the ALA's bill of rights, I do think that the agenda
here is to make access to this type of material available to
all, with no restrictions, and I think that is not best for our
children, and it is not best for those who work in libraries or
schools.
If you look at your other option, instead of a tap on the
shoulder, segregating adult and minor computers, you could set
up a haven for child predators and pediphiles to be able to go
into public libraries, escaping legal scrutiny to have the
access to that type of information, and with no supervision.
I just don't see any workable way to find acceptable ways
to protect our children and the work place. Filter technology
is an effective way, and it is not an obtrusive or intrusive
way to accomplish our objectives here.
I do think it is constitutional, and Mr. Chairman, I yield
back.
Mr. Upton. The gentleman's time has expired. Mr. Largent,
do you want to go now or do you want to come back?
Mr. Largent. I will go now. Ms. Caywood, are you a parent?
Ms. Caywood. No, I am not.
Mr. Largent. Mr. Johnson, are you a parent?
Mr. Johnson. No, I'm not.
Mr. Largent. Mr. Johnson, do you believe that exposure to
obscenity is harmful minors?
Mr. Johnson. Well, I think that minors can be exposed to
obscenity under many circumstances, and I think it is the
parent's responsibility to educate their children.
Mr. Largent. No, that was not the question. The question is
do you personally, and I am not talking about the ACLU. Do you
personally believe that exposure to obscenity, or even legal
pornography, is harmful to minors? I am talking about 8 and 9
year olds, 10, 11, 12, 13; is it harmful to them?
Mr. Johnson. My answer is with the parental supervision,
no, because the parents can explain what the difference is, and
why this is inappropriate material. I mean, that is what a
parent's responsibility is to do, is to----
Mr. Largent. So you would say that without parental
supervision it is harmful?
Mr. Johnson. Well, I am not sure that there has been any
study that indicates that it is necessarily harmful. But what I
am saying is----
Mr. Largent. So then you would say that exposure to
pornography, legal or illegal, is not harmful? I am just asking
for a yes or no answer. You said it was not harmful if under
adult supervision, and then I said, okay, without adult
supervision, it is harmful; and you are saying no. So I am
confused by your response.
Mr. Johnson. Well, what I am saying----
Mr. Largent. Is or is it not harmful?
Mr. Johnson. I don't believe it is probably harmful. What I
am saying is----
Mr. Largent. Okay. That's all I needed to know. That is
what I needed to know. Forty percent of children--we have been
told that 40 percent of children are first exposed to obscenity
at libraries or schools. Ms. Caywood, do you believe that the
Children's Internet Pornography Act is an unnecessary Federal
mandate?
Ms. Caywood. We are doing just fine the way we are. You
brought up the fact that I have never had children. However, I
have been entrusted with other people's on many occasions.
Mr. Largent. Sure.
Ms. Caywood. Two families were willing to let me take their
12 year olds to Richmond to testify to the COPA Commission. I
have been a children's librarian for 28 years.
Mr. Largent. That's fine, but let me get back to my
questions, because we have got a vote on the floor. You said
that someone coming into your libraries at Virginia Beach can
choose a filtered or an unfiltered computer. Can a child choose
an unfiltered computer at your library?
Ms. Caywood. Yes.
Mr. Largent. They can?
Ms. Caywood. Yes.
Mr. Largent. Can you check out Playboys to minors at your
library? If a 9 year old comes in and says I would like to
check out the Playboy, would he get it?
Ms. Caywood. We have never been asked to have a
prescription to Playboy.
Mr. Largent. So you don't have any pornography in your
library, any written pornography?
Ms. Caywood. Well, we certainly have some art material that
parallels the NEA material that some of you would not be happy
with.
Mr. Largent. Why don't you subscribe to Playboy?
Ms. Caywood. We have never been asked.
Mr. Largent. By who?
Ms. Caywood. By our community.
Mr. Largent. By your committee?
Ms. Caywood. By our community. We have a process where
people request materials that they would like to have in the
library, and that's not been requested yet.
Mr. Largent. But if you had Playboy as a subscription at
your libraries would you check it out to an 8 or 9 year old?
Ms. Caywood. We don't check out our magazines either. They
are used in the library.
Mr. Largent. Could a child have access to a Playboy if you
had it in your library?
Ms. Caywood. Yes, I imagine that they could use that like
any other library material if we had it in the library.
Mr. Largent. Right. If you had it, then they could have
access to it. That was a hard question to get to, but we did
it. Mr. Taylor, you talked about the legal terms of art that
are pretty well defined by the Courts, whether it is obscenity,
or harmful to minors, and so forth.
And you said the question is not--I mean, what I drew from
that was the question is not what are you going to block, but
how you are going to block it; is that correct?
Mr. Taylor. Well, the Act leaves it up to the school or
library to work with their filter to decide what kind of sites
they want to block within those three classes. I put the three
tests in my testimony because each of those three tests
excludes the kinds of sites that the ACLU and the American
Library Association complain might get blocked by mistake.
And CHIPA doesn't ask any library or school in the United
States under any of those three categories to block any images
or written material that deals with abortion, or sexual
orientation, or sex education, or hate speech, or Nazis, or
art, or all those kinds of categories of agendas are not
harmful to minors, and they are not obscene for minors.
They are not obscene for adults under the Miller test and
they are not child pornography under the Federal or State
statutes. They don't have to block them under CHIPA. If a
library wants to block it, just like if a library doesn't want
to carry Playboy, they don't have to.
But the policy of, well, if we want to carry Playboy, we
are going to give to kids, and if we want to have an unfiltered
terminal means that what Congress is dealing with is that you
have got terminals where you walk up to them and you type
Lolita into a search engine, and you get child p porn. If you
type Deep Throat, you get hardcore porn, and that is what the
CHIPA bill says you must try to stop.
Mr. Upton. Excuse me for interrupting, but we have about 3
minutes left. We are going to come back. Mr. Shimkus has
additional questions. So we are going to come back in about 15
minutes. We have two votes.
[Brief recess]
Mr. Upton. That is the last vote for a little while, and I
know that I talked to a number of Members on this vote, and
again they apologize for not being here. We have got another
major subcommittee in action underneath us in 2123 Rayburn, and
a number of us are on both subcommittees.
So I am absent down there and they are absent up there, but
I know that Mr. Shimkus had some additional questions. The
gentleman recognizes the gentleman from Illinois.
Mr. Shimkus. Thank you, and I apologize to the panel,
because usually I would have been very supporting in getting
done, but this is such a pressing issue and of concern that I
really wanted to have a chance to ask some questions and get
into a short discourse, especially since I wasn't able to be
here for opening statements.
I knew that there were going to be votes at 10:15, and so I
just stayed over in the Capital where I was. I was trying to
make good use of my time. And it is too bad that Mr. Markey is
not here, because maybe his site was blocked not because of
connections as were talked about today.
Maybe it was just his ideological stand that someone
filtered out, but that is a future debate that I always--he and
I have a good time, and Ms. Caywood, it is pronounced CHIPA
from what I understand, and part of the reason I think is
because of Chip Pickering.
And I have harassed him about naming that in his--in giving
him that much credit to have a bill actually named after his
first name. So he is not here to harass either. So I better get
down to business.
A couple of things that I wanted to try to briefly cover. I
have to two things, Mr. Chairman, if I have permission to
submit into the record.
Mr. Upton. Without objection.
[The information referred to follows:]
[GRAPHIC] [TIFF OMITTED] T2836.001
[GRAPHIC] [TIFF OMITTED] T2836.002
[GRAPHIC] [TIFF OMITTED] T2836.003
[GRAPHIC] [TIFF OMITTED] T2836.004
[GRAPHIC] [TIFF OMITTED] T2836.005
[GRAPHIC] [TIFF OMITTED] T2836.006
[GRAPHIC] [TIFF OMITTED] T2836.007
[GRAPHIC] [TIFF OMITTED] T2836.008
[GRAPHIC] [TIFF OMITTED] T2836.009
[GRAPHIC] [TIFF OMITTED] T2836.010
Mr. Shimkus. And then I also have--I am not going to ask
for this petition to be submitted for the record, because it is
quite large, but I am going to give it to you just so you can
look through it.
And it is not from my district. It is from a school in
California, and it is in Congressman Lewis' district. And the
two things that I want to submit is a letter by the Lewis
Center for Educational Research, Academy for Academic
Excellence; and the two portions of the letter I want to
especially for you all to have is that one of our second grade
students, under adult supervision, misspelled a name, and we
all know the story.
The URL--the children's website was automatically connected
into a sexually explicit porn site. Immediately he did as he
had been taught, and called the teacher, and clicked the X box,
and instead of the site closing down, it moved to another
explicit picture. Each click did the same, and the child was
removed quickly and the computer turned off.
The truth is in another part of the letter that
pornographers cannot continue to increase profits without
attracting a bigger audience, like the elephant in the room--
this is from the principal of the school.
Like the elephant in the room that no one will talk about,
we all know that the industry targets hormonally active
teenagers. It is simply a lie that they want to protect. My
class has come up with a simple constitutional way of keeping
adult porn off computers.
A way to allow every parent, whether computer literate or
not, to block unwanted materials, and I hope that you will take
seriously the efforts of these young people and consider their
proposals as a relevant and practical solution.
I want to read that also with the petition and also for the
record, a paper written by Brandon Smith on the 23rd of
February. It was a paper written for a school project on the
First Amendment, and I know that the ACLU would probably like
to--this young man actually did a very good job of talking
about it.
And his basic premise stems from the petition and all this
stuff, where an issue that I have addressed a couple of times
is, and Congressman Largent and I were talking during some of
the opening statements with the Triple X domain name.
So I am going to throw that out for everybody, and if we
could just go down the line. A lot of us are struggling with
the Triple X domain name, and from the industry folks, or the
filterers, I want to know does that help?
From the ACLU, the constitutional issue; and why is it no
different than zoning ordinances of local communities. And if
we would address legislation in a bill that would not mandate
people to leave their sites. But we actually know that these
pornographic sites move around anyway.
But we have a location where those people can go if that is
their desire, but it is more helpful in the filtering why not.
And, Mr. Taylor, why don't you start with that? How would you
address the Triple X domain name?
Mr. Taylor. So far our organization, the National Law
Center for Children and Families, has been opposed to the
creation of a Triple X or a top level domain for the
pornography syndicates. I don't want to elevate them to the
same level that the worldwide web consortium that government
and education, and commerce, and military are. All the
pornographers would put their sites there, but they would not
get off the cache card or the dot.com. The law would be
challenged----
Mr. Shimkus. Let me go on to a discourse. Let me go, but in
marketing, as in Triple X adult book stores, they would draw.
They would draw the people who are searching for that type of
material. Do you agree?
Mr. Taylor. It would make it easier for a filter to block
it if they were all there and nowhere else. But they are not
going to be just----
Mr. Shimkus. They would be elsewhere, too. I don't think we
are going to be able to prohibit elsewhere.
Mr. Taylor. And that's why I don't think it is going to
help. All it will do is that if a kid goes up to an unfiltered
terminal in a library, and punches in give me everything on the
X-domain, he gets it all. So it makes it easier to follow.
Mr. Shimkus. Well, I don't think we are going to limit
filtering software.
Mr. Taylor. No.
Mr. Shimkus. I think there are some people who are
proposing it, but I don't think that is going to happen. But
the issues of a Triple X domain help in the filtering software?
Mr. Taylor. I don't think so. I think that the filtering
technology uses the same search technology to find the
pornography without it being here.
Mr. Shimkus. Well, let me go to the industry folks. Ms.
Getgood and Mr. Ophus, will the Triple X domain make it--will
it help?
Ms. Getgood. I don't think it will help solve the problem.
I think it might help block the material in that Triple X
domain more efficiently. However, it doesn't prevent material
being in other domains, and it doesn't get away from the issue
that who administers and who is responsible for maintaining
that Triple X domain.
I mean, you know the joke is that it is a trip to the White
House that no kid should take, and that is to WhiteHouse.com,
and people who name their websites that sort of thing aren't
necessarily going to be that responsible to go into a Triple X
domain. So there is an issue of it is just not going to achieve
the goal.
Mr. Shimkus. Mr. Ophus.
Mr. Ophus. I would agree with Ms. Getgood on that point. It
is something that we as a company considered. We use very
sophisticated computer spidering technology that basically has
these lists of criterion that constantly scour the Internet
looking for these types of websites.
And then those sites are put into a cue, and like many of
the other filtering products are subjected to human review. So
the Triple X domain would obviously make it easier to put those
sites into the cue, I think they would also make it easier for
some kids to find those sites though.
So I think there is benefits and negatives on both sides.
Mr. Shimkus. Mr. Johnson, briefly, the First Amendment
debate on our Triple X site Mr. Johnson. Yes, sir. On one of
the few occasions that I guess Mr. Taylor and I agree that this
would not be a good idea. The First Amendment zoning laws,
where they talk about secondary effects do not apply when you
are talking about specific content, and there is a Supreme
Court case of Booz versus Barry, specifically said that when
you are talking about content that the zoning analysis does not
apply.
Mr. Shimkus. The other issue that I would like to briefly
address is an issue that as a former high school teacher that
we have an impossible time of ever identifying when a child
becomes an adult.
In other societies, it was when they killed their first
bear, or when they went to their first battle. We can't do
that, and that's why 12 year olds and 13 year olds, 16 year
olds--what is it, 16 you get your drivers license, and 18, you
can serve and carry a weapon in the military, but you can't
drink alcohol until you are 21.
This whole debate of when is an adult an adult has always
been troubling for me, and I don't know if in society if you
can ever identify that. But that makes it also difficult in
this debate.
And I raise that because, Mr. Johnson, you talked about why
libraries and not schools, and you went into--well, at least in
your written testimony, one of my questions was why are you
fighting against libraries and not schools.
And that question was sort of asked by another Member, and
you were trying to address the age of the adult access of
material at libraries, where that is not the case for children
in schools.
But if we have this debate about when is an adult an adult,
there is problems in that area that I see. Why are you not as
concerned about schools, or if you are successful in the
position, along with the American Library Association, in
prohibiting filtering, would you next turn to schools?
Mr. Johnson. Well, we are not ruling out a challenge to the
schools, but because of the different missions between public
schools and the public libraries, it made sense to only pursue
one at a time.
And so we have chosen to pursue with the public libraries
at this point, but we have not ruled out a suit against the
schools.
Mr. Shimkus. And the last question, Mr. Chairman, with your
indulgence, for Ms. Caywood. I visited my local library, who
was opposed to CHIPA, or CHIPA as I call it.
And, of course, they have the libraries position, but they
are a smaller library. They are not the Chicago library system,
where you may have a hundred times more users than people to be
able to monitor. So I really have a great appreciation for Ms.
Morgan and her position.
But they were very strong advocates for the library
position. While you may not be supportive of the content of
CHIPA, do you support the spirit of the law?
Ms. Caywood. That is an interesting question. I think that
the spirit of the law is to create a healthy environment for
learning, I would have to say that I am certainly for that.
If the spirit of the law is that the Federal Government
knows more than local government, then I would have to say that
I disagree with that. What do you think is the spirit of the
law?
Mr. Shimkus. I think the spirit of the law is to protect
children from pornography, and anyway we can, even if there is
some failures in the system. The other question I was going to
ask, but I am going to only take my 10 minutes of questioning
time, was to ask what is an acceptable rate?
If we can filter out 99.9 percent, I think that is pretty
darn good, and as a parent, I would appreciate 99.9 percent
assistance, understanding that no one is perfect. You cannot
get everything.
Pornography has a disastrous impact, especially on hormonal
young boys, and it helps lead to destructive lifestyles,
destructive choices, and it is a detriment to our society.
And if we don't do something to be involved in protecting
our children, I don't know who is going to do it, and that's
what I think the intent of the law is. Mr. Chairman, thank you
very much, and I would like to thank you all for coming back
during the break and allowing me a chance to ask some
questions. I appreciate it, Mr. Chairman. Thank you very much.
Mr. Upton. Before the gentleman leaves, I would note for
the record that it is CHIPA for Chip Pickering, and not CHIPA.
Mr. Shimkus. I am not going to call it CHIPA. I refuse.
Mr. Upton. I have one additional question, and then I will
yield. I think both Mr. Largent and Mr. Pickering have
additional questions, and if other members come, I will
obviously entertain their questions as well.
I look at myself as a dad first, and as a Legislator
second, and I guess, Mr. Taylor, the question I have for you is
do you think that it is a right for Americans visiting a local
library, a public library, using a computer system that is
funded by the taxpayers, and literally all of them because of
the E-rate that goes into the schools and libraries, that they
have a right if they choose to have access to hardcore
pornography?
Mr. Taylor. My answer to that is no. I mean, the Internet
access funded by the Government is not an entitlement program.
It is a gift. And it is intended to make these technologies
available to people.
And so if the Congress says to libraries and schools that
we want to give you all this money so you will have access to
the Internet, but we don't want you to give access to the porno
industry, or the pediphiles to our kids, or even to adult
addicts.
And so libraries and schools don't have any right, nor
duty, to give pornography to adults or to minors, and adults
don't have a right to go into the public library. They can't
demand that the library carry Deep Throat in their video
collection, or that they subscribe to Playboy or Hustler.
They can't demand that the library change its selection
policies. A library doesn't carry porn anywhere else in its
system. All this bill does is say for the kinds of materials
that you would never choose to put on your shelf, here is a
technology means to prevent it from coming in uninvited.
Mr. Upton. Mr. Johnson, do you think that they have a right
if they want? Do they have a right to use taxpayer funded
equipment to access pornography, something like a Deep Throat
or something else if that is available?
Mr. Johnson. Well, your question first of all assumes that
this technology is going to stop that, and we know that it
doesn't.
Mr. Upton. But we heard from Ms. Getgood and Mr. Ophus that
they made some pretty good advances from where things were a
few years ago.
Mr. Johnson. Well, as late as yesterday, there was a site
blocked for breast cancer information. Once again, we always
hear that this is not the case.
Mr. Upton. Well, let's say we get the technology that it is
going to be better than 90 or 95 percent that will filter out
Deep Throat. Do they have a right--can someone walk into the
Chicago Public Library, where they don't have a system like
they have in Kalamazoo, or my home town of St. Joe, do they
have a right to say, hey, I want to get access to that, and I
am going to play it right here; yes or no?
Mr. Johnson. Well, the strictly technical answer is yes, if
it is protected under the Constitution. And as I have said,
there is a distinction between what is obscenity and what is
pornography.
But again if you look at the statistics of the amount of
information on the web, there is no way that there is a human
intervention on every one of these websites. There is no way
physically that this can be done.
Mr. Upton. Ms. Morgan, you had something additional to say
to another Member that had some questions. If you could just
respond, and then I will yield to Mr. Pickering.
Ms. Morgan. Actually, it was just in a comment about the
whole concept of selection in public libraries. I have been a
librarian since 1989, and I am the architectural librarian, as
well as the--I am in charge of all of the arts periodicals.
In that department where I work, as I said, I make a lot of
selections every day. If somebody came up to me and asked me to
purchase for the library collection some sort of a pornographic
magazine because they thought it considered great art, I would
say no.
Again, we make those decisions all the time. In the art
department, we do have some of the so-called controversial art
books. Those are kept in closed reference stacks. People have
to leave an ID to look at them.
Often times Robert Maplethorpe is brought up in these
discussions, and even the Commissioner of the Library brought
up the Maplethorpe books that we own when I was making
complaints about the hardcore porn.
I think that is a very bad argument. I think that we can
make a distinction very clearly between the things that we
select for our departments, even those few art books that we
have in closed stacks, and this vast array of pornographic
material that is on the Internet; everything from bestiality,
which yes indeed I have seen people look at that, to the child
pornography.
I don't think that there is anyone that can make the
argument that those two concepts are the same thing. I see them
as very different situations.
Mr. Upton. Mr. Pickering.
Mr. Pickering. Mr. Taylor, Mr. Johnson raises
constitutional questions as to whether this by overblocking or
underblocking would not meet the least restrictive means test,
or that it would have a viewpoint discrimination. How would you
respond to those points that Mr. Johnson raises?
Mr. Taylor. The first is that I hope that the Department of
Justice says that the least restrictive means test doesn't
apply. This is not a statute passed by Congress under the
police power to put obligations on the public for public
health, safety, and morals.
This is not a crimes statute, like the CDA or COPA was.
This is a funding measure which should be judged by the Courts
under a much different standard. If you don't want the Federal
money, you don't have to abide by their wishes.
This is not an obligation to the public. This is only a
trade in exchange for assistance. But because they don't have a
right to get that material--and one of the important parts of
the answer is that in this Communications Decency Act in 1996,
one of the parts of the Act that wasn't challenged and is still
on the books is the good samaritan immunity provision that says
no civil liability can attach to any person who voluntarily
takes actions to block access to material that is obscene,
offensive, even if it is protected speech.
So libraries don't have to give people pornography even if
it is protected. They don't have to give them access to breast
cancer sites even though they are not illegal, and even though
CHIPA says not to block them.
But CHIPA doesn't ask them to block any of the sites that
they have complained of, and they would only block a breast
cancer site if you set it at a word filter option, not under a
porn category, if you set it at the highest, most strictest
levels.
These filters--I mean, you have I-gear, which is semantic.
You are using the entire New York City School District. You
have got CyberPatrol, X-Stop, N2H2. These are filters that have
been customized for library and school settings that don't
block any of these sites.
So the filter technology can be told to do it, and if it
makes a mistake of 1.5, the public doesn't have a right to
receive that, at least not in a public library.
So the Courts should say if you find out that a site is
wrongfully blocked, the filter will unblock it. The library
will insist on it, and it will be done. So this bill really
won't impact protected speech for more than minutes maybe if
somebody really wanted to do it. But even if it did, you are
not entitled to have the government buy you that which is
protected speech.
Mr. Pickering. The fundamental difference between, say, the
Communications Decency Act, and CHIPA, is that, one, all legal
terms addressed in CHIPA are well established; child
pornography, obscenity, harmful to minors. There are no new
definitions, and no new terms, and well established
definitions, and legal terms, of what is not constitutionally
protected speech.
On the difference between the least restrictive means and
on the question of viewpoint discrimination, the difference is
that this is a funding issue, just like we condition
transportation funds on blood levels of alcohol, on driving,
age, and seatbelts.
The logic, because it is not completely effective, brakes
or seatbelts are not 100 percent effective, but this is a tool,
a technology tool. And I do think that from a technological
point of view that it is somewhat disingenuous to say that it
is too big. There are too many sites with a search engine or
with a filter technology. That is what the technology does.
It is well suited to be able to block that which is defined
in this Act as child pornography or obscenity, and Ms. Caywood,
on the third section of the bill, does require local community
input. We are not imposing a one size fits all, a federally
only approach.
We were very sensitive to that issue, and if you look at
the third provision of the bill, that is the part of the
process which includes the local community, your viewpoints,
and views of parents and families in that community, in
establishing community standards.
I think that this bill is very well structured. And,
finally, Mr. Ophus, my last question on the effectiveness of
technology. We have had some raised concerns about whether
filter technologies are effective or not.
The Consumer Reports raised questions about that. Would you
address the Consumer Reports and what they did in their study,
and what they looked at, and why that is not a complete
picture, and then where we are today on the effectiveness of
filter technology.
Mr. Ophus. Thank you. I appreciate the opportunity to say
that, because now on several occasions I have heard comments
made regarding things like just yesterday by Mr. Johnson, where
chicken breast sites were blocked.
The fact of the matter is that Consumer Reports, and I
cited this in my original oral testimony, stated that filtering
is not effective. And actually Consumer Reports did something
that I don't think I have ever seen them do before. They ran a
spin-off article that sat right in an in-set to this report,
stating this is why the government shouldn't impose these kinds
of filters on our schools and libraries.
Well, that's why the rebuttal or the letter that cam back
from the Consumer Reports--and actually I will be happy to read
it again here to you--was so important, because he only tested
six products.
Now, Mr. Johnson made a comment a minute ago about chicken
breasts. The problem with his comment is what filter was it,
and when did that happen, because without those pieces of
information, it really is a completely non-valid point.
I know for a fact the technology that we use, S4F
technology, has a provision in the key word element called
intelligent key word search and block out, which is a nice big
long acronym.
But basically what it means is that years ago, literally
2\1/2\ years ago, we solved the problem of keyword search
filtering, where if you searched on sex, it would block, but
not sexually transmitted diseases or Middlesex, England, or
chicken breasts, but perhaps breasts.
So the technology has been around already for a couple of
years to stop those types of key word filtering and it is
literally the most consistent argument that I still hear coming
from opponents of filtering, is this key word issue, when in
fact it is not an issue whatsoever.
So in the response notice, real quickly, in the Consumer
Reports, the Consumer Reports editor, David Hyde--and I am
happy to give you a copy of the letter. I have it with me--
basically said in his response that you are right. We are
guilty of testing only so-called client side software.
The significance of that is that client side software is
never typically used in the education or library space. It is
typically a product that you would put on a home computer.
There are, as Ms. Getgood mentioned, server side products, and
ours is one of them, that we might call more industrial
strength, more powerful.
And where the data bases are immediately up to date, and
there is no downloading of data bases. So basically he said--
actually the Consumer Reports that we did was specifically
about products that existed in the retail or the consumer
space.
We never did intend, nor do we intend in the future, to
test any kind of products in the educational or school space.
So using Consumer Reports as proof positive that educational
filtering in libraries doesn't work is erroneous. It is a bogus
argument without a doubt.
Mr. Pickering. One final question. If there is a site that
is erroneously blocked, whether it is the Markey site or
whatever it might be, how long would that take to correct by a
school or library?
Mr. Ophus. It depends upon the system you are using. Again,
there is different filters and every company that I know has a
different set of features. There are some filters where the
second the Administrator puts in the URL, and this sight was
wrongly blocked, it immediately changes it in the master data
base.
Mr. Pickering. How long does that take?
Mr. Ophus. Seconds. I mean, nanoseconds from the moment you
hit the enter button, because the data base on server site
products and proxy servers are real time.
As soon as that site is added to that list, it is now
available to be used. Now, there are some products that they
may want to e-mail a URL in, and then the human review, either
committee or person, looks at it and says yes or no. So it
could be 24 hours or it could be longer.
Mr. Pickering. But a school or library could say, look, we
want to have a very flexible quick process, and they could work
it out where as soon as it is identified, it could be
corrected.
Mr. Ophus. With our particular technology, when the
administrator puts in what is called an override, they can type
in what is called pass through lists. We want this site to be
allowed to be passed through. It immediately makes it
available, and it also e-mails it to our review board for our
permanent use with everybody else in the world.
Mr. Pickering. And that would be probably more quicker and
more efficient, and effective, than say a tap on the shoulder?
Mr. Ophus. Well, the issue about the tap on the shoulder is
this. If you have a librarian walking over and taping a
shoulder, they are making a filter judgment at that point.
In essence, it is really no different than a filtering
company who actually may have more stringent criterion making a
value judgment. So there still is a valued judgment being
passed in the tap on the shoulder issue.
Mr. Pickering. Mr. Ophus, thank you very much. I appreciate
what you and your company are doing. Mr. Chairman, thank you
for this. I would like to ask for unanimous consent to have
additional materials submitted to the record.
Mr. Upton. Without objection.
[The information referred to follows:]
[Monday, April 2, 2001--USA Today]
Study: Net users cite child porn as top online threat
WASHINGTON (AP)--Americans think child pornography is the worst
danger on the Internet, according to a survey released Monday.
They are divided over whether they mind federal agents spying on e-
mail, according to the Pew Internet and American Life Project study.
``The Internet is not necessarily the boogeyman when it comes to
how Americans feel about fighting crime,'' said Susannah Fox, author of
the study.
``They're very concerned about online crime, but they don't see e-
mail as particularly threatening or requiring more surveillance from
law enforcement,'' Fox said.
Seventy percent of the respondents said they were anxious about
computer viruses, with 80% worried about fraud and 82% concerned with
terrorist activity online.
But the most respondents, 92%, said they were worried about child
pornography, and half of the respondents rated child porn as the single
most heinous online crime, far higher than any other choice.
``As soon as we asked the question, it was overwhelming how people
reacted negatively to child pornography,'' Fox said. ``It's something
that may or may not touch the lives of every American, but everybody is
horrified.''
Concerns about criminal activity also outweighed Americans' fears
about the government looking at e-mail.
While only 31% said they trust the government to do ``the right
thing'' most of the time or all of the time, 54% of Americans approve
of the FBI monitoring a suspect's e-mail.
Only about one in five Americans said they have heard of the FBI's
controversial e-mail monitoring system, previously called ``Carnivore''
and now renamed ``DCS 1000.'' Of those, 45% said it is a good law-
enforcement tool, but an equal number said it was a threat to the
privacy of ordinary citizens.
``Knowing about Carnivore doesn't seem to change people's minds
very much,'' Fox said, adding that the respondents were more
comfortable giving that power to the FBI rather than to generic ``law
enforcement agencies.''
But while Americans don't mind the FBI checking e-mail, 62% of the
respondents said they want new laws to protect their privacy.
The results were based on a telephone survey of 2,096 adults, of
which 1,198 were Internet users, taken from Feb. 1 to March 1. There is
a sampling error of plus or minus 2% for questions posed to the whole
group, and plus or minus 3% for questions to the Internet users.
______
[February 11, 2000]
Librarian Resigns After Being Ordered To Provide Pornography To
Children
Seattle, WA--After pursuing a rewarding career for over 10 years, a
Seattle reference librarian has been forced to resign her position,
after being ordered to provide public access to graphic internet
pornography sites on library terminals. Not only was the librarian
required to allow adults unchecked, unlimited, and unregulated access
to these sites, but was also required to allow such access to children,
as well. According to library policy, anyone, no matter what age, is
allowed total library internet access to anything except illegal child
pornography.
When she brought her concerns before the library board, the board
decided to filter the internet terminals in the children's areas, but
refused to restrict access by children and teens to the terminals in
the adult sections of the library. Another area of concern was that
high school student employees would be subjected to these pornographic
sites when providing computer assistance to library patrons.
``Since the moment this librarian began expressing her concern for
the children to the library board, she has been the subject of
intimidation and ridicule,'' said Brad Dacus, President of Pacific
Justice Institute. ``There are other employees that feel similarly to
this librarian, but are not able to deal with the intimidation that she
has experienced. They have, consequently, been afraid to express their
concerns and objections. No one should ever face the loss of their
career in the effort to protect our children, and no staff should ever
be forced to view pornographic material as part of the requirements of
their job.''
Pacific Justice Institute is a non-profit organization dedicated to
the defense of religious freedom, parents, rights and other basic
constitutional civil liberties.
______
[The Associated Press State & Local Wire--March 1, 2001]
Stilwell library closing stirs controversy
The closing of the Stilwell Public Library for several days stemmed
from a misunderstanding, not controversy over Internet use, a library
official said Thursday.
The library closed Feb. 22 and didn't reopen until Wednesday. A
member of the Stilwell Library Advisory Board said allegations of
patrons using the library's computers to access pornography was one
reason for the closing. But Marilyn Hinshaw, the director of the
Eastern Oklahoma District Library System, said Thursday she was unaware
of any connection between the library's closing and Internet service.
``Most of the library patrons of Stilwell have embraced and
benefited from the Internet access offered at the library,'' said
Hinshaw, whose office governs the Stilwell library and 13 other
libraries in northeastern Oklahoma.
Stilwell officials said Librarian Pat Gordon shut down the library
when advisory board members tried to move around some equipment with
the authority of the City Council. Hinshaw said a misunderstanding was
to blame and was cleared up when city officials assured that moving
furniture, files and computers would be discussed in greater detail
before further action was taken.
Mayor Marilyn Hill-Russell said Gordon abruptly locked up the
library while the chairman of the library board and two board members
tried to move some equipment for the City Council in order to reopen a
meeting room.
The city owns and maintains the building where the library is
housed. The council passed a resolution more than three months ago to
restore a storage-type room back to a meeting room, the mayor said. But
Bob Perkins, a member of the Stilwell Library Advisory Board, said
there also had been several complaints lodged with the board about
patrons viewing child pornography on computers in the enclosed room.
``It's common knowledge around town that if you want to watch porn,
then go to the public library in Stilwell, because you can hide,'' said
Perkins, who thought the computers should be placed in the middle of
the library.
Hinshaw said in a news release that the five computers with
Internet access resulted from a Gates Foundation grant. They were
housed in a converted staff office ``and staff are in and out on a
regular basis, using the fax machine which also is located there''
``It's not the ideal way to accommodate this need, it is just the
least expensive,'' she said. ``As you would expect, competition for
space in the 3,200 square foot building makes for anything but easy
answers.''
The library reopened Wednesday. ``I'm glad to say we're back,''
Gordon said Thursday, ``and it's business as usual.''
______
[Newsweek, July 17, 2000]
cybersex
Not on the Reading List
thanks to internet access, librarians have a new job: keeping their
patrons from tuning into porn
By Sarah Downey
Librarian Wendy Adamson likes to keep up readers' interests. She
knows who likes a good mystery novel and who prefers the latest romance
yarn. But she draws the line at helping patrons indulge their sexual
curiosity on the Internet. ``One guy was really into bondage. A lot of
them had a thing for torture scenes'' says Adamson, who saw the images
on monitors after the Minneapolis Public Library connected to the Net
in 1996. Several dozen people got in the habit of surfing for cyberporn
at the main library, Adamson says sometimes for eight straight hours.
The Internet revolution has changed the local library. Circulation
is up, budgets are up and, with more high-tech resources, the role of
librarian now includes thwarting sex acts on the premises. One of
Adamson's colleagues stumbled on three teenagers, apparently heated up
by what they'd been watching on the computer, having group sex in the
bathroom. Circulation supervisors in a library in Austin, Texas,
witnessed an adult patron telling children how to access Internet porn.
``They were being exposed to things they'd really rather not see,''
says assistant library director Cynthia Kidd.
Librarians tend to support the First Amendment, so the idea of
restricting Internet access doesn't come easily. But with porn seekers
continuing to increase, 15 percent of the nation's 9,000 public-library
systems (Austin's included) now use filters. The software has flaws;
the American Library Association says it arbitrarily suppresses access
to otherwise harmless material.
Still, censorship debates become irrelevant when sites violate
obscenity and child-pornography laws. In May a lawyer for Adamson and
11 of her colleagues filed a sex-discrimination claim against the
library with the federal Equal Employment Opportunity Commission,
charging that access to Internet sex sites created ``an indisputably
hostile, offensive and palpably unlawful working environment.''
Pressure from anti-porn taxpayers finally led library director Mary
Lawson to ban the viewing of ``sexually offensive'' material.
Undercover cops now patrol the computer terminals.
Other cities have tried different remedies. After a convicted child
molester's 1999 arrest for distributing child porn from a computer at
the L.A. Public Library, officials opted for no-sex search engines on
some computers. Denver took similar action, says library director Linda
Cumming. Beyond that, though, ``the librarians need to understand it's
just a condition of the job today,'' Cumming says. She tells her staff
sympathetically, ``I'm sure this isn't what you expected when you went
to library school.''
______
[USA Today--May 8, 2000]
Porn Makes Workplace Hostile, 7 Librarians Say
The news behind the Net by Janet Komblum
Seven Minneapolis librarians filed a discrimination complaint with
the Equal Employment Opportunity Commission, saying that library
patrons viewing pornography on the Net have helped create an
``intimidating, hostile and offensive working environment.''
Specifically, they are complaining about the library's policy of
allowing unrestricted access to the Net, saying they and patrons are
constantly subjected to offensive and inescapable images on screen and
off, their attorney says.
In a letter to the library board president and director, attorney
Robert Halagan says librarians ``should not have to choose between
their jobs and working in a hostile, sexually perverse and dangerous
workplace.''
But Judith Krug of the American Library Association, an
organization that opposes filtering, says librarians do have an
alternative: making library computers more private. Filters, she says,
weed out ``valuable, important information that's constitutionally
protected.''
Halagan says privacy screens are inadequate: They only block from
an angle. The city, he adds, must ``Provide an environment that is not
hostile and offensive. They're going to have to make some choices.''
______
[The Wall Street Journal--January 14, 2000]
Taste--Review & Outlook: X-Rated
While Tallie Grubenhoff stood at the checkout counter of the Selah,
Wash. (pop. 5,000), library with her toddler daughter, she noticed a
rowdy group of preteens around a computer. Her other kids drifted over
to see what all the fuss was about. The six-year-old came back with the
answer: They'd been watching ``a lady bending over with something in
her mouth going up and down and she was a naked lady.''
But the worst was yet to come. The librarian informed Mrs.
Grubenhoff that she was powerless to prevent children from accessing
Internet porn because the word from her boss was that doing so would
violate their free-speech rights. And that informing their parents, she
added, would violate their privacy rights.
Welcome to the American library, where Marian the Librarian is fast
making room for the Happy Hooker.
Mrs. Grubenhoff isn't the only one with a horror tale; most
American parents are understandably disturbed by the terrors that lurk
on the freewheeling Internet for their children. And their fears have
reached the politicians; in at least two presidential debates, Sen.
John McCain came out for the mandatory installation of blocking
software in libraries. All the more reason to wonder why, as the
American Library Association's midwinter conference begins today, the
subject hasn't even made it onto the group's agenda.
``We think filters is a simplistic approach,'' ALA President Sarah
Ann Long told us. Indeed, the most the ALA will do this weekend is to
issue a lowly fact sheet that states that ``the American Library
Association has never endorsed the viewing of pornography by children
or adults.''
Problem is, it's never endorsed their not viewing it, either. Quite
the opposite. Virtually all the ALA's energies appear directed toward a
highly politicized understanding of speech. As one ALA statement puts
it, libraries ``must support access to information on all subjects that
serve the needs or interests of each user, regardless of the user's age
or the content of the material.'' One gets the sense that the activists
at the ALA consider Larry Flynt less of a threat than Dr. Laura, who's
complained about ALA opposition to efforts to ensure that minors are
protected from pornographic Web sites on library computers.
Maybe blocking software is not the solution. We do know, however,
that there are answers for those interested in finding them, answers
that are technologically possible, constitutionally sound and eminently
sane. After all, when it comes to print, librarians have no problem
discriminating against Hustler in favor of House & Garden. Indeed, to
dramatize the ALA's inconsistency regarding adult content in print and
online, blocking software advocate David Burt three years ago announced
``The Hustler Challenge''--a standing offer to pay for a year's
subscription to Hustler for any library that wanted one. Needless to
say, there haven't been any takers.
Our guess is that this is precisely what Leonard Kniffel, the
editor of the ALA journal American Libraries, was getting at last fall
when he asked in an editorial: ``What is preventing this Association .
. . from coming out with a public statement denouncing children's
access to pornography and offering 700+ ways to fight it?''
Good question. And we'll learn this weekend whether the ALA
hierarchy believes it worthy of an answer.
______
[The Wall Street Journal, Thursday, February 3, 2000]
letters to the editor
Porn Surfers Invade the Library
Your Jan. 14 editorial ``X-Rated'' (Taste page, Weekend Journal),
contrary to ALA Council member Maurice J. Freedman's defensive claims
(Jan. 20, Letters) was right on the mark. However, Mr. Freedman is
accurate in stating that ``libraries have policies to manage Internet
use.'' The problem is those policies seldom include real protection for
either the employees or the patrons, except the patrons accessing
pornography.
Most of the porn surfers get total ``privacy'' and ``freedom to
view'' in the majority of libraries with Internet access. Rather than
anti-porn rules, ALA ``leadership'' prefers to recommend ``privacy
screens'' creating instant peep-booths at taxpayer expense and making
it harder for librarians to monitor the behavior until and occasional
behavior signals a problem even ``free access'' fans can't ignore.
Otherwise librarians are frequently told to leave the patrons
completely alone, regardless of their web activities. More often than
not, children can access ``adult'' sites on the Internet or view
adults' lewd Internet surfing without parental knowledge or permission.
In the occasional cases of more stringent rules for some unfiltered
systems, librarians may give polite verbal warnings or, at best,
temporary dismissal from Internet use. In those latter cases, the
working librarians (usually female) are forced to view the obscenity
and enforce the rule with the porn viewer (usually male) who is not
inclined to comply without argument. That sexually harassing or hostile
job environment is illegal in every other government workplace. Even
when the material accessed is child pornography, most libraries'
``acceptable use'' policies do not instruct librarians to stop it or
report to the local police as the law would seem to require.
ALA ``leadership'' is also responsible for:
Cooperating with the ACLU and pornographers, like Hugh Hefner,
by rewarding libraries where community efforts to get porn-
filtering are thwarted such as Loudoun County, Va. There the
so-called ``local'' anti-filter group was directly aided in its
set-up by the ALA itself.
Refusing to support moderation of the current recommended
online standards of access to everything ``regardless of
content or age of user.'' At the October '98 preliminary
meeting for the President's Online Summit regarding children's
safety issues, Judith Krug, longtime ALA-OIF spokesperson,
refused to endorse public library rules against accessing
Internet obscenity and child pornography, two categories
already outside of Constitutional protection. At the recent
mid-winter ALA conference the only new agreement was a ``task
force to study the issue'' of age and access.
Cooperating with the ACLU and pornographers by threatening
libraries who do filter with expensive lawsuits and by
intimidating librarians who would otherwise speak out despite
the fact that no circuit court has ever made a precedent-
setting decision declaring filters unconstitutional.
Encouraging public libraries from coast to coast to stall
against or refuse cooperation with public research into their
Internet pornography incidents. A growing number of systems are
refusing to even keep such record so there is nothing to
report.
The only public voice of reason that has surfaced in recent months
from within the ALA hierarchy is Leonard Kniffel's. His gutsy October
'99 editorial in the American Libraries magazine dared to.say
``children and pornography don't mix'' and even more bravely asked,
``What is preventing this Association . . . from coming out with a
public statement denouncing children's access to pornography and
offering 700+ ways to fight it?''
Karen Jo Gounaud, President
Family Friendly Libraries, Springfield, Va.
Mr. Upton. Mr. Largent.
Mr. Largent. Thank you, Mr. Chairman. I have just a few
questions. Mr. Johnson, I am reading here from American Civil
Liberties Union Policy Number 4, Censorship of Obscenity,
Pornography, and Indecency, that your organization put out.
In there it states that much expression may offend the
sensibilities of people, and indeed have a harmful impact on
some. But this is no reason to sacrifice the First Amendment.
The First Amendment does not allow suppression of speech
because of the potential harm. Do you agree with that
statement?
Mr. Johnson. Well, yes, we do because of the fact that
there isn't a principal basis for making some of these
distinctions that you are talking about with regard to the
First Amendment, when it says that Congress shall make no law
abiding freedom of speech. That is hardly ambiguous.
Mr. Largent. Sir, would you say that the Supreme Court's
decision on--the decision to say that to stand up in a crowded
theater and yell fire, that that would be constitutionally
protected speech?
Mr. Johnson. Well, first of all, that wasn't a decision
that said that you couldn't do that. That was an example in
dicta that was being used.
Mr. Largent. So you disagree with that as an example? I
mean, people should be able to do that?
Mr. Johnson. Well, that wasn't what I said. What I said was
that your characterization of the Supreme Court opinion as such
was not correct.
Mr. Largent. Okay.
Mr. Johnson. But what I am saying is that when you take a
look at what the Court has done with regard to, for instance,
Brandenburg versus Ohio, when you talk about the imminence of
danger, that is what the yelling--or as Abbie Hoffman said,
yelling theater in a crowded fire.
Basically that is what it was regarding, and I don't have a
problem when you are talking about the imminent danger of
speech being curtailed to some extent. But when there isn't
that imminent danger, then yes there is a problem with saying
that speech should be curtailed simply because of its effect on
the person who hears that.
I mean, after all, any good information is going to have an
effect on the listener, and it may not be the effect that you
want. But nonetheless if you start saying that because it may
have a bad effect on somebody, then we are going to curtail
that speech, you now give the government power to curtail all
speech, because any good orator may end up affecting somebody.
But if it is not the effect that the government wants, the
government will now have the ability to limit that speech.
Mr. Largent. So in reality what we are arguing over here is
just degrees of the limits that the government can place upon
free speech, because you just said you are not necessarily
opposed to someone standing up in a crowded theater and yelling
fire, that should not be protected free speech?
Mr. Johnson. Well, assuming that here is no fire. I mean,
obviously if there was a fire, then that's different. But what
I am saying is that if there is an imminent danger, and what I
am talking about is an imminent danger, and not just----
Mr. Largent. That's exactly what I said you said. So again
what we are talking about then is the degree to which we limit
free speech, because what you just said should be allowable is
a degree of limitation on free speech. Would you agree with
that?
Mr. Johnson. Well, only to the extent that it encourages
action.
Mr. Largent. All I am saying is do you agree that what you
just said is a limitation on just total free speech?
Mr. Johnson. Well, I would agree that it is a limitation of
action, where you have speech coupled with imminent action.
Mr. Largent. That's a great lawyer answer for saying
exactly what I just said. So again we are just talking about--
your degree of limitation is this much; whereas, maybe some
people, including the ACLU, I'm sure, would feel like the
degree of limitation is this much on pornography, and
obscenity, and access to it by our children, right?
So we are just talking about degrees, but we have already
crossed the rubicon of saying that there are some limitations
that we can all agree upon should be placed on the First
Amendment, and so it is just degrees.
And basically when you get back to that argument, then it
becomes or it goes back to the community values, community
standards, that the Supreme Court did talk about in terms of
defining obscenity, right? I mean, we are just talking about
degrees here.
Some communities have a great tolerance, and Chicago
obviously has more tolerance than I think they should have.
They wouldn't have that same level of tolerance in Tulsa, but
that they have a greater degree of tolerance or the community
standard, and their degree is a lot higher of what they will
tolerate as free speech.
And in Tulsa, Oklahoma, it is a lot lower hurdle, but again
we have already crossed the argument. I mean, you have, as
representing the ACLU here, you said there should some
limitations, and you want to lawyer it all that you want.
But you said there should be some limitations on free
speech. I agree with that.
Mr. Johnson. And so we agree on something, I suppose, from
that standpoint. But again what you are talking about, in terms
of limitations, the Court has been very clear that these so-
called degrees have to be very carefully evaluated. And you
don't just say, well, it is just a matter of degree. So we are
going to start limiting speech.
Mr. Largent. Would you say that not allowing somebody to
stand up in the theater and yell fire is a suppression of
speech because of potential harm?
Mr. Johnson. No, I would not characterize it that way.
Mr. Largent. Oh, my gosh. This is unbelievable. Okay. How
would you characterize it?
Mr. Johnson. Well, what you are doing is you are saying
that because there may be harm, okay?
Mr. Largent. Potential harm.
Mr. Johnson. Potential harm, and----
Mr. Largent. Isn't that why we say you shouldn't stand up
in a crowded theater and yell fire when there is no fire?
Mr. Johnson. No, not when you look at Brandenburg versus
Ohio.
Mr. Largent. Then why should you not yell fire in a crowded
theater when there is no fire? Why should you not do that if it
is not because of the potential harm?
Mr. Johnson. It is because of the imminent harm, and not
potential harm. There is a difference between imminent and
potential.
Mr. Largent. What is the difference?
Mr. Johnson. The difference is potential may be somewhere
down the road, and what you are talking about in Brandenburg
versus Ohio is an imminent harm. In other words, that something
is going to happen right now, and when you yell fire in a
crowded theater where there is no fire, then what you are doing
is immediately causing problems because of the stampede effect.
But what you are talking about is some potential harm down
the road because of the effect of that particular speech. That
is not what the First Amendment allows in terms of curtailing
speech, because if you do that, then you give the government
carte blanche essentially to restrict any speech, because it
may have an effect somewhere down the road.
And that is the distinction between potential harm versus
imminent harm. And like you said, Brandenburg versus Ohio talks
about the imminence and not the potential.
Mr. Largent. Okay. Mr. Chairman, those are all the
questions that I have.
Mr. Pickering. Mr. Chairman, I have just one follow-up
question on that. Would you say that child pornography, the
production and distribution of child pornography, and then the
viewing of child pornography in public places like a library or
school to minors, would that be imminent harm?
Mr. Johnson. No, it would not.
Mr. Pickering. Okay. Thank you.
Mr. Upton. Well, that concludes the hearing. I appreciate
your time this morning. I have to say that there is going to be
a lot of interest as all of us watch how the FCC is going to
implement CHIPA, and how the courts are going to rule as we
attempt to protect our kids in the digital age.
I would note listening to the discussion that there are
many Americans and again many Members of Congress that indeed
view taxpayer funded pornography that is accessible at public
libraries as a real problem in this day and age.
It does seem as though we have the technology at our
fingertips that has come a long way from the days of old, and I
salute that work and obviously we will watch very carefully in
the coming days and weeks ahead. Thank you very much. This
hearing is adjourned.
[Whereupon, at 12:58 p.m., the subcommittee adjourned.]
[Additional material submitted for the record follows:]
American Civil Liberties Union
Washington National Office
April 5, 2001
The Honorable Fred Upton
2333 Rayburn House Office Building
Washington, DC 20515-2206
Re: Hearing on CHIPA before the Subcommittee on Telecommunications and
the Internet, April 4, 2001
Dear Congressman Upton: Attached is a letter to Congressman Edward
Markey regarding the hearing yesterday before your Committee. I
respectfully request that the letter be made a part of the record.
If you have any questions, please call me at 202-675-2334.
Sincerely
Marvin J. Johnson
______
American Civil Liberties Union
Washington National Office
April 5, 2001
The Honorable Edward J. Markey
2108 Rayburn House Office Building
Washington, DC 20515-2107
Re: Hearing on CHIPA before the Subcommittee on Telecommunications and
the Internet, April 4, 2001
Dear Congressman Markey: During your comments yesterday during the
hearing, you seemed to imply that the regulation of Internet content
imposed by CHIPA was justified because of the government's limited
ability to regulate broadcast media. That contention was soundly
rejected by the United States Supreme Court in Reno v. ACLU (1997). The
Court refused to analogize the Internet to the broadcast media,
instead, saying it was more analogous to the print media. Thus, the
Internet is entitled to the highest protection under the First
Amendment, similar to books, newspapers, and magazines.
If you have any questions, please call me at 202-675-2334.
Sincerely,
Marvin J. Johnson