[House Hearing, 114 Congress]
[From the U.S. Government Publishing Office]
THE EVOLUTION OF TERRORIST PROPAGANDA: THE PARIS ATTACK AND SOCIAL
MEDIA
=======================================================================
HEARING
BEFORE THE
SUBCOMMITTEE ON TERRORISM, NONPROLIFERATION, AND TRADE
OF THE
COMMITTEE ON FOREIGN AFFAIRS
HOUSE OF REPRESENTATIVES
ONE HUNDRED FOURTEENTH CONGRESS
FIRST SESSION
__________
JANUARY 27, 2015
__________
Serial No. 114-1
__________
Printed for the use of the Committee on Foreign Affairs
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
Available via the World Wide Web: http://www.foreignaffairs.house.gov/
or
http://www.gpo.gov/fdsys/
______
U.S. GOVERNMENT PUBLISHING OFFICE
92-852 PDF WASHINGTON : 2015
-----------------------------------------------------------------------
For sale by the Superintendent of Documents, U.S. Government Publishing
Office Internet: bookstore.gpo.gov Phone: toll free (866) 512-1800;
DC area (202) 512-1800 Fax: (202) 512-2104 Mail: Stop IDCC,
Washington, DC 20402-0001
COMMITTEE ON FOREIGN AFFAIRS
EDWARD R. ROYCE, California, Chairman
CHRISTOPHER H. SMITH, New Jersey ELIOT L. ENGEL, New York
ILEANA ROS-LEHTINEN, Florida BRAD SHERMAN, California
DANA ROHRABACHER, California GREGORY W. MEEKS, New York
STEVE CHABOT, Ohio ALBIO SIRES, New Jersey
JOE WILSON, South Carolina GERALD E. CONNOLLY, Virginia
MICHAEL T. McCAUL, Texas THEODORE E. DEUTCH, Florida
TED POE, Texas BRIAN HIGGINS, New York
MATT SALMON, Arizona KAREN BASS, California
DARRELL E. ISSA, California WILLIAM KEATING, Massachusetts
TOM MARINO, Pennsylvania DAVID CICILLINE, Rhode Island
JEFF DUNCAN, South Carolina ALAN GRAYSON, Florida
MO BROOKS, Alabama AMI BERA, California
PAUL COOK, California ALAN S. LOWENTHAL, California
RANDY K. WEBER SR., Texas GRACE MENG, New York
SCOTT PERRY, Pennsylvania LOIS FRANKEL, Florida
RON DeSANTIS, Florida TULSI GABBARD, Hawaii
MARK MEADOWS, North Carolina JOAQUIN CASTRO, Texas
TED S. YOHO, Florida ROBIN L. KELLY, Illinois
CURT CLAWSON, Florida BRENDAN F. BOYLE, Pennsylvania
SCOTT DesJARLAIS, Tennessee
REID J. RIBBLE, Wisconsin
DAVID A. TROTT, Michigan
LEE M. ZELDIN, New York
TOM EMMER, Minnesota
Amy Porter, Chief of Staff Thomas Sheehy, Staff Director
Jason Steinbaum, Democratic Staff Director
------
Subcommittee on Terrorism, Nonproliferation, and Trade
TED POE, Texas, Chairman
JOE WILSON, South Carolina WILLIAM KEATING, Massachusetts
DARRELL E. ISSA, California BRAD SHERMAN, California
PAUL COOK, California BRIAN HIGGINS, New York
SCOTT PERRY, Pennsylvania JOAQUIN CASTRO, Texas
REID J. RIBBLE, Wisconsin ROBIN L. KELLY, Illinois
LEE M. ZELDIN, New York
C O N T E N T S
----------
Page
WITNESSES
The Honorable Mark Wallace, chief executive officer, Counter
Extremism Project.............................................. 6
Mr. J.M. Berger, author.......................................... 40
Mr. Evan Kohlmann, chief information officer, Flashpoint Partners 46
Ms. Rebecca MacKinnon, director, Ranking Digital Rights, New
America........................................................ 56
LETTERS, STATEMENTS, ETC., SUBMITTED FOR THE HEARING
The Honorable Mark Wallace: Prepared statement................... 9
Mr. J.M. Berger: Prepared statement.............................. 42
Mr. Evan Kohlmann: Prepared statement............................ 49
Ms. Rebecca MacKinnon: Prepared statement........................ 58
APPENDIX
Hearing notice................................................... 76
Hearing minutes.................................................. 77
THE EVOLUTION OF TERRORIST
PROPAGANDA: THE PARIS ATTACK
AND SOCIAL MEDIA
----------
TUESDAY, JANUARY 27, 2015
House of Representatives,
Subcommittee on Terrorism, Nonproliferation, and Trade,
Committee on Foreign Affairs,
Washington, DC.
The committee met, pursuant to notice, at 2:30 p.m., in
room 2172 Rayburn House Office Building, Hon. Ted Poe (chairman
of the subcommittee) presiding.
Mr. Poe. The subcommittee will come to order.
Without objection, all members may have 5 days to submit
statements, questions and extraneous materials for the record
subject to the length and limitation in the rules.
Terrorists' use of social media has exploded over the past
several years. Terrorist groups from ISIS to the Taliban use
social media platforms to recruit, radicalize, spread
propaganda and even raise money.
Section 219 of the Immigration and Nationality Act states
that it is unlawful to provide a designated foreign terrorist
organization with material support or resources, including any
property--tangible or intangible--or services, among them,
communication, equipment, and facilities.
If foreign terrorist organizations are using American
companies to spread propaganda and raise money, the question
that remains is: Is this a violation of American law? That is
the question for us today.
I asked the Department of Justice this question directly in
August 2012. Their answer? They refused to say, as they put it,
in the abstract whether a particular company is violating the
law or not under this section. So they didn't give a definitive
answer.
American newspapers would have never allowed our enemies in
World War II to place ads in, say, the New York Times for
recruitment of people to go and fight against America. So why
do social media companies allow terrorist content on their
platforms?
Terrorists know the benefit of social media. Social media
is easy to use, it is free, and it reaches everyone in the
world. We have seen this most recently with the attacks in
Paris; and after the attack, terrorists and their supporters
took to social media to praise the attack, recruit new
jihadists and fund-raise.
Twitter has become one of the terrorists most popular
platforms. As you can see here on the monitor--I believe we
have the monitors ready--a British jihadi in Syria is bragging
about ISIS and is threatening America.
We have another example of that. Here is an example of
terrorists' use of social media. It is a Facebook fan page for
Khorasan Group in Syria complete with a message board and
photos.
The Khorasan Group is a group set up by al-Qaeda and Syria
to specifically attack the United States and Europe. In April
2013, the al-Qaeda branch in Yemen known as AQIM held an online
press conference on Twitter, allowing users to submit questions
that were answered by the terror group and posted back on
Twitter the following week.
In February 2014, a Saudi cleric launched a fund-raising
drive on Twitter for jihadists in Syria. The rise of the lone
wolf terrorism in recent years has been in part fueled by
terrorists' use of social media.
The Boston bombers made two pressure cooker bombs. The
recipes for those bombs were published before the attack in al-
Qaeda's Inspire magazine. That magazine was released and
promoted on social media.
Some people make the excuse that there is no point in
shutting down a social media account because it will pop again.
But that is not always true. For years, Twitter was asked to
shut down an account of the designated foreign terrorist
organization, al-Shabaab, which pledged allegiance to al-Qaeda.
In 2013, al-Shabaab live tweeted its attack on the Westgate
Mall in Kenya that killed 72 people. Twitter then shut down the
account. Al-Shabaab tried to reopen accounts on Twitter but
after getting shut down by Twitter each time, it finally quit.
Twitter is far worse than its peers about proactively
finding and removing terrorist content. One of our witnesses
wrote in late 2013 that the gap between Twitter's practices and
industry standards is large enough to raise the specter of
negligence.
YouTube is a popular platform for jihadists as well. Videos
are especially effective in attracting and funding and
donations. Every major video released by al-Qaeda is uploaded
to YouTube and, as soon as they are released, to jihadist
forums.
ISIS posts videos on YouTube in a service called Vimeo that
depict graphic violence. However, YouTube does try to remove
them but can't get them all.
In September 2010, I did send a letter to YouTube urging
them to change their policy when it came to terrorist accounts.
They did, allowing any user to flag a video for terrorist
content, but have since changed that policy and instead take
videos down if they post graphic content or train terrorists.
Facebook is also a favorite social media site for
terrorists and jihadists. Fortunately, Facebook has redoubled
its efforts to proactively identify and remove that content.
In 2011, the White House published a counter radicalization
strategy that acknowledged terrorists' use of the Internet and
social media to spread hate and violence. The report also
committed the administration to devising a strategy to deal
with this phenomena. However, no such strategy has been
published by the administration.
Then I sent a letter with a number of other colleagues in
September 2012 urging the FBI to do more to reduce terrorists'
use of Twitter. The FBI refused, saying they gained
intelligence about groups and individuals from their social
media activity, even though it is apparent that this social
media activity recruits terrorists who want to kill.
That may be true, but it must be weighed against the
benefits of terrorist groups that enjoy this use because of the
activity.
The debate should take place and it should inform our
policies about how to deal with this threat. At the very least
we need a strategy, and that is the purpose--one of the
purposes of this hearing.
I will now yield 5 minutes to the new ranking member, Mr.
Keating from Massachusetts, for his opening comments.
Mr. Keating. Thank you, Mr. Chairman.
Let me start off by thanking you for holding this important
hearing and a timely hearing at that. Further, I would like to
note this is indeed my first subcommittee hearing as ranking
member and I look forward to working with you in the future.
We begin this Congress with news of the terrible shootings
in Paris. Our condolences continue to be with the friends and
families of those victims and with all those who have been
impacted similarly by senseless tragedies in Boston, New York,
Brussels, Sydney, Peshawar, Nairobi and, unfortunately, the
list can go on and on.
This month's heartbreaking and gruesome attacks against
Charlie Hebdo and Hyper Cacher market in Paris have
resoundingly brought people together from across the Atlantic
and from all walks of life to express their strong commitment
to pluralistic, democratic and tolerant societies.
Yet, the same space in which terrorists and criminals
operate to recruit and radicalize like-minded or just plain
hateful individuals in the same medium is indeed the same
democratic type of medium where open societies exercise their
very freedoms, the kind of freedoms that these extremists
abhor.
There is no doubt that social networking, the Internet and
propaganda have become the premier recruitment and
radicalization tools for terrorist gangs and those expanding
their reach far into Europe and the United States.
This leads to a problem where the simplest quickest
strategies to eliminate this type of harmful influence can also
compromise the very basis of a free society, in effect
complementing the terrorists' cause.
In a recent report issued by the bipartisan Policy Center,
two former co-chairs of the 9-11 Commission argue that while
``the use of Internet to radicalize and recruit
homegrown terrorists is the single most important and
dangerous innovation since the terrorist attacks of
September 11, 2001. Approaches that are aimed at
reducing the supply of violent extremist content on the
Internet are neither feasible or desirable.''
While advocating for the government to retain its
capability for aggressive take downs of foreign-based Web sites
to stop a terrorist attack, the report recommends a strategy of
building partnerships with Internet companies, the private
sector foundations, philanthropists and community groups to
build capacity and to help potentially credible messengers such
as mainstream groups, victims of terrorism and other
stakeholders to become a more effective in advocating and
conveying their messages.
As a former district attorney, I too have seen the profound
effect of working to raise the voices of those within
communities across the U.S. that work toward peace and
multicultural acceptance.
While we debate ways in which to balance security needs in
a free society, it is important to revisit our counter
terrorism strategies to ensure that they are adequately
incorporating the role of modern technology and communications.
As I mentioned earlier, there is a larger piece of this
puzzle, and that is the mind set of militants who come from
Western nations to join brutal gangs that go on to rape, kill
and divide thousands if not millions.
As a transatlantic community, we can only fight the lure of
terrorism by determining its causes and devising appropriate
counter measures. In particular, I feel the messages promoting
the heritage and very cultural history of the Mideast and North
Africa will be important to help young people define their true
identities instead of listening to backwoods propaganda seeking
to destroy this history.
Today, radicalization, online or otherwise, is occurring
across the world in rural and urban settings, wealthy and poor
communities and among all educational levels.
In the long run, we must ensure that the course of action
we pursue not only targets terrorist groups but the polarizing
policies that often lead to societal division, and to do this,
a balance between security and liberties must be maintained.
The subject of today's hearing is of the utmost concern to
our national security and I look forward to hearing from our
witnesses and thank them for being here and their perspectives
on this timely issue.
Thank you, Mr. Chairman. I will yield back my time.
Mr. Poe. I thank the gentleman.
The Chair will recognize other members for their 1-minute
opening statement. The Chair recognizes the gentleman from
California, Colonel Cook, for 1 minute.
Mr. Cook. Thank you, Mr. Chairman.
I want to compliment you on having this hearing. As
somebody who has been characterized as being born in Jurassic
Park, this is a hearing which, I don't know how many years
ago--10 years ago, what have you--didn't have a clue what was
going on and, unfortunately, there is a lot of Americans that
still do not understand social media and the importance of it.
I am also somebody that spent a long time in the military,
read all the books and everything else including Sun Tsu about
knowing your enemy, and this new enemy that we have,
international terrorism, which every week, every day something
horrible happens and they are using a weapons system that,
unfortunately, I and many of my colleagues were very, very
naive in understanding this.
I have had an education the last few years or I wouldn't be
here. We all use it now. I think everybody in this room uses
social media and it is something that young people they listen
to, the 30-second, the 15-second sound bite, even a minute, and
it is almost addictive.
And, obviously, our enemies are enemies of democracy. They
have used this so effectively in recruiting and finding out
exactly how to get to people and using it as a strategy against
us.
So I actually believe we are going to need more of these
hearings. Unfortunately, a lot of our colleagues couldn't make
it. But this is the wave of the future because it works,
unfortunately.
So thank you again for having this very timely hearing. I
yield back.
Mr. Poe. Gentleman yields back his time. The Chair will
recognize the former ranking member of this subcommittee, the
gentleman from California, Mr. Sherman.
Mr. Sherman. Judge, Bill, I am very much looking forward to
working with you on the subcommittee in this Congress. I should
point out that this subcommittee came into existence in 2003
and for 12 years I have been either chair or ranking member of
this subcommittee.
It began as the Subcommittee on Terrorism, Nonproliferation
and Human Rights. Two years later, the human rights part was
transferred to another subcommittee. Then in the 110th Congress
as well as the 111th, I was able to serve as chair of the
subcommittee and persuade then-Chairman Lantos to add the
economic jurisdiction of the full committee to this
subcommittee, dealing with trade promotion, dealing with trade
licensing and other limits on exports.
And so I look forward to this next 2 years with the chair,
the ranking member and all the members of the subcommittee.
As to the matter at hand, I look forward to hearing from
our witnesses on not only how we can be on defense and take
down the bad stuff, but how we can be on offense and use social
media and traditional media to get our message out.
As to taking down the bad stuff, that is what First
Amendment lawyers would call prior restraint if we did it
through government fiat. So among our possible policies are to
simply name and shame and nudge these Internet publishers, if
you will, to take down the bad stuff.
If we want to go further and use the power of the state to
take down information, I think it is incumbent on Congress to
craft a new statute defining what the responsibilities of these
Internet companies are, and I yield back.
Mr. Poe. I thank the gentleman.
I will introduce the witnesses that we have before us today
and then they will each be allowed to give us 5 minutes of
their testimony.
Ambassador Mark Wallace is the CEO of the Counter Extremism
Project. He is a former U.S. Ambassador to the United Nations.
Prior to his political work, practiced law as a commercial
litigation attorney.
Mr. J.M. Berger is an author and analyst studying
extremism. He is also the founder of the Web site
IntelWire.com, which publishes investigative journalism,
analysis, and primary source documents on terrorism and
international security.
Mr. Evan Kohlmann is the chief information officer at
Flashpoint Partners where he focuses on innovation and product
development. Mr. Kohlmann has served as a consultant in
terrorism matters to various government and law enforcement
agencies throughout the world.
Ms. Rebecca MacKinnon is the director of the Ranking
Digital Rights program at New America. She is the co-founder of
Global Voices Online and author of the book, ``Consent of the
Networked: The Worldwide Struggle for Internet Freedom.''
The Chair now will recognize Ambassador Wallace. We will
start with you. You have 5 minutes.
STATEMENT OF THE HONORABLE MARK WALLACE, CHIEF EXECUTIVE
OFFICER, COUNTER EXTREMISM PROJECT
Mr. Wallace. Chairman Poe, Ranking Member Keating and
members of the subcommittee, thank you for the opportunity to
testify on the hijacking and weaponization of social media by
extremist groups to radicalize and recruit new members and to
plan violent attacks against innocent people.
The evidence of social media's reach can be seen in the
thousands of people who continue to pour into Syria and Iraq in
response to online propaganda by radical extremist groups and
the grim aftermath of terror attacks that bear witness to the
power of social media to radicalize and encourage violence.
This hearing can lead to a better understanding of the
growing problem of social media abuse and a more coordinated
and cooperative relationship between technology companies like
Twitter and those who want to stop extremists from anonymously
abusing social media platforms.
American companies have led the world in revolutionary
online technology and social media. Unfortunately, these open
platforms are also the tools of choice to spread messages of
hate and for extremist groups like ISIS to propagandize,
radicalize, recruit and commit cyber jihad.
A major focus of the Counter Extremism Project's work is to
combat extremist recruitment, rhetoric and calls for acts of
terror online, starting with Twitter.
Through our crowd sourcing campaign, #CEPDigitalDisruption,
we have researched and reported hundreds of extremists to
Twitter and to law enforcement. The question today is whether
or not companies like Twitter will partner to combat those
extremists who hijack and weaponize social media for terror.
We have reached out in the spirit of cooperation to
Twitter. The response we get from Twitter is dismissive to the
point of dereliction. A Twitter official has said publically
that ``one man's terrorist is another man's freedom fighter.''
This statement is insipid and unserious. Social media sites
have a responsibility to act against extremists. An American-
born jihadi from Minneapolis operates on Twitter with the alias
Mujahid Miski.
He is one of the most influential jihadis using Twitter and
has tweeted some of the most heinous content we have seen,
including threats to behead CEP's president, the former
Homeland Security adviser, Fran Townsend.
He boasted he has been suspended from Twitter 20 times and
keeps coming back, yet Twitter does nothing to remove his new
accounts. As a result, we have been playing a never ending game
of Whac-A-Mole in trying to stop him.
We have raised these issues to Twitter. Twitter has not
taken further action against him. I respectfully request that a
copy of the tweets we have reported over the course of our
digital disruption campaign be included along with my prepared
testimony as part of this hearing's record.
Mr. Poe. Without objection, it will be made part of the
record.
Mr. Wallace. Thank you, sir.
I would like to clarify why our focus is on Twitter. In the
case of jihadis online, Twitter is the gateway drug. This is
where vulnerable people are first exposed to radical content.
From Twitter, the conversation moves to platforms like AskFM,
where those being recruited can ask questions, for example:
What is life like in ISIS, or how can I get to Syria?
Then the conversation moves to private chat applications
like Kick or WhatsApp. The path I just described is not
fictional. It is exactly how three Denver girls were
radicalized and tried to join ISIS.
We must stop recruitment at the gateway, Twitter. We stand
ready to work with governments and any company in finding the
right mix of remedies that effectively attacks this growing
problem while protecting our values and liberties.
There are immediate actions that Twitter should take.
Twitter should grant trusted reporting status to governments
and groups like ours to swiftly identify and ensure Twitter's
expeditious removal of extremists online.
The reporting process on Twitter is long and cumbersome. A
more accessible reporting protocol should be added for users to
report suspected extremist activity.
America's leading tech company should adopt a policy
statement that extremist activities will not be tolerated--
simple but important.
Twitter has a system where people can verify their
accounts. This concept can be the foundation for a tiered
system whereby unverified accounts are restricted and subject
to streamlined review.
When one of the most influential and pro-ISIS Twitter
accounts, ShamiWitness, was publically revealed to be an Indian
businessman, it shook the cyber jihadi network. He immediately
stopped his online jihad.
Twitter should reveal detailed information, including the
names and locations of the most egregious cyber jihadis. We can
collectively agree that the most egregious of cyber jihadis do
not deserve anonymity or the right to engage in hate and
incitement of terror speech.
The FBI shut down Silk Road. There are other enforcement
successes: Online drug distribution, child pornography, tobacco
sales and sex trafficking, among others. If we can confront
these activities there are strategies that we can use on those
who hijack and weaponize social media.
Thank you, Chairman Poe, Ranking Member Keating and members
of the subcommittee, and I would just like to introduce Alan
Goldsmith, Jen Lach, Darlene Cayabyab and Steven Cohen who are
really the brains of the operation because it depends on young
people to understand these complicated networks. I just wanted
to introduce them.
[The prepared statement of Mr. Wallace follows:]
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
----------
Mr. Poe. The Chair will next recognize Mr. Berger for his
5-minute testimony.
STATEMENT OF MR. J.M. BERGER, AUTHOR
Mr. Berger. Thank you, Mr. Chairman, and thank you, members
of the committee.
I want to talk a little bit about the scope of the problem
and sort of try and put some hard numbers on what we are
talking about here because a lot of the discussion we have
about this is often very general and on principle--we know it
is bad but we don't know exactly what it is.
We are going to focus on Twitter partly because it is
easier to do this kind of analysis on Twitter and also, as the
chairman noted and as Ambassador Wallace noted, Twitter has a
particular problem with this that it is in the process of
adjusting its approach to, as opposed to Facebook and YouTube
who have made changes over the last couple of years.
So in a forthcoming study on ISIS' use of Twitter, which
was commissioned by Google Ideas and will be published by the
Brookings Institution's project on U.S. relations with the
Islamist world, technologist Jonathan Morgan and I set out to
develop metrics that could define the size and function of the
Islamic State's presence on Twitter.
While our analysis is not complete, we can confidently
estimate that throughout last fall at least 45,000 Twitter
accounts were used by ISIS supporters. This figure includes
accounts that were both created and suspended during the time
it took us to collect the data.
The size of the network has certainly changed since this
estimate but it remains only a minuscule fraction of the
overall Twitter user base. Our research began at the same time
that Twitter started an aggressive campaign of suspending
accounts so it reflects some of the effects of those
suspensions.
What it doesn't do is give us a baseline to look at to see
what the environment without suspensions is, which is
unfortunate, but the timing dictated that.
Almost three-quarters of ISIS supporters on Twitter that we
studied had fewer than 500 followers each. Only a handful had
more than 20,000.
Suspended users--people we were able to determine
definitively had been suspended as opposed to changing their
name or deleting their own account--had generally tweeted three
times as often as those who were not suspended, and received
almost 10 times as many retweets from other ISIS supporters.
Suspended users averaged twice as many followers as those
who were not suspended. When users are removed from the system,
when they are suspended or they delete themselves or for
whatever reason they stop taking part, we did see some evidence
that the existing accounts compensate.
So other people step up or new accounts are created. The
accounts that already exist increase their activity. But the
preliminary evidence suggests that they can't fully regenerate
the network if suspensions continue at a consistent pace.
One big part of this debate, you know, has been this Whac-
A-Mole concept. It is, like, you know, does it help to delete
these accounts, does it help to suspend these people? And I
think that so far what we are seeing is there is pretty good
evidence that it does limit what they can do online.
We confirmed at least 800 ISIS supporters suspensions
between last fall and this month's and there are indications
there were thousands more that we could not confirm, possibly
well over 10,000 more.
While tens of thousands of accounts remain, ISIS supporters
online called the effects of these suspensions devastating.
There are three important benefits to the current level of
suspension.
First, they reduce ISIS' reach among users at risk of
radicalization. People don't spring from the womb fully
radicalized. They have to find a path to radicalization, to
talk to a recruiter, to get information about the movement.
Suspensions don't eliminate that path but they increase the
cost of participation.
Second, while ISIS' reach has been reduced, enough accounts
remain to provide an important open source intelligence. So
that is the other piece of this debate, you know, is there
valuable intelligence that we are losing out on when we suspend
these guys.
And, you know, if you have 30,000 or 40,000 accounts that
are all very limited reach, you can get a lot of intelligence
from that without necessarily allowing them to operate
unfettered.
Third, the targeting of the most active members of the ISIS
supporter network, which is what is currently happening in
terms of the Twitter suspensions we have seen, undercuts ISIS'
most important strategic advantage on this platform, which is
about 2,000 to 3,000 supporter accounts that are much more
active than ordinary Twitter users.
This is an explicit strategy of ISIS. They put out
documents about it. They have a name for the group--they call
them the mujahideen, which is Arabic for industrious--and they
are the people who drive this activity.
The reason we are talking about this now is that these over
achievers who get online and are extremely active are able to
drive a lot more traffic. They are able to cause ISIS hashtags
to trend and get aggregated by third parties.
They are able to influence search results. So if somebody
is searching for information on Baghdad they might get an ISIS
threat instead of whatever information they were trying to
seek.
So what we see right now is that there is a lot of pressure
on this network and I think that there is a balance that we are
pretty close to achieving. But there is definitely room for
improvement.
[The prepared statement of Mr. Berger follows:]
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
----------
Mr. Poe. The gentleman yields back his time.
The Chair now recognizes Mr. Kohlmann for his 5-minute
opening statement. Mr. Kohlmann?
STATEMENT OF MR. EVAN KOHLMANN, CHIEF INFORMATION OFFICER,
FLASHPOINT PARTNERS
Mr. Kohlmann. Thank you, Mr. Chairman. Thank you, members
of the committee.
As more young people from the U.S. and other Western
countries seek to depart to join jihadi front lines abroad,
there has been an increasing public awareness of the role that
online social media is serving and recruiting them to the
cause.
Yet, recently there has been a noticeable divergence from
traditional jihadi chat forums to the slicker interfaces and
enormous global audience that has been afforded by services
like Facebook and Twitter.
Indeed, the trend toward jihadists exploiting Western
commercial social media platforms has been in full view in the
aftermath of this month's terrorist attacks in Paris.
Through relatively little is known about how the Kouachi
brothers and Amedy Coulibaly were using social media, claims of
responsibility for the attacks in Paris emerged quite quickly
from al-Qaeda in the Arabian Peninsula, AQAP, all of which were
distributed exclusively via Twitter.
On January 9, AQAP's media wing used its account on Twitter
to disseminate download links for a message from its official,
Hareth al-Nadhari, praising the Paris attacks and lamenting
only that, ``I wish I had been there with you.''
On January 14, again, using the exact same Twitter account,
AQAP distributed download links for a direct video recorded
claim of responsibility for the Paris attacks from senior
official, Nasr al-Ansi, in which he declared, ``The one who
chose the target, laid the plan and financed the operation is
the leadership of this organization.''
In fact, as of right now, AQAP, which is a designated
terrorist organization under U.S. law, has not one but two
official accounts on Twitter: One for releasing videos and one
for releasing breaking news updates.
Nor is AQAP alone. Other allied factions such as al-Qaeda
and the Islamic Maghreb have also begun to eschew the
traditional route of publishing media on these forums and
instead are releasing material directly on Twitter.
Over the past 3 months, AQAP's public Twitter account has
only been disabled by administrators on four occasions. Each
time it has been disabled, AQAP has merely created a new
account with the same name appended with 1, 2, 3, 4,
respectively. There is not much mystery in which Twitter
account AQAP will register next unless you have trouble
counting to five.
Nonetheless, Twitter is not the only offender here and this
leads to another aspect of jihadi social media that surfaced as
a result of Paris and that is the Internet video that featured
Amedy Coulibaly claiming responsibility for the attacks in the
name of ISIS.
In the video, Coulibaly condemned recent Western air
strikes on ISIS and threatened, ``If you attack the Caliphate,
if you attack the Islamic State, we will attack you.''
Links to this video were first posted on ISIS' main online
chat forum, alplatformmedia.com and, naturally, the question
that follows from this analysis is: How is ISIS able to operate
its own official .com social media platform on the Internet in
order to disseminate its media?
And the answer to that question is another billion-dollar
San Francisco-based company called CloudFlare, which aims to
shield Web sites from being targeted by spammers, cyber
criminals and denial of service attacks.
CloudFlare in essence serves as a gatekeeper to control the
flow of unwanted visitors to a given site. It has advanced
detection features that thwart attempts by automated robots to
scrape data from and monitor these forums.
In fact, two of ISIS' top three online chat forums,
including alplatformmedia.com, are currently guarded by
CloudFlare.
Without such protection, these sites would almost certainly
succumb to the same relentless online attacks that have
completely collapsed several major jihadi web forums in recent
years.
In 2013, after CloudFlare was accused of providing
protection to terrorist Web sites, the company CEO insisted
that,
``It would not be right for us to monitor the content
that flows through our network and make determinations
on what is and what is not politically appropriate.
Frankly, that would be creepy.''
He also asserted,
``A Web site is speech. It is not a bomb. There is no
imminent danger it creates and no provider has an
affirmative obligation to monitor and make
determinations about the theoretically harmful nature
of speech a site may contain.''
It is extremely difficult to reconcile the logical paradox
that it is currently illegal under U.S. law to give pro bono
assistance to a terrorist group in order to convince them to
adopt politics instead of violence but it is perfectly legal
for CloudFlare to commercially profit from a terrorist group by
assisting them to disseminate propaganda which encourages mass
murder.
In fact, CloudFlare's CEO has been adamant that,
``CloudFlare abides by all applicable laws in the countries in
which we operate and we firmly support the due process of
law.''
The multi-billion-dollar U.S. companies who provide social
media services to ISIS and al-Qaeda are well aware that the way
American law is presently structured it is almost impossible
for them to ever be held responsible for the mayhem that their
paying users might cause.
The only real incentive they have to address this problem
is when it becomes so glaring, as it was in the case of James
Foley, that they are briefly forced to take action to save
public face.
Permitting U.S. commercial interests to simply ignore vital
national security concerns and earn profits from consciously
providing high-tech services to terrorist organizations is not
an acceptable legal framework in the 21st century.
Thank you very much.
[The prepared statement of Mr. Kohlmann follows:]
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
----------
Mr. Poe. I thank the gentleman.
Now we will hear from our final witness, Ms. MacKinnon, for
your 5-minute opening statement.
STATEMENT OF MS. REBECCA MACKINNON, DIRECTOR, RANKING DIGITAL
RIGHTS, NEW AMERICA
Ms. MacKinnon. Thank you very much, Mr. Chairman, Ranking
Member Keating, members of the committee.
So how do we fight terrorism and violent extremism, which
are obvious problems as we have just been hearing, in the
Internet age while not undermining the core principles and
freedoms of democratic and open societies?
As it happens, yesterday I returned from the Philippines
where I participated in a conference of bloggers, activists and
citizen journalists from all over the world, people who believe
in freedom of expression, the open Internet and multicultural
tolerance.
I can tell you terrorists are not the only people who are
using social media powerfully and effectively. However, many
people connected to this community face serious threats of
censorship and imprisonment when they write about subjects or
advocate policy positions that their governments find
threatening.
In countries like Ethiopia, Russia, Turkey, Egypt, Morocco,
China and elsewhere, some have even been charged under broad
anti-terror laws that are habitually used as tools to keep
incumbent regimes in power.
In response to the tragic massacre in Paris, the French
Government has called for United Nations member states to work
together on an international legal framework that would place
greater responsibility on social networks and other Internet
platforms for terrorists' use of their services.
In addressing the problem of terrorists' use of social
networking platforms, I believe the United States should adhere
to the following principles.
First, multi-stakeholder policymaking. The U.S. opposes
U.N. control over Internet governance because many U.N. member
states, such as some of the ones that I just listed, advocate
policies that would make the Internet much less free and open.
Instead, the U.S. supports a multi-stakeholder approach
that includes industry, civil society and the technical
community alongside governments in setting policies and
technical standards that ensure that the Internet functions
globally.
In constructing global responses to terrorists' use of the
Internet, we need a multi-stakeholder approach for the same
reasons.
Second, any national level laws, regulations or policies
aimed at regulating or policing online activities should
undergo a human rights risk assessment process to identify
potential negative repercussions for freedom of expression,
assembly and privacy.
Governments need to be transparent and accountable with the
public about the nature and volume of requests being made to
companies. Companies need to be able to uphold core principles
of freedom of expression and privacy grounded in international
human rights standards.
Several major U.S.-based Internet companies have made
commitments to uphold these rights as members of the multi-
stakeholder Global Network Initiative.
Guidelines for implementing these commitments include
narrowly interpreting government demands to restrict content or
grant access to user data or communications, challenging
government requests that lack a clear legal basis, transparency
with users about the types of government requests received and
the extent to which the company complies, and restricting
compliance to the online domains over which the requesting
government actually has jurisdiction.
Third, liability for Internet intermediaries, including
social networks, for users' behavior must be kept limited.
Research conducted around the world by human rights experts and
legal scholars shows clear evidence that when companies are
held liable for users' speech and activity, violations of free
expression and privacy can be expected to occur as companies
preemptively and proactively seek to play it safe and remove
anything that might get them in trouble.
Limited liability for Internet companies is an important
prerequisite for keeping the Internet open and free.
Fourth, development and enforcement of companies' terms of
service and other forms of private policing must also undergo
human rights risk assessments.
Any new procedures developed by companies to eliminate
terrorist activity from their platforms must be accompanied by
engagement with key affected stakeholders, at-risk groups and
human rights advocates.
Fifth, in order to prevent abuse and maintain public
support for the measures taken, governments as well as
companies must provide effective, accessible channels for
grievance and remedy for people whose rights to free
expression, assembly and privacy have been violated.
Thank you for listening, and I look forward to your
questions.
[The prepared statement of Ms. MacKinnon follows:]
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
----------
Mr. Poe. I thank all of our panelists for being here. I
agree with you, Ms. MacKinnon. This is a very complex issue. I,
like everybody else on the dais here, are great believers of
the First Amendment.
It is first because it is the most important, and anything
Congress does to try to make exceptions is always suspect. But
the Immigration and Nationality Act's Section 219 says that no
one can aid a foreign terrorist organization.
So we are not talking about some individual who makes some
comments on the Internet that is tweeting something. The first
requirement is that it is a foreign terrorist organization that
is doing this.
It seems to me that that legislation--giving aid to a
foreign terrorist organization--was upheld in the Holder v.
Humanitarian Law Project in 2010. I think that is the only case
where the Supreme Court addressed the issue of Internet, free
speech and foreign terrorist organizations.
So we set aside all those other folks out there that are
saying things on the Internet--I would like to just address
that specific issue--foreign terrorist organization, a member
of a foreign terrorist organization, recruiting folks in jihad,
radical jihadists to kill other people, like Americans.
What suggestions specifically other than the one Ms.
MacKinnon has made--several that she has made--do any of the
rest of you have on that specific issue? I know that companies
vary and many are, I think, trying to cooperate and bring down
these sites on their own.
Mr. Kohlmann, would you like to weigh in on that question?
Foreign terrorist organization, member of a foreign terrorist
organization, using the Internet to recruit jihadists to kill
folks, being very specific about that question.
Mr. Kohlmann. Sure. I think to the average person, the idea
of how would you find terrorist propaganda on Twitter or how to
find the important parts, sounds like a gargantuan task.
But the reality is is that the companies we are talking
about already have the technology which is capable of doing
this without human intervention. And how do I know that?
It is the same reason that when you go on YouTube or
Twitter you don't see child pornography. You don't see stolen
commercial videos. There is a reason for that. It is not just
happenstance.
The reason is because of the fact that the companies that
operate those social media platforms have a strict policy when
it comes to things like child pornography and stolen
copyrighted material and they have proactive means of removing
them.
The exact same way that they remove that material they can
also remove terrorist propaganda. It is just a matter of
switching the search terms, the hash values, the images that
they are looking for. The answer is that they don't have an
incentive to do that right now.
Mr. Poe. And what should that incentive be?
Mr. Kohlmann. Well, look. Right now there is no legal
remedy for anyone in the event that these companies are hosting
a terrorist Web site.
I mean, Twitter has never been sued and it has never been
held criminally liable or civilly liable by anyone. Why? The
answer is because of the fact that--the way that it is right
now--Internet hosting provider law is written so that an
Internet hosting provider, if they don't have active knowledge
of what is going on, they are not really responsible.
And look, I don't want to crack down on the freedom of
speech and I don't want to make Internet companies responsible
for everything that their users do, when there are some things
that their users do we will never really be able to know about.
But there is a certain level of basic responsibility that
companies like Twitter and CloudFlare are failing to meet. We
are not asking that they find every single terrorist Web site
or they shut down every single terrorist video, just to make a
best effort. And anyone who says that the effort that is being
made right now is a best effort has no idea what they are
talking about.
Mr. Poe. Okay. I have a question for you, Ambassador. Once
again, I am talking specifically not about terrorists. I am
talking about members of a foreign terrorist organization,
which the law specifically addresses currently.
Ambassador Wallace, the FBI follows these chats and they
don't seem to encourage the bringing down of some of this
Internet material because they want to follow the bad guys all
over the world, what they are saying, who they are, et cetera.
What is your reaction to that?
Mr. Wallace. I think it is very clear that the intelligence
value of having everything open and accessible is incredibly
overstated. It is very much like, with due respect, the
demagoguery associated that somehow we are all talking about
impairing First Amendment rights.
All of us support the First Amendment here but this isn't
free speech. This is hate speech, and I think that, having
previously served in our Government and having been a consumer
of our intelligence data, we have so many good tools that allow
us to track terrorists' activity that we don't need to solely
rely on the open forums.
The value of taking down these recruiters, these
propagandizers, far exceeds the intelligence value that we
would get from fully tracking all the individual users of
social media.
So I think it is very clear. Maybe at one point when there
were only a few abusers a long time ago there might have been
intelligence value. But right now, the Internet is awash with
those that would propagandize, recruit and incite terror. We
have to take these down, and as J.M. said, it matters. It has
an effect.
Mr. Poe. The Chair will yield to the ranking member 5
minutes for his questions of the panel.
Mr. Keating. Thank you, Mr. Chairman.
One area, and I would initially do it with Mr. Berger
because he alluded to metrics that were used themselves, but in
your analysis, and I will throw it open to the other witnesses
as well, part of the difficulty will be--you know, the chairman
set one specific example but as you go along it becomes a
little more difficult.
What material, you know, and to what extent when you were
looking at your metrics did you draw the line in some of these
postings to have them fit into your analysis? You had to draw a
line somewhere if you had metrics.
Can you give us some examples of what, in your analysis,
was on one side of the line and what was on the other?
Mr. Berger. So for this particular paper what we wanted to
do was----
Mr. Poe. Would you speak up a little bit, please?
Mr. Berger. Sure. I don't know if--okay.
Mr. Poe. I am just a little deaf so talk louder, Mr.
Berger.
Mr. Berger. For this particular paper what we did was we
wanted to identify people who were specifically ISIS supporters
and not supporters of other jihadist groups.
So what we developed was a metric to sort the 50,000
accounts we had really robust information on and we evaluated
them based on whether they appeared to be interested in just
ISIS and whether they were promoting ISIS or whether they were
more broadly interested in following jihadist activity.
So in this case, we got very, very specific. What I will
say about the intelligence question and the metrics in this
kind of material is relevant to that it is possible to sift out
the noise on here.
So we did a demographic study that we will publish in
detail on 20,000 ISIS supporters. But within that group it is
eminently possible to zero in on who the media people are, on
who the foreign fighters are, who is in the country, who is not
in the country.
You know, the issue that you run into with this is that you
can't do it 100 percent. So we created a sample group to do our
demographics as 20,000 accounts that is 95 percent ISIS
supporters.
So if you are going to approach this problem legislatively
or encourage companies to take a more aggressive role, one of
the things you have to do is figure out first where you are
going to draw the line, whether it is going to be a member of
the organization. There aren't 20,000 ISIS members on Twitter.
There are 20,000 ISIS supporters that we can point to.
So how much involvement do they have to have and how are we
going to determine that without going in with a search warrant
and really getting, you know, very invasive about how we are
going to get that information out of the company.
Mr. Keating. So you did it based on, you know, people that
you identified through your analysis as ISIS. Can I just be a
little more broader and thematic in this?
Can you give me any examples just off the top of your head
where it is clear, you know, where you are on one side of the
line where it is a difficult choice, and the other side of the
line when it isn't? Because those are the kind of decisions----
Mr. Berger. Sure.
Mr. Keating [continuing]. We might have to do it, and I
would ask anyone if they wanted to venture in. Ms. MacKinnon,
did you get a chance? Where would you say--can you give an
example where it is clearly an issue where action should be
taken and it is one where even though it might be a close call
it is not?
Ms. MacKinnon. I am not a counter terrorism expert so I am
not going to go outside of my field of expertise. But I,
certainly, can say that the question is: Who is going to make
the determination where the line is drawn, right? Is it the
company? Is it the government? Is it someone else? Is it an
outside expert?
Mr. Keating. And do they use a common----
Ms. MacKinnon. And in order to determine what side of the
line this person falls on, is the company going to need to
conduct an investigation of that person and where they are
coming from?
This leads to an issue of there is already a great public
backlash about the amount of information that companies are
collecting on people and the way in which it is shared with law
enforcement and national security.
And so companies, in thinking about not just their domestic
trust with users but their trust with international users which
is the main growth area for all of these companies, are they
going to have to start building their own profiles on, you
know, users of interest in order to decide which side of the
line they fall on.
Mr. Keating. Okay. Let me just ask the other witnesses that
we have. What could we do to establish those kind of guidelines
that would be useful from company to company? Can it be done in
a uniform way?
Mr. Wallace. Sure, I will take a quick crack. Look, the
clear line to us is incitement of violence, right? I mean,
there are a lot of lawyers in the room. Incitement of violence,
clearly, or terror is clear.
Threatening to behead Fran Townsend on Twitter, I think,
shouldn't be on Twitter. I think that is very clear and
constitutes a bright line. I think we would all agree that
shouldn't be there.
Mr. Keating. But where it gets a little gray?
Mr. Wallace. Where it gets a little bit gray is saying that
you support these groups. I would say that now is the time to
change. Inspire magazine is a classic example.
This is a publication that has been providing material
support for al-Qaeda for a long time. We have been tolerating
it under the right of free expression.
There is an excellent op-ed in the New York Times I think 2
days ago that said, ``No more al-Qaeda magazines.'' I think now
we can say that as it pertains to terrorist organizations, we
have taken a decision that promoting these groups is a
violation of law.
We should not tolerate hate speech that supports these
entities and we shouldn't allow the Internet versions of
Inspire magazine.
Mr. Keating. All right. I will just have this one comment,
Mr. Chairman, and yield back. The answers were basically group
centered, and when it comes to that we have to move forward
somehow and grasp the content--maybe we will deal with that in
a second round.
I yield back, Mr. Chair.
Mr. Poe. The Chair will recognize the gentleman from South
Carolina, Mr. Wilson, for 5 minutes.
Mr. Wilson. Thank you, Mr. Chairman.
I thank all of you for being here today and I want to thank
you, Ambassador, for pointing out the circumstances of Whac-A-
Mole because it seems like that is where we are. Then you
proceeded that we can be successful and have been in blocking
child pornography, drug sales, human trafficking.
And, Mr. Kohlmann, thank you for pointing out about stolen
copyrighted material. There is hope, and for the American
people we need this because respecting, indeed, as Ms.
MacKinnon has pointed out, the First Amendment rights that we
so respect, certainly, that doesn't include promoting mass
murder.
And I just sincerely hope that with the good minds who are
here that, indeed, positive programs can be developed. In fact,
Ambassador, could you tell us about the Think Again campaign
and has there been success or limitations based on that
particular program by the State Department?
Mr. Wallace. You know, there are various tools in the
toolshed. One of them is the counter narrative argument and
that has been the State Department's effort of trying to win
the war of ideas.
At the Counter Extremism Project, we take the position that
we should be pursuing all items on the menu, order every item
on the menu. And the counter-narrative option is important.
Obviously, the State Department has had some fumbling around
initially with the Think Again program; it has had some
difficulty. Our focus right now is there are many tens of
thousands of these actors on the Internet.
I think if we focused on the seed accounts, those that are
really driving this conversation, and work cooperatively with
the online platforms and systematically took them down, it
would provide opportunities for the State Department and others
to engage in legitimate counter narrative conversations because
they would have the advantage of not having the jihadis online.
So I think this is something that we need to do
collectively and collaboratively.
Mr. Wilson. And, to me, it is so important that we counter
the brainwashing messages that are utterly bizarre. A couple
years ago I was in Pakistan and I was reading a newspaper that
was very vibrant and seemed very positive and very open minded,
and then I read an op-ed and it was accusing the United States
of intentionally targeting mosques and all kind of bizarre
accusations that had no basis at all in reality.
And then I looked to see who the author of the article was:
Fidel Castro. How would he know this? It was an utter
fabrication. And so whoever would like to answer, how are our
governments and civil service organizations using social media
platforms to counter terrorist messaging and propaganda?
Mr. Kohlmann. I would just say this. I would say that it is
a great thing to counter terrorist propaganda. I would say that
thus far the efforts of the State Department and social media
to do this have not been very successful, and I can tell you
that from directly studying them.
Most of the time when State Department social media
representatives get involved on jihadi forums or any forums
that have people from the Middle East on them they have to
identify themselves, first of all, as being State Department
representatives, and that kind of ends the discussion right
there because the rest of the people then start spouting off
about--why is America sticking its nose in our business, and
why are there spies observing our conversations and what not.
So that program by and large, in my opinion, is a complete
failure. The most successful single thing we can do to counter
their ideology is show where the rubber meets the road. And
what do I mean by that?
Right now, ISIS and al-Qaeda, in particular AQAP, right now
they are locked in this test of wills where they are putting
out nasty, nasty stuff about each other on the Internet in
English and Arabic and all sorts of languages.
ISIS just put out a whole magazine in which they accused
al-Qaeda and the Taliban of being deviant morons. Now, that is
what needs to go out there. That is what we need to be
rebroadcasting, the fact that these guys think that each other
are a bunch of clowns.
There is no honor in this. There is no courage or valor.
They both think that they are idiots, and if you put that out
there and you show that these guys are really amateurs, they
are clowns, that most of the people that are involved in this
don't even believe in the ideology, that is where you really
crack the seal.
That is where you start breaking the hold that these folks
have in social media. You have to show that they are full of
it, and they are, and the only way you can do that is by
showing their own videos in which they are massacring people,
massacring Muslims.
There is no explanation for that anywhere in their
propaganda. You have to show that. That is what weakens them.
Mr. Wilson. Well, again, thank you, and--to show the truth.
Thank you so much and, indeed, how sad it is that the chief
victims of what is going on are fellow Muslims first. We seem
to be second. Thank you.
Mr. Poe. The Chair recognizes the gentleman from
California, Mr. Sherman, for 5 minutes.
Mr. Sherman. I want to focus first on getting our message
out. The Internet as a tool favors the side that is trying to
get information out and puts grave, both legal questions and
technological questions, and just Whac-A-Mole difficulties on
somebody who is trying to keep information from getting out. So
if we can get our message to defeat their message the
technology is with us.
I want to bring to the attention of this subcommittee
something I have mentioned, I think, in the full committee and
that is the State Department refuses to hire a single Islamic
expert, not a single person who is really qualified to quote
Hadith and Koranic verses. Not one.
And so we are in a circumstance where we think the best
argument to use on those who are close to embracing Islamic
extremism is to say they kill children, isn't that obviously
bad?
Well, in the world of Islamic extremists maybe that is not
one of the top 10 sins. If we had some understanding of basic
Islam and then extremist Islam from people who are not just
passing knowledge but are people who have memorized the Koran
then we can do a much better job.
But that would mean taking State Department jobs away or at
least one away from people with fancy degrees from U.S. and the
Western European universities, and it has been completely
rejected by the State Department, who thinks they are going to
make arguments thought of in our minds to people of a
completely different mindset.
So, I mean, these are folks who barely know enough not to
hold a get-together with ham sandwiches and beer to discuss
what Islam does not allow, okay?
Mr. Kohlmann, do they have the technology not only to
deactivate a particular user but to deactivate that IP address,
that computer, so that they can't just log in from that
particular site and give a different name?
Mr. Kohlmann. One hundred percent, and----
Mr. Sherman. Do they use it?
Mr. Kohlmann. No, and I--that is----
Mr. Sherman. Wait a minute. So you go online and you put up
something so bad that Twitter actually does take you down.
Mr. Kohlmann. They don't ban the IP, no.
Mr. Sherman. You eat lunch, you go back on, you use the
same computer to put up similar material but you identify
yourself as, you know, with a different name and they leave you
up?
Mr. Kohlmann. There is a jihadist that just commented the
other day. He actually tweeted at Twitter and said why don't
you just stop this pantomime and stop doing this whole thing
where you shut down our accounts occasionally; it just takes us
2 minutes to create a new account when you shut one down.
They----
Mr. Sherman. And they can do it from the same computer?
Okay.
Mr. Kohlmann. Yes. Twitter doesn't look at these kind of
things because, again, they don't have any incentive to.
Mr. Sherman. Well, that raises the next issue and that is
how do we put the right kind of pressure on these
organizations. At a minimum, this subcommittee ought to be
involved in naming and shaming.
But then you go beyond that to perhaps changing our tax
laws, which doesn't raise some of the same First Amendment
arguments, or otherwise penalizing those that carry the message
at least when the author is an identified foreign terrorist
organization, because that doesn't require delving into content
and parsing words.
Even if it is just weather reports from Mosul, if they are
brought to you by ISIS, they shouldn't be on Twitter. Just to
give you an illustration of how difficult it is to get our law
enforcement authorities to take seriously anything that is a
few steps away from the dead body, something that is in the
realm of finance and propaganda, I brought to the attention of
Eric Holder himself a video showing Americans in Orange County
raising money for Hamas.
They still haven't even lost their tax exemption so we are
subsidizing it, and the Americans who were on the flotilla that
took building materials to Gaza and turned them over to Hamas,
not even a letter of inquiry.
So we live in this world where, yes, if we see you with a
gun or a bomb we know you are a threat but if you violate our
clearest laws but you are white collar, we don't want to do
anything.
So I realize it is going to be tougher to get these, to
force by rule of law taking down certain messages because,
where do you draw the line between those who advocate for ISIS
and those who say, well, ISIS isn't quite as bad as Brad
Sherman says they are?
But we can certainly take down anything that claims,
whether it is true or not, to be posting to a foreign terrorist
organization. Ms. MacKinnon, you haven't commented. You have
been an advocate for privacy here. Why not just take it down if
it says brought to you by any organization on the U.S. foreign
terrorist organization list?
Ms. MacKinnon. Well, I think at root here we have a trust
problem that is going three ways. I think that there has been
sort of a history over the last couple of years of Internet
companies, particularly in light of the Snowden revelations, of
feeling that they need to restore trust with their users in
terms of what kinds of information they are handing over to the
government, what kinds of requests they are responding to and
so there is an incentive on the part of the companies not to
comply further.
Mr. Sherman. My time has expired. But if these rich
companies making a fortune can't lose a few percentage points
on their profit to help us in the war on terrorism, there is
something the matter with their souls, and I yield back.
Mr. Poe. The Chair will recognize the gentleman from Texas,
Mr. Castro, for 5 minutes.
Mr. Castro. Thank you, Chairman Poe. Thank you to each of
the panelists who are here to testify before us. We appreciate
you being here and your sharing your wisdom.
You know, I think, like most Americans, after there is an
attack in Paris, for example, the Boston bombing, and we see
people take credit for that on Twitter--one of the social media
sites--you ask yourself, you know, why the hell do these people
have a Twitter account or a Facebook account. I think that is
what the average American thinks.
So I certainly support asking Twitter to be cooperative in
developing protocols to make sure that we root some of this
stuff out, as you have suggested, that Facebook and others
have. And so I have a few questions, though.
Have they done that for any nation? Are there different
rules in the United States versus Europe, for example, or
somewhere else?
Mr. Kohlmann. As far as I am aware, there are no different
rules in terms of terrorist organizations. It really seems--
especially, at least as we take the example of Twitter.
Twitter, generally speaking, only takes action when there is a
public embarrassment, when there is a public spectacle. So when
the James Foley video came out, all of a sudden you see public
comments from Jack Dorsey.
You see Twitter all of a sudden rashly knocking out a whole
bunch of accounts, and then all of a sudden silence for months.
Then, all of a sudden, there will be a new video that will make
it to a front page headline on CNN or MSNBC, and then once
again Twitter will go on a rampage for a week. But, again, that
is just for----
Mr. Castro. Let me ask Ms. MacKinnon and anyone can chime
in.
Ms. MacKinnon. Sure. Yes. A lot of these companies--
Twitter, Facebook and Google, in particular, that I have some
familiarity with--generally have policies around the world
where they will, in countries where they have operations,
respond to lawful requests--so requests that are made in
accordance with local law officially, you know, in writing.
Mr. Castro. Right.
Ms. MacKinnon. But if those requests do not have legal
basis in that jurisdiction, they will not comply. Then, of
course, they have terms of service that restrict speech that
may or may not be legal in a given place.
Mr. Castro. Well, I guess, and I think this is a tough
question because the United States and Americans, obviously,
value the First Amendment a lot and you have to start making a
distinction between what crosses over from speech to getting
closer to expression and action.
For example, I know that somebody on the panel made the
comment that this is hate speech and I would agree that a lot
of it is. But there is a lot of hate speech on the Internet.
And so, for example, how do you make the distinction
between Islamic terrorism and domestic terrorism? When there
were thousands of children who were coming across the U.S.-
Mexico border, turning themselves over to Border Patrol, there
were organized militias that were organizing on Twitter and
Facebook and all the social media sites to go down there with
arms, with weapons, and a few of them had confrontation with
law enforcement.
So how do you draw that distinction? Or are we just going
to say as Americans we are going to do it for Islamic terrorism
but we are not going to draw a line for domestic terrorism?
I think those are some of the tough questions that we have
got to answer among ourselves. And, like I said, I support
movement and action on this issue. I think it is prudent. But
there are some very deep and very tough questions that we need
to answer.
Mr. Berger. I just wanted to say there are some precedents
for this. I mean, so, for instance, France has a law against
anti-Semitic speech and Twitter was complying with that law to
provide information on users.
You know, the other thing that I think is not necessarily
informing the conversation we are having here right now is that
Facebook, YouTube and Twitter do cooperate with law enforcement
requests to some extent and they do take accounts down based on
government requests, to some extent.
One reason we don't know about that is because a lot of
that happens under national security letters and other forms of
requests that they are not allowed to disclose, and one thing
that would help us understand this better is if they were
allowed to have a little more transparency about----
Mr. Castro. Sure. Maybe let Ambassador Wallace also.
Mr. Wallace. Good to see you, Congressman.
Mr. Castro. Yes.
Mr. Wallace. Look, I fully agree. But I don't think that we
need to reinvent the definition of hate speech in this hearing.
There has been an entire body of constitutional law that has
developed around hate speech and that has been pretty clear.
So I agree with you, sir, that hate speech is hate speech.
It should come down and we should take action on hate speech.
It shouldn't be allowed.
But I think we are looking for a bright line, Mr. Keating.
You know, I think that the distinction of the well-developed
law on hate speech is take down those that are designated
terrorist organizations, those that provide material support,
whether it is ideological or otherwise, we have said that those
actors are doing things that are hateful, for lack of a
better----
Mr. Castro. Designated by the United States Government?
Mr. Wallace. Correct. Correct. And I think that it should
be without doubt that if it is an AQAP supporter or an ISIS
supporter or Inspire magazine, they should come down now. But I
fully agree with you, Congressman. You know, hate speech is
hate speech.
Mr. Castro. Can I ask one more question?
Mr. Poe. Sure.
Mr. Castro. But would you put the same restrictions on an
organization that is going to recruit another Timothy McVeigh
or Terry Nichols?
Mr. Wallace. Yes.
Mr. Castro. Well, but that is not part of this
conversation, right?
Mr. Wallace. Well----
Mr. Castro. So you start getting into a broader--and I
agree. I just think you start getting into a broader
conversation of moving it beyond Islamic terrorism into
domestic terrorism also.
Mr. Wallace. Right. I mean, Congressman, you and I have
spent much time together. I think everyone agrees on the nature
of bad actors like Timothy McVeigh.
But right now, we have to be honest with ourselves that the
grave national security concern, the threat to global security,
are these cyber jihadis that are propagandizing.
I certainly don't want to minimize in any way that the next
Timothy McVeigh that we should allow him to stand or somebody
else who would brutally seek to harm lawful or unlawful
immigrants.
We shouldn't. But, obviously, the focus right now has been
because of--there are so many examples. So I don't mean to
diminish----
Mr. Castro. Sure. No, no, no.
Mr. Wallace [continuing]. Those examples in any way, sir,
and I fully agree with you, of course.
Mr. Castro. Yes. Sure. Thank you.
Mr. Poe. The gentleman yields back. We are in the middle of
votes. One last comment, then I will yield to the ranking
member for a final comment as well.
The law makes a distinction between a foreign terrorist
organization and non-organization using the Internet including
domestic terrorist organizations. Those types of organizations,
my understanding, you cannot provide any assistance, even
helpful assistance.
Like in the Holder v. Humanitarian Law Project, they
weren't advocating terrorism. They were advocating peace. But
the Supreme Court said you cannot assist a foreign terrorist
organization and it is a violation of the Section 219 of the
law whether it is peace or advocating jihadist movements, and I
think Congress has an obligation to look into this whole matter
and try to see if we need to get involved.
As Mr. Berger pointed out, some of these organizations--
Google, for example--are doing what they can when asked to or
on their own to take down some of these sites. Twitter, not so
much.
But I appreciate all four of you being here and the
comments, I think, by the panelists and by the members were
excellent. And I will yield the last comment--give you the last
word, something I never do.
Mr. Keating. Never done, and I appreciate that. I am sure
it is just because it is my first hearing.
Mr. Poe. It is.
Mr. Keating. I just want to thank--this has been an
important hearing, I think, and a frustrating one because it is
sort of like trying to grasp a watermelon seed. Once you think
you have it, it slips through your fingers again.
But it is important to begin this dialogue, and there are
some areas, I have learned today, that can be helpful where
maybe we can limit to specific, you know, groups or individuals
and not get involved in some of the other issues.
But even that becomes complex because the difficulty of
dealing with different languages, different laws and different
countries makes it become very difficult.
But I think one thing we can agree on it is important for
us all going forward to try and get our hands around this a
little bit and to see what we can do, whether it is hate speech
or existing law.
But, you know, you have got companies. You are their guests
on those--you know, of those companies as well. So I think that
working with the private side, having those discussions, will
really serve a great benefit and I hope today was a time that
we can refocus on this from such a broad perspective, as
frustrating as the conversation was. Thank you all for being
here.
Mr. Poe. I thank all four of you for being here. It is very
important information you have given us. I thank the members
for participating as well, and the subcommittee is adjourned.
Thank you very much.
[Whereupon, at 3:50 p.m., the committee was adjourned.]
A P P E N D I X
----------
Material Submitted for the Record
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
[all]