[House Hearing, 115 Congress]
[From the U.S. Government Publishing Office]




 
FACEBOOK, GOOGLE AND TWITTER: EXAMINING THE CONTENT FILTERING PRACTICES 
                                  OF 
                          SOCIAL MEDIA GIANTS

=======================================================================

                                HEARING

                               before the
                       COMMITTEE ON THE JUDICIARY
                        HOUSE OF REPRESENTATIVES

                     ONE HUNDRED FIFTEENTH CONGRESS

                             SECOND SESSION

                               __________

                             JULY 17, 2018

                               __________

                           Serial No. 115-64

                               __________

         Printed for the use of the Committee on the Judiciary
         
         
         
         
         
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]         




        Available via the World Wide Web: http://www.govinfo.gov
        
        
        
        
                                _________ 

                   U.S. GOVERNMENT PUBLISHING OFFICE
                   
33-418                       WASHINGTON : 2018             
        
        
        
        
                       COMMITTEE ON THE JUDICIARY

                   BOB GOODLATTE, Virginia, Chairman
F. JAMES SENSENBRENNER, Jr.,         JERROLD NADLER, New York
    Wisconsin                        ZOE LOFGREN, California
LAMAR SMITH, Texas                   SHEILA JACKSON LEE, Texas
STEVE CHABOT, Ohio                   STEVE COHEN, Tennessee
DARRELL E. ISSA, California          HENRY C. ``HANK'' JOHNSON, Jr., 
STEVE KING, Iowa                         Georgia
LOUIE GOHMERT, Texas                 THEODORE E. DEUTCH, Florida
JIM JORDAN, Ohio                     LUIS V. GUTIERREZ, Illinois
TED POE, Texas                       KAREN BASS, California
TOM MARINO, Pennsylvania             CEDRIC L. RICHMOND, Louisiana
TREY GOWDY, South Carolina           HAKEEM S. JEFFRIES, New York
 RAUL LABRADOR, Idaho                DAVID CICILLINE, Rhode Island
BLAKE FARENTHOLD, Texas              ERIC SWALWELL, California
DOUG COLLINS, Georgia                TED LIEU, California
KEN BUCK, Colorado                   JAMIE RASKIN, Maryland
JOHN RATCLIFFE, Texas                PRAMILA JAYAPAL, Washington
MARTHA ROBY, Alabama                 BRAD SCHNEIDER, Illinois
MATT GAETZ, Florida                  VALDEZ VENITA ``VAL'' DEMINGS, 
MIKE JOHNSON, Louisiana                  Florida
ANDY BIGGS, Arizona
JOHN RUTHERFORD, Florida
KAREN HANDEL, Georgia
KEITH ROTHFUS, Pennsylvania

          Shelley Husband, Chief of Staff and General Counsel
       Perry Apelbaum, Minority Staff Director and Chief Counsel

                            C O N T E N T S

                              ----------                              


                             JULY 17, 2018

                           OPENING STATEMENTS

                                                                   Page
The Honorable Bob Goodlatte, Virginia, Chairman, Committee on the 
  Judiciary......................................................     1
The Honorable Jerrold Nadler, New York, Ranking Member, Committee 
  on the Judiciary...............................................    12

                               WITNESSES

Ms. Monika Bickert, Head of Global Policy Management, Facebook
    Oral Statement...............................................     5
Ms. Juniper Downs, Global Head of Public Policy and Government 
  Relations, YouTube
    Oral Statement...............................................     6
Mr. Nick Pickles, Senior Strategist, Public Policy, Twitter
    Oral Statement...............................................     8


FACEBOOK, GOOGLE AND TWITTER: EXAMINING THE CONTENT FILTERING PRACTICES 
                         OF SOCIAL MEDIA GIANTS

                              ----------                              


                         TUESDAY, JULY 17, 2018

                        House of Representatives

                       Committee on the Judiciary

                             Washington, DC

    The committee met, pursuant to call, at 10:09 a.m., in Room 
2141, Rayburn House Office Building, Hon. Bob Goodlatte 
[chairman of the committee] presiding.
    Present: Representatives Goodlatte, Smith, Chabot, Issa, 
King, Gohmert, Jordan, Poe, Marino, Labrador, Collins, 
DeSantis, Buck, Ratcliffe, Gaetz, Johnson of Louisiana, 
Rutherford, Handel, Rothfus, Nadler, Lofgren, Jackson Lee, 
Johnson of Georgia, Deutch, Bass, Jeffries, Cicilline, Lieu, 
Raskin, Jayapal, Schneider, and Demings.
    Staff Present: Shelley Husband, Staff Director; Branden 
Ritchie, Deputy Staff Director; Zach Somers, Parliamentarian 
and General Counsel; John Coleman, Counsel, Subcommittee on the 
Constitution and Civil Justice; Dan Huff, Counsel, Subcommittee 
on Regulatory Reform, Commercial and Antitrust Law; Amy Rutkin, 
Minority Chief of Staff; John Doty, Minority Senior Advisor; 
Perry Apelbaum, Minority Staff Counsel; Danielle Brown, 
Minority Deputy Chief Counsel and Parliamentarian; Aaron 
Hiller, Minority Deputy Chief Counsel, Oversight Counsel and 
the Subcommittee on the Constitution; James Park, Minority 
Chief Counsel, Subcommittee on the Constitution; Slade Bond, 
Minority Chief Counsel, Subcommittee on Regulatory Reform, 
Commercial and Antitrust Law; David Greengrass, Minority Senior 
Counsel; Arya Hariharan, Minority Counsel; Matthew Morgan, 
Minority Professional Staff Member; and Veronica Eligan, 
Minority Professional Staff Member.
    Chairman Goodlatte. Good morning. The Judiciary Committee 
will come to order. And without objection, the chair is 
authorized to declare a recess of the committee at any time.
    We welcome everyone to this morning's hearing on Facebook, 
Google, and Twitter: Examining the Content Filtering Practices 
of Social Media Giants. And I'll begin by recognizing myself 
for an opening statement.
    Today we continue to examine how social media companies 
filter content on their platforms. At our last hearing which we 
held in April, this committee heard from Members of Congress, 
social media personalities, legal experts, and a representative 
of the news media industry to better understand the concerns 
surrounding content filtering. Despite our invitations, 
Facebook, Google, and Twitter declined to send witnesses. 
Today, we finally have them here.
    Since our last hearing, we've seen numerous efforts by 
these companies to improve transparency. Conversely, we've also 
seen numerous stories in the news of content that's still being 
unfairly restricted. Just before July 4, for example, Facebook 
automatically blocked a post from a Texas newspaper that it 
claimed contained hate speech. Facebook then asked the paper to 
review the contents of its page and remove anything that does 
not comply with Facebook's policy.
    The text at issue was the Declaration of Independence. 
Think about that for a moment. If Thomas Jefferson had written 
the Declaration of Independence on Facebook, that document 
would have never seen the light of day. No one would be able to 
see his words because an algorithm automatically flagged it, or 
at least some portion of it, as hate speech. It was only after 
public outcry that Facebook noticed this issue and unblocked 
the post.
    Facebook may be embarrassed about this example. This 
committee has the opportunity today to ask, but Facebook also 
may be inclined to mitigate its responsibility, in part, 
because it was likely software, not a human being, that raised 
an objection to our founding document.
    Indeed, given the scale of Facebook and other social media 
platforms, a large portion of their content filtering is 
performed by algorithms without the need of human assistance. 
And Facebook is largely free to moderate content on its 
platform as it sees fit. This is in part because, over 20 years 
ago, Congress exempted online platforms from liability for 
harms occurring over their services.
    In 1996, the internet was just taking shape. Congress 
intended to protect it to spur its growth. It worked because 
the vibrant internet of today is no doubt a result of Congress' 
foresight in part.
    But the internet of today is almost nothing like the 
internet of 1996. Today we see that the most successful ideas 
have blossomed into some of the largest companies on Earth. 
These companies dominate their markets, and perhaps rightfully 
so, given the quality of their products.
    However, this begs another question. Are these companies 
using their market power to push the envelope on filtering 
decisions to favor the content the companies prefer. Congress 
must evaluate our laws to ensure that they are achieving their 
intended purpose. The online environment is becoming more 
polarized, not less. And there are concerns that discourse is 
being squelched, not facilitated.
    Moreover, society as a whole is finding it difficult to 
define what these social media platforms are and what they do. 
For example, some would like to think of them as government 
actors, as public utilities, as advertising agencies, or as 
media publishers, each with its own set of legal implications 
and potential shortfalls. It's clear, however, that these 
platforms need to do a better job explaining how they make 
decisions to filter content and the rationale for why they do 
so.
    I look forward to the witnesses' testimony.
    It is now my pleasure to recognize the vice ranking member 
of the Judiciary Committee, the gentleman from Maryland, Mr. 
Raskin, for his opening statement.
    Mr. Raskin. Mr. Chairman, thank you very much.
    In terms of today's hearing, clearly the majority would 
prefer to focus on made-up threats, fabricated, phony, and 
inflated threats instead of the real threats that are facing 
the United States of America. So today, we resume consideration 
of the entirely imaginary narrative that social media companies 
are biased against conservatives, companies, I should add, 
whose platforms were used to spread vicious Russian propaganda 
that helped to elect Donald Trump President.
    It is ironic but entirely predictable that today's hearing, 
which is ostensibly about the silencing of minority voices, 
begins by silencing the minority on this committee, by denying 
us a witness of our own choosing, as is our committee custom 
and is our right. This decision to exclude a minority witness 
continues the majority's recent outrageous assault on the 
committee's standard rules, conventions, and practices.
    Yesterday, the ranking member sent a letter to the chairman 
to protest this decision, and I now ask unanimous consent that 
it be entered into the record.
    Mr. Issa. I object.
    Mr. Raskin. On what grounds?
    Mr. Issa. Every bit of the discussion you're having is 
outside the fair decorum of this body.
    Mr. Raskin. Mr. Chairman, who controls the time right now?
    Chairman Goodlatte. The gentleman will suspend. And the 
gentleman from California has made his objection. However, I 
would urge the gentleman to reconsider. It is the custom of 
this committee that we make documents in order, and I don't see 
any reason why we should stop doing that now. So----
    Mr. Issa. Mr. Chairman, I do not object to the letter. I do 
object to the gentleman's depiction of so much, so 
inappropriate to the decorum of the body, but I withdraw my 
objection.
    Chairman Goodlatte. I thank the gentleman and dually note 
his--without objection, the document will be made a part of the 
record.
    Mr. Raskin. Thank you, Mr. Chairman. And thank you for 
withdrawing that objection.
    This hearing was called as a followup to the one that we 
conducted in April on the content filtering practices of social 
media platforms. So to be clear, the majority intends to pick 
up where it left off with Diamond and Silk when we last met, 
and to dedicate one of the last working days before the 5-week 
August recess to this conservative fantasy, instead of 
examining a long list of real and pressing issues facing 
America, beginning with the crisis caused yesterday by 
President Trump's abject humiliation before the eyes of the 
world in his cooperation with Vladimir Putin and his choosing 
Putin's narrative over that of the U.S. Intelligence Community 
and the U.S. law enforcement community.
    The majority would have us believe this conspiracy theory 
about anti-conservative bias despite the fact that the 
Republican Party controls every elected component of our 
Federal Government: the House, the Senate, the White House, the 
Supreme Court. And they're working, of course, to try to 
control the workings of the FBI and the Department of Justice, 
as well as the majority of State legislators and governorships, 
which, in turn, have allowed them to gerrymander congressional 
and State legislative seats to cement their political control 
over our country.
    While there are legitimate questions to be raised about 
social media companies' practices in general, alleged anti-
conservative bias is simply not one of them. We continue to go 
down a road of pure fantasy.
    It might instead be helpful to know what these companies 
are doing to weed out the prevalence of false information, fake 
news spread by hostile foreign powers and by others in order to 
poison our political discourse and divide our people. It might 
also be useful to know how social media companies enforce 
community standards that target racist, bigoted, or other 
inappropriate content and whether their enforcement practices 
need more refinement and focus.
    Finally, we might take advantage of the fact that we have 
representatives from three of the major social media companies 
here to ask them what they are doing to protect their users' 
data privacy and whether we ought to consider establishing a 
single governing framework to protect user data as the European 
Union has done. And, in fact, the State of California has moved 
dramatically in that direction recently as well.
    One need only point to the revelation surrounding the 
unauthorized use of Facebook user data by the political 
research firm Cambridge Analytica to see the true dangers posed 
by a lack of such protection for user data privacy and the 
people who use social media every day.
    Mr. Chairman, there is no evidence to back the specious 
claim that social media companies intentionally target 
conservative content for disfavored treatment because of their 
political ideology. Moreover, even if they were, it would be 
their right as private companies to do so, just like Sinclair 
and FOX News have a clear ideological bent and clearly promote 
their own form of censorship on their own media platforms.
    Rather than wasting our time pursuing fairy tales, I hope 
the majority will find some time to examine the pressing 
substantive issues that now should be the focus of our hearing 
and our committee instead.
    I yield back the balance of my time.
    Chairman Goodlatte. We welcome our distinguished witnesses. 
And if you'd all please rise, we'll begin by swearing you in.
    Please raise your right hand.
    Do you and each of you solemnly swear that the testimony 
that you are about to give shall be the truth, the whole truth, 
and nothing but the truth, so help you God?
    Thank you very much.
    Let the record show that all the witnesses answered in the 
affirmative.
    Our first witness is Monika Bickert, the head of Global 
Policy Management at Facebook. Our second witness is Juniper 
Downs, the global head of Public Policy and Government 
Relations at YouTube. And our third and final witness is Nick 
Pickles, a senior strategist of public policy at Twitter.
    I look forward to hearing from all of our witnesses today.
    Your written statements will be entered into the record in 
their entirety, and we ask that you summarize your testimony in 
5 minutes. And to help you stay within that time, there's a 
timing light on the table in front of you. When the light 
switches from green to yellow, you have 1 minute to conclude 
your testimony.
    Welcome to all of you.
    And Ms. Bickert, you may begin.

TESTIMONY OF MONIKA BICKERT, HEAD OF GLOBAL POLICY MANAGEMENT, 
   FACEBOOK; JUNIPER DOWNS, GLOBAL HEAD OF PUBLIC POLICY AND 
    GOVERNMENT RELATIONS, YOUTUBE; AND NICK PICKLES, SENIOR 
               STRATEGIST, PUBLIC POLICY, TWITTER

                  STATEMENT OF MONIKA BICKERT

    Ms. Bickert. Thank you.
    Chairman Goodlatte, Ranking Member Nadler, and members of 
the committee, thank you for the opportunity to be here today. 
My name is Monika Bickert, and I am the vice president of 
Global Policy Management at Facebook. We appreciate this 
committee's hard work as it examines content filtering policies 
on social media platforms.
    At Facebook, our mission is to give people the power to 
build community and bring the world closer together. More than 
2 billion people come to our platform each month to stay 
connected with friends and family, to discover what's going on 
in the world, to build their businesses, and to share what 
matters most to them. Freedom of expression is one of our core 
values, and we believe that the Facebook community is richer 
and stronger when a broad range of viewpoints are represented 
on our platform.
    Chairman Goodlatte. Let me ask the witness to suspend for a 
moment, and let me ask those members of the audience who are 
displaying things in violation of the decorum of the committee 
to take them down.
    Thank you very much. You may proceed.
    Ms. Bickert. Thank you, Mr. Chairman.
    People share billions of pictures, stories, and videos on 
Facebook every day. Being at the forefront of such a high 
volume of sharing means that we are also at the forefront of 
new questions about how to engage in automated and manual 
content filtering to keep our communities safe and vibrant. We 
know that there have been a number of recent high-profile 
content removal incidents across the political spectrum, and we 
are working to respond to the concerns raised by the Facebook 
community, this committee, and others. Let me highlight a few 
of the things that we are doing.
    First, we recently published a new version of our community 
standards, which includes the details of how our reviewers, our 
content reviewers, apply our policies governing what is and 
what is not allowed on Facebook. We've also launched an appeals 
process to enable people to contest our content decisions. We 
believe this will also enhance the quality of our automated 
filtering.
    We have engaged former Senator Jon Kyl to look at the issue 
of potential bias against conservative voices. Laura Murphy, a 
national civil liberties and civil rights leader, is also 
getting feedback directly from civil rights groups about bias 
and related topics.
    As part of Facebook's broader efforts to ensure that time 
on our platform is well spent, we're also taking steps to 
reduce the spread of false news. False news is an issue that 
negatively impacts the quality of discourse on both right and 
left, and we are committed to reducing it. We are working to 
prioritize news that is trustworthy, informative, and locally 
relevant. We are partnering with third-party fact-checking 
organizations to limit the distribution of stories that have 
been flagged as misleading, sensational, or spammy. We 
recognize that some people may ask whether in today's world it 
is possible to have a set of fact checkers that are widely 
recognized as objective.
    While we work with the nonpartisan International Fact-
Checking Network to make sure all our partners have high 
standards of accuracy, fairness, and transparency, we know this 
is still not a perfect process. As a result, our process 
provides for appeals. And if any one of our fact checkers rates 
a story as true, we do not down rank that content.
    Similar to our community standards, we have also published 
advertising policies that outline which ads are and are not 
allowed on Facebook. We recently announced changes designed to 
prevent future abuse in elections and to help ensure that 
people on Facebook have the information they need to assess 
political and issue ads. This is significant and challenging 
engineering work.
    Our goal is transparency, and we will continue to strive to 
find a right balance that is not overinclusive or 
underinclusive. We hope that these improvements will ensure 
that Facebook remains a platform for a wide range of ideas.
    Before I close, I do want to acknowledge the video bloggers 
known as Diamond and Silk. We badly mishandled our 
communications with them. And since then, we've worked hard to 
improve our relationship. We appreciate the perspective that 
they add to our platform.
    And, finally, I want to reiterate our commitment to 
building a community that encourages free expression. We 
recognize that people have questions about our efforts, and we 
are committed to working with members of this committee, our 
users, and others to continue this dialogue.
    I appreciate the opportunity to be here today, and I look 
forward to your questions. Thank you.
    Chairman Goodlatte. Thank you, Ms. Bickert.
    Ms. Downs, welcome.

                   TESTIMONY OF JUNIPER DOWNS

    Ms. Downs. Thank you.
    Chairman Goodlatte, Vice Ranking Member Raskin, and members 
of the committee, thank you for the opportunity to appear 
before you today. My name is Juniper Downs, and I serve as the 
global policy lead for YouTube.
    The internet has been a force for creativity, learning, and 
access to information. Products like Google Search and YouTube 
have expanded economic opportunity for small businesses; given 
artists, creators, and journalists a platform to share their 
work; and enabled billions to benefit from a broader 
understanding of the world.
    Supporting the free flow of ideas is core to our mission to 
organize the world's information and make it universally 
accessible and useful. We build tools that empower users to 
access, create, and share information like never before. We 
build those products for everyone in the U.S. and around the 
world. People will value these services only so long as they 
continue to trust them to work well and provide them with the 
most relevant and useful information. We have a natural and 
long-term incentive to make sure that our products work for 
users of all viewpoints.
    We strive to make information from the web available to all 
of our users, but not all speech is protected. Once we are in 
notice of content that may violate local law, we evaluate it 
and block it for the relevant jurisdictions. For many issues, 
such as defamation or hate speech, our legal obligations may 
vary as different jurisdictions deal with these complex issues 
differently. In the case of all legal removals, we share 
information about government requests for removal in our 
transparency report.
    Where we've developed our own content policies, we enforce 
them in a politically neutral way. Giving preference to content 
of one political ideology over another would fundamentally 
conflict with our goal of providing services that work for 
everyone.
    Search aims to provide all users with useful and relevant 
results based on the text of their query. Search handles 
trillions of queries each year, and 15 percent of the queries 
we see each day we've never seen before. For a typical search 
on Google, there are thousands, even millions, of web pages 
with potentially relevant information. Building a search engine 
that can serve the most useful and relevant results for all of 
these queries is a complex challenge that requires ongoing 
research, quality testing, and investment.
    Every year, we make thousands of changes to Search to 
improve the quality of our results. In 2017, we ran over 
270,000 experiments with trained external evaluators and live 
user tests, resulting in more than 2,400 improvements to 
Search.
    We put all possible changes through rigorous user testing 
and evaluation. We work with external search quality evaluators 
from a range of backgrounds and geographies to measure the 
quality of search results on an ongoing basis. These evaluators 
assess how well a website gives searchers what they're looking 
for and rate the quality of the results. These ratings help us 
benchmark so we can meet a high bar for users of Google Search 
all around the world. We publish our search quality evaluator 
guidelines and make them publicly available through our how 
search works website. Our ranking algorithms have one purpose 
only: delivering the best possible search results for our 
users.
    YouTube's mission is to give everyone a voice and show them 
the world. It has democratized how stories and whose stories 
get told. We work to provide a place where people can listen, 
share, build community, and be successful.
    To put our work in context, it's important to recognize the 
scale of our services. More than 1\1/2\ billion people come to 
YouTube every month. We see well over 450 hours of video 
uploaded every minute. Most of this content is positive. In 
fact, learning and educational content drives over a billion 
views on YouTube every single day.
    Many creators are able to make a living using the platform. 
YouTube channels making over six figures in revenue are up 40 
percent over the last year. And digital platforms like YouTube 
have long been a place for breaking news, exposing injustices, 
and sharing content from previously inaccessible places.
    We are dedicated to access to information and freedom of 
expression, but it's not anything goes on YouTube. We've 
developed robust community guidelines which we publish to 
provide clear guidance on the rules of the road. For example, 
we do not allow pornography, incitement to violence, or 
harassment. Keeping YouTube free from dangerous, illegal, or 
illicit content not only protects our users, it's a business 
imperative.
    Our policies are crafted to support an environment where 
creators, advertisers, and viewers alike can thrive. That 
includes certain restrictions we may apply to content, 
including disabling advertising on videos that don't comply 
with our advertiser-friendly guidelines and age restricting 
content that may not be appropriate for all audiences.
    We also provide user controls like restricted mode, an 
optional setting for users who want to filter out more mature 
content. Of course, videos that are unavailable in restricted 
mode or are not monetized through advertising remain available 
on the site.
    We don't always get it right, and sometimes our system 
makes mistakes. We hear these concerns from creators of all 
stripes. Accordingly, we have a robust process for appeal of 
both the monetization and removal decisions. We encourage our 
users to take advantage of this process if they feel we've 
acted in a way that's inconsistent with our policies.
    As I mentioned from the start, we build our products for 
all of our users from all political stripes around the globe. 
The long-term success of our business is directly related to 
our ability to earn and maintain the trust of our users. We 
will continue to pursue that trust by encouraging and acting on 
feedback on ways we can improve.
    Thank you for the opportunity to outline our efforts in 
this space. I'm happy to answer any questions you may have.
    Chairman Goodlatte. Thank you, Ms. Downs.
    Mr. Pickles, welcome.

                   TESTIMONY OF NICK PICKLES

    Mr. Pickles. Chairman Goodlatte, Vice Ranking Member 
Raskin, and distinguished members of the committee, thank you 
for the opportunity to be here today. My name is Nick Pickles. 
I'm the senior strategist on Twitter's public policy team.
    Twitter's purpose is to serve the public conversation. We 
have committed Twitter to help increase the collective health, 
openness, and civility of public conversation, and to hold 
ourselves publicly accountable towards progress. Twitter's 
health will be built and measured by how we help encourage more 
healthy debate, conversations, and critical thinking. 
Conversely, abuse, spam, and manipulation detract from it.
    We are looking to partner with outside experts to help us 
identify how we measure the health of Twitter, keep us 
accountable, to share our progress with the world, and to 
establish a way forward for the long term.
    We strive to protect expression, including views that some 
of our users may find objectionable or with which they 
vehemently disagree. We do not believe that censorship will 
solve societal challenges, nor that removing content will 
resolve disagreements. Threats of violence, abuse of conduct, 
and harassment are an attack on free expression intended to 
silence the voice of others, thereby robbing Twitter of 
valuable perspectives and threaten the free expression that we 
seek to foster.
    Accordingly, the Twitter rules prohibit this and other 
types of behavior on our platform. Our rules are not based on 
ideology or particular sets of beliefs. Instead, the Twitter 
rules are based on behavior. Accounts that violate our rules 
can be subject to a range of enforcement actions, including 
temporary and, in some cases, permanent suspension. We are 
increasing the transparency of these decisions so that users 
better understand our rules and why we are taking action.
    Because promoted tweets, our ads, are presented to users 
from accounts they have not chosen to follow, Twitter applies a 
more robust set of policies that prohibit advertising on, among 
other things, adult content, potentially unsafe products, and 
offensive content.
    We see a range of groups across the political spectrum 
regularly use our advertising to promote a variety of issues 
and causes. Our enforcement processes rely both on technology 
and manual human review. Every day, we have to make tough 
calls, and we do not always get them right.
    When we make a mistake, we acknowledge them, and we strive 
to learn from them. For example, our decision to hold 
Congressman Blackburn's campaign launch advertisement was a 
mistake. And when it was brought to our attention, we rectified 
it the same day. We apologized to her campaign at the time, and 
I'd like to apologize to her again today. Importantly, the 
tweet itself was never removed from Twitter.
    We've made significant progress combating abuse and 
manipulation. But our work will never be complete. We have made 
more than 30 policy and product changes since the beginning of 
last year. Additionally, we recently took steps to remove 
locked accounts from follower counts globally. This step will 
ensure that indicators that users rely on to make judgements 
about an account are as accurate as possible. This change 
applies to all accounts active on the platform regardless of 
the content they post.
    We also recently have integrated new behavioral signals 
into how tweets are presented and search results in 
conversations targeting behavior that may not violate our rules 
but is disruptive. Significantly, this approach enables us to 
improve the overall health of the platform without always 
needing to remove content.
    Some critics have described these efforts as a banning of 
conservative voices. Let me make clear to the committee today 
that these claims are unfounded and false. In fact, we have 
deliberately taken this behavior-led approach as a robust 
defense against bias as it requires us to define and act upon 
bad conduct, not a specific type of speech.
    Our success as a company depends on making Twitter a safe 
place for free expression. We are proud of the work we do in 
the world. However, we will never rest on our laurels.
    As senior strategist, my role is at the intersection of 
public policy, product, and trust and safety work. This 
juncture is unique in allowing an insight into how our company 
defends free expression. And I hope to provide both insight and 
reassurance to the committee today.
    Thank you again, and I look forward to your questions.
    Chairman Goodlatte. Thank you, Mr. Pickles.
    We'll now proceed under the 5-minute rule with questions, 
and I'll begin by recognizing myself.
    All three of you represent companies that have very strong, 
in many instances, dominant market shares in the sectors that 
you provide services. So I'll ask this to each of you, and 
we'll start with you, Ms. Bickert.
    All other things being equal, which company is likelier to 
be concerned about consumers leaving in response to 
discriminatory filtering practices, one with a 75 percent 
market share or one with 10 percent?
    Ms. Bickert. I'm sorry, Mr. Chairman. I want to make sure 
that I understand your question. Would you mind repeating it?
    Chairman Goodlatte. Sure. All other things being equal, 
which company is likelier to be concerned about consumers 
leaving in response to discriminatory filtering practices, one 
with a 75 percent market share or one with 10 percent?
    Ms. Bickert. Well, I know that at Facebook we want to make 
sure that everybody feels welcome. We are a platform for broad 
ideas across the political spectrum, and we don't want anybody 
on our platform to feel discriminated against. We want to make 
sure that our policies are applied neutrally and fairly.
    Chairman Goodlatte. And you think that the lack of 
competition in your space does not in any way affect that 
position?
    Ms. Bickert. Mr. Chairman, I know that right now, the 
average user of social media in the United States uses eight-- 
approximately eight internet communication services. So, 
clearly, people have a choice in the United States when they go 
online. And Facebook is one service they can use, but they can 
also use many others.
    Chairman Goodlatte. But Facebook owns more than one of 
those, right?
    Ms. Bickert. We do. We have Facebook, we have Instagram, 
and we have WhatsApp. But, again, users have a lot of choice 
here, as do advertisers.
    Chairman Goodlatte. Ms. Downs.
    Ms. Downs. Thank you, Mr. Chairman. We operate in an 
incredibly competitive environment. The tech industry is very 
dynamic. There are new players and entrants to the market all 
of the time. And so we have a natural incentive to continue 
delivering the most trustworthy, high-quality product to our 
users because we know competition is always one click away.
    Chairman Goodlatte. Mr. Pickles.
    Mr. Pickles. Thank you, Mr. Chairman. And it's always 
grateful to remind sometimes that our companies themselves are 
quite different. YouTube is arguably smaller than our peers 
here today.
    But from our perspective, the primary focus is that every 
user has paramount rights to free expression. And providing the 
apps within our rules, we're going to defend that right for 
them. I think the question mark of why we make those decisions 
is focused solely on the behavior of the user and whether they 
violated our rules.
    Chairman Goodlatte. Thank you.
    Ms. Bickert, do you think that the host of content 
producers whose speech has been filtered by Facebook would 
complain as loudly if they could simply switch to a competitor?
    Ms. Bickert. Mr. Chairman, I think that people do have a 
choice to use other services. We are keenly aware that users 
have choice, that advertisers have choice. And that's why we 
work hard to make sure that Facebook is a place where both 
users and advertisers want to be.
    Chairman Goodlatte. Thank you.
    Ordinarily, the sort of liability exemptions the social 
media platforms enjoy are only granted to regulated utilities, 
like phone companies. The rationale is that since phone 
companies do not have full discretion to determine who to serve 
to set the terms and conditions of their services or to 
interfere with the content they must carry, they should not be 
held culpable for harms caused by use of their services. 
Nonutilities, by contrast, are typically subject to judicial 
liability if they have not done enough to mitigate harms from 
use of their services.
    At some point, for example, hotels have a legal obligation 
to curb sex trafficking in their rooms, or clubs have a legal 
obligation to curb sale or use of illegal drugs on their dance 
floors, and pawnshops have a legal obligation to curb fencing 
of stolen goods in their stores. Property owners have a legal 
obligation to curb hazards on their grounds, and traditional 
newspapers and programming networks have a legal obligation to 
curb defamation over their outlets.
    Therefore, I'd like to ask, I'll start with you, Ms. Downs, 
why should your company be treated differently than these other 
nonutilities that I've just described?
    Ms. Downs. YouTube is a service provider that hosts user-
generated content at an unprecedented scale. And section 230 
was crafted to allow service providers like us to remove user-
uploaded content that violates content policies without 
assuming publisher liability for all the user-generated content 
on our site. Without section 230, we wouldn't be able to remove 
harmful content like child pornography without fear of 
liability.
    Chairman Goodlatte. Mr. Pickles.
    Mr. Pickles. I think it's fundamental to competition, is 
how do we ensure that new entrants can come into the market and 
compete with our businesses. And 230 is an essential part of 
that.
    And just to build on that point, in our case, for example, 
we're able to now detect 95 percent of terrorist accounts on 
Twitter ourselves using our own technology and remove them 
quickly in 75 percent of cases before they have even tweeted. 
So we're able to take those strong steps because of the legal 
framework that's in place but that also protects people 
competing with us.
    Chairman Goodlatte. My time's expired.
    The chair recognizes the ranking member, the gentleman from 
New York, Mr. Nadler, for 5 minutes.
    Mr. Nadler. Thank you, Mr. Chairman.
    Before I begin my questions, I have a motion at the desk.
    Mr. Chairman, on July 10, 2001, the Phoenix field office of 
the FBI forwarded a memorandum to headquarters to advise the 
Bureau of an effort by Osama bin Laden to send associates to 
the United States to enroll in civil aviation courses. In the 
words of former CIA director George Tenet, the system was 
blinking red.
    For a host of complicated reasons, the Bush administration 
did not follow up adequately. And 2 months later, on September 
11, in my district, the World Trade Towers fell.
    Mr. Chairman, last Friday, Special Counsel Robert Mueller 
indicted 12 Russian nationals for hacking into the Democratic 
National Committee, the Democratic Congressional Campaign 
Committee, and several State election systems. This indictment 
is a remarkable piece of forensic work. With the aid of the 
intelligence community, the special counsel can name the 
specific Russian military intelligence officers at the keyboard 
on a given day. And in the words of Director Coats, a 
distinguished former Republican Senator and President Trump's 
hand-picked Director of National Intelligence, our digital 
infrastructure is literally under attack, unquote, by the 
Russian Government as we speak.
    Mr. Chairman, this latest indictment can surely be seen as 
the equivalent of the Phoenix memo about 9/11. It is a warning. 
We must heed it.
    Yesterday in Helsinki, President Trump said he does not 
believe it. He sided with Vladimir Putin over his own 
intelligence community. And he continues to undermine American 
law enforcement proclaiming on the world stage that our laws 
are meaningless, that the work of investigators has been 
worthless, and that no one should take the special counsel 
seriously.
    This is a catastrophe in the making. If we do not take any 
action, the American people may not trust the outcome of the 
next election. And instead of taking action in this committee, 
instead of refuting the President with information you and I 
have both read, Mr. Chairman, we spent 6 more hours questioning 
Lisa Page about Cheryl Mill's laptop and Hillary Clinton's 
email.
    This is a national emergency, and our silence is 
unacceptable. Our Nation is under attack. Accordingly, under 
committee rule 3(b) and House rule XI (g)(2)(A), I move that 
the committee go into-- do now go into executive session for 
the purposes of discussing the evidence in our possession that 
speaks directly to the special counsel's indictment and to the 
President's apparent submission to the Russian Government.
    Mr. Cicilline. Second.
    Chairman Goodlatte. The motion is not debatable, and the 
clerk will call the role.
    Mr. Issa. Mr. Chairman, a point of order.
    Did you recognize him for a motion or recognize him for an 
opening statement?
    Chairman Goodlatte. I recognized him for an--for--to 
question witnesses. He's offered this motion. It's not 
debatable. We're going to vote on it immediately.
    And the clerk will call the role.
    Ms. Adcock. Mr. Goodlatte?
    Chairman Goodlatte. No.
    Ms. Adcock. Mr. Goodlatte votes no.
    Mr. Sensenbrenner?
    [No response.]
    Ms. Adcock. Mr. Smith?
    [No response.]
    Ms. Adcock. Mr. Chabot?
    [No response.]
    Ms. Adcock. Mr. Issa?
    Mr. Issa. No.
    Ms. Adcock. Mr. Issa votes no.
    Mr. King?
    Mr. King. No.
    Ms. Adcock. Mr. King votes no.
    Mr. Gohmert?
    Mr. Gohmert. No.
    Ms. Adcock. Mr. Gohmert votes no.
    Mr. Jordan?
    Mr. Jordan. No.
    Ms. Adcock. Mr. Jordan votes no.
    Mr. Poe?
    [No response.]
    Ms. Adcock. Mr. Marino?
    Mr. Marino. No.
    Ms. Adcock. Mr. Marino votes no.
    Mr. Gowdy?
    [No response.]
    Ms. Adcock. Mr. Labrador?
    Mr. Labrador. No.
    Ms. Adcock. Mr. Labrador votes no.
    Mr. Collins?
    [No response.]
    Ms. Adcock. Mr. DeSantis?
    [No response.]
    Ms. Adcock. Mr. Buck?
    [No response.]
    Ms. Adcock. Mr. Ratcliffe.
    [No response.]
    Ms. Adcock. Mrs. Roby?
    [No response.]
    Ms. Adcock. Mr. Gaetz?
    Mr. Gaetz. No.
    Ms. Adcock. Mr. Gaetz votes no.
    Mr. Johnson of Louisiana?
    Mr. Johnson of Louisiana. No.
    Ms. Adcock. Mr. Johnson votes no.
    Mr. Biggs?
    [No response.]
    Ms. Adcock. Mr. Rutherford?
    Mr. Rutherford. No.
    Ms. Adcock. Mr. Rutherford votes no.
    Mrs. Handel?
    Mrs. Handel. No.
    Ms. Adcock. Mrs. Handel votes no.
    Mr. Rothfus?
    Mr. Rothfus. No.
    Ms. Adcock. Mr. Rothfus votes no.
    Mr. Nadler?
    Mr. Nadler. Aye.
    Ms. Adcock. Mr. Nadler votes aye.
    Ms. Lofgren?
    Ms. Lofgren. Aye.
    Ms. Adcock. Ms. Lofgren votes aye.
    Ms. Jackson Lee?
    [No response.]
    Ms. Adcock. Mr. Cohen?
    [No response.]
    Ms. Adcock. Mr. Johnson of Georgia?
    Mr. Johnson of Georgia. Aye.
    Ms. Adcock. Mr. Johnson votes aye.
    Mr. Deutch?
    Mr. Deutch. Aye.
    Ms. Adcock. Mr. Deutch votes aye.
    Mr. Gutierrez?
    [No response.]
    Ms. Adcock. Ms. Bass?
    Ms. Bass. Aye.
    Ms. Adcock. Ms. Bass votes aye.
    Mr. Richmond?
    [No response.]
    Ms. Adcock. Mr. Jeffries?
    [No response.]
    Ms. Adcock. Mr. Cicilline?
    Mr. Cicilline. I vote for America. Aye.
    Ms. Adcock. Mr. Cicilline votes aye.
    Mr. Swalwell?
    [No response.]
    Ms. Adcock. Mr. Lieu?
    [No response.]
    Ms. Adcock. Mr. Raskin?
    Mr. Raskin. Aye.
    Ms. Adcock. Mr. Raskin votes aye.
    Ms. Jayapal?
    Ms. Jayapal. Aye.
    Ms. Adcock. Ms. Jayapal votes aye.
    Mr. Schneider?
    Mr. Schneider. Aye.
    Ms. Adcock. Mr. Schneider votes aye.
    Mrs. Demings?
    Mrs. Demings. Aye.
    Ms. Adcock. Mrs. Demings votes aye.
    Chairman Goodlatte. Has every member voted who wishes to 
vote?
    The clerk will report.
    Mr. Issa. Mr. Chairman, while they're counting that, I have 
a question for you. Isn't--don't the rules require that each 
member receive a copy of the motion?
    I've just received it. And since it wasn't spoken about, 
I'm surprised at what it says.
    Can we see that every member has a copy of the motion 
before we close the vote?
    Chairman Goodlatte. The clerk will report.
    Ms. Adcock. Mr. Chairman, 10 members voted aye, 12 members 
vote no.
    Chairman Goodlatte. And the motion is not agreed to. But 
the gentleman's point is well taken, and the motion will be 
distributed to all the members.
    How much time is left on the gentleman from New York?
    Okay. The gentleman is recognized.
    Mr. Nadler. Thank you.
    I'll begin my questioning by saying that to the shock of 
the Nation yesterday, President Trump stood next to Vladimir 
Putin and accepted the Russian President's word over America's 
own intelligence community's assessment that the Russian 
Government attacked our democracy during the 2016 Presidential 
election.
    He said, quote, I have great confidence in my intelligence 
people, but I will tell you that President Putin was extremely 
strong and powerful in his denial today, close quote.
    He went on to call the special counsel probing into that 
attack a disaster for our country and a total witch hunt.
    I trust the assessment of the U.S. Intelligence Community, 
not that of Vladimir Putin. And the facts are clearly laid out 
in the grand jury indictment obtained by the special counsel. 
Thirteen Russian nationals associated with several Russian-
based organizations, quote, posing as U.S. persons and creating 
false U.S. personas operated social media pages in groups 
designed to attract U.S. audiences. These groups and pages 
which addressed divisive U.S. political and social issues 
falsely claimed to be controlled by U.S. activists when, in 
fact, they were controlled by defendants. The defendants also 
used the stolen identities of real U.S. persons to post on 
defendants' organization controlled social media accounts. Over 
time, these social media accounts became defendants' means to 
reach significant members--significant numbers of Americans for 
purposes of interfering with the U.S. political system, 
including the Presidential election of 2016, close quote.
    That's obviously from the indictment issued a few months 
ago.
    Now, do each of you agree that Russian Government exploited 
the social media platforms your company provide--your companies 
provide to attack our democracy?
    Why don't we go left to right. Ms. Bickert first.
    Ms. Bickert. Ranking Member Nadler, as we have stated 
publicly, we did find accounts run by the Russian internet 
research agency both before they posted content on Facebook 
both before and after the 2016 election, and we did remove 
those accounts and report on them.
    Mr. Nadler. Thank you.
    Ms. Downs.
    Ms. Downs. Thank you, Ranking Member Nadler. We take 
election interference very seriously. And as we described last 
year, we did find limited activity on our services; limited 
because of the strong security controls we had in place leading 
up to the election. But we found two accounts linked to the 
internet research agency that had a total spend of less than 
$5,000 on our advertising products, and 18 YouTube channels 
containing a thousand videos. We terminated all of those 
accounts pursuant to our investigation.
    Mr. Nadler. Thank you.
    Mr. Pickles.
    Mr. Pickles. Thank you, sir. Yes, we removed accounts we 
believe were linked to the internet research agency. And also, 
based on the findings of the U.S. Intelligence Community, took 
the decision to off-board Russia Today and all its associated 
entities from our advertising products worldwide.
    Mr. Nadler. Thank you.
    To the extent you haven't answered this question just now, 
what steps have your companies taken to prevent further attacks 
on our democracy?
    If you've already answered it, you just say that.
    Ms. Bickert. Thank you, Congressman. We are working 
actively with academics, with others in industry, with 
government officials to make sure we're doing all we can to 
protect the elections coming up here and around the world. 
We're also improving our technology to help us find bad actors 
earlier.
    Ms. Downs. We're committed to working with Congress to 
ensure the integrity of our elections. We've undertaken a wide 
range of approaches, including a suite of tools called Protect 
Your Election that we've been using in our outreach to 
campaigns. We've worked with both the RNC and the DNC to 
educate them about these tools to protect election websites 
from hacking and interference. We also have adopted new 
transparency measures around election advertising requiring 
verification of those purchasing ads and ad labeling, and we 
will publish election ads publicly.
    Mr. Pickles. Thank you. So we have also improved 
advertising transparency, ads.twitter.com, for such 
transparency. One point I would like to flag is the improvement 
in technology that we have made in a year on year from this 
time last year, we now challenge 9.9 million accounts every 
week for suspicious activity, and that's a 299 percent increase 
on this time last year.
    Mr. Nadler. Thank you.
    Ms. Bickert, according to Cambridge Analytica whistleblower 
Christopher Wylie, Facebook data from 87 million Facebook 
profiles was used to develop psychographic profiles abuses 
which were then used by the Trump campaign to target users with 
online pro-Trump advertisements.
    Cambridge Analytica reportedly acquired this data from 
researcher Aleksandr Kogan who collected the data through a 
person--personality quiz app he created for Facebook's 
platform. According to news accounts, Facebook learned that Mr. 
Kogan had passed this data to Cambridge Analytica in 2015, 
demanded that this data be deleted, and asked the parties to 
certify that it was deleted.
    At that time, did Facebook take any additional steps, 
beyond the self-certification, to confirm this data has been 
deleted? And currently, when Facebook determines that data has 
been acquired to use by a third-party app in violation of 
company policies, does your company take active steps to 
confirm that any improperly acquired user data is secured or 
destroyed?
    Chairman Goodlatte. The time of the gentleman has expired. 
The witness may answer the question.
    Ms. Bickert. Thank you, Mr. Chairman.
    Congressman, after we were notified of the potential breach 
in December of 2015, we did take steps to obtain a 
certification from Cambridge Analytica. And this is--I'll echo 
the comments of our CEO when he testified before Congress on 
this. But we are taking many steps now to make sure that we 
understand the extent of any data that may have been passed to 
Cambridge Analytica and that we are also taking steps to make 
sure that this has not happened with other apps, or that if we 
do uncover any abuse, that we disclose it to anybody who may 
have been affected.
    Mr. Nadler. Thank you very much.
    Chairman Goodlatte. The chair recognizes the gentleman from 
California, Mr. Issa, for 5 minutes.
    Mr. Issa. Thank you, Mr. Chairman.
    Ms. Downs, I'm sorry you're from YouTube. I understand the 
request was for Google, and you're a subsidiary. And you may 
find this not as on point to YouTube, but there are similar 
examples at YouTube, so--but I'm going to phrase this in a 
Google corporate fashion, if you don't mind.
    Last month, Google provided a link for the California GOP, 
the official website of the Republican Party in California, 
that, unfortunately, because of a decision made by Wikipedia 
not to discipline and control their own content, had a 
reference to Nazism for the California Republican Party.
    Now, I'm a big supporter of emerging technologies, and I 
will always defend that things can happen in emerging 
technology that are unintended and, over time, they get 
corrected, and each of your three companies and the many 
companies that your companies have acquired deal with that 
every day. But when Google was a younger company, it was a blue 
box reference company, meaning that, by definition, what you 
did was, if I clicked on--if I Google searched something, what 
I would end up with is I'd end up with a list of places that I 
could then click on and go to.
    In the case of Wikipedia, currently, Google is using 
Wikipedia, scraping the information, and essentially using it 
almost as though its own content, meaning you're providing not 
a link to this site but you're, in fact, putting their 
information out as your information.
    Since Wikipedia is an external, fairly broad, in many 
cases, list of people, sometimes with political biases that 
will deliberately distort or do bad things to a site, and 
YouTube faces the same situation, how are we to hold you 
accountable when, in fact, instead of simply being a search 
source, you, in fact, are scraping the information? And this 
could be--obviously, we could look at how you treat restaurants 
in some cases making them your own, and so on. But 
specifically, when you absorb the content, aren't you absorbing 
the responsibility? And since, in the case of Wikipedia, 
clearly you were not scrubbing the content.
    Ms. Downs. Thank you. So knowledge panels are derived from 
a variety of sources across the web, including Wikipedia and 
other sources like the CIA----
    Mr. Issa. That's not the question, ma'am. The question is, 
aren't you absorbing the responsibility, and can't--and 
shouldn't we hold you responsible at least to the level of care 
that newspapers, ever so poorly, are held to?
    Ms. Downs. So we have robust protections in place to 
protect from this type of vandalism. And when we include 
information from other sites, like Wikipedia or any other site, 
we also include a link to their site so that users can click 
through to the original website and read the information there. 
So we still are following the traditional model of search to 
link to information across the web. It's just an opportunity 
for users to get information at a glance alongside organic 
search results.
    In the case of the California Republican Party, you're 
correct that Wikipedia was vandalized. We have protections in 
place to protect our services from showing information that's 
shared across the web pursuant to that kind of vandalism. 
Unfortunately, our systems didn't catch it in time in this 
instance, but we did fix it as soon as we were on notice and 
apologized to the California Republican Party for the----
    Mr. Issa. So now for each of you, a question that 
piggybacks the chairman's question. As your technologies now 
across the board are, by definition, mature, not just because 
they're more than a decade old in most cases, but because in 
the, if you will, the quarterly speed that goes on in San Jose 
and in other emerging technology areas, if we don't hold you 
accountable after a decade, then the reality is we never get 
past a decade. New technologies typically come in.
    So each of your technologies, why is it today that this 
side of the dais shouldn't begin looking at holding you 
accountable for what you publish that you--no matter where you 
scrape it from, if you make it your own, if you adopt it, why 
shouldn't we hold you at least to the level of care that we 
hold public newspapers and other media to?
    And I'll go right down the aisle.
    Ms. Bickert. Thank you, Congressman. We feel a tremendous 
sense of accountability for how we operate our service, and it 
is in our business interest to make sure that our service is a 
safe place.
    Mr. Issa. Mine is a strict liability question. Should we 
open you up to litigation under the standards of care that the, 
if you will, other media are held to?
    And if you could answer briefly, because my time is 
expired, each of you.
    Ms. Bickert. Congressman, we believe that section 230 of 
the Communications Decency Act is essential for online 
companies like those represented here today. And we also 
believe it's consistent with operating safe products that give 
consumers choice.
    Mr. Issa. Anyone else, please?
    Ms. Downs. We believe that the openness that's enabled by 
230 has brought tremendous benefits to the world. And for most 
of our products and services, we don't do the things many 
traditional publishing operations do, like author or copy edit 
content.
    Mr. Pickles. I think such an approach risks putting speech 
at risk and it risks competition. Our role is to have clear 
rules, to enforce those rules well, and to be more transparent 
in how we're doing that to build trust and confidence.
    Mr. Issa. Mr. Chairman, I appreciate all their comments, 
but I would note that free speech was created and supported by 
a newspaper system from our founding that lived by different 
rules. And I yield back.
    Chairman Goodlatte. I thank the gentleman.
    The chair recognizes the gentlewoman from California, Ms. 
Lofgren, for 5 minutes.
    Ms. Lofgren. Well, I think this is such an interesting 
hearing, I think, motivated by a sense of persecution on the 
part of Republicans and conservatives that somehow they're 
being unfairly treated when they have a majority in the House, 
the Senate, the White House. And when the analysis shows by 
NewsWhip that conservative news sites have three times more 
user engagement than liberals do, there's been no evidence 
whatsoever that I have seen and that the majority has been able 
to provide that there's any bias whatsoever.
    I--you know, the idea that we would adopt SOPA somehow in 
response to this feeling of persecution is astonishing to me. 
But I'd like to get into another issue, which is really the 
business model that is used in the digital environment that I 
think has an unintended consequence.
    Whether it's content discovery or user engagement or 
targeted advertising, your algorithms target what a user wants 
to see. And so, in other words, what I see is really tailored 
to my interests. And that's really for an advertising purpose. 
But the net result is that Americans have been isolated into 
bubbles. Now, where the purpose was really to sell ads, the net 
effect is that all of us have sort of ended up in echo chambers 
with confirmation bias that has allowed the American public to 
be exploited by our enemies.
    And, you know, the Russians tried to attack our 
infrastructure. We know that now, from the indictments, the 
Russian military was involved. This isn't meddling. This is an 
attack on the United States. And our people have been made more 
vulnerable because of the isolation that is the side product of 
your advertising model.
    So I'm wondering if you have, each of you, given some 
thought on how the model might be adjusted so that individuals 
who end up in these bubbled echo chambers can be freed from 
those echo chambers and have a more generic experience so that 
Americans can begin talking to each other again instead of just 
being led down the rabbit hole of conspiracy theories related 
to one political theory or another.
    If each of you could share your thoughts on that.
    Ms. Downs. Thank you, Congresswoman. We let users know when 
information that we're providing to them has been personalized. 
We believe transparency there is very important. So, for 
example, on YouTube, with watch next with the recommended 
videos, we label recommended for you if something is based on 
things that the user has watched before, and it's a 
personalized recommendation. And our search----
    Ms. Lofgren. But most people don't look at that. They're 
just getting the next--and it's--they're getting the next view 
because it's something that--it's a preference, and so you can 
sell ads. But it isolates further and further down that rabbit 
hole.
    What else are you doing?
    Ms. Downs. So in our research and recommendations 
generally, we aim to show users information from a variety of 
sources. In fact, we have research that shows that users do 
like to engage with content from a variety of sources. So we 
are very conscious of not wanting to isolate people. And 
because we have such a breadth and depth of content on our 
services, we aim to design products in a way that shows content 
from diverse sources.
    Ms. Lofgren. What about Facebook?
    Ms. Bickert. Thank you, Congressman. We've got a number of 
initiatives that are designed to increase the breadth of 
information that people come across if they're interacting with 
news on Facebook. First, we announced back in, I believe it was 
December, that we were working with third-party fact checkers. 
If we have indications that something on Facebook, a news story 
may be false, then we are sharing beneath the news article 
related articles from around the----
    Ms. Lofgren. Let me ask a follow-up question, because 
millions of Americans were sent material by the Russian 
military. Would Facebook contact each Facebook user and say, 
you were sent this by the Russian military in an effort to 
influence you?
    Ms. Bickert. Anybody who saw content that was put on 
Facebook by Russia's IRA, we did proactively send notice to and 
let them know. And all of those accounts violated our policies. 
The mistake we made was we didn't catch them fast enough, and 
we've improved our systems to make sure that we do.
    What we're doing going forward to combat the issue that you 
mentioned of people getting into bubbles, first, I would note 
that our research does suggest actually--and we're looking at 
studies from other places as well--that suggest that people 
actually come into a broader range of views when they are 
online versus when they are offline.
    On Facebook, there are--people, in general, have about 23 
percent of friends come from different political ideologies 
than themselves. So we know that diversity is already out 
there. What we're trying to make sure we're doing is giving 
people the information to make educated choices about the news 
they want to interact with. And we're doing that through this 
related articles and other programs that we put out since 
December and January.
    Ms. Lofgren. Thank you, Mr. Chairman.
    Chairman Goodlatte. The time of the gentlewoman has 
expired.
    The chair recognizes the gentleman from Iowa, Mr. King, for 
5 minutes.
    Mr. King. Thank you, Mr. Chairman. And I thank all the 
witnesses.
    I know that you can't necessarily see the entire gallery 
behind you. But I would point out that there's not one gray 
head in the entire packed gallery today. There must be a 
message in that for all of us sitting on this panel and in this 
room and for America. A lot of youth has stepped up and are 
paying attention to where this goes. And I would say remember 
these days and look back about 20 years ago when section 230 
was passed with an anticipation of what the internet would grow 
into and what great care to making sure it had the kind of 
flexibility to grow into the companies we have before us today. 
We shouldn't be surprised if we have a few problems and maybe 
some serious ones that have emerged. But on the other hand, we 
do have a lot of freedom.
    And so I'd turn first to Ms. Bickert, and I'd point out 
that it's a matter of congressional record that Gateway Pundit 
Mr. Jim Hoft has introduced information into the record that, 
in the span of time between 2016 and 2018, he saw his Facebook 
traffic cut by 54 percent. And could you render an explanation 
to that for him and for me, Ms. Bickert?
    Ms. Bickert. Thank you, Congressman. I can't speak to any 
one individual's decline in reach or popularity on Facebook. I 
can say that we do change the way that our news feed algorithm 
works. The algorithm basically--it's individualized, and it 
gives people--it sorts or ranks content for each individual 
user based on people that they follow, pages that they follow, 
groups that they belong to. So it's an inventory of content 
that they have chosen.
    And we do make changes to that over time, and we have made 
some this year that might affect whether or not people have--it 
might affect their reach in some way. But there are also other 
factors such as how appealing their content is or what sorts of 
content they're producing and who they're trying to reach that 
would also affect it.
    Mr. King. But we actually did speak to Diamond and Silk. 
But their issue--and they watched their traffic drop too. And I 
saw them repeat a tweet after you lifted the--apparently the 
algorithm that had cut down on their distribution or their 
content.
    But what you've described to me, I think, are a series of 
judgment calls that are being made. Can you be more precise on 
how an algorithm actually works?
    Let me just try this definition. A series of if then 
formulas that are written so that--let's just say if a certain 
word shows up, then that sets up a software alarm bell that, 
perhaps, connected with another word or two or a phrase would 
cause it automatically to be kicked out. Is that a fair 
explanation of what goes on?
    Ms. Bickert. It works a little differently than that, 
Congressman. What the algorithm looks to is what is the type of 
content--it looks at things like what is the type of content 
that an individual user tends to interact with, what's the 
recency of a certain piece of content, what type of engagement 
is that content generating.
    There is no--there is no point at which an individual 
Facebook employee decides where an individual piece of content 
will go in somebody's news feed. This is based on giving users 
the content that is the most relevant to them based on their 
interactions with the----
    Mr. King. Okay. And but there's still judgment calls 
involved, and you have people that are ethics experts that are 
applying a certain strategy to the algorithms. Is that a fair 
assessment?
    Ms. Bickert. We definitely do have--the algorithms are 
written by people, and we do definitely look at the----
    Mr. King. With the counsel of your ethicists----
    Ms. Bickert. We definitely make sure we are taking into 
account ethics and fairness as we work on our algorithms----
    Mr. King Have you used the Southern Poverty Law Center as 
one of those advisory groups?
    Ms. Bickert. No, Congressman, we do talk to more than a 
hundred organizations in the course of setting our content 
policies, and that includes organizations from around the 
world. They do not have any--no organization has decisionmaking 
over our content policies----
    Mr. King. But not SPLC, has not been under contract or been 
a formal advisor to Facebook in any way?
    Ms. Bickert. No, Congressman, not that I'm not aware. We 
have talked to SPLC along with more than a hundred other 
organizations in the course of getting input from our community 
about what we can do better.
    Mr. King. Okay. I could go further with that. But, instead, 
I think I'll just in the seconds I have left, I would ask you 
to contemplate an alternative, Ms. Downs, now--and by the way, 
I tweeted out the picture of the gallery, so you know that, Mr. 
Pickles.
    But, Ms. Downs, I think you have a sense and a concern 
about where this is going, and I'm all for freedom of speech 
and free enterprise and for competition and finding a way that 
we can have competition itself that does its own regulation so 
government doesn't have to. But if this gets further out of 
hand, it appears to me that section 230 needs to be reviewed. 
And one of the discussions that I'm hearing is, what about 
converting the large behemoth organizations that we're talking 
about here into public utilities? How do you respond to that 
particular query, Ms. Downs?
    Chairman Goodlatte. The time of the gentleman has expired. 
The witness may answer the question.
    Ms. Downs. Thank you, Chairman. As I said previously, we 
operate in a highly competitive environment. There are--the 
tech industry is incredibly dynamic. We see new entrants all 
the time. We see competitors across all of our products at 
Google, and we believe that the framework that governs our 
services is an appropriate way to continue to support 
innovation.
    Mr. King. Thank you. I yield back.
    Chairman Goodlatte. The chair recognizes the gentleman from 
Georgia, Mr. Johnson, for 5 minutes.
    Mr. Johnson of Georgia. Thank you, Mr. Chairman. Yesterday, 
the President of the United States humiliated America on the 
world stage. He lavished praise upon a dictator known to have 
interfered with our election in an effort to destabilize our 
democracy and then proceeded to criticize American law 
enforcement's investigation into the Russian misconduct.
    When asked that the now infamous press conference with 
President Putin whether or not he, quote, held Russia at all 
accountable for anything in particular, end quote, Trump said 
that he, quote, holds both countries responsible, end quote. 
President Trump has gone from acting like a dictator to 
prostrating himself before a foreign dictator on the world 
stage. This is a moment of great national peril where Americans 
must be alarmed at the danger that lies ahead for our country 
under current leadership.
    No longer can we pretend that what President Trump is doing 
is normal. Rather than hearing this bullying Facebook about how 
it treated Diamond and Silk or bullying Twitter about how it 
treated Marsha Blackburn, the powerful House Judiciary 
Committee should be holding hearings on what the Russians are 
doing now to disrupt the upcoming November elections. Instead 
of holding a salacious 11-hour marathon hearing last week about 
Peter Strzok's emails to Lisa Page, this committee should be 
holding hearings on how to protect rather than undercut the 
Mueller investigation.
    Now, moving to my questions, I want to thank you all for 
being here today. In 2016, both the Russian Government and 
independent foreign actors took advantage of advertising rules, 
account rules, and posting rules, to help sway the Presidential 
election in favor of Donald Trump. And for the past year and 9 
months. We have been paying the price. From Russian bots to 
fake news, Americans heading to the polls have had to grapple 
with the question of what is real and what is fake? I care 
about the security and sanctity of our elections, and I believe 
that we all share a responsibility in keeping misinformation 
out of our elections and minimizing foreign influence.
    Mr. Pickles, in January of 2018, Twitter disclosed that it 
had removed more than 50,000 Russia-linked accounts. Isn't that 
true?
    Mr. Pickles. Yes, sir.
    Mr. Johnson of Georgia. How many Russian-linked accounts 
have you suspended since that time?
    Mr. Pickles. I don't have that figure at hand. I'm happy to 
follow up. However, I would say our systems our catching 
behavior from accounts across the spectrum. Our systems are 
designed to defend Twitter from manipulation by any actor. And 
we see different actors attempting to use our platform. That's 
why we now catch 9.9 million accounts every week and challenge 
them because of suspicious behavior. That is up 299 percent in 
1 year.
    So I think we're doing much more to learn from 2016 and to 
put robust protections in place to the forthcoming elections.
    Mr. Johnson of Georgia. Thank you. How do you differentiate 
these fake accounts from real people? Are you using software, 
or are these human vetting decisions?
    Mr. Pickles. Thank you for that question. It's a good 
opportunity to explain our work. So, firstly, we have a lot of 
technology----
    Mr. Johnson of Georgia. Quickly.
    Mr. Pickles [continuing]. That is working across this. And, 
secondly, we do have teams of people. So we have a dedicated 
information quality team at the company, which was established 
after 2016, to focus on these issues, to understand behavior 
about actors, and then reinforce our technology. So it's a 
combination of the two.
    Mr. Johnson of Georgia. Thank you.
    Ms. Bickert, Facebook's 2016 woes with fake news have been 
well documented. Foreign actors used your platform to generate 
and cultivate untrue stories, and Russian-backed Facebook posts 
reached millions of Americans. Your company has been working to 
fight fake news since then, but misinformation still exists. 
Does Facebook believe that it has a responsibility to fact-
check our platform, and looking ahead to November, are you 
planning on doing anything different for the midterm elections 
compared with what you're doing now?
    Ms. Bickert. Yes, Congressman, we're doing a lot more, and 
I think we've gotten a lot better since the 2016 election. I'll 
point to three things quickly. One is we've gotten much better 
at removing fake accounts. The accounts that Russia's IRA had 
on Facebook around the 2016 election were inauthentic. We now 
have a mix of technical tools and human reviewers that have 
gotten much faster at identifying and removing those types of 
account. And before the French election, the German election, 
we removed tens of thousands of such accounts that we know to 
have been inauthentic.
    The second thing we're doing is requiring much greater 
transparency around advertising. Now if somebody runs a 
political or issue ad in the United States, you can see who 
paid for that advertisement. You can also see all the ads that 
that entity is running, even if they are not targeting you at 
all. We're requiring identity verification for anybody who is 
running those ads.
    And, finally, we are working to reduce the spread of false 
news through Facebook. We know this is a problem, and we're 
doing things like working with third party fact-checkers to 
identify when content might be false and then providing 
relevant information to users so that they can make an informed 
decision about what to trust.
    Mr. Johnson of Georgia. Thank you.
    And, Mr. Chairman, Ms. Downs, could she respond to that 
briefly?
    Chairman Goodlatte. Yes. And I will, without objection, 
extend another minute because I want to follow up with one of 
the questions that Ms. Bickert answered of yours.
    Mr. Johnson of Georgia. I'll yield to the gentleman.
    Chairman Goodlatte. The second point you made about 
disclosing who paid for a political ad. Do you also disclose 
how much they paid or the rate they paid for the ad?
    Ms. Bickert. No, Mr. Chairman, we just disclosed--I believe 
we just disclosed who paid for it. I can follow up with details 
on how we do that.
    Chairman Goodlatte. Thank you. Ms. Downs, you can answer 
the question.
    Mr. Johnson of Georgia. Thank you.
    Ms. Downs. Google has strong security protections in place 
to protect against state interference. And we have worked to 
extend those security protections to campaigns. We've trained 
over 1,000 campaign professionals on a suite of tools called 
Protect Your Election. These are digital tools designed to 
protect election websites and political campaigns from digital 
attacks.
    As I mentioned, we also have implemented more transparency 
with election advertising, as I described previously, and we're 
continuing to fight misinformation on our products, both 
through enforcement of our policies against deceptive behavior 
and through surfacing more authoritative content when people 
are looking for news or current events. We do that across both 
search and YouTube.
    Mr. Johnson of Georgia. Thank you, and I yield back.
    Chairman Goodlatte. The chair recognizes the gentleman from 
Texas, Mr. Gohmert, for 5 minutes.
    Mr. Gohmert. Thank you, Mr. Chairman.
    We appreciate you being here today. And I want to that my 
colleagues across the aisle for their concerns about Russian 
interference with our elections because it's been going on for 
70 years. It helped Truman get elected in 1948. Eisenhower 
called the Russians on it in '56, the manipulation there. 
Khrushchev bragged in '60 that he helped throw the election to 
Kennedy. It's been going on. The Progressive Party, as they 
were called previously, and now it's been reemerged, and the 
Democratic help to Jimmy Carter. It's--or the help from 
Russians.
    So I am thrilled that we're going to get help across the 
aisle to get to the Russian input stopped. But I need to ask 
each of you: You have been asked specifically about Russian use 
of your platforms, but did you ever find any indication of use 
of your platform utilized by the Chinese, North Korea, or any 
other foreign country, intelligence, or agency of that country? 
First, Ms. Bickert.
    Ms. Bickert. I would note, Mr.--I would note, Congressman, 
that we are not in North Korea or China. In terms of whether 
we've seen attacks on our services, we do have--we are, of 
course, a big target. We do have a robust security team that 
works to----
    Mr. Gohmert. But that is not my question. It's just very 
direct question. Have you found use? You don't have to be in 
North Korea to be North Korean intelligence and use--we have 
foreign governments' intelligence agencies in this country. So 
have--it would seem to me you were each a little bit vague 
about how, oh, yes, we found hundreds or whatever. I'm asking 
specifically, were any of those other countries, besides 
Russia, that were using your platform inappropriately? It 
should be a yes or no.
    Ms. Bickert. I don't have the details. I know we definitely 
work to detect and repel attacks----
    Mr. Gohmert. I know that, but were any of them foreign 
entities other than Russia?
    Ms. Bickert. I can certainly follow up with you on that.
    Mr. Gohmert. So you don't know? You sure seemed anxious to 
answer the Democrats' questions about Russia influence, and you 
don't really know of all the people--of all of the groups that 
inappropriately used your platform, you don't know which were 
Russians and which were other foreign entities?
    Chairman Goodlatte. Congressman, we certainly have seen 
attacks from people other than Russians. As far as the details 
of from whom those attacks have come, I would have to have my 
team follow up with you on those.
    Mr. Gohmert. So you don't know about China? You're sure 
about Russia, but you don't even know about China?
    Ms. Bickert. I would have to have my team follow up with 
you----
    Mr. Gohmert. So you came prepared to help the Democrats 
establish about Russia, but you can't point out any other 
country. Is that right?
    Ms. Bickert. Congressman, we have put public statements----
    Mr. Gohmert. Well, let me go--you're not answering the 
question.
    Let me go to Ms. Downs. How about on Google, did you detect 
any other countries besides Russia utilizing your platform 
inappropriately?
    Ms. Downs. Our security team is trained to protect our 
services from foreign interference----
    Mr. Gohmert. Are we going to get to an answer to my 
question?
    Ms. Downs. So the team has certainly----
    Mr. Gohmert. Are we going to get to an answer of my 
question? Did you find any other countries besides Russia that 
were using your platform inappropriately? Very simple.
    Ms. Downs. The investigation that we conducted was specific 
to Russian interference in the 2016 election, but----
    Mr. Gohmert. You don't know if China did or not?
    Ms. Downs. My guess would be that our security team has----
    Mr. Gohmert. You're here to guess?
    Ms. Downs [continuing]. At breaching our security from 
other foreign governments as well, but that information is held 
confidentially, even internally.
    Mr. Gohmert. So you're only here to condemn the Russians. 
Thank you.
    How about you, Mr. Pickles, are you prepared to identify 
any other foreign countries or just here to help the Democrats 
blast Russia after 70 years of Russia helping Democrats?
    Mr. Pickles. Well, certainly happy to help the committee 
and yourself understand our work to defend elections.
    Mr. Gohmert. I understand that. But did you find any other 
countries besides Russia that inappropriately used Twitter?
    Mr. Pickles. So we suspend these accounts because they are 
breaking our rules.
    Mr. Gohmert. I understand that. Did you find any other 
countries or their agencies inappropriately using Twitter?
    Mr. Pickles. Well, to echo points of my colleagues, I think 
our services, people----
    Mr. Gohmert. So did you find any other countries besides 
Russia that inappropriately used your Twitter?
    Mr. Pickles. Sir, I'm happy to follow up on that specific 
question.
    Mr. Gohmert. But you did not come prepared to answer any 
questions about any other country but Russia? Is that correct?
    Mr. Pickles. So I think it was important on the election--
--
    Mr. Gohmert. You answered the question about Russia. You 
can't answer about China? Yes or no.
    Mr. Pickles. So we make these decisions based on our 
rules----
    Mr. Gohmert. You're very good at dodging refusing to answer 
the questions. Let me just say, I think Mr. Raskin had the key 
to the solution here when he said that he didn't think they 
discriminated, but if they did, they have every bit as much 
right as FOX News and Sinclair. There's the key. They should be 
just as liable as FOX News and Sinclair. I yield back.
    Chairman Goodlatte. The time of the gentleman has expired.
    And the gentlewoman from Texas, Ms. Jackson Lee, is 
recognized for 5 minutes.
    Ms. Jackson Lee. Please forgive me, I am probably going to 
be coughing, and I apologize. But let me thank Ms. Bickert, Ms. 
Downs, and Mr. Pickles, first of all, for representing the kind 
of technological engines that have been a real asset, an anchor 
of America's genius.
    It is a responsibility of Congress to give guidance and 
regulation, as we have noted the expanse of both, including--
not both, it's three of you: Twitter, Facebook and Google. And 
I think you recognize that in your businesses that it's 
important for congressional oversight.
    Ms. Bickert, I'm just getting a yes or no. Is that your 
appreciation?
    Ms. Bickert. Yes, definitely.
    Ms. Jackson Lee. Ms. Downs.
    Ms. Downs. Yes, we're always happy to work with Congress.
    Ms. Jackson Lee. Mr. Pickles.
    Mr. Pickles. Yes, absolutely.
    Ms. Jackson Lee. And you have done so. And so I'm going to 
have a line of questioning. I know my colleague earlier, Mr. 
Raskin, mentioned a lot of issues that this committee should be 
addressing, and my time is going, but I will reiterate: We have 
not had hearings dealing with the snatching of children from 
families. And we're not and have not had elections dealing with 
the intrusion and the invasion of the election, in particular 
by certain countries.
    So let me just ask you this: On July the 13th, the Mueller 
investigation issued an indictment of 12 Russian intelligence 
officers, some of them military. Again, I will be asking yes or 
no questions. Ms. Bickert, did your company have any 
involvement in that indictment?
    Ms. Bickert. Congresswoman, we have cooperated with the 
investigations since we have been asked, and we've been public 
about that cooperation.
    Ms. Jackson Lee. You have cooperated, but did you have any 
direct involvement with the ultimate result of an indictment?
    Ms. Bickert. I can't speak to the indictments or how they 
were put together, but we have cooperated with investigations.
    Ms. Jackson Lee. Ms. Downs.
    Ms. Downs. Not to my knowledge.
    Ms. Jackson Lee. Mr. Pickles.
    Mr. Pickles. We have cooperated, but specifically on the 
indictment, not to my knowledge.
    Ms. Jackson Lee. So we have three people here that have 
provided the necessary information, but if we are to 
extrapolate what a prosecutor does--and we know that they do 
that independent of the information that they receive. So I 
want to pose a comment and then proceed with a series of 
questions.
    First of all, as I indicated, this committee needs to 
proceed with hearings involving the question of the Russian 
intrusion and stealing of the 2016 election. And I have come to 
a conclusion now that it was truly stolen. And dealing with 
these engines that have been effective for the United States on 
that issue seems to be a stretch and inappropriate. But I do 
think it's important that we have the ability to provide to 
allow freedom to the extent that people are utilizing the First 
Amendment.
    Do you believe that the First Amendment covers your--each 
of your companies? Ms. Bickert, I'm just going down the line.
    Ms. Bickert. Well, Congresswoman, we do have community 
standards about what is acceptable and not acceptable. But 
certainly, to the extent that the First Amendment regulates the 
governments, we operate consistent with U.S. laws.
    Ms. Jackson Lee. Ms. Downs.
    Ms. Downs. As a private company, we aren't bound by the 
First Amendment, but obviously we work with the U.S. government 
on meeting any of our obligations in terms of how speech is 
regulated in the U.S.
    Mr. Pickles. I'd echo those points. Just to emphasize 
again, we have our own rules that we're proactive in enforcing 
to make sure that speech on Twitter is within those rules.
    Ms. Jackson Lee. Well, I think what you're saying, I think 
private companies are bound by the First Amendment. I think 
what you're saying is that we can't cry ``fire'' in a crowded 
theater. So you're able to regulate accordingly.
    Let me ask the question: Ms. Bickert, what are doing to 
prevent Unite the Right, the organizer of the Charlottesville 
rally, from using the FB to plan their upcoming rally mid-
August if it is declared and is conspicuously hate speech?
    Ms. Bickert. Congresswoman, any time that we see somebody 
organizing an event for violent purposes or engaging in calls 
for violence or hate speech, we will remove it from the site.
    Ms. Jackson Lee. Ms. Downs, what are you doing as relates 
to your company's data privacy policies and methods in which 
you comply with these policies?
    Ms. Downs. We've invested considerable resources at Google 
to create one of the most sophisticated privacy programs in 
existence. Thousands of employees are dedicated across the 
company to daily to ensure that we protect the privacy and 
security of our users. Our three guiding principles are 
transparency, control, and choice. We believe in being 
transparent with users, communicating in clear language, and 
there's a single destination called My Account that we have 
created where users can see all of the data we collect and 
store, and have control over revoking any permissions they have 
given previously, et cetera. It is a very well-used site, over 
1.6 billion visitors in 2016.
    Ms. Jackson Lee. Mr. Pickles, can you share any of the 
changes--you've made about 30--of your product changes, so can 
you share some of them with us?
    Mr. Pickles. Absolutely. So one example might be we rolled 
out a new policy focusing on violent extremist groups. Those 
are groups who focus on encouraging violence against civilians, 
which we clarified our policy. We also rolled out a change last 
week, you may have seen, where we updated people's following 
numbers to make sure they're authentic and don't include locked 
accounts. And we've also rolled out for the U.S. midterms 
specific new labeling for accounts that belong to candidates in 
those elections.
    Chairman Goodlatte. The time of the gentlewoman has 
expired.
    Ms. Jackson Lee. I thank you, Mr. Chairman. My final 
sentence is just to thank them. They are international 
companies; I wanted to clarify that. But I also want to clarify 
that they do represent an economic engine that we have to 
appreciate, work with, and protect others from bad speech.
    Chairman Goodlatte. We thank the----
    Ms. Jackson Lee [continuing]. And also recognize their 
value.
    Chairman Goodlatte. The chair recognizes the gentleman from 
Pennsylvania, Mr. Marino, for 5 minutes.
    Mr. Marino. Thank you, Chairman.
    I want to that all--over here. I want to thank all of you 
for being here. I am going to start with you, Mr. Pickles, 
because Ms. Bickert gets hit with usually the first, tell us 
what you think. I'm going to read a legal term: libel. You're 
all familiar with that. Liable is to publish in print, 
including pictures, writing, or broadcast through radio, 
television, or film, an untruth about another which will do 
harm to that person or his or her reputation by tending to 
bring that target into ridicule, hatred, scorn, or contempt of 
others. So we all understand what that is.
    Given the fact that your companies reach, I think, far more 
people than a conglomeration of the newspapers, radios, and 
televisions combined in the United States. Have any of you sat 
back and considered libel, or do you think you are immune from 
it. Sir?
    Mr. Pickles. I'm happy to start, and thank you for the 
opportunity to outline what we do in this area. So, as I say, 
we have clear rules that govern what happens on Twitter. Some 
of those behaviors are deplorable, and we want to remove them 
immediately. So terrorist content is one example where we now 
detect 95 percent of the terrorist accounts that we removed----
    Mr. Marino. I understand that, sir. But how about--we, in 
Congress, we put up with it all the time. I know that we are 
public officials, the same way as people in the movies or so 
on, but do you specifically look for and address republication 
can be used in a defamation case. Do you look at libel and 
defamation content?
    Mr. Pickles. So we--as I say, we focus on our rules. Those 
rules govern a wide range of behavior.
    Mr. Marino. With all due respect, I've heard you focus on 
your rules about 32 times today. Do you look for libel or 
defamation in your company's opinion?
    Mr. Pickles. Sir, I think our company's opinion is 
expressed in those rules that publicly available----
    Mr. Marino. Okay. Now you've answered my question. Thank 
you. Next.
    Ms. Downs. Thank you, Congressman. So YouTube is a platform 
for user-generated content, and we respect the law in the 
nearly 200 countries where we operate, which means once we're 
on notice of content that may violate the law, we take action 
by blocking it for the relevant jurisdictions.
    Mr. Marino. Is it specific towards defamation and/or libel?
    Ms. Downs. Including defamation removal, yes.
    Mr. Marino. Because you know the reproduction of those 
statements--have you ever been sued, that you know of, based on 
defamation and libel?
    Ms. Downs. I don't know the answer to that question.
    Mr. Marino. Okay.
    Ms. Bickert.
    Ms. Bickert. Thank you, Congressman. Similar to YouTube, we 
do have a notice and takedown approach where people can submit 
legal--notifications to us of illegal speech, and that would 
include, where appropriate, notifications of libel.
    Mr. Marino. Where do you draw the line, each of you? We 
still have some time. I'll start with you, Ms. Bickert. Where 
do you draw that line? Do you have specific rules, a policy, 
that determine in your company's opinion what is libel and what 
is defamation?
    Ms. Bickert. There's two ways this might be addressed. One 
would be if it was a violation of say our bullying or 
harassment policies.
    Mr. Marino. Because we know young people are committing 
suicide because of things that are said about them on the 
internet. But please go ahead.
    Ms. Bickert. And we take that threat extremely seriously, 
which is why we don't allow bullying, and we do consider it a 
safety-related policy.
    Mr. Marino. I appreciate that.
    Ms. Bickert. Those are our own lines. And then separately 
we sometimes will receive notifications that are legal takedown 
requests for speech that breaches the law in a specific 
jurisdiction. Our legal team evaluates those, and if 
appropriate, then we'll remove that speech.
    Mr. Marino. Ms. Downs.
    Ms. Downs. We have a very similar process in place for 
legal removals. Once we're on notice of content that has been 
deemed to violate the law, then our team evaluates and then 
blocks the relevant jurisdiction. On top of that, we also have 
content policies for our platforms like YouTube that prevent 
and prohibit things like harassment and so on.
    So we take those policies particularly seriously when it 
comes to young people, and we have various protections in place 
to ensure that we're enforcing them robustly.
    Mr. Marino. Mr. Pickles.
    Mr. Pickles. One additional, hopefully useful, piece of 
information is we also work with a site Lumen, which is a 
project that discloses when we do take content down, subject to 
legal orders as described. And we also publish the number of 
times we do that in a transparency report.
    Mr. Marino. Do you do that through AI, artificial 
intelligence, and/or individuals reviewing? Could you all three 
quickly answer that because my time has just run out, and I 
yield back.
    Mr. Pickles. People.
    Ms. Downs. We use a mix of humans and technology to enforce 
our policy, and legal removal requests are reviewed by a 
special legal team.
    Chairman Goodlatte. The time of the gentleman has expired. 
The chair recognizes the gentlewoman from California, Ms. Bass, 
for 5 minutes.
    Ms. Bass. Thank you very much, Mr. Chair. And I also want 
to thank my colleague, Representative Deutch, for allowing me 
to go out of order. There's been an awful lot of discussion and 
questions today about social media and conservatives, but I 
wanted to ask, particularly, Ms. Bickert, about over-censorship 
of activists on the left as well, and what is Facebook doing to 
address disproportionate censorship of people of color on the 
platform. And an example is, over the last few years, multiple 
well-known black activists have had their content removed or 
accounts banned for speaking out about racial injustice.
    What is your plan to ensure that voices like this are not 
silenced on your platform?
    Ms. Bickert. Thank you, Congresswoman. It is so important 
to us that we are a platform for all of these voices. We know 
that at our scale we have billions of posts every day, and we 
review more than a million reports every day. We know that we 
sometimes will make mistakes, and sometimes those mistakes have 
affected activists.
    One of the things that we're doing is offering appeals so 
that if we get a decision wrong, people can ask us to take a 
second look, and we will. That is live now. We're continuing to 
increase the robustness of the appeals process, but that is 
something that we--something that we rolled out after talking 
to many of these groups.
    Ms. Bass. How would a person know about the appeal? So, in 
other words, if you remove content, then do you send a message 
saying you can appeal?
    Ms. Bickert. So, if somebody posts something on Facebook 
and we remove it, we send a notification, and then that gives 
them the opportunity to stay: Facebook, I think you got it 
wrong.
    And then we will----
    Ms. Bass. When did you start doing this?
    Ms. Bickert. We began--well, let me be clear. We did have 
appeals for pages and groups and profiles for years. But in 
early May, late April, early May of this year, we began 
offering appeals for when somebody has a post or a photo 
removed. By the end of this year, we're hoping to have this 
also available for--if you reported something to us, and we 
have failed to remove it, and you think it should be removed. 
That's one thing we're doing.
    We're also talking to a number of groups directly, and 
that's something that we do with groups across the political 
spectrum and around the world to understand mistakes that we've 
made and how our policies affect them. And we've also hired 
Laura Murphy, prominent civil rights attorney and law firm 
Relman, a civil rights law firm, to do a comprehensive 
assessment of how our policies and our practices affect civil 
rights.
    Ms. Bass. So maybe some of the activists that had their 
sites removed are not aware of that, so you know, Facebook 
might consider taking another look at that for some of the 
groups that have had their sites removed.
    Ms. Bickert. Thank you, Congresswoman.
    Ms. Bass. What is Facebook doing to prevent Russians from 
pretending to be black activists and buying ads targeting black 
users? You remember that. The ads that went out really to 
discourage African Americans from voting. And then the pretend 
sites that seemed as though they were from black activists, but 
they were not.
    Ms. Bickert. Congresswoman, there are two primary things 
that we're doing. The first is we've gotten a lot better at 
removing fake accounts like those that were--that the Russian 
IRA had on Facebook. Those were inauthentic accounts. We should 
have caught them sooner, but they always violated our policies. 
We have now gotten much better at finding those accounts.
    The second thing we're doing is requiring transparency 
around political and issue ads in the United States such that 
now, if you see an ad on Facebook about a political issue, you 
can see who has paid for that ad. You can also see any ad that 
a particular entity is running by going to their page and 
clicking on a button that will take you to their ads library, 
even if you haven't been targeted with any of the ads.
    Ms. Bass. You mentioned a woman that was working with you 
from civil rights arena, and you announced that there was a 
civil rights review that would evaluate how the platform impact 
people of color, and specifically what is the audit looking at?
    Ms. Bickert. We want to make sure that the policies that we 
have are being enforced consistent with civil rights. So we 
need to understand how these policies are affecting all these 
different communities. We're hopeful that what we learn from 
this assessment and from the audit will help us become a better 
platform that will respect voices across the political 
spectrum.
    Ms. Bass. So, since you're receiving so much criticism 
today, let me just end by thanking Facebook, actually, for 
never taking down memorial pages for people who have passed 
away, even for years and years after they have passed away. I 
appreciate that.
    Chairman Goodlatte. Would the gentlewoman yield?
    Ms. Bass. Yes.
    Chairman Goodlatte. I just want to follow up with a 
question, Ms. Bickert made a comment again about the ability 
to--now requiring that a campaign has to disclose who is paying 
for the ad, which I think is a good thing. And I wanted to 
follow up--I asked earlier about whether you also disclosed the 
rates. Now, with television or radio, an opposing campaign, I 
guess the media, can probably check and find out what the rates 
are. Can you do that with Facebook, too? If the opposing 
campaign sees a lot of Facebook advertising, can they find out 
the rate and see if their rate is comparable to the rate for 
the ad that they're seeing----
    Ms. Bickert. Mr. Chairman, the way that Facebook's ads 
pricing works, through an auction model is something that's 
fairly complex, and I'm afraid I'm probably not the best 
equipped to explain the details of that, but we can certainly 
follow up with you----
    Chairman Goodlatte. Yeah, we'll follow up, and I do--would 
like to know more about that.
    Ms. Bickert. Will do. Thank you.
    Chairman Goodlatte. Thank you very much. The chair 
recognizes the gentleman from Texas--actually, I have a 
unanimous consent request, too, and you may want to comment on 
this, if anybody allows you time. I just want to put this in 
the record. Asia Times, June 2, 2018, ``Is Facebook helping 
Vietnam suppress online dissent?''
    Without objection, that will be made a part of the record.
    The gentleman from Texas, Mr. Smith, is recognized for 5 
minutes.
    Mr. Smith. Thank you, Mr. Chairman. Americans deserve the 
facts objectively reported. They know media bias is pervasive. 
A recent Morning Consult poll found that only a quarter of 
voters now trust the media to tell them the truth, a record 
low. The media savages the President and portrays his 
administration in the worst possible light. Over 90 percent of 
his network news coverage has been negative, higher than any 
other President.
    The muting of conservative voices by social media also has 
intensified. Social media companies have repeatedly censored, 
removed, or shadow-banned conservative journalists, news 
organizations, and media outlets that do not share their 
liberal political views. Facebook's new algorithm for what 
users see on their timeline has disproportionately harmed 
conservative publishers. They are getting fewer readers, while 
their liberal counterparts haven't been impacted to the same 
degree.
    Recently, Google's employees easily convinced the company's 
management to cut ties to contracts with the military. And 
Google has long faced criticism from fact checkers over 
manipulating search results to slight conservatives. Google has 
also deleted or blocked references to Jesus, Chik-fil-A, and 
the Catholic religion. When will it stop?
    Also alarming are the guidelines being written by these 
companies to define hate speech. Facebook's newly published 
community standards, which determines what content is allowed, 
defined these terms for the American people. It violates 
Facebook rules, quote, to exclude or segregate a person or 
group, end quote. So a conservative organization, for example, 
calling for illegal immigrants to be returned to their home 
country could be labeled a hate group by the platform and their 
content removed all together.
    Some platforms have allowed liberal interest groups to 
determine what information is available to the public. The 
Southern Poverty Law Center is allowed to influence platform 
guidelines and sometimes censor content that they deem hate 
speech. The SPLC has a hate map that lists over 900 
organizations. These include pro-life, religious freedom, and 
border security groups, all popular with the American people. 
And they are unfairly targeted by the SPLC.
    It's no secret the social media organizations are typically 
controlled and run by individuals who lean liberal, sometimes 
radically so. It will require a constant effort by these 
entities to neutralize this relentless bias if in fact they 
really want to do it. All media entities should give the 
American people the facts, not tell them what to think.
    Mr. Chairman, I'd like to ask all of our panelists today 
one question that is pretty direct and I think can be answered 
yes or no. And the question is this: Would you pledge publicly 
today to make every effort to neutralize bias within your 
online platforms?
    And, Ms. Bickert, we'll start with you.
    Ms. Bickert. Congressman, we're making those efforts now. 
There is no place for bias on Facebook.
    Mr. Smith. Thank you.
    Ms. Downs.
    Ms. Downs. Yes, we design products that are for everyone, 
and we enforce our policies in a politically neutral way.
    Mr. Smith. And you feel every effort should be made to try 
to neutralize the bias?
    Ms. Downs. Correct. We design our algorithms to check for 
bias.
    Mr. Smith. Mr. Pickles.
    Mr. Pickles. I think you're right to highlight that people 
have biases when they come to work, and our focus, as you say, 
should absolutely be making sure that bias is not a factor. And 
our rules are enforced impartially.
    Mr. Smith. Thank you all.
    Thank you, Mr. Chairman. I yield back.
    Chairman Goodlatte. Would the gentleman yield?
    Mr. Smith. Yes, I will be happy to advance my time to the 
chairman.
    Chairman Goodlatte. I wonder, Ms. Bickert, are you familiar 
with the story about the contention that Facebook's content 
filtering practices, which, as has been testified here earlier, 
comply with local law, is effectively censoring free speech in 
Vietnam?
    Ms. Bickert. Mr. Chairman, I can speak to how we respond to 
requests under local law. If we get a request from a government 
telling us that speech is illegal, the first thing that we do 
is see if that speech complies with our community standards. If 
it doesn't, then we will remove it. If it does comply with our 
standards but is nevertheless illegal, we will look at the 
legal requests that we've received. Our legal team will look at 
the requesting authority, the process itself, who was affected 
by the speech, any human rights implications, and then we will 
make a decision about whether or not we should restrict content 
in accordance with that local law. If we do, then we remove 
that content only in the jurisdiction where it is illegal, and 
we report on that in our government transparency report.
    I do want to emphasize that this is very different from us 
providing data to a foreign government. There is a process 
through which governments can ask us for data, like an FBI 
search warrant, let's say, and our legal team analyzes those. 
But I think you mentioned that the article was about Vietnam. 
We do not store data in Vietnam. They can present legal process 
to us. If they do, we will scrutinize that, and you can see 
from our government transparency report that those requests 
from Vietnam are very low, and we don't comply with all of 
them.
    Ms. Lofgren. Mr. Chairman, could I ask unanimous consent?
    Chairman Goodlatte. The gentleman's time has expired, but 
we'll follow up in writing with a question about that.
    And the gentlewoman is recognized.
    Ms. Lofgren. I ask unanimous consent to put into the record 
a letter sent by a bipartisan group this week on the issue in 
Vietnam to both Facebook and Google.
    Chairman Goodlatte. Without objection, that will be made a 
part of the record.
    The chair recognizes the gentleman from Florida, Mr. 
Deutch, for 5 minutes.
    Mr. Deutch. Thank you, Mr. Chairman.
    Mr. Chairman, there was an interesting article in this 
morning's Wall Street Journal entitled ``Publishing Executives 
Argue Facebook is Overly Deferential to Conservatives.'' Ms. 
Bickert, I just wanted to follow up on what you had talked 
about earlier. In particular, the review that's being led by 
former Senator Jon Kyl, along with the Heritage Foundation 
about what many articles recently point out as unsubstantiated 
claims of anticonservative bias. But the question is: You put 
this together. They're conducting this review. After the review 
started, my understanding is that the RNC Chair, Ronna McDaniel 
and Brad Parscale, the campaign manager for the President's 
reelection campaign, then doubled-down on the narrative, 
complained of suppression of conservative speech. And rather 
than pointing to this review that is taking place, instead 
there were meetings immediately scheduled between the head of 
the RNC, the President's reelection campaign, and high-ranking 
officials at Facebook. Is that right?
    Ms. Bickert. I'm afraid I don't know about that. I could 
follow--have our team follow up.
    Mr. Deutch. If you could just follow up on that and get 
back to us, we'd appreciate it.
    I represent Parkland, Florida, and in this discussion of 
social media the first thing that comes to mind to me is the 
savage attacks on the student survivors of Stoneman Douglas. 
One of the most virulent strains of these attacks was that the 
students didn't survive a school shooting, that they were 
crisis actors, that they were planted by some mysterious cabal 
to finally get Congress to do something about gun violence.
    And in the weeks after the shooting, Alex Jones' YouTube 
channel posted a video that was seen by 2.3 million subscribers 
alleging that these were merely--that these were actors and not 
real students who had experienced the most horrific thing 
anybody one could possibly imagine. The video violated 
YouTube's rule against bullying, and it was removed. An article 
posted to Slate.com describes this as a strike against the 
channel.
    Ms. Downs, how many strikes does a channel get?
    Ms. Downs. Typically, a channel gets three strikes, and 
then we terminate the channel.
    Mr. Deutch. So the reason I ask is, Alex Jones obviously is 
one of the conspiracy theorists whose brand is bullying. He 
wants similar attacks against the families whose 6- and 7-year-
old kids were slaughtered at Sandy Hook, and he's not the only 
one. Truthers have spread these lies claiming that Sandy Hook 
never happened at all.
    A Slate article references a study by Jonathan Albright, 
director of the Tow Center for Digital Journalism at Columbia 
who found 9,000 videos on YouTube with titles that are--and I 
quote, a mixture of shocking vile and promotional themes that 
include rape game jokes, shock reality, social experiments, 
celebrity pedophilia, false flag rants, and terror-related 
conspiracy theories dating back to the Oklahoma City attacks in 
1995.
    Ms. Downs, does Google think that this is a problem, and 
what is the solution that you're coming up with to address it?
    Ms. Downs. Thank you for the question. So, as you noted, 
when Alex Jones posted the video you described saying that the 
survivors at the Parkland massacre were crisis actors, that 
violated other harassment policy. We have a specific policy 
that says if you say a well-documented violent attack didn't 
happen and you use the name or image of survivors or victims of 
that attack, that is a malicious attack, and it violates our 
policy.
    In terms of conspiracy theory content generally, our goal 
is to promote authoritative content to our users. So we have 
two principles that guide the way here. That's the first one, 
as we want to provide users with authoritative, trustworthy 
and----
    Mr. Deutch. I'm sorry to cut you off. I only have a minute 
and a half, and I don't really need to hear what you're trying 
to provide. I want to know how you're dealing with all these 
conspiracy theorists on your platform.
    Ms. Downs. So the first way is by demoting low-quality 
content and promoting more authoritative content. And the 
second is by providing more transparency for users. So we're 
introducing boxes that provide factual information at the top 
of results that have shown themselves to turn up a lot of 
information that is counterfactual, such as searching for the 
Earth is flat on YouTube, where you see a lot of videos 
claiming----
    Mr. Deutch. Okay. Your response is to put a box saying, 
``Nope, the Earth is not flat''?
    Ms. Downs. Correct.
    Mr. Deutch. I have a question, Ms. Bickert, for you. You 
recently decided not to ban Infowars. Can you explain that 
decision? And do you use a strikes model like YouTube?
    Ms. Bickert. Congressman, we do use a strikes model. What 
that means is, if a page or a profile or a group is posting 
content and some of that violates our policies, we always 
remove the violating post at a certain point, and it depends--
it depends on the nature of the content that is violating our 
policies. At a certain point, we would also remove the page or 
the profile or the group at issue.
    Mr. Deutch. So the question is, how many strikes does a 
conspiracy theorist who attacks grieving parents and student 
survivors of mass shootings get? How many strikes are they 
entitled to before they can no longer post those kinds of 
horrific attacks?
    Ms. Bickert. I want to be very clear that allegations that 
survivors of a tragedy like Parkland are crisis actors, that 
violates our policy and we removed that content. And we would 
remove and continue to remove any violations from the Infowars 
page. If they posted sufficient content that it violated our 
threshold, the page would come down. That threshold varies 
depending on the severity of different types of violations.
    Mr. Deutch. Thank you. I yield back.
    Chairman Goodlatte. The chair recognizes the gentleman from 
Idaho, Mr. Labrador, for 5 minutes.
    Mr. Labrador. Thank you, Mr. Chairman.
    Ms. Downs, you mentioned in your opening statement or 
sometime in the beginning that there was only limited activity 
on your side from some of the Russian trolls and some of these 
entities. Is that correct?
    Ms. Downs. That is correct.
    Mr. Labrador. What did you mean by ``limited activity''?
    Ms. Downs. Pursuant to our investigation around the 2016 
election, we found two accounts that had a spend of less than 
$5,000 in advertising and 18 YouTube channels with just over 
1,000 videos in English that we terminated as soon as we 
identified them. Those were all linked to the Internet Research 
Agency.
    Mr. Labrador. Mr. Pickles, would you consider that limited 
activity that happened on Twitter?
    Mr. Pickles. We have 336 million users. As a proportion of 
that, yes, this was a small proportion, but the accounts we 
believe that were linked to the Internet Research Agency did 
run to several thousand. That was too many. We have taken steps 
to make sure----
    Mr. Labrador. But there were millions of users, and there 
were several thousand of these accounts.
    Mr. Pickles. Yes.
    Mr. Labrador. Ms. Bickert, what about on Facebook?
    Ms. Bickert. Congressman, we have more than 2 billion 
people using the site every month, and we had fewer than 500 
pages, groups, and accounts.
    Mr. Labrador. So what all three of you are telling us is 
that the Democrats' campaign was so weak that this limited 
activity apparently influenced the elections and cost the 
United States to actually choose the wrong person for 
President. Is that what you're telling us? That's a rhetorical 
question. You don't have to answer it. I yield the rest of my 
time to the current chairman of the committee.
    Mr. Gaetz [presiding]. I thank the gentleman for yielding.
    Mr. Pickles, is it your testimony or your viewpoint today 
that Twitter is an interactive computer service pursuant to 
section 230, sub(c)(1)?
    Mr. Pickles. I'm not a lawyer, so I don't want to speak to 
that, but I understand that, under section 230, we are 
protected by that, yes.
    Mr. Gaetz. So, if section 230 covers you, and that section 
says no provider or user of interactive computer service shall 
be treated as the publisher or speaker of any information 
provided by another, is it your contention that Twitter enjoys 
a First Amendment right under speech while at the same time 
enjoying section 230 rights?
    Mr. Pickles. Well, I think we discussed the way the First 
Amendment interacts with our companies. As private companies, 
we enforce our rules, and our rules prohibit a range of 
activities.
    Mr. Gaetz. I am not asking about your rules. I'm asking 
about whether or not you believe you have First Amendment 
rights. You either do or you do not.
    Mr. Pickles. I'd like to follow up on that. As someone 
who's not a lawyer, I think it very important----
    Mr. Gaetz. Well, you're the senior public policy official 
for Twitter before us, and you will not answer the question 
whether or not you believe your company enjoys rights under the 
First Amendment?
    Mr. Pickles. Well, I believe we do, but I would like to 
confer with my colleagues.
    Mr. Gaetz. So what I want to understand is if you say, I 
enjoy rights under the First Amendment and I'm covered by 
section 230 and section 230 itself says no provider shall be 
considered the speaker, do you see the tension that that 
creates?
    Mr. Pickles. Yes, but I also see that Congress we worked 
with previously to identify why it's important to remove 
content that is of child sexual abuse and why it's important 
to----
    Mr. Gaetz. Well, let's explore some of those extremes then. 
I know Twitter would never do this; I'll disclaim that. But 
could Twitter remove someone from their platform because 
they're gay or because they're a woman.
    Mr. Pickles. Well, we would remove someone breaking our 
rules, and that behavior is not prohibited under our rules.
    Mr. Gaetz. So it's your contention that Twitter does not 
have the ability then to remove someone because they are gay or 
because they are a woman?
    Mr. Pickles. I say that context is not part of the context 
of whether they break our rules.
    Mr. Gaetz. Okay. Well, Jared Taylor is a horrible human 
being who you're currently litigating with, but that litigation 
seems--the transcript from it seems to have some tension with 
what you're telling Congress. The court in that litigation 
asked the question: Does Twitter have the right to take 
somebody off its platform because it doesn't like the fact that 
the person is a woman or gay? And the response from the 
attorney for Twitter was: The First Amendment would give 
Twitter the right, just like it would give a newspaper the 
right to choose to not run an op-ed from someone because she 
happens to be a woman. Would Twitter ever do that? Absolutely 
not. Not in a million years. Does the First Amendment provide 
that protection? It absolutely does.
    So was your lawyer correct in that assessment? Or were you 
correct when you just said that that would not be permitted?
    Mr. Pickles. Well, I'm not familiar with the facts of that 
case, and I can appreciate--I can't comment on ongoing 
litigation. But this is absolutely a critical public policy 
issue, one that is important we debate, because as our 
companies seek to reassure you in this committee, the way that 
we take our decisions in a neutral way, not taking into account 
political beliefs, I think the fact our rules are public and 
that we're taking steps to improve the transparency of how we 
improve the enforcement of those rules are important steps to 
take.
    Mr. Gaetz. Right, but it is not in service of transparency 
if your company sends executives to Congress to say one thing, 
that you would not have the right to engage in that conduct and 
then your lawyers in litigation say precisely the opposite. 
That serves to frustrate transparency.
    But my time has expired, the gentleman from Rhode Island, 
Mr. Cicilline, is recognized for 5 minutes.
    Mr. Cicilline. Thank you, Mr. Chairman. I begin by saying 
that America has stood as a beacon of hope to the rest of the 
world, and America has been defined by a set of ideals that 
have established our moral leadership in the world, propelled 
by our respect for freedom, human rights, and the rule of law.
    Yesterday, the President's statements and behavior as well 
as his conduct in the preceding weeks has severely damaged our 
standing in the world by siding with Russia and its brutal 
thuggish dictator against the United States, and he has created 
a crisis in our beloved country. But rather than conducting any 
oversight of these important issues relating to the integrity 
of our elections, the disgraceful conduct of the President, and 
the threat to our democracy, we have this hearing.
    So I'd like to begin by associating myself with the remarks 
of several of my colleagues about the seriously misguided 
priorities of this committee under Republican leadership. Let's 
make something very clear: There is no evidence that the 
algorithms of social networks or search results are biased 
against conservatives. It is a made-up narrative pushed by the 
conservative propaganda machine to convince voters of a 
conspiracy that does not exist.
    But in spite of studies and research by data analytic firms 
that show that there is no systemic bias against conservatives 
online, the Republican effort to advance its victimhood complex 
is somehow working. Over the past 2 years, Facebook has bent 
over backwards to placate and mollify conservatives based on 
this fiction. It's refused to evenly enforce its platform 
policies against hate speech, conspiracy theories, and 
disinformation. It will not ban pages that share dangerous 
hoaxes. And it has tailored its news feed algorithm to boost 
posts based solely on engagement, resulting in significantly 
more traffic to hyperpartisan conservative sources and 
misinformation at the expense of local news and other sources 
of trustworthy journalism.
    Facebook also fired its team of news curators in response 
to Republican criticism and legitimized the baseless claims of 
top Republican officials, including President Trump's campaign 
manager, by initiating a study on this issue conducted by the 
conservative Heritage Foundation and a former Republican 
Senator.
    But aside from the obvious hypocrisy of the conservative 
agenda to delegitimize any information that undermines their 
media narrative, why does this matter? It matters because 
nearly three-quarters of Americans access news online through 
Facebook and Google while more than two-thirds of online 
traffic is channeled through Facebook and Google. Last year, 
these two companies alone pulled in more than $42 billion from 
online ads, more than 60 percent of all online ad revenue, and 
are projected to account for 83 percent of growth in the 
digital ad market.
    It is overwhelmingly clear that this enormous unchecked 
power to dictate and profit from what people see online is a 
fundamental threat to the free and diverse press in our vibrant 
democracy. News publishers, local businesses, and media 
companies are at the mercy of the dominant corporations. But 
don't take it from me, take it from the chief executive of News 
Corp, who recently warned that we have entered into an era in 
which the pervasiveness of the largest digital platforms make 
Standard Oil look like a corner gas station; or the chairman of 
The New York Times, who recently referred to Facebook CEO Mark 
Zuckerberg's approach to news, and I quote, a terrifyingly 
naive perspective that makes my blood run cold, end quote; or 
the editor in chief of Wired who said earlier this year that 
news publishers have been reduced to, and I quote, 
sharecroppers on Facebook's massive industrial farm, end quote.
    There's no question that we've reached a tipping point. 
We're at the precipice sacrificing the news organizations that 
are essential of uncovering corruption, holding the government 
and powerful corporations accountable, and sustaining our 
democracy to the profit margins of a few dominant companies.
    As Justice Robert Jackson remarked in 1937, we cannot 
permit private corporations to be private governments. We must 
keep our economic system under the control of the people who 
live by and under it. It's long overdue that we take these 
concerns seriously, rather than listening to the fevered dreams 
and conspiracy theorists of conservatives. That's why we must 
restore and protect our democracy by creating an even playing 
field for news publishers and giving power back to Americans 
through greater control over their data.
    So, Ms. Bickert, I'll start with you. My question is: I've 
introduced legislation to ensure fairness and an even playing 
field between publishers and dominant platforms such as 
Facebook. This bill provides for a limited safe harbor for news 
publishers to ban together for purposes of negotiating branding 
attribution and interoperability of news. What objection does 
Facebook have to collective bargaining by news publishers to 
promote to access to trustworthy sources of news? And, second 
question is: Facebook and other companies are required by 
Article 20 of the European Union as general data protection 
regulation to give consumers the ability to take their data 
from Facebook to a competing service. Why has Facebook not made 
this right available to American users? And does Facebook 
oppose giving American consumers the right to move their data 
to competing services like you do as part of this agreement 
with the European Union?
    Ms. Bickert. Thank you, Congressman. We definitely support 
data portability. In fact, we've had that in place for years. 
And the services that we apply--that we offer in Europe for 
data portability, we are also offering similar options for 
users in the United States. And we have offered such options 
for years. That means that people can take their data with them 
from Facebook to another service.
    I would note that when we hear concerns from any community 
on Facebook, whether it is news publishers or whether it is 
people on the right or people on the left, we want to make sure 
that we understand those concerns, are responsive to them. We 
always want to apply our policies fairly to all of these 
groups. That's the reason that we are undertaking various 
audits and assessments. We just want to make sure that we're 
doing our job right and that we're understanding if our 
policies are in fact being applied as fairly as we intend for 
them to be. We can always do better.
    Mr. Gaetz. The gentleman's time has expired.
    Mr. Cicilline. Can she answer the first question?
    Mr. Gaetz. Well, we've gone a minute over, Mr. Cicilline. 
So I am going to recognize the gentleman----
    Mr. Cicilline. Mr. Chairman, I have a unanimous consent 
request.
    Mr. Gaetz. The gentleman is recognized to make his 
unanimous consent request.
    Mr. Cicilline. I request unanimous consent to enter the 
following materials into the record, a 2017 report by NewsWhip, 
an internet analytics firm, on the rise of hyperpolitical 
media; an article Nieman Lab entitled ``Has Facebook's 
algorithm change hurt hyperpartisan sites? According to this 
data, nope''; an article by April Glaser in Slate entitled 
``Facebook won't make the bed it lies in.''
    Mr. Gaetz. Without objection.
    The gentlemen from Louisiana, Mr. Johnson, is recognized 
for 5 minutes.
    Mr. Johnson of Louisiana. Thank you, Mr. Chairman.
    I thank all of you for being here. I know this is a 
difficult subject area. Prior to my election to Congress, I was 
a constitutional law attorney, and I litigated free speech 
cases in the courts for almost 20 years, defending religious 
liberty and the First Amendment. And so I'm very wary of 
censorship efforts. Sometimes they are well intended, but 
there's always a high degree of subjectivity, and it causes 
problems. And I'm still trying to understand what standards 
each of your organizations utilize to determine exactly how 
offensive or controversial or fake content is defined. And as 
you've noticed, many of us have come in and out of the hearing 
because we have other things going on. If you've answered some 
of these, I apologize in advance. But this is a question that I 
think my constituents back home really want to know because 
they ask me this all the time: How do each of your companies 
define fake news?
    Let me start with Ms. Bickert.
    Ms. Bickert. Thank you, Congressman. First, I want to say 
that we actually published a new version of our standards in 
April that gives all of the detail of what we tell our content 
reviewers in terms of how to apply our policies against things 
like hate speech or bullying, so forth. We also include in 
there a section on what we're doing to combat false news.
    It is not against our policies, meaning we don't remove the 
content just for being false. What we do instead is we try to 
provide--if we have an indication that a news story is false, 
and that would be because it's been flagged as potentially 
being false, and then we've sent it to third-party fact 
checkers who have rated it false, then we will provide 
additional information to people who see that content. And that 
will be related articles from around the internet. We will also 
try to counter any virality of that post by reducing its 
distribution.
    Mr. Johnson. Okay. Ms. Downs.
    Ms. Downs. Our goal is to provide our users with 
trustworthy information that's responsive to what they're 
looking for.
    Mr. Johnson of Georgia. So how do you define fake news?
    Ms. Downs. Fake news is obviously a term used to describe a 
spectrum of content. So at one end of the spectrum you have 
malicious, deceptive content that's often being spread by troll 
firms, et cetera. That content would violate our policies and 
we would act quickly to remove the content and/or the accounts 
that are spreading it.
    In the middle, you have misinformation that may be low 
quality. This is where our algorithms kick in to promote more 
authoritative content and demote lower quality content. And 
then, of course, you even hear the term used to refer to 
mainstream media in which case we do nothing. We don't embrace 
the term in that context.
    Mr. Johnson of Georgia. Let me ask you before I move to 
Twitter. If a content reviewer determines that something is 
fake news, is there an appeals process for the person who 
produced that content? I mean, are they notified formally?
    Ms. Downs. Any time we remove content for violation of our 
policies, the user is notified and given a link to an appeals 
form.
    Mr. Johnson of Georgia. How long does the appeals process 
take?
    Ms. Downs. I'm not familiar with the average turnaround 
times, but I could get back to you with that information.
    Mr. Johnson of Georgia. Well, I wish you would. I mean, the 
news cycle obviously is constantly changing. So if the appeals 
process takes days or weeks, then it's a moot point by the end 
of that process.
    And our concern is, of course, that you would--any 
organization, any company, would filter things that they may or 
may not or their internal reviewers may not agree with. And 
then by the time the appeals process is exhausted, it's stale 
content anyway. And if the objective was to pull it down and 
take it out of the public's view, then that was accomplished 
just because of the time delay. So there's a due process 
concern that we have, even though you're not the government. I 
mean, it still should apply here, I think.
    Let me ask you. I'm getting to Twitter next, but hold on.
    Are individuals outside of your company consulted with 
regard to appropriate content or the purveyors of the content?
    Ms. Downs. Our policies are developed by us, but we 
sometimes consult experts when we feel we need additional 
expertise to understand particular kinds of content. However, 
all enforcement decisions are made internally by the company.
    Mr. Johnson of Georgia. A controversy developed this year 
with regard to you guys about the Southern Poverty Law Center, 
the SPLC. And they labeled some mainstream Christian and 
conservative organizations as hate groups because they didn't 
like what they were doing. And then Google and YouTube used the 
SPLC designation, at least allegedly, to flag the content of 
those groups.
    Did that happen? Do you admit that that happened?
    Ms. Downs. So that references to a program that we call the 
Trusted Flagger program which is one where we engage NGOs and 
government agencies with expertise on the particular kinds of 
things we prohibit per policy. They get access to a bulk 
flagging tool so they can flag videos to us in bulk rather than 
one at a time.
    They do not have any authority to remove content, restrict 
content, or demote content on our services. All of those 
decisions are made by us. So we're leveraging our expertise, 
but decision-making authority is retained by the company.
    Mr. Johnson of Georgia. I guess this goes to the appeals 
process. But, I mean, some of these groups I know personally 
are legitimate, well-respected, faith-based organizations. And 
I just want to say SPLC is not a neutral watchdog organization. 
So I'm glad to know they don't get editorial control where I 
think some of that needs to be looked at.
    I got 2 seconds. I didn't get to get to Twitter. But I 
appreciate you all being here.
    I'll yield back, Mr. Chairman.
    Mr. Gaetz. The gentleman from California, Mr. Lieu, is 
recognized for 5 minutes.
    Mr. Lieu. Thank you, Mr. Chair.
    I served on Active Duty in the U.S. military. I never 
thought I would see the American Commander in Chief deliver the 
talking points of the Kremlin. This Judiciary Committee has 
oversight over the Department of Justice. Our President 
disparaged members of the Department of Justice. Are we having 
a hearing on that? No.
    As we sit here today, there is nearly 3,000 babies and kids 
ripped away from their parents by the Trump administration. 
They have not been reunified yet. Are we having a hearing on 
that? Because we have jurisdiction over immigration. No.
    Instead, we are having this ridiculous hearing on the 
content of speech of private sector companies. It's stupid 
because there's this thing called the First Amendment. We can't 
regulate content. The only thing worse than the Alex Jones 
video is the government trying to tell Google not to do it, to 
prevent people from watching the Alex Jones video. We can't 
even do it if we tried. We can't even do any legislation on 
this committee. And we're having this ridiculous second 
installment hearing on the very first hearing about Diamond and 
Silk not getting enough likes on Facebook.
    So let me just ask some very basic questions so the 
American public understands what a dumb hearing this entire 
hearing is.
    So, Ms. Bickert, are you a private company?
    Ms. Bickert. Congressman, we are.
    Mr. Lieu. All right. And you report to a board of 
directors, and you're publicly traded, correct?
    Ms. Bickert. Yes, we do.
    Mr. Lieu. Okay. And as a publically traded private sector 
company, one of your goals and your duties to shareholders is 
to maximize profit, correct?
    Ms. Bickert. Yes.
    Mr. Lieu. So if it turns out that elevating stories about 
cats generates more profit for your company, you have the 
absolute right to do that, don't you?
    Ms. Bickert. Within certain guardrails, we work to maximize 
the profitability of our company, yes.
    Mr. Lieu. Thank you.
    All right. So, Ms. Downs, are you a private company?
    Ms. Downs. Yes.
    Mr. Lieu. And you report to your shareholders. You have a 
duty to your shareholders, right?
    Ms. Downs. Correct.
    Mr. Lieu. All right. And so if it turns out that if you 
don't play Diamond and Silk because people don't like to watch 
them, but you elevate, let's say, pictures of kittens and that 
makes you more money, you could absolutely do that. Isn't that 
right?
    Ms. Downs. We could.
    Mr. Lieu. Okay. Thank you.
    All right. So, Mr. Pickles, I'm going to ask you the same 
question. You're a private sector company?
    Mr. Pickles. Yes, sir.
    Mr. Lieu. All right. And you have a duty to your 
shareholders to maximize profit?
    Mr. Pickles. Yes, sir.
    Mr. Lieu. Okay. So if it turns out that you've got accounts 
that can generate revenue for you, but they're saying all sorts 
of crazy things, for example, that--let's say, I don't know, 
red tomatoes taste worse than purple tomatoes, but generates 
you revenue, you could talk about that and elevate that on your 
platform, right?
    Mr. Pickles. Yes, sir.
    Mr. Lieu. All right. I notice all of you talked about your 
own internal rules, because that's what this should be about. 
You all get to come up with your own rules, but not because 
government tells you what to do or that government says you 
have to rule this way or that way. And the whole notion that 
somehow we should be interfering with these platforms from a 
legislative governmental sort of point of view is anathema to 
the First Amendment, and really, it's about the marketplace of 
ideas. So if you're a user and you don't like the fact that, 
you know, I don't know, Facebook isn't playing Diamond and 
Silk, well, go to some other social media platform. Go find 
Diamond and Silk on a website and watch their videos. Or if you 
don't like how Twitter is operating, well, go use WeChat or go 
use KakaoTalk or go use some other social media platform.
    I don't even know why we're having this hearing. We should 
be having a hearing on the President of the United States, 
statements he has made that show that he has this bizarre 
relationship to Vladimir Putin, who is not our friend. Russia 
is not an ally. And yet we're sitting here talking about 
something we have no control over, we cannot regulate.
    And it's from--actually, it's not a partisan issue. There 
were--there were questions from members of my own side that 
also trouble me. Because, again, you all need to be able to do 
whatever you want to do that maximizes your profit based on 
your internal rules, not because the House Judiciary Committee 
says that you shouldn't play, you know, Alex Jones or you 
shouldn't play Diamond and Silk or whatever it is that 
conservatives come up with or liberals come up with. This is an 
issue of the First amendment. That's why it's made America 
great.
    Thank you all for being here. Just keep on doing what 
you're doing. Your duty is to your shareholders, not to the 
members of this Judiciary Committee.
    I yield back.
    Mr. Gaetz. Will the gentleman yield for a question?
    Mr. Lieu. Sure.
    Mr. Gaetz. Thank you. Thank you for yielding.
    So I understand your argument and agree with most of it as 
it relates to the First Amendment. But is your view of section 
230 that it's consistent with First Amendment principles? Or do 
you read section 230 to say that if you choose to be a public 
forum, then you surrender those First Amendment rights for the 
liability protections you get to host content?
    I yield back for the answer.
    Mr. Lieu. Thank you. I am a supporter of section 230.
    I yield back.
    Mr. Gaetz. The gentleman from Ohio, Mr. Jordan, is 
recognized for 5 minutes.
    Mr. Jordan. I thank the chairman.
    Ms. Bickert, what percentage of digital advertising market 
does Facebook have?
    Ms. Bickert. Congressman, I can say that advertising I know 
is, in the United States, a $650 billion industry. We have 
about 6 percent of that.
    Mr. Jordan. Six percent of--and just from the digital 
platform, though. I'm not talking advertising in general.
    Ms. Bickert. I don't have an exact statistic on that. 
Sorry. We can follow up on that.
    Mr. Jordan. How about you, Ms. Downs?
    Ms. Downs. I'd have to follow up with that number as well.
    Mr. Jordan. Mr. Pickles.
    Mr. Pickles. I can follow up on that.
    Mr. Jordan. Because it's been reported that it's like 
somewhere around three-quarters of all digital advertising 
marketing dollars are with the three of you guys. Is that not 
accurate?
    That's a number I've heard. Seventy-five percent of digital 
advertising market is controlled by Facebook, Google, and 
Twitter. That's not accurate?
    Ms. Bickert.
    Ms. Bickert. Congressman, again, I know that we have about 
6 percent of the overall advertising market, which is about 
$650 billion.
    Mr. Jordan. Yeah. But that's not what I'm asking you. I'm 
not asking overall advertising. I'm talking about the digital 
area.
    Ms. Bickert. I'd be happy to have our team follow up with 
you on that.
    Mr. Jordan. Okay. If you could, that'd be fine.
    Ms. Bickert, in your opening statement, you talked about 
fake news. I think you called it false news. And there's been 
discussions about third-party fact checkers who assist you in 
determining what, in fact, is fake news or false news.
    Can you tell me how that process works at Facebook?
    Ms. Bickert. Yes, Congressman. We do work with third-party 
fact checkers. All of them are approved, they're pointer 
approved, and they are signatories to the International Fact-
Checking Network code of principles. Although they meet these 
high standards, we also want to make sure that we are not--we 
know this is not a perfect process, so----
    Mr. Jordan. How many are there?
    Ms. Bickert. In the United States, I believe right now 
there are five currently. And that includes groups like the AP 
and The Weekly Standard.
    We are open for others. They can apply to----
    Mr. Jordan. So do you use all five of those?
    Ms. Bickert. The way the process works is if something is 
flagged that's potentially false, it is sent to all five--or 
all participating fact checkers. Right now there--I believe 
there are five.
    Mr. Jordan. Who flags it?
    Mr. Bickert. If one of them--who flags it? It could either 
be from people reporting that content is false or it could be 
from our technology noting that, for instance, in comments 
people are saying that this content is false.
    Mr. Jordan. So your system can flag it or someone can just 
send you some kind of message, say, hey, we think this is fake 
news.
    Ms. Bickert. And if content is flagged as potentially being 
false, again, sent to these fact checkers. Now, if any one of 
the fact-checking organizations--organizations rates the 
content as true, then the content is not downranked. But if 
they agree that this content is----
    Mr. Jordan. So if all five say this is fake news, then what 
does Facebook do?
    Ms. Bickert. We don't remove it, but we do reduce the 
distribution. And we also will put related articles beneath the 
content so that people see other stories from around the 
internet that are on the same topic.
    Mr. Jordan. Are these five entities who are making this 
determination, are they the same all the--I mean, are they them 
same five entities or does it rotate? Or how does--and who are 
these five entities? Tell me that.
    Ms. Bickert. The entities include AP, The Weekly Standard, 
factcheck.org. And, by the way, this list is--we are open to 
receiving additional fact-checking organizations. They can 
apply and----
    Mr. Jordan. Associated Press, The Weekly Standard, 
factcheck.org, and who else? Who are the other two?
    Ms. Bickert. I--PolitiFact, and there's one that I'm 
forgetting. I can get these to you. These are public. We've 
listed these five publicly.
    And if others apply, they will also--and they meet the 
standards, they will also be added.
    Mr. Jordan. Okay. And then would it still be unanimous 
before this gets downgraded or flagged on your platform?
    Ms. Bickert. Right now our practice is, if any one 
organization flags it as true, then it will not be demoted. 
But, of course, over time, we are learning from this process. 
We know it's not perfect right now. We will continue to iterate 
and get better on it.
    Mr. Jordan. Okay. Ms. Downs, same question to you. How does 
it work for you guys? Same way?
    Ms. Downs. We briefly introduced a fact-check feature and 
our knowledge panel who--the goal was to provide information 
about publishers. So users have greater context in evaluating 
what they read. However, it was an experimental feature. We got 
some critical feedback that we felt was valid, so we put the 
feature on pause until we could fix those concerns and decide 
whether to----
    Mr. Jordan. And who were the organizations doing it for 
you?
    Ms. Downs. I think that--I would have to get back with 
you--to you on the details, but I believe----
    Mr. Jordan. Let me go to where Mr. Johnson was just a few 
minutes ago.
    Was the Southern Poverty Law Center one of those entities 
for part of this third-party fact-checking operation?
    Ms. Downs. I do not believe so, no.
    Mr. Jordan. Okay. All right.
    Thank you, Chairman. I yield back.
    Mr. Gaetz. The gentleman from Maryland, Mr. Raskin, is 
recognized for 5 minutes.
    Mr. Raskin. Thank you, Mr. Chairman.
    Yesterday, a day that will live in infamy, the President of 
the United States openly sided with Vladimir Putin, a despot, a 
kleptocrat, and a tyrant who freely orders the assassination of 
journalists and political adversaries, as opposed to the 
American intelligence community, the foreign policy community, 
and the law enforcement community.
    The President, in the face of Russia's clear aggression 
against our political sovereignty and democracy in the 2016 
election, instead apologized, which is the equivalent of 
America going out and apologizing to Japan after Pearl Harbor. 
This is an absolute outrage and a scandal. And, of course, this 
is what the Judiciary Committee in its right mind would be 
working on today. This is what we would be investigating.
    Mr. Chairman, it seems as if Facebook and Twitter and 
Google have been arraigned here on charges of completely 
fanciful and mythical anti-conservative bias. When you look at, 
for example, Facebook's community standards, which I've read 
through, they are completely viewpoint neutral. They ban things 
like the advocacy of violence and criminal behavior, hate 
speech from whatever political perspective, child pornography, 
and so on.
    What concerns me, and this is where I guess I suppose I 
would depart from my friend and colleague, Mr. Lieu--what 
concerns me is the political pressure that's apparently being 
brought to bear now on all of these entities and the suggestion 
that they are buckling under to this myth of some kind of anti-
conservative conspiracy. So there is this article in this 
morning's Wall Street Journal saying publishing executives 
argue that Facebook is now overly deferential to conservatives. 
And so we know the dynamic of people working the refs and 
harassing and haranguing the entities.
    And I wanted to ask Ms. Bickert, who's here to represent 
Facebook, is it true that you chose a former Republican 
Senator, Senator Jon Kyl, and the right wing Heritage 
Foundation to do a study about bias under the Facebook, and you 
chose no former Democratic Senator or independent or anybody 
else, and no liberal think tank, like the Center for American 
Progress, to participate in that review? Is that true?
    Ms. Bickert. Thank you, Congressman. We've hired Senator 
Kyl for a very specific purpose, which is to dig in and 
understand concerns about the way that our policies are applied 
and how that affects conservative voices.
    I want to emphasize, though, that we have similar efforts, 
not just in the U.S., but also around the world----
    Mr. Raskin. Well, let me just ask you about that. Because 
my colleague, Mr. Deutch, raised a very profound concern about 
the continuing demonization and vilification of families and 
children who are victimized in episodes of mass gun slaughter. 
And we know it's now an ideological fixture on the right wing 
in America to deny the existence of these atrocities, like what 
took place in Newtown, Connecticut, what took place in 
Parkland, and then to allege that they're the product of some 
kind of conspiracy or hoax. And, of course, the founding myth 
in this vein is Holocaust revisionism which claims that the 
Holocaust never took place.
    But so Mr. Deutch asks the question, what's being done 
about this. Have you appointed a committee to review the 
problem of bias against people who are victims of gun violence 
and the way that they are being treated on the internet and the 
way that there is voice being given to people who are denying 
the historical fact of their experience?
    Ms. Bickert. Certainly, if anybody alleges that the 
Parkland survivors are crisis actors, that violates our 
policies. We remove it.
    But to your question about----
    Mr. Raskin. You remove it. And then--and they're allowed to 
continue to keep putting it up in the future or are they banned 
from Facebook?
    Ms. Bickert. We would remove that content if they put it up 
again. And at a certain point, their account or their page 
would be removed.
    Mr. Raskin. So just explain, what's happened with Infowars? 
Because they've made a cottage industry out of this. What they 
do is they deny that these events have happened.
    Why are they still on Facebook?
    Ms. Bickert. We have removed content from the Infowars page 
to the extent that it's violated our policies. They have not 
reached the threshold at which they're entire----
    Mr. Raskin. What's the threshold?
    Ms. Bickert. It depends, Congressman, on the nature of the 
violation. So there are sometimes more severe violations and--
--
    Mr. Raskin. All right.
    Well, look, I'm with Mr. Lieu, which is that you guys are 
private journalistic entities right now. But if you're going to 
be ideologically badgered and bulldozed to take positions in 
the interest of right wing politics, then we are going to have 
to look at what's happening there, because at that point 
there's not viewpoint neutrality.
    Would you agree that you should not be catering to 
particular interests as opposed to everybody who's concerned 
about what's taking place online?
    Mr. Gaetz. The gentleman's time has well expired.
    So we're going to now recognize Mr. Rutherford of Florida 
for 5 minutes.
    Mr. Raskin. Mr. Chairman, can she answer the question?
    Mr. Gaetz. Well, she--Mr. Raskin, she answered the first 
question that you asked after your time had expired, so we'll 
probably conclude it there.
    Mr. Rutherford, you're recognized for 5 minutes.
    Mr. Rutherford. Thank you, Mr. Chairman.
    Committee, I just want to go back to some earlier 
statements. My--one of our colleagues across the aisle who--
listen, I believe that this is an incredibly important hearing. 
The potential censorship of free speech, I think, goes to the 
core of our country's freedoms. And to suggest that because 
we're not talking about some other items that are in the news 
is somehow this is, quote, ridiculous when considered in light 
of the balance between free speech and public safety.
    When we look at what went on with Backpage, the lives that 
were destroyed, the children that were trafficked, the 
prostitution that was rampant. That--that's why this hearing, I 
think, is so vitally important.
    And, Ms. Bickert, I'd like to ask you, today, citizens can 
hold newspapers and other media groups legally accountable if 
they knowingly lie, if they show indecent content, or if they 
use materials or photos that they are not authorized or did not 
pay to use and so on. Technical platforms are currently making 
in-depth decisions about what information users receive and how 
they receive it, often driven by financial and other unknown 
motives. And Mr. Zuckerberg himself has repeatedly said that 
his platform is, quote, responsible for the content that they 
host.
    Should the tech platforms be subject to the same content 
regulations and civil penalties as those who produce the 
content?
    Ms. Bickert. Thank you, Congressman. We do feel a sense of 
accountability and responsibility to make sure that Facebook is 
a place where people can come and be safe and express 
themselves. And that is at the core of everything we are trying 
to do.
    In terms of regulation, we are happy to talk to this 
committee and others. We think that there is a place for--for 
these conversations, and we hope that we could be a part of 
guiding any regulatory efforts.
    Mr. Rutherford. I'm very pleased to hear you say that. Your 
platform generates revenue based on ads, yet the content 
provided in some cases is illicit, which is--why is it 
acceptable to Facebook, Google, Twitter, Bing, and others to 
make money off of illegal content while these other media 
outlets are held accountable civilly and criminally without the 
protection of section 230?
    Ms. Bickert. Congressman, we do--if content is brought to 
our attention that violates our policies or is illegal, we do 
have measures for removing that. And I would also note that we 
were supportive of the act to protect sex trafficking victims. 
That's something we care a lot about.
    Mr. Rutherford. In fact, this Congress, and I believe the 
public, are beginning to question the full protections afforded 
under section 230. And as you just referenced, on April 11 of 
this year, the President signed into law an additional 
provision under 230 that declared that 230 does not limit a 
Federal civil claim for conduct that constitutes sex 
trafficking, a Federal criminal charge for conduct that 
constitutes sex trafficking, or a State criminal charge for 
conduct that promotes or facilitates prostitution.
    So my question is those are two examples, trafficking and 
prostitution, that are now exemptions to the protection under 
230. Do you see any other--and this is for the whole panel. Do 
you see any other areas where those kinds of exemptions to the 
protections under 230 should be examined?
    Let me throw out an example. How about sedition? Mr. 
Pickles, any comment?
    Mr. Pickles. Well, I think we've spent a lot of time 
talking about these issues today. And, firstly, you're right to 
highlight the work that's been done to tackle child 
trafficking. Before the passage of that bill, we already had a 
zero tolerance approach to this. We work very closely with the 
National Center for Missing and Exploited Children to help law 
enforcement bring to justice those people who are seeking to do 
harm to children.
    And we also take a range of actions under our rules that we 
think are the right thing to do. Our rules go far beyond what's 
required by law, because this is--these are our rules that we 
set.
    So I've highlighted previously terrorist content that we 
are proactively taking down at speed and scale, because we 
think it keeps our platform safe, and it's the right thing to 
do. And I think the demonstration you're seeing is of companies 
who are responsible and taking the right----
    Mr. Rutherford. Do you think Congress should look at 
codifying that?
    Mr. Pickles. Well, I think the balance of regulation that 
we see----
    Mr. Rutherford. To have a rule is one thing; to have a law 
is another.
    Ms. Downs, what do you think?
    Ms. Downs. YouTube remains a service provider that hosts 
user-generated content at an unprecedented scale. We have a 
natural incentive to protect our product from harmful content, 
and we invest a lot of resources in enforcing our policies 
using both technology and humans.
    Mr. Gaetz. The gentleman's time has expired.
    The gentleman from New York, Mr. Jeffries, is recognized 
for 5 minutes.
    Mr. Jeffries. Thank you very much. And good afternoon, 
everyone.
    Ms. Bickert, am I correct that it was in May that Facebook 
engaged the former Senator Jon Kyl to investigate political 
bias?
    Ms. Bickert. I believe it was around that time.
    Mr. Jeffries. And Senator Kyl was a Republican Member of 
the Congress. Is that correct?
    Ms. Bickert. Yes, that's my understanding.
    Mr. Jeffries. And has Facebook engaged any former 
Democratic members of the House or the Senate to participate in 
this exercise as it relates to determining political bias?
    Ms. Bickert. With regard to this--to Senator Kyl's inquiry, 
he is working with his law firm; my understanding is they're 
reaching out to many people. But I want to emphasize that we do 
have conversations on both sides of the aisle. We've engaged 
Laura Murphy, civil rights attorney, to look at how our 
policies are affecting different groups who might have concerns 
about civil rights. And we have conversations--through my team, 
we have conversations on a weekly basis with groups that care 
about these issues from both sides of the aisle.
    Mr. Jeffries. And are you aware that Senator Kyl is 
currently the White House designee to help the administration 
navigate through the confirmation process of arch conservative 
right wing justice nominee Kavanaugh?
    Ms. Bickert. I'm not aware of all of Senator Kyl's 
activities, nor was I a part of the Facebook team that is 
working on that audit. I'm just aware--or an audit assessment. 
I'm just aware of the assessment.
    Mr. Jeffries. Are you aware that Senator Kyl once stated on 
the floor of the Senate that 90 percent of Planned Parenthood's 
activities were abortion related?
    Ms. Bickert. Again, Congressman, I'm not aware of 
everything about Senator Kyl. I just know that we have engaged 
him and his law firm to carry out this assessment.
    Mr. Jeffries. And, actually, the figure is 3 percent of 
Planned Parenthood's activities are abortion related. So I 
guess I'm a little confused that we would embrace the notion of 
someone investigating political bias who himself, as a Member 
of the United States Congress, would broadcast such fake news. 
And that's just one example. He also has a close affiliation 
with the Center for Security Policy, which promotes the 
conspiracy theory that the Muslim Brotherhood has infiltrated 
the U.S. Government to threaten our democracy.
    It's amazing how we get concerned about all of these other 
entities, but somehow we can live through what we witnessed 
yesterday as it relates to the President of the United States 
continuing to play footsie with Vladimir Putin. Nobody's 
alarmed, apparently, at any of these other affiliations as it 
relates to Senator Kyl.
    Now, The Heritage Foundation has also been brought in to 
address the so-called conservative bias. Is that right?
    Ms. Bickert. We can follow up with you on the details of 
everybody that Senator Kyl's group is reaching out to.
    Mr. Jeffries. Okay. I'd be interested in just understanding 
the political leanings of every organization that's been 
brought in. But it's our understanding that The Heritage 
Foundation, which obviously leans very right, has been brought 
in as well.
    Now, Facebook has engaged third-party organizations to help 
filter what it calls fake news. Is that right?
    Ms. Bickert. Congressman, we use third-party fact checkers. 
In the United States we currently have five. We're open to 
more.
    Mr. Jeffries. Am I correct that those five are AP, one; 
PolitiFact, two; factcheck.org, three. Snopes, is that correct, 
four? And the Weekly Standard, five?
    Ms. Bickert. Yes, that's correct.
    Mr. Jeffries. Now, when I look at this list, I'm trying to 
figure out. So the Weekly Standard is a, again, right wing 
conservative leaning organization. I'm struggling to look at 
this list of five.
    Is there a left leaning progressive organization in this 
list of five fact-checking organizations?
    Ms. Bickert. Each of these five fact-checking organizations 
was selected because it is pointer approved and it also adheres 
to the code of principles of the International Fact-Checking 
Network.
    But I also want to make clear----
    Mr. Jeffries. I guess the answer would be no, is that 
correct, to my question? Is there a progressive left leaning 
organization amongst these five different entities that are 
fact checking?
    Ms. Bickert. Congressman, I'm sure different people would 
characterize these organizations different ways. We know that--
that people will have those opinions. And that's why we're not 
removing content if these organizations flag content as false. 
Instead, what we are doing is we are demoting content and we're 
providing relevant information about other articles on the 
internet about the same topics that people have better 
information.
    Mr. Gaetz. The gentleman's time has expired.
    The gentleman from Texas, Mr. Poe, is recognized for 5 
minutes.
    Mr. Poe. Thank you, Chairman. Thank you all for being here.
    Just as a note, the whole idea of now we are going to have 
corporations censure speech based upon their definition of fake 
news, based on their definition of hate speech is opening up a 
Pandora's box. What one person may think is fake news somebody 
else believes is the gospel truth. And we're going to turn that 
over to a group of people to decide. It's going to be, I think, 
very chaotic.
    Are any of you familiar with the general data protection 
requirements regulations? Is anybody familiar?
    Ms. Downs, are you familiar with it?
    Ms. Downs. Yes, I'm familiar with GDPR.
    Mr. Poe. Okay. And, basically, if I'm correct, it's now the 
policy in the European Union that consumers must opt in to the 
dissemination of their private information that is carried by 
one of your organizations. Is that a fair statement, Ms. Downs?
    Ms. Downs. There are consent requirements built into GDPR, 
yes.
    Mr. Poe. All right. And what do you personally think of 
this regulation in Europe?
    Ms. Downs. We very much support a goal of protecting the 
privacy of our users, and we're--we are happy to continue to 
work with Congress on that here in the U.S. as we do with 
European Union for Europe.
    Mr. Poe. I agree with you. I think the privacy of most 
Americans and people--consumers should be something looked at 
by, not only your companies, but Congress so that the consumer 
is protected because we all know and have heard all of the 
stories about how our private--we think is private information 
is not private at all. It's disseminated by your organizations 
to people we don't even know, and so that citizens, users, 
should at least know where that information is going to and 
have the ability to opt in to the dissemination of private 
information, not to mention all of the cyber attacks that take 
place daily by nefarious organizations.
    Mr. Pickles, did you want to say something?
    Mr. Pickles. Just to agree. I think privacy is, you're 
right, a defining public policy issue. We have a global privacy 
policy for this reason. So while Europe has passed GDPR, we 
think privacy is something important to all of our users all 
around the world.
    Mr. Poe. And you think that the United States Congress 
should look into that issue, working with all three of you all 
and other people that are providers, to come up with some 
privacy guidelines for consumers? Just your opinion, Mr. 
Pickles.
    Mr. Pickles. I think that conversation has already started. 
And you're absolutely right, I think it's one that Congress and 
industry can engage on to make sure that Americans citizens 
and, indeed, companies strike that right balance.
    Mr. Poe. All right.
    Ms. Bickert, let me ask you this. What--I agree with you, 
Mr. Pickles.
    What is your definition of fake news?
    Ms. Bickert. Congressman, we have a set of policies that 
are public that define everything that we are doing to counter 
fake news and how we----
    Mr. Poe. So what is fake news? Just tell me your definition 
of fake news.
    Ms. Bickert. Well, really, it depends how people use that 
term.
    Mr. Poe. I mean, it depends on what? What is fake news?
    You said that you're going to try to keep it off of all 
these platforms. I'm not arguing with you. But what is fake 
news?
    Ms. Bickert. No, we don't have a policy of removing fake 
news.
    Mr. Poe. You just--but you point it out to individuals.
    Ms. Bickert. What we do is, if people have----
    Mr. Poe. Excuse me. I'm just trying to understand. If you 
think something is fake news, you have one of these five 
organizations--and Snopes is a left wing organization, by the 
way. If you want to have one of these organizations tag it, 
what is it? What are we talking about is fake news?
    Ms. Bickert. Congressman, what we do is, if people flag 
content as being false or if our technology detects that 
comments or other signals suggest that content might be false, 
then we send it to these fact-checking organizations. If they 
rate the content as false and none of them rate it as true, 
then we will reduce the distribution of the content and add the 
related articles.
    Mr. Poe. So you let somebody else determine what fake news 
is and whatever their opinion of fake news is. But you don't 
have a definition of fake news?
    Ms. Bickert. We do--sharing information that is false does 
not violate our policies.
    Mr. Poe. All right. Thank you.
    I yield back to the chair.
    Mr. Gaetz. The gentlelady from Washington, Ms. Jayapal, is 
recognized for 5 minutes.
    Ms. Jayapal. Thank you, Mr. Chairman.
    Let me associate my remarks with some of my colleagues 
earlier in saying that there are so many things we should be 
discussing, particularly given the news of yesterday, given 
something that I've been fierce about, which is the separation 
of families. So many things. And I was pleased today--shocked 
but pleased today to see that Chairman Goodlatte had said that 
time and time again, this is his quote, that Russia will stop 
at nothing to interfere with and undermine our system of 
government.
    Just days ago, the Department of Justice announced more 
Russian nationals have been charged with attempting to 
interfere with the 2016 Presidential election. This is not a 
country that can be trusted.
    I would urge Chairman Goodlatte to hold hearings on that 
very important topic. He seems to think it's a problem, and yet 
the Judiciary Committee that has jurisdiction over these issues 
has yet to hold a single hearing on election security, on 
protecting our democracy, on Russian hacking of our elections. 
And so I really hope that we do that.
    All of that said, I do think that there are some important 
issues raised here. And I think that, in many ways, this 
hearing and the questions that it raises are a tribute to the 
success of social media platforms. That's what's happened.
    Mark Zuckerberg, when he started Facebook, I don't think 
had any idea that it would take off, or maybe he did. I think 
he didn't. But, you know, that it would take off in the way 
that it has.
    And so the questions that are before you are critical, and 
your responsibility and your actions and your timeliness around 
all of these issues is absolutely essential to making sure that 
these platforms aren't misused and don't actually contribute to 
the detriment of our democracy. And I appreciate that there has 
been some work that all of your companies have done in trying 
to find the right answers, and I don't think it's easy.
    I would like to just echo some of the comments that Mr. 
Raskin made and that Mr. Jeffries made about how Facebook 
ensures that it is not bending to the other side with the 
criticism that it gets.
    And I want to point out--I don't know if you're aware of 
this, Ms. Bickert, but I just saw a news article 2 days ago 
that Facebook has recently donated to Chairman Nunes, who, as 
you may know, is one of the leading voices that's fighting 
Special Counsel Mueller's investigation into Russian 
interference in the 2016 election.
    Now, I understand that you donate to Democrats and 
Republicans. I have a bill that I am working on now that would 
not allow donations to members of a committee where there is an 
interest at stake. Why? Because I think it's important for 
there to be transparency and for the American public to 
understand that those donations don't affect how we look at 
issues.
    But are you concerned about the fact that Facebook has just 
in the last few weeks given money to an individual who is--who 
is countering the Russian investigation when you and Facebook 
are so deeply tied into what access the Russian Government and 
Russian operatives were able to get to our elections?
    Ms. Bickert. Thank you for the question, Congresswoman. I 
know that the Facebook PAC does have bipartisan contributions. 
They are publicly disclosed.
    Ms. Jayapal. Were you aware that Facebook has--the Facebook 
PAC donated to Chairman Nunes just in the last month multiple 
times?
    Ms. Bickert. I don't keep up-to-date on the details----
    Ms. Jayapal. Are you concerned that that would--that would 
taint the notion that Facebook really is trying to come to good 
solutions around these questions?
    Ms. Bickert. Congresswoman, I know that we try to be very 
evenhanded in the way that we donate, and we also make sure 
that we're very open about our donations.
    I'd be happy to have a member of our team follow up with 
you on that.
    Ms. Jayapal. That would be great. Because, again, look, I 
know you donate to everybody. I don't think that's right. But I 
know you donate to everybody. But I would just encourage you to 
look at this question of whether Facebook is bending too much 
to appease some of our right wing interests that I think are 
undermining our democracy.
    Let me go to this question of false news, because I think 
the challenge here is that it is difficult to determine exactly 
what may qualify as false news. But the bigger problem to me is 
that we somehow get to a standard that truth is relative. Truth 
is not relative. An apple is an apple. It can't be a tomato 
tomorrow and a pear yesterday. It is an apple.
    And so the question for you is, in your strategy, you say 
that you do take steps to try to not share false news, and yet 
at the same time, you're saying you don't take down any pages. 
And I guess I just don't understand what the lines are here and 
how you're determining the broad guidance.
    Ms. Bickert. Yes. There's a couple of different things we 
do. One thing is we know that the majority--the biggest amount 
of false news that you see on social media tends to come from 
spammers, financially motivated actors. And so we have tactical 
means--that violates our policies, and we have tactical means 
of trying to detect those accounts and remove them. And we've 
made a lot of progress in the past few years.
    Then there is this content that people may disagree about 
or it may be widely alleged to be false. And we've definitely 
heard feedback that people don't want a private company in the 
business of determining what is true and what is false. But 
what we know we can do is counter virality by--if we think that 
there is--that there are signals like third-party fact checkers 
telling us the content is false, we can counter that virality 
by demoting a post, and we can provide additional information 
to people so that they can see whether or not this article is 
consistent with what other mainstream sources around the 
internet are also saying.
    Ms. Jayapal. Thank you. Mr. Chairman, I yield back.
    Mr. Gaetz. The gentlelady's time has expired.
    I'll now recognize myself for 5 minutes. And I'll begin by 
associating myself with some of the comments from Mr. Lieu and 
Mr. Raskin. When they indicate that the government should not 
foist upon the technology community, the--you know, the 
overregulation of the government, I completely agree.
    My question is, when you avail yourself to the protections 
of section 230, do you necessarily surrender some of your 
rights as a publisher or speaker?
    The way I read that statute now, it's pretty binary. It 
says that you have to be one or the other. You have to be 
section 230 protected or you're a speaker with a full 
complement of your First Amendment rights.
    I'm cool with that. I would love you guys to make the 
choice. I come from the Libertarian leaning segment of my 
party. I just think it's confusing when you try to have it both 
ways. When you try to say that, you know, we get these 
liability protections, but at the same time, we have the right 
to throttle content. We have the right to designate content. 
And in the most extreme examples, when you have a Twitter 
attorney saying in court, we would never do this, but we would 
have the right to ban people based on their gender or their 
sexual orientation. So I wanted to clear up those comments.
    But my question--my next question is for you, Ms. Bickert. 
I've provided to you a screenshot I've taken from content that 
was published on Facebook from a page that is Milkshakes 
Against the Republican Party. There are two posts. Would you 
read the first one? And there is one naughty word there that 
you're welcome to skip over.
    Would you read it aloud?
    Ms. Bickert. Congressman, this is a post, Milkshakes 
Against the Republican Party. It has a picture, and it says: 
Parents in the waiting area for today's school shooting in 
Florida.
    And then it says: You remember the shooting at the 
Republican baseball game? One of those should happen every week 
until those NRA--and then there are unpleasant words--and then 
there's--I'm not sure if this is another post beneath it or 
not.
    Mr. Gaetz. Yeah. That's a second post.
    Will you read that? That has no naughty words.
    Ms. Bickert. It says: Dear crazed shooters, the GOP has 
frequent baseball practice. You really want to be remembered? 
That's how you do it. Signed, Americans tired of our 
politicians bathing in the blood of the innocent for a few 
million dollars from the terrorist organization NRA.
    Mr. Gaetz. Do these posts violate your terms of service?
    Ms. Bickert. Any call for violence violates our terms of 
service.
    Mr. Gaetz. So why is Milkshakes Against the Republican 
Party still a live page on your platform?
    Ms. Bickert. I can't speak--I haven't reviewed this page. I 
can't speak to why any page is up or not up. But we can 
certainly follow up with it.
    Mr. Gaetz. So a member of my staff provided these comments 
to Facebook. And we said, based on our reading of your terms of 
service, and, frankly, based on your testimony today where you 
say we are committed to removing content that encourages real 
world harm, based on that, this would be a facial violation. 
But I received back what I've provided to you, and the 
highlighted portion of Facebook's message back to my staff 
includes: It doesn't go against one of our specific community 
standards.
    So do you see the tension between your public testimony 
today, your terms of service, and then your conduct when you're 
presented with violent calls to shoot people who are members of 
my party at baseball practice?
    Ms. Bickert. Congressman, there's no place for any calls 
for violence on Facebook. I will certainly follow up after the 
hearing and make sure that we're addressing content you bring 
to our attention.
    Mr. Gaetz. Thank you. Yeah. I mean, I brought it to your 
attention when I emailed it to you. And then I brought it to 
your attention when I went to Facebook with Mr. Ratcliffe. We 
went to California. We went to your corporate headquarters. I 
showed these posts to your executives. And the response I got 
from your executives is: Well, we removed those specific posts, 
but we're not going to remove the entire page.
    So I guess if a page hosts repeated content that threatens 
violence and that references the shooting of Republicans at a 
baseball game, why would you not remove the page?
    Ms. Bickert. Thank you, Congressman. Okay. So these posts 
were removed but the page has not been removed. Is that 
correct?
    Mr. Gaetz. Correct.
    Ms. Bickert. Okay. So we remove pages or groups or profiles 
when there is a certain threshold of violations that has been 
met. So--and this depends. If somebody, for instance, posts an 
image of child sexual abuse imagery, their account will come 
down right away. But there are different thresholds depending 
on different violations. So I can follow up with you on that.
    Mr. Gaetz. Yeah. How many does a page have to encourage 
violence against Republican Members of Congress at baseball 
practice before you will ban the page?
    Ms. Bickert. Congressman, I'm happy to look into this and 
look at the page specifically and then come back to you with an 
answer.
    Mr. Gaetz. You agree this is a mistake, right?
    Ms. Bickert. These--the posts should not be on Facebook. I 
have to look at a specific page before--with my team before we 
can----
    Mr. Gaetz. Do you think that this page should be hosted on 
Facebook with these multiple calls for violence against people 
in my party?
    Ms. Bickert. Congressman, I personally have not seen the 
page on Facebook. But I will look into----
    Mr. Gaetz. You've seen these posts, though, right?
    Mr. Raskin. Would the chairman yield?
    Mr. Gaetz. Yeah.
    Mr. Raskin. I just had a question. I'm agreeing with the 
chairman about this. And I think we arrived at the exact same 
place when we were talking about at what threshold does 
Infowars have their page taken down after they repeatedly deny 
the historical reality of massacres of children in public 
schools.
    And so when you follow up with it, and obviously you want 
to look into specifics of the case, I would love it if you 
would follow up also about Alex Jones and Infowars. If certain 
content you're saying has been taken down when they are 
taunting the students from Parkland, but at what point does the 
whole page get taken down?
    And I agree, certainly, that the--that these posts should 
be taken down that the chairman's talking about.
    I yield back.
    Mr. Gaetz. I thank the gentleman, and would concur with his 
sentiments.
    My time has expired.
    And seeing no further business before the committee, this 
concludes today's hearing.
    Thank you to the distinguished witnesses for attending.
    Without objection, all members will have 5 legislative days 
to submit additional written questions for the witnesses or 
additional materials for the record.
    The hearing is adjourned.
    [Whereupon, at 12:54 p.m., the committee was adjourned.]