Jump to content

Fake news website: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Sagecandor (talk | contribs)
Sagecandor (talk | contribs)
copy edit Germany
Line 139: Line 139:
[[Chancellor of Germany|German Chancellor]] [[Angela Merkel]] lamented the problem of fraudulent news reports in a November 2016 speech, days after announcing her campaign for a fourth term as leader of her country.<ref name=merkelwarns>{{citation|url=https://www.yahoo.com/news/merkel-warns-against-fake-news-driving-populist-gains-110054526.html|accessdate=23 November 2016|date=23 November 2016|title=Merkel warns against fake news driving populist gains|work=[[Yahoo! News]]|agency=[[Agence France-Presse]]}}</ref> In a speech to the German parliament, Merkel was critical of such fake sites: "Something has changed -- as globalisation has marched on, (political) debate is taking place in a completely new media environment. Opinions aren't formed the way they were 25 years ago. Today we have fake sites, bots, trolls -- things that regenerate themselves, reinforcing opinions with certain algorithms and we have to learn to deal with them."<ref name=merkelwarns /> She warned that such fraudulent news websites were a force increasing the power of [[populism|populist]] extremism.<ref name=merkelwarns /> Merkel called fraudulent news a growing phenomenon that might need to be regulated in the future.<ref name=merkelwarns />
[[Chancellor of Germany|German Chancellor]] [[Angela Merkel]] lamented the problem of fraudulent news reports in a November 2016 speech, days after announcing her campaign for a fourth term as leader of her country.<ref name=merkelwarns>{{citation|url=https://www.yahoo.com/news/merkel-warns-against-fake-news-driving-populist-gains-110054526.html|accessdate=23 November 2016|date=23 November 2016|title=Merkel warns against fake news driving populist gains|work=[[Yahoo! News]]|agency=[[Agence France-Presse]]}}</ref> In a speech to the German parliament, Merkel was critical of such fake sites: "Something has changed -- as globalisation has marched on, (political) debate is taking place in a completely new media environment. Opinions aren't formed the way they were 25 years ago. Today we have fake sites, bots, trolls -- things that regenerate themselves, reinforcing opinions with certain algorithms and we have to learn to deal with them."<ref name=merkelwarns /> She warned that such fraudulent news websites were a force increasing the power of [[populism|populist]] extremism.<ref name=merkelwarns /> Merkel called fraudulent news a growing phenomenon that might need to be regulated in the future.<ref name=merkelwarns />


Germany's foreign intelligence agency [[Federal Intelligence Service (Germany)|Federal Intelligence Service]] chief, Bruno Kahl, warned of the potential for [[cyberattacks]] by Russia in the 2017 German election.<ref name=fisspychief>{{citation|accessdate=1 December 2016|url=http://www.ibtimes.co.uk/russian-hackers-may-disrupt-germanys-2017-election-warns-spy-chief-1594221|work=[[International Business Times|International Business Times UK edition]]|date=30 November 2016|title=Russian hackers may disrupt Germany's 2017 election warns spy chief|first=Jason|last=Murdock}}</ref> He said the cyberattacks would take the form of the intentional spread of misinformation.<ref name=fisspychief /> Kahl said the goal is to "elicit political uncertainty".<ref name=fisspychief /> Germany's domestic intelligence agency [[Federal Office for the Protection of the Constitution]] chief, [[Hans-Georg Maassen]], said: "The information security of German government, administrative, business, science and research institutions is under permanent threat. ... Russian intelligence agencies are also showing a readiness to [carry out] sabotage."<ref name=fisspychief />
Germany's foreign intelligence agency [[Federal Intelligence Service (Germany)|Federal Intelligence Service]] , Bruno Kahl, warned of the potential for [[cyberattacks]] by Russia in the 2017 German election.<ref name=fisspychief>{{citation|accessdate=1 December 2016|url=http://www.ibtimes.co.uk/russian-hackers-may-disrupt-germanys-2017-election-warns-spy-chief-1594221|work=[[International Business Times|International Business Times UK edition]]|date=30 November 2016|title=Russian hackers may disrupt Germany's 2017 election warns spy chief|first=Jason|last=Murdock}}</ref> He said the cyberattacks would take the form of the intentional spread of misinformation.<ref name=fisspychief /> Kahl said the goal is to "elicit political uncertainty".<ref name=fisspychief /> Germany's domestic intelligence agency [[Federal Office for the Protection of the Constitution]] , [[Hans-Georg Maassen]], said: "The information security of German government, administrative, business, science and research institutions is under permanent threat. ... Russian intelligence agencies are also showing a readiness to [carry out] sabotage."<ref name=fisspychief />


===China===
===China===

Revision as of 22:17, 1 December 2016

Fake news websites publish hoaxes, propaganda, and disinformation to drive web traffic inflamed by social media. These sites are distinguished from news satire, as they mislead and profit from readers' gullibility.[1] Such sites promoted political falsehoods in countries: Germany,[2] Indonesia and the Philippines,[3] Sweden,[4] China,[5][6] Myanmar,[7][8] and the United States.[9][10][11] Many sites are hosted in: Russia,[9][10][12] Macedonia,[13][14] Romania,[15] and the U.S.[16][17]

Swedish paper The Local called fake news psychological warfare.[4] Agence France-Presse noted media analysts see it as "a threat to democracy itself."[2] The European Parliament Committee on Foreign Affairs passed a 2016 resolution warning the Russian government used "pseudo-news agencies" and "internet trolls" to assail democracy.[12]

In 2015, Sweden's national security agency the Swedish Security Service concluded Russia used the tactic to inflame "splits in society".[4] Sweden's Ministry of Defence tasked its Civil Contingencies Agency to combat fake news from Russia.[4] Fraudulent news affected politics in Indonesia and the Philippines, because social media was widespread; while ways to check veracity limited.[3] German Chancellor Angela Merkel warned against "fake sites, bots, trolls" and lamented their societal impact.[2]

Fraudulent articles escalated during the 2016 U.S. presidential election.[9][10][11] U.S. Intelligence Community officials said Russia was engaged in spreading fake news.[18] Computer security company FireEye concluded Russia used social media as cyberwarfare.[19] Google and Facebook banned fake sites from using online advertising.[20][21] U.S. President Barack Obama said a disregard for facts created a "dust cloud of nonsense".[22] Concern advanced bipartisan legislation in the U.S. Senate to authorize U.S. State Department action against foreign propaganda.[23] U.S. Senate Intelligence Committee member Ron Wyden said: "There is definitely bipartisan concern about the Russian government engaging in covert influence activities of this nature."[23]

Prominent sources

Prominent among fraudulent news sites include false propaganda created by individuals in the countries of Russia,[9][10][12] Macedonia,[13][14] Romania,[15] and the United States.[16][17] Several of these websites are often structured to fool visitors that they are actually real publications and mimic the stylistic appearance of ABC News and MSNBC, while other pages are specifically propaganda.[14]

Russia

Internet Research Agency

An aerial view of the Smolny Convent in Saint Petersburg
A Russian propaganda "troll farm" was traced back to Saint Petersburg.

Beginning in fall 2014, The New Yorker writer Adrian Chen performed a six-month-long investigation into Russian propaganda campaigns on the Internet orchestrated by a group that called itself the Internet Research Agency.[24] Evgeny Prigozhin, a close associate of Vladimir Putin, was behind the operation which hired hundreds of individuals to work in Saint Petersburg to support Russian government views online.[24]

Internet Research Agency came to be regarded as a "troll farm", a term used to refer to propaganda efforts controlling many accounts online with the aim of artificially providing a semblance of a grassroots organization.[24] Chen reported that Internet trolling came to be used by the Russian government as a tactic largely after observing the organic social media organization of the 2011 protests against Putin.[24]

Chen interviewed reporters in Russia in addition to political activists, and was informed the end goal of fake news usage by the Russian government was not to attempt to persuade particular readers that it was factual, but rather to simply sew discord and chaos generally online.[24] Chen wrote: "The real effect, the Russian activists told me, was not to brainwash readers but to overwhelm social media with a flood of fake content, seeding doubt and paranoia, and destroying the possibility of using the Internet as a democratic space."[24]

EU regulation of Russian fake news

Building of the European Union's Committee on Foreign Affairs
European Union parliamentary Committee on Foreign Affairs drew greater attention to the problem — when it passed a resolution in November 2016, condemning: "pseudo-news agencies ... social media and internet trolls" used by Russia.

In 2015, the Organization for Security and Co-operation in Europe released an analysis highly critical of disinformation campaigns by Russia employed to appear as legitimate news reporting.[25] These propaganda campaigns by Russia were intended to interfere with Ukraine relations with Europe — after the removal of former Ukraine president Viktor Yanukovych from power.[25] According to Deutsche Welle, "The propaganda in question employed similar tactics used by fake news websites during the US elections, including misleading headlines, fabricated quotes and misreporting".[25] This propaganda motivated the European Union to create a special taskforce to deal with disinformation campaigns originating out of Russia.[12][25][26]

Foreign Policy reported that the taskforce, called East StratCom Team, "employs 11 mostly Russian speakers who scour the web for fake news and send out biweekly reviews highlighting specific distorted news stories and tactics."[27] The European Union voted to add to finances for the taskforce in November 2016.[27]

Deutsche Welle noted: "Needless to say, the issue of fake news, which has been used to garner support for various political causes, poses a serious danger to the fabric of democratic societies, whether in Europe, the US or any other nation across the globe."[25]

In November 2016, the European Parliament Committee on Foreign Affairs passed a resolution warning of the use by Russia of tools including: "pseudo-news agencies ... social media and internet trolls" as forms of propaganda and disinformation in an attempt to weaken democratic values.[12] The resolution emphatically requested media analysts within the European Union to investigate, explaining: "with the limited awareness amongst some of its member states, that they are audiences and arenas of propaganda and disinformation."[12] The resolution condemned Russian sources for publicizing "absolutely fake" news reports, and the tally on 23 November 2016 passed by a margin of 304 votes to 179.[28]

Observations

Gleb Pavlovsky, who assisted in creating an propaganda program for the Russian government prior to 2008, told The New York Times in August 2016: "Moscow views world affairs as a system of special operations, and very sincerely believes that it itself is an object of Western special operations. I am sure that there are a lot of centers, some linked to the state, that are involved in inventing these kinds of fake stories."[29]

Anders Lindberg, a Swedish attorney and reporter, explained a common pattern of fake news distribution: "The dynamic is always the same: It originates somewhere in Russia, on Russia state media sites, or different websites or somewhere in that kind of context. Then the fake document becomes the source of a news story distributed on far-left or far-right-wing websites. Those who rely on those sites for news link to the story, and it spreads. Nobody can say where they come from, but they end up as key issues in a security policy decision."[29]

Counter-Disinformation Team

Logo of the United States Department of State
The United States Department of State spent 8 months creating a unit to counter Russian disinformation campaigns against the U.S. before scrapping their own program in September 2015.

The International Business Times reported that the United States Department of State had plans in the works to specifically use a unit that had been formed with the intention of fighting back against disinformation from the Russian government, and that the unit was disbanded in September 2015 after department heads within the State Department did not foresee the peril of the propaganda in the months immediately prior to the 2016 U.S. presidential campaign.[30] The U.S. State Department had put 8 months of work into developing the counter-disinformation unit before deciding to scrap it.[30]

Titled Counter-Disinformation Team, the program would have been a reboot of the Active Measures Working Group set up by the Reagan Administration which previously operated under the auspices of the U.S. State Department and United States Information Agency.[31][32] The Counter-Disinformation Team was set up underneath the Bureau of International Information Programs of the U.S. State Department.[31][32] Work began in the Obama Administration on the Counter-Disinformation Team in 2014.[31][32] The intention of the Counter-Disinformation Team was to combat propaganda from Russian sources such as Russia Today.[31][32] A beta release version website was established ready to go live and several staff members were hired by the U.S. State Department for the Counter-Disinformation Team prior to its cancellation.[31][32] United States Intelligence Community officials explained to former National Security Agency analyst and counterintelligence officer John R. Schindler, that the Obama Administration decided to cancel the Counter-Disinformation Team because they were afraid of antagonizing the Russian government.[31][32]

Under Secretary of State for Public Diplomacy and Public Affairs Richard Stengel was the point person at the U.S. State Department for the Counter-Disinformation Team before it was canceled.[31][32] Stengel had experience previously on the matter, having written publicly for the U.S. State Department about the disinformation campaign by the Russian government and Russia Today.[33] After United States Secretary of State John Kerry called Russia Today: a "propaganda bullhorn" for Vladimir Putin the president of Russia,[34] Russia Today insisted that the State Department give an "official response" to Kerry's statement.[33][35] In his response, Stengel wrote for the U.S. State Department that Russia Today engaged in a "disinformation campaign".[33][35] Stengel spoke out against the spread of fake news, and explained the difference between reporting and propaganda: "Propaganda is the deliberate dissemination of information that you know to be false or misleading in order to influence an audience."[33][35]

A representative for the U.S. State Department explained to the International Business Times in a statement after being contacted regarding the closure of the Counter-Disinformation Team: "The United States, like many other countries throughout Europe and the world, has been concerned about Russia's intense propaganda and disinformation campaigns. We believe the free flow of reliable, credible information is the best defense against the Kremlin's attack on the truth."[30]

Peter Kreko of the Hungary-based Political Capital Institute spoke to International Business Times about his work studying the disinformation initiatives by the Russian government, and said: "I do think that the American [Obama] administration was caught not taking the issue seriously enough and there were a lot more words than action."[30] Kreko recounted that employees within the U.S. government told him they were exasperated due to the "lack of strategy, efficiency and lack of taking it seriously" regarding the information warfare by the Russian government against the United States.[30]

Further role in 2016 U.S. presidential election

Adrian Chen observed a strange pattern in December 2015 whereby online accounts he had been monitoring as supportive of Russia had suddenly additionally become highly supportive of 2016 U.S. presidential candidate Donald Trump.[36] Chen said: "I created this list of Russian trolls. And I check on it once in a while, still. And a lot of them have turned into conservative accounts, like fake conservatives. I don’t know what’s going on, but they’re all tweeting about Donald Trump and stuff."[36] The Daily Beast reported in August 2016: "Fake news stories from Kremlin propagandists regularly become social media trends."[36] Writers Andrew Weisburd and Clint Watts observed: "The synchronization of hacking and social media information operations not only has the ability to promote a favored candidate, like Trump, but also has the potential to incite unrest amongst American communities."[36]

David DeWalt, the chairman of computer security company FireEye
David DeWalt, chairman of computer security company FireEye, concluded that the intelligence operation during the 2016 U.S. election by the Russian government was a new development in cyberwarfare by Russia.

On 24 November 2016, The Washington Post reported that two independent teams of researchers confirmed that propaganda during the 2016 U.S. presidential election organized by Russia helped foment criticism of Democrat candidate Hillary Clinton and support of Republican candidate Donald Trump.[9][37] The Russian propaganda effort was conducted through deliberate proliferation of fake news.[9][10][11] The strategy involved social media users, Internet trolls working for hire, botnets, and organized websites in order to cast Clinton in a negative light.[9][10][11] Foreign Policy Research Institute fellow Clint Watts monitored propaganda from Russia and stated its tactics were similar to those used during the Cold War — only this time spread deliberately through social media to a more powerful extent.[9] Watts stated the goal of Russia was to "essentially erode faith in the U.S. government or U.S. government interests."[9] Watts research along with colleagues Andrew Weisburd and J.M. Berger was published in November 2016.[9]

Separately, the group PropOrNot[a] came to similar conclusions about involvement by Russia in propagating fake news during the 2016 U.S. election.[9][10] PropOrNot analyzed data from Twitter and Facebook and tracked propaganda from the disinformation campaign by Russia to a national reach of 15 million people within the United States.[9][10] PropOrNot concluded that accounts belonging to both Russia Today and Sputnik News promoted "false and misleading stories in their reports", and additionally magnified other false articles found on the Internet to support their propaganda effort.[9] The executive director of PropOrNot told The Washington Post: "The way that this propaganda apparatus supported Trump was equivalent to some massive amount of a media buy. It was like Russia was running a super PAC for Trump’s campaign. ... It worked."[9] The conclusions of both the separate investigations by Foreign Policy Research Institute and PropOrNot were confirmed by prior research from the Elliott School of International Affairs at George Washington University and by the RAND Corporation.[9]

The Washington Post and PropOrNot received criticism from The Intercept,[38] Fortune,[39] and Rolling Stone.[40] Matthew Ingram of Fortune magazine felt that PropOrNot cast too wide a net in identifying fake news websites.[39] The Intercept journalists Glenn Greenwald and Ben Norton were highly critical that the organization included Naked Capitalism on its list.[38] The Intercept called the reporting by The Washington Post as "shoddy",[38] and Fortune magazine called the evidence "flimsy".[39] Writing for Rolling Stone, Matt Taibbi described the report as "astonishingly lazy" and questioned the methodology used by PropOrNot and the lack of information about who was behind the organization.[40] The Washington Post article was criticized in an opinion piece in the paper itself, written by Katrina vanden Heuvel.[41] She wrote that the websites listed by PropOrNot: "include RT and Sputnik News, which are funded by the Russian government, but also independent sites such as Naked Capitalism, Truthout and the right-wing Drudge Report."[41]

Ari Shapiro on the National Public Radio program All Things Considered interviewed Washington Post journalist Craig Timberg, who explained how the initiative operated: "There's legions of botnets and paid human trolls that collect information and tweet it to one another and amplify it online. And that makes these stories that in many cases are false or misleading look much bigger than they are. And they are more likely to end up trending on Google News or end up in your Facebook feed."[37] Timberg explained there were "thousands and thousands of social media accounts" working for Russia together that functioned as a "massive online chorus".[37] Timberg stated Russia had a vested interest in the 2016 U.S. election due to a dislike for Hillary Clinton over the 2011–13 Russian protests, and exhibited a "fondness for Donald Trump".[37] Timberg concluded, "Undermining [United States] democracy and our claims to having a clean democracy were important goals to the Russians."[37]

Multiple different officials within the United States Intelligence Community told BuzzFeed News on 30 November 2016 that they believed the Russian government was actively engaged in spreading fake news.[18] One U.S. intelligence official stated: "They’re doing this continuously, that’s a known fact."[18] Another said: "This is beyond propaganda, that’s my understanding."[18] Institute of International Relations Prague senior research fellow and scholar on Russian intelligence, Mark Galeotti, explained the motivations behind the Kremlin operations: "The most significant aspect of today’s Russian active measures is precisely thereabout undermining and fragmenting the west."[18]

Bloomberg News reported that computer security company FireEye came to the conclusion the Russian government utilized social media online as a strategic weapon with the intention of swaying perspectives regarding the U.S. election.[19] FireEye Chairman David DeWalt told Bloomberg News the intelligence operation by the Russian government in 2016 was a new development in cyberwarfare by Russia.[19] DeWalt stated: "The dawning of Russia as a cyber power is at a whole other level than it ever was before. We’ve seen what I believe is the most historical event maybe in American democracy history in terms of the Russian campaign."[19] FireEye CEO Kevin Mandia stated the tactics of Russian propaganda cyberwarfare changed significantly after fall 2014, from covert computer hacking to suddenly more overt tactics with decreased concerns for operational security or being revealed to the public as an intelligence operation.[19] Mandia concluded: "That’s a change in the rules of engagement."[19]

Macedonia

The town of Veles in Macedonia
Fraudulent news stories during the 2016 U.S. election were traced by a BuzzFeed investigation to adolescent youths in the town of Veles, Macedonia.

A significant amount of fraudulent news during the 2016 United States election cycle came from adolescent youths in Macedonia attempting to rapidly profit from those believing their falsehoods.[13][42] An investigation by BuzzFeed revealed that over 100 websites spreading fraudulent articles supportive of Donald Trump were created by teenagers in the town of Veles, Macedonia.[43][44] The Macedonian teenagers experimented with writing fraudulent news about Bernie Sanders and other articles from a politically left or liberal slant; they quickly found out that their most popular fraudulent writings were about Donald Trump.[43]

The Guardian performed its own independent investigation and reached the same conclusion as BuzzFeed; concurrently tracing back over 150 fraudulent news sites to the same exact town of Veles, Macedonia.[13] One of the Macedonian teenagers, "Alex", was interviewed by The Guardian during the ongoing election cycle in August 2016 and stated that regardless of whether Trump won or lost the election fraudulent news websites would remain profitable.[13] He explained he often began writing his pieces by plagiarism through copy and pasting direct content from other websites.[13] Alex told The Guardian: "I think my traffic will be fine if Trump doesn’t win. There are too many haters on the net, and all of my audience hates Hillary."[13]

One of the investigative journalists who exposed the ties between fraudulent websites and Macedonian teenagers, Craig Silverman of BuzzFeed News, told Public Radio International that some false stories net the Balkan adolescents a few thousand dollars per day and most fake articles aggregate to earn them on average a few thousand per month.[45] Public Radio International reported that after the 2016 election season the teenagers from Macedonia would likely turn back to making money off fraudulent medical advice websites, which Silverman noted was where most of the youths had garnered clickbait revenues before the election season.[45]

Romania

"Ending the Fed", a popular purveyor of fraudulent reports, was run by a 24-year-old named Ovidiu Drobota out of Romania, who boasted to Inc. magazine about being more popular than "the mainstream media".[15] "Ending the Fed" was responsible for a false story in August 2016 that incorrectly stated FOX News had fired journalist Megyn Kelly — the story was briefly prominent on Facebook on its "Trending News" section.[15] "Ending the Fed" held four out of the 10 most popular fake articles on Facebook related to the 2016 U.S. election in the prior three months before the election itself.[15] The Facebook page for the website, called "End the Feed", had 350,000 "likes" in November 2016.[15]

After being contacted by Inc. magazine, Drobota stated he was proud of the impact he had on the 2016 U.S. election in favor of his preferred candidate Donald Trump.[15] According to Alexa Internet, "Ending the Fed" garnered approximately 3.4 million views over a 30-day-period in November 2016.[15] Drobota stated the majority of incoming traffic is from Facebook.[15] He said his normal line of work before starting "Ending the Fed" included web development and search engine optimization.[15]

United States

Homepage of fake news website, RealTrueNews, which states on its main page: "Everything on RealTrueNews Was A LIE".
RealTrueNews intended to show reader gullibility — its fiction was widely believed as factual.[46][47][48]

U.S. News & World Report warned readers to be wary of popular fraudulent news sites composed of either outright hoaxes or propaganda, and recommended the website Fake News Watch for a listing of such problematic sources.[49]

Marco Chacon created the fake news website called RealTrueNews to show his alt-right friends "how ridiculous" their gullibility was for such websites.[46][47] In one of the stories Chacon wrote a fake transcript for Hillary Clinton's leaked speeches in which Clinton explains bronies to Goldman Sachs bankers.[46][47] Chacon was shocked when his fake article was attributed as factual by Fox News and he heard his own creation on The Kelly File hosted by Megyn Kelly.[46][47] Trace Gallagher verbatim repeated Chacon's fiction when Gallagher falsely reported Clinton had called Bernie Sanders supporters a "bucket of losers" — a phrase made-up and written by Chacon himself.[46] Megyn Kelly said she was sorry, in the form of a public retraction, subsequent to emphatic denials from representatives for Hillary Clinton.[46][47][48]

After his fake stories that he made up were believed as factual and shared and viewed tens of thousands of times, Chacon told Brent Bambury of CBC Radio One program Day 6 that he was so shocked at Internet consumers' ignorance he felt it was like an episode from The Twilight Zone.[48] In an interview with ABC News, Chacon defended his site, saying his was only an over-the-top parody of other fake news sites to teach them the how ridiculous they were: "The only way I could think of to have a conversation with these people is to say, 'if you have a piece of crazy fake news, look I got one too, and it’s even crazier, it’s absurd.'"[50]

The Daily Beast reported on the popularity of Chacon's fiction being reported as if it were factual: "Chacon’s stories are regularly accepted as fact in the pro-Trump message board canon. YouTube videos with tens of thousands of views exist solely to reinforce sentences and ideas Chacon dreamed up on his laptop in the middle of the night."[46] In a follow-up piece Chacon wrote as a contributor for The Daily Beast after the 2016 U.S. election, he concluded: "When the only news you are willing to believe is partisan news, you are susceptible to stories written 'in your language' that are complete, obvious, utter fabrications."[47]

Jestin Coler from Los Angeles is the founder and CEO of Disinfomedia, a company which owns many fake news websites He had previously given interviews to multiple media organizations about fake news under a pseudonym, Allen Montgomery, in order to evade personal scrutiny.[16] With the help of tech-company engineer John Jansen, journalists from NPR found Coler's identity. After being identified as Disinformedia's owner, Coler agreed to an interview.[16] Coler explained how his original intent for his project backfired: "The whole idea from the start was to build a site that could kind of infiltrate the echo chambers of the alt-right, publish blatantly or fictional stories and then be able to publicly denounce those stories and point out the fact that they were fiction."[16] He stated his company attempted to write fraudulent reports for the left-wing perspective, but found those articles were not shared nearly as much as fake news from a right-wing point-of-view.[16] Coler told NPR that consumers of information must be more skeptical of content in order to combat fake news: "Some of this has to fall on the readers themselves. The consumers of content have to be better at identifying this stuff. We have a whole nation of media-illiterate people. Really, there needs to be something done."[16]

Paul Horner, a creator of fraudulent news stories, stated in an interview with The Washington Post that he was making approximately US$10,000 a month through advertisements linked to the fraudulent news.[17][51][52] He claimed to have posted a fraudulent advertisement to Craigslist offering thousands of dollars in payment to protesters, and to have written a story based on this which was later shared online by Trump's campaign manager.[17][51][52] Horner believed that when the stories were shown to be false, this would reflect badly on Trump's supporters who had shared them, but concluded "Looking back, instead of hurting the campaign, I think I helped it. And that feels [bad]."[53]

In a follow-up interview with Rolling Stone, Horner revealed that The Washington Post profile piece on him spurred greatly increased interest with over 60 interview requests from media including ABC News, CBS News, and Inside Edition.[54] Horner explained that his writing style was such that articles appeared legitimate at the top and became increasingly couched in absurdity as the reader progressed: "Most of my stuff, starts off, the first paragraph is super legit, the title is super legit, the picture is super legit, but then the story just gets more and more ridiculous and it becomes obvious that none of it is true."[54] Horner told Rolling Stone that he always placed his name as a fictional character in his fake articles.[54] He said he supported efforts to decrease fake news websites.[54]

Impacts by country

Fake news has influenced political discourse in multiple countries, including Germany,[2] Indonesia and the Philippines,[3] Sweden,[4] China,[5][6] Myanmar,[7][8] and the United States.[9][10][11]

Sweden

Logo of the Swedish Security Service
The Swedish Security Service issued a report in 2015 identifying propaganda from Russia had the goal to "create splits in society."

The Swedish Security Service issued a report in 2015 identifying propaganda from Russia infiltrating Sweden with the objective to: "spread pro-Russian messages and to exacerbate worries and create splits in society."[4]

The Swedish Civil Contingencies Agency (MSB), part of the Ministry of Defence of Sweden, identified fake news reports targeting Sweden in 2016 which originated from Russia.[4] Swedish Civil Contingencies Agency official Mikael Tofvesson stated: "This is going on all the time. The pattern now is that they pump out a constant narrative that in some respects is negative for Sweden."[4]

The Local identified these tactics as a form of psychological warfare.[4] The newspaper reported the MSB identified Russia Today and Sputnik News as "important channels for fake news".[4] As a result of growth in this propaganda in Sweden, the MSB planned to hire six additional security officials to fight back against the campaign of fraudulent information.[4]

2016 U.S. presidential election

U.S. President Barack Obama
U.S. President Barack Obama said, "If we can't discriminate between serious arguments and propaganda, then we have problems."

Fraudulent stories during the 2016 U.S. presidential election popularized on Facebook included a viral post that Pope Francis had endorsed Donald Trump, and another that wrote actor Denzel Washington "backs Trump in the most epic way possible".[55]

Donald Trump's son and campaign surrogate Eric Trump, top national security adviser Michael T. Flynn, and then-campaign managers Kellyanne Conway and Corey Lewandowski shared fake news stories during the campaign.[53][56][57][58]

U.S. President Barack Obama commented on the significant problem of fraudulent information on social networks impacting elections, in a speech the day before Election Day in 2016: "The way campaigns have unfolded, we just start accepting crazy stuff as normal. And people, if they just repeat attacks enough and outright lies over and over again, as long as it’s on Facebook, and people can see it, as long as its on social media, people start believing it. And it creates this dust cloud of nonsense."[22][59]

Shortly after the election, Obama again commented on the problem, saying in an appearance with German Chancellor Angela Merkel: "If we are not serious about facts and what’s true and what's not, and particularly in an age of social media when so many people are getting their information in sound bites and off their phones, if we can't discriminate between serious arguments and propaganda, then we have problems."[57][60]

One prominent fraudulent news story released after the election—that protesters at anti-Trump rallies in Austin, Texas, were "bused in"—started as a tweet by one individual with 40 Twitter followers.[61] Over the next three days, the tweet was shared at least 16,000 times on Twitter and 350,000 times on Facebook, and promoted in the conservative blogosphere, before the individual stated that he had fabricated his assertions.[61]

BuzzFeed called the problem an "epidemic of misinformation".[44] According to BuzzFeed's analysis, the 20 top-performing election news stories from fraudulent sites generated more shares, reactions, and comments on Facebook than the 20 top-performing stories from 19 major news outlets.[62][63]

Fox News host of the journalism meta analysis television program Media Buzz, Howard Kurtz, acknowledged fraudulent news was a serious problem.[63] Kurtz relied heavily upon the BuzzFeed analysis for his reporting on the controversy.[63] Kurtz wrote that: "Facebook is polluting the media environment with garbage".[63] Citing the BuzzFeed investigation, Kurtz pointed out: "The legit stuff drew 7,367,000 shares, reactions and comments, while the fictional material drew 8,711,000 shares, reactions and comments."[63] Kurtz concluded Facebook founder Mark Zuckerberg must admit the website is a media company: "But once Zuckerberg admits he’s actually running one of the most powerful media brands on the planet, he has to get more aggressive about promoting real news and weeding out hoaxers and charlatans. The alternative is to watch Facebook’s own credibility decline."[63]

Worries that fake news spread by the Russian government swayed the outcome of the election grew, and representatives in the U.S. Congress took action to safeguard the National security of the United States by advancing legislation to monitor incoming propaganda from external threats.[23][64] On 30 November 2016, legislators approved a measure within the National Defense Authorization Act to ask the U.S. State Department to take action against foreign propaganda through an interagency panel.[23][64] The legislation authorized funding of $160 million over a two-year-period.[23]

The initiative was developed through a bipartisan bill written in March 2016 by US Senators Chris Murphy and Rob Portman titled: Countering Foreign Propaganda and Disinformation Act.[23] US Senator Rob Portman stated: "This propaganda and disinformation threat is real, it’s growing, and right now the U.S. government is asleep at the wheel. The U.S. and our allies face many challenges, but we must better counter and combat the extensive propaganda and disinformation operations directed against us."[23] US Senator Chris Murphy was interviewed by The Washington Post about the legislation and said: "In the wake of this election, it’s pretty clear that the U.S. does not have the tools to combat this massive disinformation machinery that the Russians are running."[23] United States Senate Select Committee on Intelligence member Senator Ron Wyden told The Washington Post: "There is definitely bipartisan concern about the Russian government engaging in covert influence activities of this nature."[23]

Indonesia and Philippines

Fraudulent news has been particularly problematic in Indonesia and the Philippines, where social media has an outsized political influence.[3] According to media analysts, "many developing countries with populations new to both democracy and social media" are particularly vulnerable to the influence of fraudulent news.[3] In some developing countries, "Facebook even offers free smartphone data connections to basic public online services, some news sites and Facebook itself — but limits access to broader sources that could help debunk fake news."[3]

Germany

German Chancellor Angela Merkel lamented the problem of fraudulent news reports in a November 2016 speech, days after announcing her campaign for a fourth term as leader of her country.[2] In a speech to the German parliament, Merkel was critical of such fake sites: "Something has changed -- as globalisation has marched on, (political) debate is taking place in a completely new media environment. Opinions aren't formed the way they were 25 years ago. Today we have fake sites, bots, trolls -- things that regenerate themselves, reinforcing opinions with certain algorithms and we have to learn to deal with them."[2] She warned that such fraudulent news websites were a force increasing the power of populist extremism.[2] Merkel called fraudulent news a growing phenomenon that might need to be regulated in the future.[2]

Germany's foreign intelligence agency Federal Intelligence Service Chief, Bruno Kahl, warned of the potential for cyberattacks by Russia in the 2017 German election.[65] He said the cyberattacks would take the form of the intentional spread of misinformation.[65] Kahl said the goal is to "elicit political uncertainty".[65] Germany's domestic intelligence agency Federal Office for the Protection of the Constitution Chief, Hans-Georg Maassen, said: "The information security of German government, administrative, business, science and research institutions is under permanent threat. ... Russian intelligence agencies are also showing a readiness to [carry out] sabotage."[65]

China

The government of China used the growing problem of fake news as a rationale for increasing internet censorship in China in November 2016.[66] China took the opportunity to publish an editorial in its Communist Party newspaper The Global Times called: "Western Media's Crusade Against Facebook", and criticized "unpredictable" political problems posed by freedoms enjoyed by users of Twitter, Google, and Facebook.[5] China government leaders meeting in Wuzhen at the third World Internet Conference in November 2016 said fake news in the U.S. election justified adding more curbs to free and open use of the Internet.[6] China Deputy Minister Ren Xianliang, official at the Cyberspace Administration of China, said increasing online participation led to additional "harmful information" and that "intimidation and fraud are more common than ever".[67] Kam Chow Wong, a former Hong Kong law enforcement official and criminal justice professor at Xavier University, said at the conference: "it's a good move that the U.S. is trying to regulate social media; it’s overdue."[68] The Wall Street Journal noted China's themes of Internet censorship became more relevant at the World Internet Conference due to the outgrowth of fake news: "China’s efforts to promote its concept of the internet had fresh resonance as Western minds now debate whether social media sites should screen out fake news".[69]

Myanmar

Fake news negatively affected individuals in Myanmar, leading to a rise in violence against Muslims in the country.[7][8] Online participation within the country surged from a value of one percent to 20 percent of Myanmar's total populace from the period of time of 2014 to 2016.[7][8] Fake stories from Facebook in the country grew so influential that they were reprinted in paper periodicals called Facebook and The Internet that simply regurgitated the website's newsfeed text often without factual oversight, for those without Internet access.[8] False reporting related to practitioners of Islam in the country was directly correlated with increased attacks on people of the religion in Myanmar, and protests against Muslims.[7][8]

BuzzFeed News journalist Sheera Frenkel reported: "there has also been an increase in articles that demonize the country’s minority Muslim community, with fake news claiming that vast hordes of Muslim worshippers are attacking Buddhist sites. These articles, quickly shared and amplified on social media, have correlated with a surge in anti-Muslim protests and attacks on local Muslim groups."[7][8] Frenkel noted countries that were relatively newer to Internet exposure were more susceptible to the problem, writing: "Countries like Myanmar, which come online quickly and without many government-backed programs to teach safe internet habits — like secure passwords and not revealing personal details online — rank among the lowest in digital literacy. They are the most likely to fall for scams, hacks, and fake news."[8]

Responses

Google CEO comment and actions

A screenshot of a fake news story, falsely claiming Donald Trump won the popular vote in the 2016 United States presidential election
A screenshot of a fake news story, falsely claiming Donald Trump won the popular vote in the 2016 United States presidential election.[70][71]
Google CEO Sundar Pichai
Google CEO Sundar Pichai has said there should be "no situation where fake news gets distributed" and that it is possible fake news had some effect on the 2016 election.

In the aftermath of the 2016 U.S. presidential election, Google, along with Facebook, faced increased scrutiny in the role of fake-news websites in the election.[72] The top result on Google for results of the race was to a fraudulent news site.[73] "70 News" had fraudulently written an incorrect headline and article that Donald Trump won the popular vote against Hillary Clinton in the 2016 U.S. election.[70][71][72] With regards to the false results posted on "70 News", Google later stated that its prominence in search results was a mistake: "In this case we clearly didn't get it right, but we are continually working to improve our algorithms."[74] By Monday, November 14, the "70 News" result was the second link that people saw when searching for results of the race.[72]

When asked shortly after the election whether fraudulent news sites could have changed the election's results, Google CEO Sundar Pichai responded: "Sure" and went on to emphasize the importance of stopping the spread of fraudulent news sites: "Look, it is important to remember this was a very close election and so, just for me, so looking at it scientifically, one in a hundred voters voting one way or the other swings the election either way. ... From our perspective, there should just be no situation where fake news gets distributed, so we are all for doing better here."[75]

On 14 November 2016, Google responded to the growing problem of fraudulent news sites by banning such companies from profiting on advertising from traffic to false articles through its marketing program AdSense.[20][21][72] The company already had a policy for denying ads for dieting ripoffs and counterfeit merchandise.[76] Google stated upon the announcement: "We’ve been working on an update to our publisher policies and will start prohibiting Google ads from being placed on misrepresentative content. Moving forward, we will restrict ad serving on pages that misrepresent, misstate, or conceal information about the publisher, the publisher’s content, or the primary purpose of the web property."[77] This builds upon one of Google's existing better-advertisement policies, wherein misleading advertising is already banned from Google AdSense.[72][78] The ban is not expected to apply to news satire sites like The Onion; some satirical sites may be inadvertently blocked under this new system.[72]

Facebook deliberations

Blocking fraudulent advertisers

Facebook CEO Mark Zuckerberg
Facebook CEO Mark Zuckerberg specifically recommended fact-checking website Snopes.com as a way to respond to fraudulent news on Facebook.

Facebook made the decision to take a similar move to Google, and blocked fake news sites from advertising on its website the following day after Google took earlier action first on the matter.[21][72] Facebook explained its new policy: "We do not integrate or display ads in apps or sites containing content that is illegal, misleading or deceptive, which includes fake news. ... We have updated the policy to explicitly clarify that this applies to fake news. Our team will continue to closely vet all prospective publishers and monitor existing ones to ensure compliance."[77] The steps by both Google and Facebook intended to deny ad revenue to fraudulent news sites; neither company took actions to prevent dissemination of false stories in search engine results pages or web feeds.[20][79]

Facebook CEO Mark Zuckerberg said, in a post to his website on the issue, that the notion that fraudulent news sites impacted the 2016 election was a "crazy idea".[80][81] Zuckerberg rejected that his website played any role in the outcome of the election, describing the idea that it might have done so as "pretty crazy".[82] In a blog post, he stated that more than 99% of content on Facebook was authentic (i.e. neither fake news nor a hoax).[83] In the same blog post, he stated that "News and media are not the primary things people do on Facebook, so I find it odd when people insist we call ourselves a news or media company in order to acknowledge its importance."[84] Separately, Zuckerberg advised Facebook users to check the fact-checking website Snopes.com whenever they encounter fake news on Facebook.[85][86]

Top staff members at Facebook did not feel that simply blocking ad revenue from these fraudulent sites was a strong enough response to the problem, and together they made an executive decision and created a secret group to deal with the issue themselves.[80][81] In response to Zuckerberg's first statement that fraudulent news did not impact the 2016 election, the secret Facebook response group disputed this idea: "It’s not a crazy idea. What’s crazy is for him to come out and dismiss it like that, when he knows, and those of us at the company know, that fake news ran wild on our platform during the entire campaign season."[80][81] BuzzFeed reported that the secret task force included "dozens" of Facebook employees.[80][81]

Response

Facebook faced mounting criticism in the days after its decision to solely revoke advertising revenues from fraudulent news providers, and not take any further actions on the matter.[87][88] After one week negative coverage in the media including assertions that the proliferation of fraudulent news on Facebook gave the 2016 U.S. presidential election to Donald Trump, Mark Zuckerberg posted a second post on the issue on 18 November 2016.[87][88] The post was a reversal of his earlier comments on the matter where he had discounted the impact of fraudulent news.[88]

Zuckerberg said there was an inherent difficult nature in attempting to filter out fraudulent news: "The problems here are complex, both technically and philosophically. We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible."[87] The New York Times reported some measures being considered and not yet implemented by Facebook included: "third-party verification services, better automated detection tools and simpler ways for users to flag suspicious content."[87] The 18 November post did not announce any concrete actions the company would definitively take, or when such measures would formally be put into usage on the website.[87][88]

Many people commented positively under Zuckerberg's second post on fraudulent news.[89] National Public Radio observed the changes being considered by Facebook to identify fraud constituted progress for the company into a new medium: "Together, the projects signal another step in Facebook's evolution from its start as a tech-oriented company to its current status as a complex media platform."[89] On 19 November 2016, BuzzFeed advised Facebook users they could report posts from fraudulent news websites.[90] Users could do so by choosing the report option: "I think it shouldn't be on Facebook", followed by: "It’s a false news story."[90]

Impact

Fake news proliferation on Facebook had a negative financial impact for the company. The Economist reported: "Brian Wieser of Pivotal Research recently wrote that the focus on fake news and the concerns over the measurement of advertising could well cut revenue growth by a couple of percentage points."[91]

The New York Times reported shortly after Mark Zuckerberg's second statement on fake news proliferation on his website, that Facebook would engage in assisting the government of China with a version of its software in the country to allow increased censorship by the government.[92] Barron's newspaper contributor William Pesek was highly critical of this move, writing: "By effectively sharing its fake news problem with the most populous nation, Facebook would be a pawn of [China’s President Xi] Jinping's intensifying censorship push."[92]

Fact-checking websites and journalists

Fact-checking websites play a role as debunkers to fraudulent news reports.[93][94][95] Such sites saw large increases in readership and web traffic during the 2016 U.S. election cycle.[93][94] FactCheck.org,[b] PolitiFact.com,[c] Snopes.com,[d] and "The Fact Checker" section of The Washington Post,[e] are prominent fact-checking websites that played an important role in debunking fraud.[85][93][95][101] The New Yorker writer Nicholas Lemann wrote on how to address fake news, and called for increasing the roles of FactCheck.org, PolitiFact.com, and Snopes.com, in the age of post-truth politics.[102] CNN media meta analyst Brian Stelter wrote: "In journalism circles, 2016 is the year of the fact-checker."[93]

Logo of PolitiFact
Fact-checking website PolitiFact.com was praised by rival fact-checking service FactCheck.org and recommended as a resource for readers to check before sharing a potentially fake story.

By the close of the 2016 U.S. election season, fact-checking websites FactCheck.org, PolitiFact.com, and Snopes.com, had each authored guides on how to respond to fraudulent news.[1][101][103] FactCheck.org advised readers to check the source, author, date, and headline of publications.[101] They recommended their colleagues Snopes.com, The Washington Post Fact Checker, and PolitiFact.com as important resources to rely upon before re-sharing a fraudulent story.[101] FactCheck.org admonished consumers to be wary of their own biases when viewing media they agree with.[101] PolitiFact.com announced they would tag stories as "Fake news" so that readers could view all fraudulent stories they had debunked.[103] Snopes.com warned readers: "So long as social media allows for the rapid spread of information, manipulative entities will seek to cash in on the rapid spread of misinformation."[1]

The Washington Post's "The Fact Checker" section, which is dedicated to evaluating the truth of political claims, greatly increased in popularity during the 2016 election cycle. Glenn Kessler, who runs the Post's "Fact Checker", wrote that "fact-checking websites all experienced huge surges in readership during the election campaign."[94] The Fact Checker had five times more unique visitors than during the 2012 cycle."[94] Kessler cited research showing that fact-checks are effective at reducing "the prevalence of a false belief."[94] Will Moy, director of the London-based Full Fact, a UK fact-checking website, said that debunking must take place over a sustained period of time to truly be effective.[94] Full Fact began work to develop multiple products in a partnership with Google to help automate fact-checking.[104]

FactCheck.org former director Brooks Jackson remarked that larger media companies had devoted increased focus to the importance of debunking fraud during the 2016 election: "It's really remarkable to see how big news operations have come around to challenging false and deceitful claims directly. It's about time."[93] FactCheck.org began a new partnership with CNN journalist Jake Tapper in 2016 to examine the veracity of reported claims by candidates.[93]

Angie Drobnic Holan, editor of PolitiFact.com, noted the circumstances warranted support for the practice: "All of the media has embraced fact-checking because there was a story that really needed it."[93] Holan was heartened that fact-checking garnered increased viewership for those engaged in the practice: "Fact-checking is now a proven ratings getter. I think editors and news directors see that now. So that's a plus."[93] Holan cautioned that heads of media companies must strongly support the practice of debunking, as it often provokes hate mail and extreme responses from zealots.[93]

On 17 November 2016, the International Fact-Checking Network (IFCN) published an open letter on the website of the Poynter Institute to Facebook founder and CEO Mark Zuckerberg, imploring him to utilize fact-checkers in order to help identify fraud on Facebook.[95][105] Created in September 2015, the IFCN is housed within the St. Petersburg, Florida-based Poynter Institute for Media Studies and aims to support the work of 64 member fact-checking organizations around the world.[106][107] Alexios Mantzarlis, co-founder of FactCheckEU.org and former managing editor of Italian fact-checking site Pagella Politica, was named director and editor of IFCN in September 2015.[106][107] Signatories to the 2016 letter to Zuckerberg featured a global representation of fact-checking groups, including: Africa Check, FactCheck.org, PolitiFact.com, and The Washington Post Fact Checker.[95][105] The groups wrote they were eager to assist Facebook root out fraudulent news sources on the website.[95][105]

In his second post on the matter on 18 November 2016, Zuckerberg responded to the fraudulent news problem by suggesting usage of fact-checking websites.[85][86] He specifically identified fact-checking website Snopes.com, and pointed out that Facebook monitors links to such debunking websites in reply comments as a method to determine which original posts were fraudulent.[85][86] Zuckerberg explained: "Anyone on Facebook can report any link as false, and we use signals from those reports along with a number of others — like people sharing links to myth-busting sites such as Snopes — to understand which stories we can confidently classify as misinformation. Similar to clickbait, spam and scams, we penalize this content in News Feed so it's much less likely to spread."[85][86]

Society of Professional Journalists president Lynn Walsh said in November 2016 that the society would reach out to Facebook in order to provide assistance with weeding out fake news.[108] Walsh said Facebook should evolve and admit that it functioned as a large media company: "The media landscape has evolved. Journalism has evolved, and continues to evolve. So I do hope that while it may not be the original thought that Facebook had. I think they should be now."[108]

Proposed technology tools

New York magazine contributor Brian Feldman responded to an article by media communications professor Melissa Zimdars, and used her list to create a Google Chrome extension that would warn users about fraudulent news sites.[109] He invited others to use his code and improve upon it.[109]

Slate magazine senior technology editor Will Oremus wrote that fraudulent news sites were controversial; and their prevalence was obscuring a wider discussion about the negative impact on society from those who only consume media from one particular tailored viewpoint — and therefore perpetuate filter bubbles.[110]

Upworthy co-founder and The Filter Bubble author Eli Pariser launched an open-source model initiative on 17 November 2016 to address false news.[111][112] Pariser began a Google Document to collaborate with others online on how to lessen the phenomenon of fraudulent news.[111][112] Pariser called his initiative: "Design Solutions for Fake News".[111] Pariser's document included recommendations for a ratings organization analogous to the Better Business Bureau, and a database on media producers in a format like Wikipedia.[111][112]

Writing for Fortune, Matthew Ingram agreed with the idea that Wikipedia could serve as a helpful model to improve Facebook's analysis of potentially fake news.[113] Ingram concluded: "If Facebook could somehow either tap into or recreate the kind of networked fact checking that Wikipedia does on a daily basis, using existing elements like the websites of Politifact and others, it might actually go some distance towards being a possible solution."[113]

Academic analysis

Writing for MIT Technology Review, Jamie Condliffe said that merely banning ad revenue from the fraudulent news sites was not enough action by Facebook to effectively deal with the problem.[42] He wrote: "The post-election furor surrounding Facebook’s fake-news problem has sparked new initiatives to halt the provision of ads to sites that peddle false information. But it’s only a partial solution to the problem: for now, hoaxes and fabricated stories will continue to appear in feeds."[42] Condliffe concluded: "Clearly Facebook needs to do something to address the issue of misinformation, and it’s making a start. But the ultimate solution is probably more significant, and rather more complex, than a simple ad ban."[42]

Indiana University informatics and computer science professor Filippo Menczer commented on the steps by Google and Facebook to deny fraudulent news sites advertising revenue: "One of the incentives for a good portion of fake news is money. This could cut the income that creates the incentive to create the fake news sites."[114] Menczer's research team engaged in developing an online tool titled: Hoaxy — to see the pervasiveness of unconfirmed assertions as well as related debunking on the Internet.[115]

Dartmouth College political scientist Brendan Nyhan has criticized Facebook for "doing so little to combat fake news... Facebook should be fighting misinformation, not amplifying it."[62]

Zeynep Tufekci, a writer and academic
Zeynep Tufekci wrote for The New York Times that Facebook "policies entrench echo chambers and fuel the spread of misinformation."

Zeynep Tufekci wrote critically about Facebook's stance on fraudulent news sites in a piece for The New York Times, pointing out fraudulent websites in Macedonia profited handsomely off false stories about the 2016 U.S. election: "The company's business model, algorithms and policies entrench echo chambers and fuel the spread of misinformation."[116]

Merrimack College assistant professor of media studies Melissa Zimdars wrote an article "False, Misleading, Clickbait-y and Satirical 'News' Sources" in which she advised how to determine if a fraudulent source was a fake news site.[117] Zimdars identified strange domain names, lack of author attribution, poor website layout, the use of all caps, and URLs ending in "lo" or "com.co" as red flags of a fake news site.[117] In evaluating whether a website contains fake news, Zimdars recommends that readers check the "About Us" page of the website, and consider whether reputable news outlets are reporting on the story.[117]

Education and history professor Sam Wineburg of the Stanford Graduate School of Education at Stanford University and colleague Sarah McGrew authored a 2016 study which analyzed students' ability to discern fraudulent news from factual reporting.[118][119] The study took place over a year-long period of time, and involved a sample size of over 7,800 responses from university, secondary and middle school students in 12 states within the United States.[118][119] The researchers were "shocked" at the "stunning and dismaying consistency" with which students thought fraudulent news reports were factual in nature.[118][119] The study found that 82 percent of students in middle school were unable to differentiate between an advertisement denoted as sponsored content from an actual online news article.[120] The authors concluded the solution was to educate consumers of media on the Internet to themselves behave like fact-checkers — and actively question the veracity of all sources they encounter online.[118][119]

Scientist Emily Willingham proposed applying the scientific method towards fake news analysis.[121] She had previously written on the topic of differentiating science from pseudoscience, and applied that logic to fake news.[121] Her recommended steps included: Observe, Question, Hypothesize, Analyze data, Draw conclusion, and Act on results.[121] Willingham suggested a hypothesis of "This is real news", and then forming a strong set of questions to attempt to disprove the hypothesis.[121] These tests included: check the URL, date of the article, evaluate reader bias and writer bias, double-check the evidence, and verify the sources cited.[121]

Media commentary

Full Frontal

Samantha Bee, host of the TV show Full Frontal
Samantha Bee went to Russia for her television show Full Frontal and met with individuals financed by the government of Russia to act as Internet trolls and attempt to manipulate the 2016 U.S. election in order to subvert democracy.

Samantha Bee went to Russia for her television show Full Frontal and met with individuals financed by the government of Russia to act as Internet trolls and attempt to subvert the 2016 U.S. election in order to subvert democracy. The man and woman interviewed by Bee said they influenced the election by commenting on websites for New York Post, The Wall Street Journal, The Washington Post, Twitter, and Facebook.[122][123][124] They kept their identities covert, and maintained cover identities separate from their real Russian names, with the woman claiming in posts to be a housewife residing in Nebraska. They blamed consumers for believing all they read online.[122][123][124]

Executive producers for Full Frontal told The Daily Beast that they relied upon writer Adrian Chen, who had previously reported on Russian trolls for The New York Times Magazine in 2015, as a resource to contact those in Russia agreeable to be interviewed by Bee. The Russian trolls wore masks on camera and asked Full Frontal producers to maintain the confidentiality of all of their fake accounts so they would not be publicly identified. Full Frontal producers paid the Russian trolls to utilize the Twitter hashtag #SleazySam in order to troll the show itself, so the production staff could verify the trolls were indeed able to manipulate content online as they claimed.[124]

Subsequent to their research within Russia itself for a second segment on Full Frontal, the production staff came to the conclusion that Russian leader Vladimir Putin supported Donald Trump for U.S. President in order to subvert the system of democracy within the U.S.[124] Television producer Razan Ghalayini explained to The Daily Beast: "Russia is an authoritarian regime and authoritarian regimes don’t benefit from the vision of democracy being the best version of governance." Television producer Miles Kahn concurred with this analysis, adding: "It’s not so much that Putin wants Trump. He probably prefers him in the long run, but he would almost rather the election be contested. They want chaos."[124]

Last Week Tonight

John Oliver commented on his comedy program Last Week Tonight, in one of his segments about Donald Trump, that the problem of fraudulent news sites fed into a wider issue of echo chambers in the media. Oliver lamented: "Fake facts circulate on social media to a frightening extent." He pointed out such sites often only exist to draw in profit from web traffic: "There is now a whole cottage industry specializing in hyper-partisan, sometimes wildly distorted clickbait."[49]

Other media

Critics contended that fraudulent news on Facebook may have been responsible for Donald Trump winning the 2016 U.S. presidential election, because most of the fake news stories Facebook allowed to spread portrayed him in a positive light.[83] Facebook is not liable for posting or publicizing fake content because, under the Communications Decency Act, interactive computer services cannot be held responsible for information provided by another internet entity. Some legal scholars, like Keith Altman, think that Facebook's huge scale creates such a large potential for fake news to spread that this law may need to be changed.[125] Writing for The Washington Post, Institute for Democracy in Eastern Europe co-director Eric Chenoweth wrote "many 'fake news' stories that evidence suggests were generated by Russian intelligence operations".[126]

British BBC News interviewed a fraudulent news site writer who went by the pseudonym "Chief Reporter (CR)", who defended his actions and possible influence on elections: "If enough of an electorate are in a frame of mind where they will believe absolutely everything they read on the internet, to a certain extent they have to be prepared to deal with the consequences."[127]

See also

Footnotes

  1. ^ The Washington Post and the Associated Press described PropOrNot as a nonpartisan foreign policy analysis group composed of persons with prior experience in international relations, warfare, and information technology sectors.[9][10][11] PropOrNot received criticism from sources including The Intercept[38] and Fortune magazine for casting too wide a net in its identification list.[39]
  2. ^ FactCheck.org, a nonprofit organization and a project of the Annenberg Public Policy Center of the Annenberg School for Communication at the University of Pennsylvania,[96] won a 2010 Sigma Delta Chi Award from the Society of Professional Journalists.[97]
  3. ^ PolitiFact.com, run by the Tampa Bay Times,[98] received a 2009 Pulitzer Prize for National Reporting for its fact-checking efforts the previous year.[98]
  4. ^ Snopes.com, privately run by Barbara and David Mikkelson, was given "high praise" by FactCheck.org, another fact-checking website;[99] in addition, Network World gave Snopes.com a grade of "A" in a meta-analysis of fact-checking websites.[100]
  5. ^ "The Fact Checker" is a project by The Washington Post to analyze political claims.[93] Their colleagues and competitors at FactCheck.org recommended The Fact Checker as a resource to use before assuming a story is factual.[101]

References

  1. ^ a b c LaCapria, Kim (2 November 2016), "Snopes' Field Guide to Fake News Sites and Hoax Purveyors - Snopes.com's updated guide to the internet's clickbaiting, news-faking, social media exploiting dark side.", Snopes.com, retrieved 19 November 2016
  2. ^ a b c d e f g h "Merkel warns against fake news driving populist gains", Yahoo! News, Agence France-Presse, 23 November 2016, retrieved 23 November 2016
  3. ^ a b c d e f Paul Mozur and Mark Scott (17 November 2016), "Fake News on Facebook? In Foreign Elections, That's Not New", The New York Times, retrieved 18 November 2016
  4. ^ a b c d e f g h i j k "Concern over barrage of fake Russian news in Sweden", The Local, 27 July 2016, retrieved 25 November 2016
  5. ^ a b c Eunice Yoon and Barry Huang (22 November 2016), "China on US fake news debate: We told you so", CNBC, retrieved 28 November 2016
  6. ^ a b c Cadell, Catherine (19 November 2016), China says terrorism, fake news impel greater global internet curbs, retrieved 28 November 2016 {{citation}}: Unknown parameter |agency= ignored (help)
  7. ^ a b c d e f Read, Max (27 November 2016), "Maybe the Internet Isn't a Fantastic Tool for Democracy After All", New York Magazine, retrieved 28 November 2016
  8. ^ a b c d e f g h Frenkel, Sheera (20 November 2016), "This Is What Happens When Millions Of People Suddenly Get The Internet", BuzzFeed News, retrieved 28 November 2016
  9. ^ a b c d e f g h i j k l m n o p q Timberg, Craig (24 November 2016), "Russian propaganda effort helped spread 'fake news' during election, experts say", The Washington Post, retrieved 25 November 2016, Two teams of independent researchers found that the Russians exploited American-made technology platforms to attack U.S. democracy at a particularly vulnerable moment
  10. ^ a b c d e f g h i j "Russian propaganda effort likely behind flood of fake news that preceded election", PBS NewsHour, Associated Press, 25 November 2016, retrieved 26 November 2016
  11. ^ a b c d e f "Russian propaganda campaign reportedly spread 'fake news' during US election", Nine News, Agence France-Presse, 26 November 2016, retrieved 26 November 2016
  12. ^ a b c d e f Lewis Sanders IV (11 October 2016), "'Divide Europe': European lawmakers warn of Russian propaganda", Deutsche Welle, retrieved 24 November 2016
  13. ^ a b c d e f g Dan Tynan (24 August 2016), "How Facebook powers money machines for obscure political 'news' sites - From Macedonia to the San Francisco Bay, clickbait political sites are cashing in on Trumpmania – and they're getting a big boost from Facebook", The Guardian, retrieved 18 November 2016
  14. ^ a b c Ben Gilbert (15 November 2016), "Fed up with fake news, Facebook users are solving the problem with a simple list", Business Insider, retrieved 16 November 2016, Some of these sites are intended to look like real publications (there are false versions of major outlets like ABC and MSNBC) but share only fake news; others are straight-up propaganda created by foreign nations (Russia and Macedonia, among others).
  15. ^ a b c d e f g h i j Townsend, Tess (21 November 2016), "Meet the Romanian Trump Fan Behind a Major Fake News Site", Inc. magazine, ISSN 0162-8968, retrieved 23 November 2016
  16. ^ a b c d e f g Sydell, Laura (23 November 2016), "We Tracked Down A Fake-News Creator In The Suburbs. Here's What We Learned", All Things Considered, National Public Radio, retrieved 26 November 2016
  17. ^ a b c d THR staff (17 November 2016), "Facebook Fake News Writer Reveals How He Tricked Trump Supporters and Possibly Influenced Election", The Hollywood Reporter, retrieved 18 November 2016
  18. ^ a b c d e Ali Watkins and Sheera Frenkel (30 November 2016), "Intel Officials Believe Russia Spreads Fake News", BuzzFeed News, retrieved 1 December 2016
  19. ^ a b c d e f Strohm, Chris (1 December 2016), "Russia Weaponized Social Media in U.S. Election, FireEye Says", Bloomberg News, retrieved 1 December 2016
  20. ^ a b c "Google and Facebook target fake news sites with advertising clampdown", Belfast Telegraph, 15 November 2016, retrieved 16 November 2016
  21. ^ a b c Shanika Gunaratna (15 November 2016), "Facebook, Google announce new policies to fight fake news", CBS News, retrieved 16 November 2016
  22. ^ a b John Ribeiro (14 November 2016), "Zuckerberg says fake news on Facebook didn't tilt the elections", Computerworld, retrieved 16 November 2016
  23. ^ a b c d e f g h i Timberg, Craig (30 November 2016), "Effort to combat foreign propaganda advances in Congress", The Washington Post, retrieved 1 December 2016
  24. ^ a b c d e f Chen, Adrian (27 July 2016), "The Real Paranoia-Inducing Purpose of Russian Hacks", The New Yorker, retrieved 26 November 2016
  25. ^ a b c d e Lewis Sanders IV (17 November 2016), "Fake news: Media's post-truth problem", Deutsche Welle, retrieved 24 November 2016
  26. ^ European Parliament Committee on Foreign Affairs (23 November 2016), "MEPs sound alarm on anti-EU propaganda from Russia and Islamist terrorist groups" (PDF), European Parliament, retrieved 26 November 2016
  27. ^ a b Surana, Kavitha (23 November 2016), "The EU Moves to Counter Russian Disinformation Campaign", Foreign Policy, ISSN 0015-7228, retrieved 24 November 2016
  28. ^ "EU Parliament Urges Fight Against Russia's 'Fake News'", Radio Free Europe/Radio Liberty, Agence France-Presse and Reuters, 23 November 2016, retrieved 24 November 2016
  29. ^ a b MacFarquhar, Neil (29 August 2016), "A Powerful Russian Weapon: The Spread of False Stories", The New York Times, p. A1, retrieved 24 November 2016
  30. ^ a b c d e Porter, Tom (28 November 2016), "How US and EU failings allowed Kremlin propaganda and fake news to spread through the West", International Business Times, retrieved 29 November 2016
  31. ^ a b c d e f g Schindler, John R. (5 November 2015), "Obama Fails to Fight Putin's Propaganda Machine", New York Observer, retrieved 28 November 2016
  32. ^ a b c d e f g Schindler, John R. (26 November 2016), "The Kremlin Didn't Sink Hillary—Obama Did", New York Observer, retrieved 28 November 2016
  33. ^ a b c d LoGiurato, Brett (29 April 2014), "Russia's Propaganda Channel Just Got A Journalism Lesson From The US State Department", Business Insider, retrieved 29 November 2016
  34. ^ LoGiurato, Brett (25 April 2014), "RT Is Very Upset With John Kerry For Blasting Them As Putin's 'Propaganda Bullhorn'", Business Insider, retrieved 29 November 2016
  35. ^ a b c Stengel, Richard (29 April 2014), "Russia Today's Disinformation Campaign", Dipnote, United States Department of State, retrieved 28 November 2016
  36. ^ a b c d Weisburd, Andrew; Watts, Clint (6 August 2016), "Trolls for Trump - How Russia Dominates Your Twitter Feed to Promote Lies (And, Trump, Too)", The Daily Beast, retrieved 24 November 2016{{citation}}: CS1 maint: multiple names: authors list (link)
  37. ^ a b c d e Shapiro, Ari (25 November 2016), "Experts Say Russian Propaganda Helped Spread Fake News During Election", All Things Considered, National Public Radio, retrieved 26 November 2016
  38. ^ a b c d Ben Norton; Glenn Greenwald (26 November 2016), "Washington Post Disgracefully Promotes a McCarthyite Blacklist From a New, Hidden, and Very Shady Group", The Intercept, retrieved 27 November 2016
  39. ^ a b c d Ingram, Matthew (25 November 2016), "No, Russian Agents Are Not Behind Every Piece of Fake News You See", Fortune magazine, retrieved 27 November 2016
  40. ^ a b Taibbi, Matt (28 November 2016), "The 'Washington Post' 'Blacklist' Story Is Shameful and Disgusting", Rolling Stone, retrieved 30 November 2016
  41. ^ a b vanden Heuvel, Katrina (29 November 2016), "Putin didn't undermine the election. We did.", The Washington Post, retrieved 1 December 2016
  42. ^ a b c d Jamie Condliffe (15 November 2016), "Facebook's Fake-News Ad Ban Is Not Enough", MIT Technology Review, retrieved 16 November 2016
  43. ^ a b Craig Silverman and Lawrence Alexander (3 November 2016), "How Teens In The Balkans Are Duping Trump Supporters With Fake News", BuzzFeed, retrieved 16 November 2016, As a result, this strange hub of pro-Trump sites in the former Yugoslav Republic of Macedonia is now playing a significant role in propagating the kind of false and misleading content that was identified in a recent BuzzFeed News analysis of hyperpartisan Facebook pages.
  44. ^ a b Ishmael N. Daro and Craig Silverman (15 November 2016), "Fake News Sites Are Not Terribly Worried About Google Kicking Them Off AdSense", BuzzFeed, retrieved 16 November 2016
  45. ^ a b Christopher Woolf (16 November 2016), "Kids in Macedonia made up and circulated many false news stories in the US election", Public Radio International, retrieved 18 November 2016
  46. ^ a b c d e f g Collins, Ben (28 October 2016), "This 'Conservative News Site' Trended on Facebook, Showed Up on Fox News—and Duped the World", The Daily Beast, retrieved 27 November 2016
  47. ^ a b c d e f Chacon, Marco (21 November 2016), "I've Been Making Viral Fake News for the Last Six Months. It's Way Too Easy to Dupe the Right on the Internet.", The Daily Beast, retrieved 27 November 2016
  48. ^ a b c Bambury, Brent (25 November 2016), "Marco Chacon meant his fake election news to be satire — but people took it as fact", Day 6, CBC Radio One, retrieved 27 November 2016
  49. ^ a b Rachel Dicker (14 November 2016), "Avoid These Fake News Sites at All Costs", U.S. News & World Report, retrieved 16 November 2016
  50. ^ Chang, Juju (29 November 2016), "When Fake News Stories Make Real News Headlines", ABC News, retrieved 29 November 2016
  51. ^ a b McAlone, Nathan (17 November 2016), "This fake-news writer says he makes over $10,000 a month, and he thinks he helped get Trump elected", Business Insider, retrieved 18 November 2016
  52. ^ a b Goist, Robin (17 November 2016), "The fake news of Facebook", The Plain Dealer, retrieved 18 November 2016
  53. ^ a b Dewey, Caitlin (17 November 2016), "Facebook fake-news writer: 'I think Donald Trump is in the White House because of me'", The Washington Post, ISSN 0190-8286, retrieved 17 November 2016
  54. ^ a b c d Hedegaard, Erik (29 November 2016), "How a Fake Newsman Accidentally Helped Trump Win the White House - Paul Horner thought he was trolling Trump supporters – but after the election, the joke was on him", Rolling Stone, retrieved 29 November 2016
  55. ^ Alyssa Newcomb (15 November 2016), "Facebook, Google Crack Down on Fake News Advertising", NBC News, NBC News, retrieved 16 November 2016
  56. ^ Drum, Kevin (17 November 2016), "Meet Ret. General Michael Flynn, the Most Gullible Guy in the Army", Mother Jones, retrieved 18 November 2016
  57. ^ a b Tapper, Jake (17 November 2016), "Fake news stories thriving on social media - Phony news stories are thriving on social media, so much so President Obama addressed it. CNN's Jake Tapper reports.", CNN, retrieved 18 November 2016
  58. ^ Masnick, Mike (14 October 2016), "Donald Trump's Son & Campaign Manager Both Tweet Obviously Fake Story", Techdirt, retrieved 18 November 2016
  59. ^ President Barack Obama (7 November 2016), Remarks by the President at Hillary for America Rally in Ann Arbor, Michigan, White House Office of the Press Secretary, retrieved 16 November 2016
  60. ^ Gardiner Harris and Melissa Eddy (17 November 2016), "Obama, With Angela Merkel in Berlin, Assails Spread of Fake News", The New York Times, retrieved 18 November 2016
  61. ^ a b Maheshwari, Sapna (20 November 2016), "How Fake News Goes Viral", The New York Times, ISSN 0362-4331, retrieved 20 November 2016
  62. ^ a b Craig Silverman (16 November 2016), "Viral Fake Election News Outperformed Real News On Facebook In Final Months Of The US Election", BuzzFeed, retrieved 16 November 2016
  63. ^ a b c d e f Kurtz, Howard, "Fake news and the election: Why Facebook is polluting the media environment with garbage", Fox News, archived from the original on 18 November 2016, retrieved 18 November 2016
  64. ^ a b Porter, Tom (1 December 2016), "US House of representatives backs proposal to counter global Russian subversion", International Business Times UK edition, retrieved 1 December 2016
  65. ^ a b c d Murdock, Jason (30 November 2016), "Russian hackers may disrupt Germany's 2017 election warns spy chief", International Business Times UK edition, retrieved 1 December 2016
  66. ^ Orlowski, Andrew (21 November 2016), "China cites Trump to justify 'fake news' media clampdown. Surprised?", The Register, retrieved 28 November 2016
  67. ^ Pascaline, Mary (20 November 2016), "Facebook Fake News Stories: China Calls For More Censorship On Internet Following Social Media's Alleged Role In US Election", International Business Times, retrieved 28 November 2016
  68. ^ Rauhala, Emily (17 November 2016), "After Trump, Americans want Facebook and Google to vet news. So does China.", The Washington Post, retrieved 28 November 2016
  69. ^ Dou, Eva (18 November 2016), "China Presses Tech Firms to Police the Internet - Third-annual World Internet Conference aimed at proselytizing China's view to global audience", The Wall Street Journal, retrieved 28 November 2016
  70. ^ a b Bump, Philip (14 November 2016), "Google's top news link for 'final election results' goes to a fake news site with false numbers", The Washington Post, retrieved 26 November 2016
  71. ^ a b Jacobson, Louis (14 November 2016), "No, Donald Trump is not beating Hillary Clinton in the popular vote", PolitiFact.com, retrieved 26 November 2016
  72. ^ a b c d e f g Wingfield, Nick; Isaac, Mike; Benner, Katie (14 November 2016), "Google and Facebook Take Aim at Fake News Sites", The New York Times, retrieved 28 November 2016
  73. ^ Sonam Sheth (14 November 2016), "Google looking into grossly inaccurate top news search result displayed as final popular-vote tally", Business Insider, retrieved 16 November 2016
  74. ^ "Google to ban fake news sites from its advertising network", Los Angeles Times, Associated Press, 14 November 2016, retrieved 16 November 2016
  75. ^ Avery Hartmans (15 November 2016), "Google's CEO says fake news could have swung the election", Business Insider, retrieved 16 November 2016
  76. ^ "Google cracks down on fake news sites", The Straits Times, 15 November 2016, retrieved 16 November 2016
  77. ^ a b Richard Waters (15 November 2016), "Facebook and Google to restrict ads on fake news sites", Financial Times, retrieved 16 November 2016
  78. ^ Sridhar Ramaswamy (21 January 2016), "How we fought bad ads in 2015", Google blog, Google, retrieved 28 November 2016
  79. ^ Paul Blake (15 November 2016), "Google, Facebook Move to Block Fake News From Ad Services", ABC News, retrieved 16 November 2016
  80. ^ a b c d Gina Hall (15 November 2016), "Facebook staffers form an unofficial task force to look into fake news problem", Silicon Valley Business Journal, retrieved 16 November 2016
  81. ^ a b c d Frenkel, Sheera (14 November 2016), "Renegade Facebook Employees Form Task Force To Battle Fake News", BuzzFeed, retrieved 18 November 2016
  82. ^ Shahani, Aarti (15 November 2016), "Facebook, Google Take Steps To Confront Fake News", National Public Radio, retrieved 20 November 2016
  83. ^ a b Cooke, Kristina (15 November 2016), Google, Facebook move to restrict ads on fake news sites, retrieved 20 November 2016 {{citation}}: Unknown parameter |agency= ignored (help)
  84. ^ "Facebook's Fake News Problem: What's Its Responsibility?", The New York Times, Associated Press, 15 November 2016, retrieved 20 November 2016
  85. ^ a b c d e Ohlheiser, Abby (19 November 2016), "Mark Zuckerberg outlines Facebook's ideas to battle fake news", The Washington Post, retrieved 19 November 2016
  86. ^ a b c d Vladimirov, Nikita (19 November 2016), "Zuckerberg outlines Facebook's plan to fight fake news", The Hill, ISSN 1521-1568, retrieved 19 November 2016
  87. ^ a b c d e Mike Isaac (19 November 2016), "Facebook Considering Ways to Combat Fake News, Mark Zuckerberg Says", The New York Times, retrieved 19 November 2016
  88. ^ a b c d Samuel Burke (19 November 2016), "Zuckerberg: Facebook will develop tools to fight fake news", CNNMoney, CNN, retrieved 19 November 2016
  89. ^ a b Chappell, Bill (19 November 2016), "'Misinformation' On Facebook: Zuckerberg Lists Ways Of Fighting Fake News", National Public Radio, retrieved 19 November 2016
  90. ^ a b Silverman, Craig (19 November 2016), "This Is How You Can Stop Fake News From Spreading On Facebook", BuzzFeed, retrieved 20 November 2016
  91. ^ "False news items are not the only problem besetting Facebook", The Economist, 26 November 2016, retrieved 28 November 2016
  92. ^ a b Pesek, William (27 November 2016), "Will Facebook be China's propaganda tool?", The Japan Times, Barron's newspaper, retrieved 28 November 2016
  93. ^ a b c d e f g h i j Stelter, Brian (7 November 2016), "How Donald Trump made fact-checking great again", CNNMoney, CNN, retrieved 19 November 2016
  94. ^ a b c d e f Kessler, Glenn (10 November 2016), "Fact checking in the aftermath of a historic election", The Washington Post, retrieved 19 November 2016
  95. ^ a b c d e Neidig, Harper (17 November 2016), "Fact-checkers call on Zuckerberg to address spread of fake news", The Hill, ISSN 1521-1568, retrieved 19 November 2016
  96. ^ Hartlaub, Peter (24 October 2004), "Web sites help gauge the veracity of claims / Online resources check ads, rumors", San Francisco Chronicle, p. A1, retrieved 25 November 2016
  97. ^ "Fact-Checking Deceptive Claims About the Federal Health Care Legislation - by Staff, FactCheck.org", 2010 Sigma Delta Chi Award Honorees, Society of Professional Journalists, 2010, retrieved 25 November 2016
  98. ^ a b Columbia University (2009), "National Reporting - Staff of St. Petersburg Times", 2009 Pulitzer Prize Winners, retrieved 24 November 2016, For "PolitiFact," its fact-checking initiative during the 2008 presidential campaign that used probing reporters and the power of the World Wide Web to examine more than 750 political claims, separating rhetoric from truth to enlighten voters.
  99. ^ Novak, Viveca (10 April 2009), "Ask FactCheck - Snopes.com", FactCheck.org, retrieved 25 November 2016
  100. ^ McNamara, Paul (13 April 2009), "Fact-checking the fact-checkers: Snopes.com gets an 'A'", Network World, retrieved 25 November 2016
  101. ^ a b c d e f Lori Robertson and Eugene Kiely (18 November 2016), "How to Spot Fake News", FactCheck.org, retrieved 19 November 2016
  102. ^ Lemann, Nicholas (30 November 2016), "Solving the Problem of Fake News", The New Yorker, retrieved 30 November 2016
  103. ^ a b Sharockman, Aaron (16 November 2016), "Let's fight back against fake news", PolitiFact.com, retrieved 19 November 2016
  104. ^ Burgess, Matt (17 November 2016), "Google is helping Full Fact create an automated, real-time fact-checker", Wired magazine UK edition, retrieved 29 November 2016
  105. ^ a b c The International Fact-Checking Network (17 November 2016), "An open letter to Mark Zuckerberg from the world's fact-checkers", Poynter Institute, retrieved 19 November 2016
  106. ^ a b Hare, Kristen (September 21, 2015), Poynter names director and editor for new International Fact-Checking Network, Poynter Institute for Media Studies, retrieved 20 November 2016
  107. ^ a b About the International Fact-Checking Network, Poynter Institute for Media Studies, 2016, retrieved 20 November 2016
  108. ^ a b Klasfeld, Adam (22 November 2016), "Fake News Gives Facebook a Nixon-Goes-to-China Moment", Courthouse News Service, retrieved 28 November 2016
  109. ^ a b Brian Feldman (15 November 2016), "Here's a Chrome Extension That Will Flag Fake-News Sites for You", New York Magazine, retrieved 16 November 2016
  110. ^ Will Oremus (15 November 2016), "The Real Problem Behind the Fake News", Slate magazine, retrieved 16 November 2016
  111. ^ a b c d Morris, David Z. (27 November 2016), "Eli Pariser's Crowdsourced Brain Trust Is Tackling Fake News", Fortune magazine, retrieved 28 November 2016
  112. ^ a b c Burgess, Matt (25 November 2016), "Hive mind assemble! There is now a crowdsourcing campaign to solve the problem of fake news", Wired magazine UK edition, retrieved 29 November 2016
  113. ^ a b Ingram, Matthew (21 November 2016), "Facebook Doesn't Need One Editor, It Needs 1,000 of Them", Fortune magazine, retrieved 29 November 2016
  114. ^ "Google, Facebook move to curb ads on fake news sites", Kuwait Times, Reuters, 15 November 2016, retrieved 16 November 2016
  115. ^ Menczer, Filippo (28 November 2016), "Fake Online News Spreads Through Social Echo Chambers", Scientific American, The Conversation, retrieved 29 November 2016
  116. ^ Douglas Perry (15 November 2016), "Facebook, Google try to drain the fake-news swamp without angering partisans", The Oregonian, retrieved 16 November 2016
  117. ^ a b c Cassandra Jaramillo (15 November 2016), "How to break it to your friends and family that they're sharing fake news", The Dallas Morning News, retrieved 16 November 2016
  118. ^ a b c d Domonoske, Camila (23 November 2016), "Students Have 'Dismaying' Inability To Tell Fake News From Real, Study Finds", National Public Radio, retrieved 25 November 2016
  119. ^ a b c d McEvers, Kelly (22 November 2016), "Stanford Study Finds Most Students Vulnerable To Fake News", National Public Radio, retrieved 25 November 2016
  120. ^ Shellenbarger, Sue (21 November 2016), "Most Students Don't Know When News Is Fake, Stanford Study Finds", The Wall Street Journal, retrieved 29 November 2016
  121. ^ a b c d e Willingham, Emily (28 November 2016), "A Scientific Approach To Distinguishing Real From Fake News", Forbes magazine, retrieved 29 November 2016
  122. ^ a b "Samantha Bee Interviews Russian Trolls, Asks Them About 'Subverting Democracy'", The Hollywood Reporter, 1 November 2016, retrieved 25 November 2016
  123. ^ a b Holub, Christian (1 November 2016), "Samantha Bee interviews actual Russian trolls", Entertainment Weekly, retrieved 25 November 2016
  124. ^ a b c d e Wilstein, Matt (7 November 2016), "How Samantha Bee's 'Full Frontal' Tracked Down Russia's Pro-Trump Trolls", The Daily Beast, retrieved 25 November 2016
  125. ^ Rogers, James (11 November 2016), "Facebook's 'fake news' highlights need for social media revamp, experts say", Fox News, retrieved 20 November 2016
  126. ^ Chenoweth, Eric (25 November 2016), "Americans keep looking away from the election's most alarming story", The Washington Post, retrieved 26 November 2016
  127. ^ "'I write fake news that gets shared on Facebook'", BBC News, BBC, 15 November 2016, retrieved 16 November 2016

Further reading