Fake news website: Difference between revisions
Sagecandor (talk | contribs) →Further role in 2016 U.S. presidential election: use more NPOV wording here to avoid undue weight |
SashiRolls (talk | contribs) →Italy: again please clear up text: by "most engagements" do you mean "most shared"? If so, please use straightforward language to describe this BLOG post. |
||
Line 172: | Line 172: | ||
''BuzzFeed News'' journalist Sheera Frenkel reported: "there has also been an increase in articles that demonize the country’s minority Muslim community, with fake news claiming that vast hordes of Muslim worshippers are attacking Buddhist sites. These articles, quickly shared and amplified on social media, have correlated with a surge in anti-Muslim protests and attacks on local Muslim groups."<ref name=maybetheinternet /><ref name=sheerafrenkel /> Frenkel noted countries that were relatively newer to Internet exposure were more susceptible to the problem, writing: "Countries like Myanmar, which come online quickly and without many government-backed programs to teach safe internet habits — like secure passwords and not revealing personal details online — rank among the lowest in digital literacy. They are the most likely to fall for scams, hacks, and fake news."<ref name=sheerafrenkel /> |
''BuzzFeed News'' journalist Sheera Frenkel reported: "there has also been an increase in articles that demonize the country’s minority Muslim community, with fake news claiming that vast hordes of Muslim worshippers are attacking Buddhist sites. These articles, quickly shared and amplified on social media, have correlated with a surge in anti-Muslim protests and attacks on local Muslim groups."<ref name=maybetheinternet /><ref name=sheerafrenkel /> Frenkel noted countries that were relatively newer to Internet exposure were more susceptible to the problem, writing: "Countries like Myanmar, which come online quickly and without many government-backed programs to teach safe internet habits — like secure passwords and not revealing personal details online — rank among the lowest in digital literacy. They are the most likely to fall for scams, hacks, and fake news."<ref name=sheerafrenkel /> |
||
=== Italy === |
|||
Ahead of the [[Italian constitutional referendum, 2016|Italian constitutional referendum of 2016]], five out of the ten stories with most engagements on social media were fake.<ref name=notizia>{{Citation|url=https://pagellapolitica.it/blog/show/148/la-notizia-pi%C3%B9-condivisa-sul-referendum-%C3%A8-una-bufala|title=La notizia più condivisa sul referendum? È una bufala|access-date=2 December 2016|language=[[Italian language|Italian]]|work=Pagella Politica|publisher=pagellapolitica.it}}</ref> Two out of three stories with the most engagements were false.<ref name=notizia /> |
|||
==Response== |
==Response== |
Revision as of 17:53, 2 December 2016
Fake news websites are sites that publish hoaxes, propaganda, and disinformation to drive web traffic inflamed by social media. Unlike news satire, which aims to satirize real news or make up fake news for the purpose of comedy, fake news websites are designed to mislead and profit from readers believing the stories to be true.[1] Fake news websites have promoted misleading or factually incorrect information concerning the politics of several countries such as Germany,[2] Indonesia and the Philippines,[3] Sweden,[4] China,[5][6] Myanmar,[7][8] and the United States.[9] Many of the false news sites are hosted in Russia,[9][10] Macedonia,[11][12] Romania,[13] and the U.S.[14][15]
One Sweden newspaper, The Local, described the proliferation of fake news as a form of psychological warfare.[4] Agence France-Presse noted that media analysts see it as "a threat to democracy itself."[2] The European Parliament's Committee on Foreign Affairs called attention to the problem in 2016 when it passed a resolution warning that the Russian government was using "pseudo-news agencies" and "internet trolls" as forms of propaganda and disinformation in an attempt to lessen democratic values.[10]
In 2015, the Swedish Security Service, Sweden's national security agency, issued a report concluding Russia was utilizing the tactic to inflame "splits in society" through the proliferation of propaganda.[4] Sweden's Ministry of Defence tasked its Civil Contingencies Agency to combat fake news from Russia.[4] Fraudulent news affected politics in Indonesia and the Philippines, where there was simultaneously widespread usage of social media and limited resources to check the veracity of political claims.[3] German Chancellor Angela Merkel warned against "fake sites, bots, trolls" and lamented their societal impact.[2]
Fraudulent articles spread through social media during the 2016 U.S. presidential election.[16][17][18] Several officials within the United States Intelligence Community said that Russia was engaged in spreading fake news.[19] Computer security company FireEye concluded Russia used social media as cyberwarfare.[20] Google and Facebook banned fake sites from using online advertising.[21][22] U.S. President Barack Obama said a disregard for facts created a "dust cloud of nonsense".[23] Concern advanced bipartisan legislation in the U.S. Senate to authorize U.S. State Department action against foreign propaganda.[24] U.S. Senate Intelligence Committee member Ron Wyden said: "There is definitely bipartisan concern about the Russian government engaging in covert influence activities of this nature."[24]
Prominent sources
Prominent among fraudulent news sites include false propaganda created by individuals in the countries of Russia,[9][10] Macedonia,[11][12] Romania,[13] and the United States.[14][15] Several of these websites are often structured to fool visitors that they are actually real publications and mimic the stylistic appearance of ABC News and MSNBC, while other pages are specifically propaganda.[12]
Russia
Internet Research Agency
Beginning in fall 2014, The New Yorker writer Adrian Chen performed a six-month-long investigation into Russian propaganda campaigns on the Internet orchestrated by a group that called itself the Internet Research Agency.[25] Evgeny Prigozhin, a close associate of Vladimir Putin, was behind the operation which hired hundreds of individuals to work in Saint Petersburg to support Russian government views online.[25]
Internet Research Agency came to be regarded as a "troll farm", a term used to refer to propaganda efforts controlling many accounts online with the aim of artificially providing a semblance of a grassroots organization.[25] Chen reported that Internet trolling came to be used by the Russian government as a tactic largely after observing the organic social media organization of the 2011 protests against Putin.[25]
Chen interviewed reporters in Russia in addition to political activists, and was informed the end goal of fake news usage by the Russian government was not to attempt to persuade particular readers that it was factual, but rather to simply sew discord and chaos generally online.[25] Chen wrote: "The real effect, the Russian activists told me, was not to brainwash readers but to overwhelm social media with a flood of fake content, seeding doubt and paranoia, and destroying the possibility of using the Internet as a democratic space."[25]
EU regulation of Russian fake news
In 2015, the Organization for Security and Co-operation in Europe released an analysis highly critical of disinformation campaigns by Russia employed to appear as legitimate news reporting.[26] These propaganda campaigns by Russia were intended to interfere with Ukraine relations with Europe — after the removal of former Ukraine president Viktor Yanukovych from power.[26] According to Deutsche Welle, "The propaganda in question employed similar tactics used by fake news websites during the US elections, including misleading headlines, fabricated quotes and misreporting".[26] This propaganda motivated the European Union to create a special taskforce to deal with disinformation campaigns originating out of Russia.[10][26][27]
Foreign Policy reported that the taskforce, called East StratCom Team, "employs 11 mostly Russian speakers who scour the web for fake news and send out biweekly reviews highlighting specific distorted news stories and tactics."[28] The European Union voted to add to finances for the taskforce in November 2016.[28]
Deutsche Welle noted: "Needless to say, the issue of fake news, which has been used to garner support for various political causes, poses a serious danger to the fabric of democratic societies, whether in Europe, the US or any other nation across the globe."[26]
In November 2016, the European Parliament Committee on Foreign Affairs passed a resolution warning of the use by Russia of tools including: "pseudo-news agencies ... social media and internet trolls" as forms of propaganda and disinformation in an attempt to weaken democratic values.[10] The resolution emphatically requested media analysts within the European Union to investigate, explaining: "with the limited awareness amongst some of its member states, that they are audiences and arenas of propaganda and disinformation."[10] The resolution condemned Russian sources for publicizing "absolutely fake" news reports, and the tally on 23 November 2016 passed by a margin of 304 votes to 179.[29]
Observations
Gleb Pavlovsky, who assisted in creating an propaganda program for the Russian government prior to 2008, told The New York Times in August 2016: "Moscow views world affairs as a system of special operations, and very sincerely believes that it itself is an object of Western special operations. I am sure that there are a lot of centers, some linked to the state, that are involved in inventing these kinds of fake stories."[30]
Anders Lindberg, a Swedish attorney and reporter, explained a common pattern of fake news distribution: "The dynamic is always the same: It originates somewhere in Russia, on Russia state media sites, or different websites or somewhere in that kind of context. Then the fake document becomes the source of a news story distributed on far-left or far-right-wing websites. Those who rely on those sites for news link to the story, and it spreads. Nobody can say where they come from, but they end up as key issues in a security policy decision."[30]
Counter-Disinformation Team
The International Business Times reported that the United States Department of State had plans in the works to specifically use a unit that had been formed with the intention of fighting back against disinformation from the Russian government, and that the unit was disbanded in September 2015 after department heads within the State Department did not foresee the peril of the propaganda in the months immediately prior to the 2016 U.S. presidential campaign.[31] The U.S. State Department had put 8 months of work into developing the counter-disinformation unit before deciding to scrap it.[31]
Titled Counter-Disinformation Team, the program would have been a reboot of the Active Measures Working Group set up by the Reagan Administration which previously operated under the auspices of the U.S. State Department and United States Information Agency.[32][33] The Counter-Disinformation Team was set up underneath the Bureau of International Information Programs of the U.S. State Department.[32][33] Work began in the Obama Administration on the Counter-Disinformation Team in 2014.[32][33] The intention of the Counter-Disinformation Team was to combat propaganda from Russian sources such as Russia Today.[32][33] A beta release version website was established ready to go live and several staff members were hired by the U.S. State Department for the Counter-Disinformation Team prior to its cancellation.[32][33] United States Intelligence Community officials explained to former National Security Agency analyst and counterintelligence officer John R. Schindler, that the Obama Administration decided to cancel the Counter-Disinformation Team because they were afraid of antagonizing the Russian government.[32][33]
Under Secretary of State for Public Diplomacy and Public Affairs Richard Stengel was the point person at the U.S. State Department for the Counter-Disinformation Team before it was canceled.[32][33] Stengel had experience previously on the matter, having written publicly for the U.S. State Department about the disinformation campaign by the Russian government and Russia Today.[34] After United States Secretary of State John Kerry called Russia Today: a "propaganda bullhorn" for Vladimir Putin the president of Russia,[35] Russia Today insisted that the State Department give an "official response" to Kerry's statement.[34][36] In his response, Stengel wrote for the U.S. State Department that Russia Today engaged in a "disinformation campaign".[34][36] Stengel spoke out against the spread of fake news, and explained the difference between reporting and propaganda: "Propaganda is the deliberate dissemination of information that you know to be false or misleading in order to influence an audience."[34][36]
A representative for the U.S. State Department explained to the International Business Times in a statement after being contacted regarding the closure of the Counter-Disinformation Team: "The United States, like many other countries throughout Europe and the world, has been concerned about Russia's intense propaganda and disinformation campaigns. We believe the free flow of reliable, credible information is the best defense against the Kremlin's attack on the truth."[31]
Peter Kreko of the Hungary-based Political Capital Institute spoke to International Business Times about his work studying the disinformation initiatives by the Russian government, and said: "I do think that the American [Obama] administration was caught not taking the issue seriously enough and there were a lot more words than action."[31] Kreko recounted that employees within the U.S. government told him they were exasperated due to the "lack of strategy, efficiency and lack of taking it seriously" regarding the information warfare by the Russian government against the United States.[31]
Further role in 2016 U.S. presidential election
Adrian Chen observed a strange pattern in December 2015 whereby online accounts he had been monitoring as supportive of Russia had suddenly additionally become highly supportive of 2016 U.S. presidential candidate Donald Trump.[9] Chen said: "I created this list of Russian trolls. And I check on it once in a while, still. And a lot of them have turned into conservative accounts, like fake conservatives. I don’t know what’s going on, but they’re all tweeting about Donald Trump and stuff."[9]
Writers Andrew Weisburd and Foreign Policy Research Institute fellow and senior fellow at the Center for Cyber and Homeland Security at George Washington University, Clint Watts,[37] wrote for The Daily Beast in August 2016: "Fake news stories from Kremlin propagandists regularly become social media trends."[9] They observed: "The synchronization of hacking and social media information operations not only has the ability to promote a favored candidate, like Trump, but also has the potential to incite unrest amongst American communities."[9] Weisburd and Watts documented how a disinformation campaign spread from Russia Today and Sputnik News, "the two biggest Russian state-controlled media organizations publishing in English", to pro-Russian accounts on Twitter.[9]
Citing the prior research by Adrian Chen, Weisburd and Watts observed: "This melding of Russian-friendly accounts and Trumpkins has been going on for some time."[9] The two writers compared the tactics used by Russia during the 2016 U.S. election to those previously utilized by the Soviet Union against the U.S. during the Cold War.[9] They referenced the 1992 United States Information Agency report to the United States Congress, which stated: "Active measures seek to use slogans, arguments, disinformation and selected true information to influence the attitudes and actions of foreign publics and governments."[9] Weisburd and Watts concluded these Russian propaganda campaigns, called "Active measures", became much easier for the intelligence agents with the advent of social media on the Internet: "Russia influence operations in social media represents a far more effective and efficient return to their 'Active Measures' campaign of the Cold War."[9]
Immediately prior to the U.S. election in November 2016, Weisburd and Watts collaborated with colleague J. M. Berger and published a follow-up study to their Daily Beast article in the online magazine War on the Rocks, titled: "Trolling for Trump: How Russia is Trying to Destroy Our Democracy".[37][38] The research published by Weisburd, Watts, and Berger detailed techniques of Internet trolls to besmirch the goodwill of parties who criticized the Russian government efforts in Syria and proliferated falsehoods about the vitality of Hillary Clinton.[38] The three writers researched 7,000 user accounts on social media over a two-and-a-half year period of time that promoted Donald Trump.[38] Weisburd, Watts, and Berger concluded the promotion of Trump by the social media accounts: "isn’t the end of Russia’s social media and hacking campaign in America, but merely the beginning."[38]
Watts explained his colleagues' analysis in War on the Rocks to CNN: "Russia's propaganda mechanisms primarily aim for 'alt-right and more traditional right-wing and fascist parties".[37] He stated the Russian propaganda effort was additionally: "hitting across any group in the United States that is anti-government, or fomenting dissent or conspiracies against the US government and its institutions."[37]
The Guardian reported in November 2016: "Recent reports suggest that many of Donald Trump’s most fervent online supporters are not themselves Americans, but Russians being paid by their government to help him win."[39] The paper estimated the number of trolls engaged in the offense at "several thousand", and that their primary topics included: "Putin and Trump being great, the opposition being corrupt, the Nato conspiracy against Russia, the effeminacy of Barack Obama."[39]
Former Central Intelligence Agency case officer Patrick Skinner explained that the true goal of the propaganda operation was to spread uncertainty, regardless of whether or not a particular fake statement had been debunked.[40] Skinner stated: "That's that whole point of the Russian effort. Create enough doubt for everything so that when the proof comes it is washed in the same disdain for all alleged truth."[40] Investigative analyst at Bellingcat, Aric Toler, explained that fact-checking fake news in certain cases backfired playing into the hands of the proliferators: "Sometimes when fake news is debunked, among certain circles it actually gives it more legitimacy. It's the, 'This is what they don't want you to know,' argument."[40]
The United States Intelligence Community devoted a great deal of effort internally debating why Vladimir Putin chose summer 2016 to escalate "Active measures" towards influencing domestic U.S. politics.[41] Director of National Intelligence James R. Clapper said that after the 2011–13 Russian protests, Putin's confidence in his long term viability as a politician was damaged, and he decided to respond with the propaganda intelligence operation.[41] Clapper explained: "I think that their approach is they believe that we are trying to influence political developments in Russia, we are trying to effect change, and so their natural responses is to retaliate and do onto us as they think we’ve done onto them."[41]
U.S. Congressman Adam Schiff, Ranking Member of the House Permanent Select Committee on Intelligence, commented on Putin's aims with the propaganda campaigns: "I think he has certainly succeeded in introducing additional discord into our political system. And he’s endeavored to weaken Secretary Clinton so if she is successful in the election she is a less formidable foe."[41] Schiff stated the U.S. intelligence agencies looked on "with great alarm" at the Russia propaganda campaign in the U.S.[41] Speaking about "disinformation websites" that had appeard previously in Hungary, Slovakia, the Czech Republic, and Poland, Schiff said: "We’re seeing a troubling escalation of that kind of conduct here in the United States."[41] Schiff concluded Russian propaganda intelligence operations would likely continue against the U.S. after the election: "Unless they pay an increasingly high price for this, they’ll continue to meddle the way they are."[41]
Prior to the election, U.S. national security officials told BuzzFeed News they were more anxious about Russia tampering with U.S. news than hacking the election itself.[42] A Washington, D.C. U.S. security official told BuzzFeed News: "Disinformation campaigns, creating doubt around elections results, this is something we’ve seen Russia do in the past. There is a pattern of Russia targeting the soft underbelly of the voting system."[42] The official explained: "Why go through all the work of changing official voting results when you can get a news agency to misreport the results of a key swing state, or create a viral fake news story claiming that a swing state has had its system rigged?"[42]
BuzzFeed News reported that paid Internet trolls financed by the Kremlin were admittedlyopen about having drafted and spread fake news stories to criticize Clinton and support Trump.[42] Subsequent to each of the presidential debates, tens of thousands of bots were deployed to Twitter in order to proliferate hashtags including #Trumpwon and artificially sway perceptions of Trump's skill post debates.[42] The Federal Bureau of Investigation released a statement to BuzzFeed News stating they were looking into the matter: "The FBI is looking into what the actors are up to, what their activity involves, and what’s the scope of their actions. That work is ongoing."[42]
On 24 November 2016, The Washington Post reported that researchers confirmed propaganda during the 2016 U.S. presidential election organized by Russia helped foment criticism of Democrat candidate Hillary Clinton and support of Republican candidate Donald Trump.[16][43] The Russian propaganda effort was conducted through deliberate proliferation of fake news.[16][17][18] The strategy involved social media users, Internet trolls working for hire, botnets, and organized websites in order to cast Clinton in a negative light.[16][17][18] Foreign Policy Research Institute fellow Clint Watts monitored propaganda from Russia and stated its tactics were similar to those used during the Cold War — only this time spread deliberately through social media to a more powerful extent.[16] Watts stated the goal of Russia was to "essentially erode faith in the U.S. government or U.S. government interests."[16] Watts research along with colleagues Andrew Weisburd and J.M. Berger was published in November 2016.[16] These conclusions were confirmed by prior research from the Elliott School of International Affairs at George Washington University and by the RAND Corporation.[16]
Ari Shapiro on the National Public Radio program All Things Considered interviewed Washington Post journalist Craig Timberg, who explained how the initiative operated: "There's legions of botnets and paid human trolls that collect information and tweet it to one another and amplify it online. And that makes these stories that in many cases are false or misleading look much bigger than they are. And they are more likely to end up trending on Google News or end up in your Facebook feed."[43] Timberg explained there were "thousands and thousands of social media accounts" working for Russia together that functioned as a "massive online chorus".[43] Timberg stated Russia had a vested interest in the 2016 U.S. election due to a dislike for Hillary Clinton over the 2011–13 Russian protests, and exhibited a "fondness for Donald Trump".[43] Timberg concluded, "Undermining [United States] democracy and our claims to having a clean democracy were important goals to the Russians."[43]
In the same article, the The Washington Post reported that the group PropOrNot[a] came to similar conclusions about involvement by Russia in propagating fake news during the 2016 U.S. election.[16][17] PropOrNot analyzed data from Twitter and Facebook and tracked propaganda from the disinformation campaign by Russia to a national reach of 15 million people within the United States.[16][17] PropOrNot concluded that accounts belonging to both Russia Today and Sputnik News promoted "false and misleading stories in their reports", and additionally magnified other false articles found on the Internet to support their propaganda effort.[16]
The Washington Post and PropOrNot received criticism from The Intercept,[44] Fortune,[45] and Rolling Stone.[46] Matthew Ingram of Fortune magazine felt that PropOrNot cast too wide a net in identifying fake news websites.[45] The Intercept journalists Glenn Greenwald and Ben Norton were highly critical that the organization included Naked Capitalism on its list.[44] The Intercept called the reporting by The Washington Post as "shoddy",[44] and Fortune magazine called the evidence "flimsy".[45] Writing for Rolling Stone, Matt Taibbi described the report as "astonishingly lazy" and questioned the methodology used by PropOrNot and the lack of information about who was behind the organization.[46] The Washington Post article was criticized in an opinion piece in the paper itself, written by Katrina vanden Heuvel.[47] She wrote that the websites listed by PropOrNot: "include RT and Sputnik News, which are funded by the Russian government, but also independent sites such as Naked Capitalism, Truthout and the right-wing Drudge Report."[47] Multiple different officials within the United States Intelligence Community told BuzzFeed News on 30 November 2016 that they believed the Russian government was actively engaged in spreading fake news.[19] One U.S. intelligence official stated: "They’re doing this continuously, that’s a known fact."[19] Another said: "This is beyond propaganda, that’s my understanding."[19] Institute of International Relations Prague senior research fellow and scholar on Russian intelligence, Mark Galeotti, explained the motivations behind the Kremlin operations: "The most significant aspect of today’s Russian active measures is precisely thereabout undermining and fragmenting the west."[19]
Bloomberg News reported that computer security company FireEye came to the conclusion the Russian government utilized social media online as a strategic weapon with the intention of swaying perspectives regarding the U.S. election.[20] FireEye Chairman David DeWalt told Bloomberg News the intelligence operation by the Russian government in 2016 was a new development in cyberwarfare by Russia.[20] DeWalt stated: "The dawning of Russia as a cyber power is at a whole other level than it ever was before. We’ve seen what I believe is the most historical event maybe in American democracy history in terms of the Russian campaign."[20] FireEye CEO Kevin Mandia stated the tactics of Russian propaganda cyberwarfare changed significantly after fall 2014, from covert computer hacking to suddenly more overt tactics with decreased concerns for operational security or being revealed to the public as an intelligence operation.[20] Mandia concluded: "That’s a change in the rules of engagement."[20]
Macedonia
A significant amount of fraudulent news during the 2016 United States election cycle came from adolescent youths in Macedonia attempting to rapidly profit from those believing their falsehoods.[11][48] An investigation by BuzzFeed revealed that over 100 websites spreading fraudulent articles supportive of Donald Trump were created by teenagers in the town of Veles, Macedonia.[49][50] The Macedonian teenagers experimented with writing fraudulent news about Bernie Sanders and other articles from a politically left or liberal slant; they quickly found out that their most popular fraudulent writings were about Donald Trump.[49]
The Guardian performed its own independent investigation and reached the same conclusion as BuzzFeed; concurrently tracing back over 150 fraudulent news sites to the same exact town of Veles, Macedonia.[11] One of the Macedonian teenagers, "Alex", was interviewed by The Guardian during the ongoing election cycle in August 2016 and stated that regardless of whether Trump won or lost the election fraudulent news websites would remain profitable.[11] He explained he often began writing his pieces by plagiarism through copy and pasting direct content from other websites.[11] Alex told The Guardian: "I think my traffic will be fine if Trump doesn’t win. There are too many haters on the net, and all of my audience hates Hillary."[11]
One of the investigative journalists who exposed the ties between fraudulent websites and Macedonian teenagers, Craig Silverman of BuzzFeed News, told Public Radio International that some false stories net the Balkan adolescents a few thousand dollars per day and most fake articles aggregate to earn them on average a few thousand per month.[51] Public Radio International reported that after the 2016 election season the teenagers from Macedonia would likely turn back to making money off fraudulent medical advice websites, which Silverman noted was where most of the youths had garnered clickbait revenues before the election season.[51]
Romania
"Ending the Fed", a popular purveyor of fraudulent reports, was run by a 24-year-old named Ovidiu Drobota out of Romania, who boasted to Inc. magazine about being more popular than "the mainstream media".[13] "Ending the Fed" was responsible for a false story in August 2016 that incorrectly stated FOX News had fired journalist Megyn Kelly — the story was briefly prominent on Facebook on its "Trending News" section.[13] "Ending the Fed" held four out of the 10 most popular fake articles on Facebook related to the 2016 U.S. election in the prior three months before the election itself.[13] The Facebook page for the website, called "End the Feed", had 350,000 "likes" in November 2016.[13]
After being contacted by Inc. magazine, Drobota stated he was proud of the impact he had on the 2016 U.S. election in favor of his preferred candidate Donald Trump.[13] According to Alexa Internet, "Ending the Fed" garnered approximately 3.4 million views over a 30-day-period in November 2016.[13] Drobota stated the majority of incoming traffic is from Facebook.[13] He said his normal line of work before starting "Ending the Fed" included web development and search engine optimization.[13]
United States
U.S. News & World Report warned readers to be wary of popular fraudulent news sites composed of either outright hoaxes or propaganda, and recommended the website Fake News Watch for a listing of such problematic sources.[55]
Marco Chacon created the fake news website called RealTrueNews to show his alt-right friends "how ridiculous" their gullibility was for such websites.[52][53] In one of the stories Chacon wrote a fake transcript for Hillary Clinton's leaked speeches in which Clinton explains bronies to Goldman Sachs bankers.[52][53] Chacon was shocked when his fake article was attributed as factual by Fox News and he heard his own creation on The Kelly File hosted by Megyn Kelly.[52][53] Trace Gallagher verbatim repeated Chacon's fiction when Gallagher falsely reported Clinton had called Bernie Sanders supporters a "bucket of losers" — a phrase made-up and written by Chacon himself.[52] Megyn Kelly said she was sorry, in the form of a public retraction, subsequent to emphatic denials from representatives for Hillary Clinton.[52][53][54]
After his fake stories that he made up were believed as factual and shared and viewed tens of thousands of times, Chacon told Brent Bambury of CBC Radio One program Day 6 that he was so shocked at Internet consumers' ignorance he felt it was like an episode from The Twilight Zone.[54] In an interview with ABC News, Chacon defended his site, saying his was only an over-the-top parody of other fake news sites to teach them the how ridiculous they were: "The only way I could think of to have a conversation with these people is to say, 'if you have a piece of crazy fake news, look I got one too, and it’s even crazier, it’s absurd.'"[56]
The Daily Beast reported on the popularity of Chacon's fiction being reported as if it were factual: "Chacon’s stories are regularly accepted as fact in the pro-Trump message board canon. YouTube videos with tens of thousands of views exist solely to reinforce sentences and ideas Chacon dreamed up on his laptop in the middle of the night."[52] In a follow-up piece Chacon wrote as a contributor for The Daily Beast after the 2016 U.S. election, he concluded: "When the only news you are willing to believe is partisan news, you are susceptible to stories written 'in your language' that are complete, obvious, utter fabrications."[53]
Jestin Coler from Los Angeles is the founder and CEO of Disinfomedia, a company which owns many fake news websites He had previously given interviews to multiple media organizations about fake news under a pseudonym, Allen Montgomery, in order to evade personal scrutiny.[14] With the help of tech-company engineer John Jansen, journalists from NPR found Coler's identity. After being identified as Disinformedia's owner, Coler agreed to an interview.[14] Coler explained how his original intent for his project backfired: "The whole idea from the start was to build a site that could kind of infiltrate the echo chambers of the alt-right, publish blatantly or fictional stories and then be able to publicly denounce those stories and point out the fact that they were fiction."[14] He stated his company attempted to write fraudulent reports for the left-wing perspective, but found those articles were not shared nearly as much as fake news from a right-wing point-of-view.[14] Coler told NPR that consumers of information must be more skeptical of content in order to combat fake news: "Some of this has to fall on the readers themselves. The consumers of content have to be better at identifying this stuff. We have a whole nation of media-illiterate people. Really, there needs to be something done."[14]
Paul Horner, a creator of fraudulent news stories, stated in an interview with The Washington Post that he was making approximately US$10,000 a month through advertisements linked to the fraudulent news.[15][57][58] He claimed to have posted a fraudulent advertisement to Craigslist offering thousands of dollars in payment to protesters, and to have written a story based on this which was later shared online by Trump's campaign manager.[15][57][58] Horner believed that when the stories were shown to be false, this would reflect badly on Trump's supporters who had shared them, but concluded "Looking back, instead of hurting the campaign, I think I helped it. And that feels [bad]."[59]
In a follow-up interview with Rolling Stone, Horner revealed that The Washington Post profile piece on him spurred greatly increased interest with over 60 interview requests from media including ABC News, CBS News, and Inside Edition.[60] Horner explained that his writing style was such that articles appeared legitimate at the top and became increasingly couched in absurdity as the reader progressed: "Most of my stuff, starts off, the first paragraph is super legit, the title is super legit, the picture is super legit, but then the story just gets more and more ridiculous and it becomes obvious that none of it is true."[60] Horner told Rolling Stone that he always placed his name as a fictional character in his fake articles.[60] He said he supported efforts to decrease fake news websites.[60]
Impacts by country
Fake news has influenced political discourse in multiple countries, including Germany,[2] Indonesia and the Philippines,[3] Sweden,[4] China,[5][6] Myanmar,[7][8] and the United States.[9]
Sweden
The Swedish Security Service issued a report in 2015 identifying propaganda from Russia infiltrating Sweden with the objective to: "spread pro-Russian messages and to exacerbate worries and create splits in society."[4]
The Swedish Civil Contingencies Agency (MSB), part of the Ministry of Defence of Sweden, identified fake news reports targeting Sweden in 2016 which originated from Russia.[4] Swedish Civil Contingencies Agency official Mikael Tofvesson stated: "This is going on all the time. The pattern now is that they pump out a constant narrative that in some respects is negative for Sweden."[4]
The Local identified these tactics as a form of psychological warfare.[4] The newspaper reported the MSB identified Russia Today and Sputnik News as "important channels for fake news".[4] As a result of growth in this propaganda in Sweden, the MSB planned to hire six additional security officials to fight back against the campaign of fraudulent information.[4]
2016 U.S. presidential election
Fraudulent stories during the 2016 U.S. presidential election popularized on Facebook included a viral post that Pope Francis had endorsed Donald Trump, and another that wrote actor Denzel Washington "backs Trump in the most epic way possible".[61]
Donald Trump's son and campaign surrogate Eric Trump, top national security adviser Michael T. Flynn, and then-campaign managers Kellyanne Conway and Corey Lewandowski shared fake news stories during the campaign.[59][62][63][64]
U.S. President Barack Obama commented on the significant problem of fraudulent information on social networks impacting elections, in a speech the day before Election Day in 2016: "The way campaigns have unfolded, we just start accepting crazy stuff as normal. And people, if they just repeat attacks enough and outright lies over and over again, as long as it’s on Facebook, and people can see it, as long as its on social media, people start believing it. And it creates this dust cloud of nonsense."[23][65]
Shortly after the election, Obama again commented on the problem, saying in an appearance with German Chancellor Angela Merkel: "If we are not serious about facts and what’s true and what's not, and particularly in an age of social media when so many people are getting their information in sound bites and off their phones, if we can't discriminate between serious arguments and propaganda, then we have problems."[63][66]
One prominent fraudulent news story released after the election—that protesters at anti-Trump rallies in Austin, Texas, were "bused in"—started as a tweet by one individual with 40 Twitter followers.[67] Over the next three days, the tweet was shared at least 16,000 times on Twitter and 350,000 times on Facebook, and promoted in the conservative blogosphere, before the individual stated that he had fabricated his assertions.[67]
BuzzFeed called the problem an "epidemic of misinformation".[50] According to BuzzFeed's analysis, the 20 top-performing election news stories from fraudulent sites generated more shares, reactions, and comments on Facebook than the 20 top-performing stories from 19 major news outlets.[68][69]
Fox News host of the journalism meta analysis television program Media Buzz, Howard Kurtz, acknowledged fraudulent news was a serious problem.[69] Kurtz relied heavily upon the BuzzFeed analysis for his reporting on the controversy.[69] Kurtz wrote that: "Facebook is polluting the media environment with garbage".[69] Citing the BuzzFeed investigation, Kurtz pointed out: "The legit stuff drew 7,367,000 shares, reactions and comments, while the fictional material drew 8,711,000 shares, reactions and comments."[69] Kurtz concluded Facebook founder Mark Zuckerberg must admit the website is a media company: "But once Zuckerberg admits he’s actually running one of the most powerful media brands on the planet, he has to get more aggressive about promoting real news and weeding out hoaxers and charlatans. The alternative is to watch Facebook’s own credibility decline."[69]
Worries that fake news spread by the Russian government swayed the outcome of the election grew, and representatives in the U.S. Congress took action to safeguard the National security of the United States by advancing legislation to monitor incoming propaganda from external threats.[24][70] On 30 November 2016, legislators approved a measure within the National Defense Authorization Act to ask the U.S. State Department to take action against foreign propaganda through an interagency panel.[24][70] The legislation authorized funding of $160 million over a two-year-period.[24]
The initiative was developed through a bipartisan bill written in March 2016 by US Senators Chris Murphy and Rob Portman titled: Countering Foreign Propaganda and Disinformation Act.[24] US Senator Rob Portman stated: "This propaganda and disinformation threat is real, it’s growing, and right now the U.S. government is asleep at the wheel. The U.S. and our allies face many challenges, but we must better counter and combat the extensive propaganda and disinformation operations directed against us."[24] US Senator Chris Murphy was interviewed by The Washington Post about the legislation and said: "In the wake of this election, it’s pretty clear that the U.S. does not have the tools to combat this massive disinformation machinery that the Russians are running."[24] United States Senate Select Committee on Intelligence member Senator Ron Wyden told The Washington Post: "There is definitely bipartisan concern about the Russian government engaging in covert influence activities of this nature."[24]
Members of the United States Senate Select Committee on Intelligence traveled to Ukraine and Poland in March 2016 and heard from officials in both countries on Russian operations to influence their affairs.[71] U.S. Senator Angus King told the Portland Press Herald that tactics used by Russia during the 2016 U.S. election cycle were analogous to those used against other countries as well.[71] King recalled: "We were told by various officials in both countries about the Russian standard practice of interfering with elections: planting fake news stories".[71] On 30 November 2016, King joined a letter in which seven members of the U.S. Senate Select Committee on Intelligence asked President Obama to publicize more information from the intelligence community on Russia's role in the U.S. election.[71][72]
Indonesia and Philippines
Fraudulent news has been particularly problematic in Indonesia and the Philippines, where social media has an outsized political influence.[3] According to media analysts, "many developing countries with populations new to both democracy and social media" are particularly vulnerable to the influence of fraudulent news.[3] In some developing countries, "Facebook even offers free smartphone data connections to basic public online services, some news sites and Facebook itself — but limits access to broader sources that could help debunk fake news."[3]
Germany
German Chancellor Angela Merkel lamented the problem of fraudulent news reports in a November 2016 speech, days after announcing her campaign for a fourth term as leader of her country.[2] In a speech to the German parliament, Merkel was critical of such fake sites: "Something has changed -- as globalisation has marched on, (political) debate is taking place in a completely new media environment. Opinions aren't formed the way they were 25 years ago. Today we have fake sites, bots, trolls -- things that regenerate themselves, reinforcing opinions with certain algorithms and we have to learn to deal with them."[2] She warned that such fraudulent news websites were a force increasing the power of populist extremism.[2] Merkel called fraudulent news a growing phenomenon that might need to be regulated in the future.[2]
Germany's foreign intelligence agency Federal Intelligence Service Chief, Bruno Kahl, warned of the potential for cyberattacks by Russia in the 2017 German election.[73] He said the cyberattacks would take the form of the intentional spread of misinformation.[73] Kahl said the goal is to "elicit political uncertainty".[73] Germany's domestic intelligence agency Federal Office for the Protection of the Constitution Chief, Hans-Georg Maassen, said: "The information security of German government, administrative, business, science and research institutions is under permanent threat. ... Russian intelligence agencies are also showing a readiness to [carry out] sabotage."[73]
China
The government of China used the growing problem of fake news as a rationale for increasing internet censorship in China in November 2016.[74] China took the opportunity to publish an editorial in its Communist Party newspaper The Global Times called: "Western Media's Crusade Against Facebook", and criticized "unpredictable" political problems posed by freedoms enjoyed by users of Twitter, Google, and Facebook.[5] China government leaders meeting in Wuzhen at the third World Internet Conference in November 2016 said fake news in the U.S. election justified adding more curbs to free and open use of the Internet.[6] China Deputy Minister Ren Xianliang, official at the Cyberspace Administration of China, said increasing online participation led to additional "harmful information" and that "intimidation and fraud are more common than ever".[75] Kam Chow Wong, a former Hong Kong law enforcement official and criminal justice professor at Xavier University, said at the conference: "it's a good move that the U.S. is trying to regulate social media; it’s overdue."[76] The Wall Street Journal noted China's themes of Internet censorship became more relevant at the World Internet Conference due to the outgrowth of fake news: "China’s efforts to promote its concept of the internet had fresh resonance as Western minds now debate whether social media sites should screen out fake news".[77]
Myanmar
Fake news negatively affected individuals in Myanmar, leading to a rise in violence against Muslims in the country.[7][8] Online participation within the country surged from a value of one percent to 20 percent of Myanmar's total populace from the period of time of 2014 to 2016.[7][8] Fake stories from Facebook in the country grew so influential that they were reprinted in paper periodicals called Facebook and The Internet that simply regurgitated the website's newsfeed text often without factual oversight, for those without Internet access.[8] False reporting related to practitioners of Islam in the country was directly correlated with increased attacks on people of the religion in Myanmar, and protests against Muslims.[7][8]
BuzzFeed News journalist Sheera Frenkel reported: "there has also been an increase in articles that demonize the country’s minority Muslim community, with fake news claiming that vast hordes of Muslim worshippers are attacking Buddhist sites. These articles, quickly shared and amplified on social media, have correlated with a surge in anti-Muslim protests and attacks on local Muslim groups."[7][8] Frenkel noted countries that were relatively newer to Internet exposure were more susceptible to the problem, writing: "Countries like Myanmar, which come online quickly and without many government-backed programs to teach safe internet habits — like secure passwords and not revealing personal details online — rank among the lowest in digital literacy. They are the most likely to fall for scams, hacks, and fake news."[8]
Response
Google CEO comment and actions
In the aftermath of the 2016 U.S. presidential election, Google, along with Facebook, faced increased scrutiny in the role of fake-news websites in the election.[80] The top result on Google for results of the race was to a fraudulent news site.[81] "70 News" had fraudulently written an incorrect headline and article that Donald Trump won the popular vote against Hillary Clinton in the 2016 U.S. election.[78][79][80] With regards to the false results posted on "70 News", Google later stated that its prominence in search results was a mistake: "In this case we clearly didn't get it right, but we are continually working to improve our algorithms."[82] By Monday, November 14, the "70 News" result was the second link that people saw when searching for results of the race.[80]
When asked shortly after the election whether fraudulent news sites could have changed the election's results, Google CEO Sundar Pichai responded: "Sure" and went on to emphasize the importance of stopping the spread of fraudulent news sites: "Look, it is important to remember this was a very close election and so, just for me, so looking at it scientifically, one in a hundred voters voting one way or the other swings the election either way. ... From our perspective, there should just be no situation where fake news gets distributed, so we are all for doing better here."[83]
On 14 November 2016, Google responded to the growing problem of fraudulent news sites by banning such companies from profiting on advertising from traffic to false articles through its marketing program AdSense.[21][22][80] The company already had a policy for denying ads for dieting ripoffs and counterfeit merchandise.[84] Google stated upon the announcement: "We’ve been working on an update to our publisher policies and will start prohibiting Google ads from being placed on misrepresentative content. Moving forward, we will restrict ad serving on pages that misrepresent, misstate, or conceal information about the publisher, the publisher’s content, or the primary purpose of the web property."[85] This builds upon one of Google's existing better-advertisement policies, wherein misleading advertising is already banned from Google AdSense.[80][86] The ban is not expected to apply to news satire sites like The Onion; some satirical sites may be inadvertently blocked under this new system.[80]
Facebook deliberations
Blocking fraudulent advertisers
Facebook made the decision to take a similar move to Google, and blocked fake news sites from advertising on its website the following day after Google took earlier action first on the matter.[22][80] Facebook explained its new policy: "We do not integrate or display ads in apps or sites containing content that is illegal, misleading or deceptive, which includes fake news. ... We have updated the policy to explicitly clarify that this applies to fake news. Our team will continue to closely vet all prospective publishers and monitor existing ones to ensure compliance."[85] The steps by both Google and Facebook intended to deny ad revenue to fraudulent news sites; neither company took actions to prevent dissemination of false stories in search engine results pages or web feeds.[21][87]
Facebook CEO Mark Zuckerberg said, in a post to his website on the issue, that the notion that fraudulent news sites impacted the 2016 election was a "crazy idea".[88][89] Zuckerberg rejected that his website played any role in the outcome of the election, describing the idea that it might have done so as "pretty crazy".[90] In a blog post, he stated that more than 99% of content on Facebook was authentic (i.e. neither fake news nor a hoax).[91] In the same blog post, he stated that "News and media are not the primary things people do on Facebook, so I find it odd when people insist we call ourselves a news or media company in order to acknowledge its importance."[92] Separately, Zuckerberg advised Facebook users to check the fact-checking website Snopes.com whenever they encounter fake news on Facebook.[93][94]
Top staff members at Facebook did not feel that simply blocking ad revenue from these fraudulent sites was a strong enough response to the problem, and together they made an executive decision and created a secret group to deal with the issue themselves.[88][89] In response to Zuckerberg's first statement that fraudulent news did not impact the 2016 election, the secret Facebook response group disputed this idea: "It’s not a crazy idea. What’s crazy is for him to come out and dismiss it like that, when he knows, and those of us at the company know, that fake news ran wild on our platform during the entire campaign season."[88][89] BuzzFeed reported that the secret task force included "dozens" of Facebook employees.[88][89]
Response
Facebook faced mounting criticism in the days after its decision to solely revoke advertising revenues from fraudulent news providers, and not take any further actions on the matter.[95][96] After one week negative coverage in the media including assertions that the proliferation of fraudulent news on Facebook gave the 2016 U.S. presidential election to Donald Trump, Mark Zuckerberg posted a second post on the issue on 18 November 2016.[95][96] The post was a reversal of his earlier comments on the matter where he had discounted the impact of fraudulent news.[96]
Zuckerberg said there was an inherent difficult nature in attempting to filter out fraudulent news: "The problems here are complex, both technically and philosophically. We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible."[95] The New York Times reported some measures being considered and not yet implemented by Facebook included: "third-party verification services, better automated detection tools and simpler ways for users to flag suspicious content."[95] The 18 November post did not announce any concrete actions the company would definitively take, or when such measures would formally be put into usage on the website.[95][96]
Many people commented positively under Zuckerberg's second post on fraudulent news.[97] National Public Radio observed the changes being considered by Facebook to identify fraud constituted progress for the company into a new medium: "Together, the projects signal another step in Facebook's evolution from its start as a tech-oriented company to its current status as a complex media platform."[97] On 19 November 2016, BuzzFeed advised Facebook users they could report posts from fraudulent news websites.[98] Users could do so by choosing the report option: "I think it shouldn't be on Facebook", followed by: "It’s a false news story."[98]
In November 2016, Facebook began assessing use of warning labels on fake news.[99] The rollout was at first only available to a few users in a testing phase.[99] A sample warning read: "This website is not a reliable news source. Reason: Classification Pending".[99] TechCrunch analyzed the new feature during the testing phase and surmised it may have a tendency towards false positives.[99]
Impact
Fake news proliferation on Facebook had a negative financial impact for the company. The Economist reported: "Brian Wieser of Pivotal Research recently wrote that the focus on fake news and the concerns over the measurement of advertising could well cut revenue growth by a couple of percentage points."[100]
The New York Times reported shortly after Mark Zuckerberg's second statement on fake news proliferation on his website, that Facebook would engage in assisting the government of China with a version of its software in the country to allow increased censorship by the government.[101] Barron's newspaper contributor William Pesek was highly critical of this move, writing: "By effectively sharing its fake news problem with the most populous nation, Facebook would be a pawn of [China’s President Xi] Jinping's intensifying censorship push."[101]
Fact-checking websites and journalists
Fact-checking websites play a role as debunkers to fraudulent news reports.[102][103][104] Such sites saw large increases in readership and web traffic during the 2016 U.S. election cycle.[102][103] FactCheck.org,[b] PolitiFact.com,[c] Snopes.com,[d] and "The Fact Checker" section of The Washington Post,[e] are prominent fact-checking websites that played an important role in debunking fraud.[93][102][104][110] The New Yorker writer Nicholas Lemann wrote on how to address fake news, and called for increasing the roles of FactCheck.org, PolitiFact.com, and Snopes.com, in the age of post-truth politics.[111] CNN media meta analyst Brian Stelter wrote: "In journalism circles, 2016 is the year of the fact-checker."[102]
By the close of the 2016 U.S. election season, fact-checking websites FactCheck.org, PolitiFact.com, and Snopes.com, had each authored guides on how to respond to fraudulent news.[1][110][112] FactCheck.org advised readers to check the source, author, date, and headline of publications.[110] They recommended their colleagues Snopes.com, The Washington Post Fact Checker, and PolitiFact.com as important resources to rely upon before re-sharing a fraudulent story.[110] FactCheck.org admonished consumers to be wary of their own biases when viewing media they agree with.[110] PolitiFact.com announced they would tag stories as "Fake news" so that readers could view all fraudulent stories they had debunked.[112] Snopes.com warned readers: "So long as social media allows for the rapid spread of information, manipulative entities will seek to cash in on the rapid spread of misinformation."[1]
The Washington Post's "The Fact Checker" section, which is dedicated to evaluating the truth of political claims, greatly increased in popularity during the 2016 election cycle. Glenn Kessler, who runs the Post's "Fact Checker", wrote that "fact-checking websites all experienced huge surges in readership during the election campaign."[103] The Fact Checker had five times more unique visitors than during the 2012 cycle."[103] Kessler cited research showing that fact-checks are effective at reducing "the prevalence of a false belief."[103] Will Moy, director of the London-based Full Fact, a UK fact-checking website, said that debunking must take place over a sustained period of time to truly be effective.[103] Full Fact began work to develop multiple products in a partnership with Google to help automate fact-checking.[113]
FactCheck.org former director Brooks Jackson remarked that larger media companies had devoted increased focus to the importance of debunking fraud during the 2016 election: "It's really remarkable to see how big news operations have come around to challenging false and deceitful claims directly. It's about time."[102] FactCheck.org began a new partnership with CNN journalist Jake Tapper in 2016 to examine the veracity of reported claims by candidates.[102]
Angie Drobnic Holan, editor of PolitiFact.com, noted the circumstances warranted support for the practice: "All of the media has embraced fact-checking because there was a story that really needed it."[102] Holan was heartened that fact-checking garnered increased viewership for those engaged in the practice: "Fact-checking is now a proven ratings getter. I think editors and news directors see that now. So that's a plus."[102] Holan cautioned that heads of media companies must strongly support the practice of debunking, as it often provokes hate mail and extreme responses from zealots.[102]
On 17 November 2016, the International Fact-Checking Network (IFCN) published an open letter on the website of the Poynter Institute to Facebook founder and CEO Mark Zuckerberg, imploring him to utilize fact-checkers in order to help identify fraud on Facebook.[104][114] Created in September 2015, the IFCN is housed within the St. Petersburg, Florida-based Poynter Institute for Media Studies and aims to support the work of 64 member fact-checking organizations around the world.[115][116] Alexios Mantzarlis, co-founder of FactCheckEU.org and former managing editor of Italian fact-checking site Pagella Politica, was named director and editor of IFCN in September 2015.[115][116] Signatories to the 2016 letter to Zuckerberg featured a global representation of fact-checking groups, including: Africa Check, FactCheck.org, PolitiFact.com, and The Washington Post Fact Checker.[104][114] The groups wrote they were eager to assist Facebook root out fraudulent news sources on the website.[104][114]
In his second post on the matter on 18 November 2016, Zuckerberg responded to the fraudulent news problem by suggesting usage of fact-checking websites.[93][94] He specifically identified fact-checking website Snopes.com, and pointed out that Facebook monitors links to such debunking websites in reply comments as a method to determine which original posts were fraudulent.[93][94] Zuckerberg explained: "Anyone on Facebook can report any link as false, and we use signals from those reports along with a number of others — like people sharing links to myth-busting sites such as Snopes — to understand which stories we can confidently classify as misinformation. Similar to clickbait, spam and scams, we penalize this content in News Feed so it's much less likely to spread."[93][94]
Society of Professional Journalists president Lynn Walsh said in November 2016 that the society would reach out to Facebook in order to provide assistance with weeding out fake news.[117] Walsh said Facebook should evolve and admit that it functioned as a large media company: "The media landscape has evolved. Journalism has evolved, and continues to evolve. So I do hope that while it may not be the original thought that Facebook had. I think they should be now."[117]
Proposed technology tools
New York magazine contributor Brian Feldman responded to an article by media communications professor Melissa Zimdars, and used her list to create a Google Chrome extension that would warn users about fraudulent news sites.[118] He invited others to use his code and improve upon it.[118]
Slate magazine senior technology editor Will Oremus wrote that fraudulent news sites were controversial; and their prevalence was obscuring a wider discussion about the negative impact on society from those who only consume media from one particular tailored viewpoint — and therefore perpetuate filter bubbles.[119]
Upworthy co-founder and The Filter Bubble author Eli Pariser launched an open-source model initiative on 17 November 2016 to address false news.[120][121] Pariser began a Google Document to collaborate with others online on how to lessen the phenomenon of fraudulent news.[120][121] Pariser called his initiative: "Design Solutions for Fake News".[120] Pariser's document included recommendations for a ratings organization analogous to the Better Business Bureau, and a database on media producers in a format like Wikipedia.[120][121]
Writing for Fortune, Matthew Ingram agreed with the idea that Wikipedia could serve as a helpful model to improve Facebook's analysis of potentially fake news.[122] Ingram concluded: "If Facebook could somehow either tap into or recreate the kind of networked fact checking that Wikipedia does on a daily basis, using existing elements like the websites of Politifact and others, it might actually go some distance towards being a possible solution."[122]
Academic analysis
Writing for MIT Technology Review, Jamie Condliffe said that merely banning ad revenue from the fraudulent news sites was not enough action by Facebook to effectively deal with the problem.[48] He wrote: "The post-election furor surrounding Facebook’s fake-news problem has sparked new initiatives to halt the provision of ads to sites that peddle false information. But it’s only a partial solution to the problem: for now, hoaxes and fabricated stories will continue to appear in feeds."[48] Condliffe concluded: "Clearly Facebook needs to do something to address the issue of misinformation, and it’s making a start. But the ultimate solution is probably more significant, and rather more complex, than a simple ad ban."[48]
Indiana University informatics and computer science professor Filippo Menczer commented on the steps by Google and Facebook to deny fraudulent news sites advertising revenue: "One of the incentives for a good portion of fake news is money. This could cut the income that creates the incentive to create the fake news sites."[123] Menczer's research team engaged in developing an online tool titled: Hoaxy — to see the pervasiveness of unconfirmed assertions as well as related debunking on the Internet.[124]
Dartmouth College political scientist Brendan Nyhan has criticized Facebook for "doing so little to combat fake news... Facebook should be fighting misinformation, not amplifying it."[68]
Zeynep Tufekci wrote critically about Facebook's stance on fraudulent news sites in a piece for The New York Times, pointing out fraudulent websites in Macedonia profited handsomely off false stories about the 2016 U.S. election: "The company's business model, algorithms and policies entrench echo chambers and fuel the spread of misinformation."[125]
Merrimack College assistant professor of media studies Melissa Zimdars wrote an article "False, Misleading, Clickbait-y and Satirical 'News' Sources" in which she advised how to determine if a fraudulent source was a fake news site.[126] Zimdars identified strange domain names, lack of author attribution, poor website layout, the use of all caps, and URLs ending in "lo" or "com.co" as red flags of a fake news site.[126] In evaluating whether a website contains fake news, Zimdars recommends that readers check the "About Us" page of the website, and consider whether reputable news outlets are reporting on the story.[126]
Education and history professor Sam Wineburg of the Stanford Graduate School of Education at Stanford University and colleague Sarah McGrew authored a 2016 study which analyzed students' ability to discern fraudulent news from factual reporting.[127][128] The study took place over a year-long period of time, and involved a sample size of over 7,800 responses from university, secondary and middle school students in 12 states within the United States.[127][128] The researchers were "shocked" at the "stunning and dismaying consistency" with which students thought fraudulent news reports were factual in nature.[127][128] The study found that 82 percent of students in middle school were unable to differentiate between an advertisement denoted as sponsored content from an actual online news article.[129] The authors concluded the solution was to educate consumers of media on the Internet to themselves behave like fact-checkers — and actively question the veracity of all sources they encounter online.[127][128]
Scientist Emily Willingham proposed applying the scientific method towards fake news analysis.[130] She had previously written on the topic of differentiating science from pseudoscience, and applied that logic to fake news.[130] Her recommended steps included: Observe, Question, Hypothesize, Analyze data, Draw conclusion, and Act on results.[130] Willingham suggested a hypothesis of "This is real news", and then forming a strong set of questions to attempt to disprove the hypothesis.[130] These tests included: check the URL, date of the article, evaluate reader bias and writer bias, double-check the evidence, and verify the sources cited.[130]
Media commentary
Full Frontal
Samantha Bee went to Russia for her television show Full Frontal and met with individuals financed by the government of Russia to act as Internet trolls and attempt to subvert the 2016 U.S. election in order to subvert democracy. The man and woman interviewed by Bee said they influenced the election by commenting on websites for New York Post, The Wall Street Journal, The Washington Post, Twitter, and Facebook.[131][132][133] They kept their identities covert, and maintained cover identities separate from their real Russian names, with the woman claiming in posts to be a housewife residing in Nebraska. They blamed consumers for believing all they read online.[131][132][133]
Executive producers for Full Frontal told The Daily Beast that they relied upon writer Adrian Chen, who had previously reported on Russian trolls for The New York Times Magazine in 2015, as a resource to contact those in Russia agreeable to be interviewed by Bee. The Russian trolls wore masks on camera and asked Full Frontal producers to maintain the confidentiality of all of their fake accounts so they would not be publicly identified. Full Frontal producers paid the Russian trolls to utilize the Twitter hashtag #SleazySam in order to troll the show itself, so the production staff could verify the trolls were indeed able to manipulate content online as they claimed.[133]
Subsequent to their research within Russia itself for a second segment on Full Frontal, the production staff came to the conclusion that Russian leader Vladimir Putin supported Donald Trump for U.S. President in order to subvert the system of democracy within the U.S.[133] Television producer Razan Ghalayini explained to The Daily Beast: "Russia is an authoritarian regime and authoritarian regimes don’t benefit from the vision of democracy being the best version of governance." Television producer Miles Kahn concurred with this analysis, adding: "It’s not so much that Putin wants Trump. He probably prefers him in the long run, but he would almost rather the election be contested. They want chaos."[133]
Last Week Tonight
John Oliver commented on his comedy program Last Week Tonight, in one of his segments about Donald Trump, that the problem of fraudulent news sites fed into a wider issue of echo chambers in the media. Oliver lamented: "Fake facts circulate on social media to a frightening extent." He pointed out such sites often only exist to draw in profit from web traffic: "There is now a whole cottage industry specializing in hyper-partisan, sometimes wildly distorted clickbait."[55]
Other media
Critics contended that fraudulent news on Facebook may have been responsible for Donald Trump winning the 2016 U.S. presidential election, because most of the fake news stories Facebook allowed to spread portrayed him in a positive light.[91] Facebook is not liable for posting or publicizing fake content because, under the Communications Decency Act, interactive computer services cannot be held responsible for information provided by another internet entity. Some legal scholars, like Keith Altman, think that Facebook's huge scale creates such a large potential for fake news to spread that this law may need to be changed.[134] Writing for The Washington Post, Institute for Democracy in Eastern Europe co-director Eric Chenoweth wrote "many 'fake news' stories that evidence suggests were generated by Russian intelligence operations".[135]
British BBC News interviewed a fraudulent news site writer who went by the pseudonym "Chief Reporter (CR)", who defended his actions and possible influence on elections: "If enough of an electorate are in a frame of mind where they will believe absolutely everything they read on the internet, to a certain extent they have to be prepared to deal with the consequences."[136]
See also
- 2016 Democratic National Committee email leak
- Post-truth politics
- Clickbait
- Confirmation bias
- Cyberwarfare by Russia
- Democratic National Committee cyber attacks
- Disinformation
- Echo chamber (media)
- Fancy Bear
- Filter bubble
- Guccifer 2.0
- Hybrid warfare
- List of satirical news websites
- Russian espionage in the United States
- Russian propaganda
- Selective exposure theory
- Spiral of silence
- State-sponsored Internet propaganda
- Tribe (internet)
- Trolls from Olgino
- Web brigades
Footnotes
- ^ The Washington Post and the Associated Press described PropOrNot as a nonpartisan foreign policy analysis group composed of persons with prior experience in international relations, warfare, and information technology sectors.[16][17][18] PropOrNot received criticism from sources including The Intercept[44] and Fortune magazine for casting too wide a net in its identification list.[45]
- ^ FactCheck.org, a nonprofit organization and a project of the Annenberg Public Policy Center of the Annenberg School for Communication at the University of Pennsylvania,[105] won a 2010 Sigma Delta Chi Award from the Society of Professional Journalists.[106]
- ^ PolitiFact.com, run by the Tampa Bay Times,[107] received a 2009 Pulitzer Prize for National Reporting for its fact-checking efforts the previous year.[107]
- ^ Snopes.com, privately run by Barbara and David Mikkelson, was given "high praise" by FactCheck.org, another fact-checking website;[108] in addition, Network World gave Snopes.com a grade of "A" in a meta-analysis of fact-checking websites.[109]
- ^ "The Fact Checker" is a project by The Washington Post to analyze political claims.[102] Their colleagues and competitors at FactCheck.org recommended The Fact Checker as a resource to use before assuming a story is factual.[110]
References
- ^ a b c LaCapria, Kim (2 November 2016), "Snopes' Field Guide to Fake News Sites and Hoax Purveyors - Snopes.com's updated guide to the internet's clickbaiting, news-faking, social media exploiting dark side.", Snopes.com, retrieved 19 November 2016
- ^ a b c d e f g h "Merkel warns against fake news driving populist gains", Yahoo! News, Agence France-Presse, 23 November 2016, retrieved 23 November 2016
- ^ a b c d e f Paul Mozur and Mark Scott (17 November 2016), "Fake News on Facebook? In Foreign Elections, That's Not New", The New York Times, retrieved 18 November 2016
- ^ a b c d e f g h i j k "Concern over barrage of fake Russian news in Sweden", The Local, 27 July 2016, retrieved 25 November 2016
- ^ a b c Eunice Yoon and Barry Huang (22 November 2016), "China on US fake news debate: We told you so", CNBC, retrieved 28 November 2016
- ^ a b c Cadell, Catherine (19 November 2016), China says terrorism, fake news impel greater global internet curbs, retrieved 28 November 2016
{{citation}}
: Unknown parameter|agency=
ignored (help) - ^ a b c d e f Read, Max (27 November 2016), "Maybe the Internet Isn't a Fantastic Tool for Democracy After All", New York Magazine, retrieved 28 November 2016
- ^ a b c d e f g h Frenkel, Sheera (20 November 2016), "This Is What Happens When Millions Of People Suddenly Get The Internet", BuzzFeed News, retrieved 28 November 2016
- ^ a b c d e f g h i j k l m Weisburd, Andrew; Watts, Clint (6 August 2016), "Trolls for Trump - How Russia Dominates Your Twitter Feed to Promote Lies (And, Trump, Too)", The Daily Beast, retrieved 24 November 2016
{{citation}}
: CS1 maint: multiple names: authors list (link) - ^ a b c d e f Lewis Sanders IV (11 October 2016), "'Divide Europe': European lawmakers warn of Russian propaganda", Deutsche Welle, retrieved 24 November 2016
- ^ a b c d e f g Dan Tynan (24 August 2016), "How Facebook powers money machines for obscure political 'news' sites - From Macedonia to the San Francisco Bay, clickbait political sites are cashing in on Trumpmania – and they're getting a big boost from Facebook", The Guardian, retrieved 18 November 2016
- ^ a b c Ben Gilbert (15 November 2016), "Fed up with fake news, Facebook users are solving the problem with a simple list", Business Insider, retrieved 16 November 2016,
Some of these sites are intended to look like real publications (there are false versions of major outlets like ABC and MSNBC) but share only fake news; others are straight-up propaganda created by foreign nations (Russia and Macedonia, among others).
- ^ a b c d e f g h i j Townsend, Tess (21 November 2016), "Meet the Romanian Trump Fan Behind a Major Fake News Site", Inc. magazine, ISSN 0162-8968, retrieved 23 November 2016
- ^ a b c d e f g Sydell, Laura (23 November 2016), "We Tracked Down A Fake-News Creator In The Suburbs. Here's What We Learned", All Things Considered, National Public Radio, retrieved 26 November 2016
- ^ a b c d THR staff (17 November 2016), "Facebook Fake News Writer Reveals How He Tricked Trump Supporters and Possibly Influenced Election", The Hollywood Reporter, retrieved 18 November 2016
- ^ a b c d e f g h i j k l Timberg, Craig (24 November 2016), "Russian propaganda effort helped spread 'fake news' during election, experts say", The Washington Post, retrieved 25 November 2016,
Two teams of independent researchers found that the Russians exploited American-made technology platforms to attack U.S. democracy at a particularly vulnerable moment
- ^ a b c d e f "Russian propaganda effort likely behind flood of fake news that preceded election", PBS NewsHour, Associated Press, 25 November 2016, retrieved 26 November 2016
- ^ a b c d "Russian propaganda campaign reportedly spread 'fake news' during US election", Nine News, Agence France-Presse, 26 November 2016, retrieved 26 November 2016
- ^ a b c d e Ali Watkins and Sheera Frenkel (30 November 2016), "Intel Officials Believe Russia Spreads Fake News", BuzzFeed News, retrieved 1 December 2016
- ^ a b c d e f Strohm, Chris (1 December 2016), "Russia Weaponized Social Media in U.S. Election, FireEye Says", Bloomberg News, retrieved 1 December 2016
- ^ a b c "Google and Facebook target fake news sites with advertising clampdown", Belfast Telegraph, 15 November 2016, retrieved 16 November 2016
- ^ a b c Shanika Gunaratna (15 November 2016), "Facebook, Google announce new policies to fight fake news", CBS News, retrieved 16 November 2016
- ^ a b John Ribeiro (14 November 2016), "Zuckerberg says fake news on Facebook didn't tilt the elections", Computerworld, retrieved 16 November 2016
- ^ a b c d e f g h i Timberg, Craig (30 November 2016), "Effort to combat foreign propaganda advances in Congress", The Washington Post, retrieved 1 December 2016
- ^ a b c d e f Chen, Adrian (27 July 2016), "The Real Paranoia-Inducing Purpose of Russian Hacks", The New Yorker, retrieved 26 November 2016
- ^ a b c d e Lewis Sanders IV (17 November 2016), "Fake news: Media's post-truth problem", Deutsche Welle, retrieved 24 November 2016
- ^ European Parliament Committee on Foreign Affairs (23 November 2016), "MEPs sound alarm on anti-EU propaganda from Russia and Islamist terrorist groups" (PDF), European Parliament, retrieved 26 November 2016
- ^ a b Surana, Kavitha (23 November 2016), "The EU Moves to Counter Russian Disinformation Campaign", Foreign Policy, ISSN 0015-7228, retrieved 24 November 2016
- ^ "EU Parliament Urges Fight Against Russia's 'Fake News'", Radio Free Europe/Radio Liberty, Agence France-Presse and Reuters, 23 November 2016, retrieved 24 November 2016
- ^ a b MacFarquhar, Neil (29 August 2016), "A Powerful Russian Weapon: The Spread of False Stories", The New York Times, p. A1, retrieved 24 November 2016
- ^ a b c d e Porter, Tom (28 November 2016), "How US and EU failings allowed Kremlin propaganda and fake news to spread through the West", International Business Times, retrieved 29 November 2016
- ^ a b c d e f g Schindler, John R. (5 November 2015), "Obama Fails to Fight Putin's Propaganda Machine", New York Observer, retrieved 28 November 2016
- ^ a b c d e f g Schindler, John R. (26 November 2016), "The Kremlin Didn't Sink Hillary—Obama Did", New York Observer, retrieved 28 November 2016
- ^ a b c d LoGiurato, Brett (29 April 2014), "Russia's Propaganda Channel Just Got A Journalism Lesson From The US State Department", Business Insider, retrieved 29 November 2016
- ^ LoGiurato, Brett (25 April 2014), "RT Is Very Upset With John Kerry For Blasting Them As Putin's 'Propaganda Bullhorn'", Business Insider, retrieved 29 November 2016
- ^ a b c Stengel, Richard (29 April 2014), "Russia Today's Disinformation Campaign", Dipnote, United States Department of State, retrieved 28 November 2016
- ^ a b c d Dougherty, Jill (2 December 2016), "The reality behind Russia's fake news", CNN, retrieved 2 December 2016
- ^ a b c d "U.S. officials defend integrity of vote, despite hacking fears", WITN-TV, 26 November 2016, retrieved 2 December 2016
- ^ a b Benedictus, Leo (6 November 2016), "Invasion of the troll armies: from Russian Trump supporters to Turkish state stooges", The Guardian, retrieved 2 December 2016
- ^ a b c Schatz, Bryan, "The Kremlin Would Be Proud of Trump's Propaganda Playbook", Mother Jones, retrieved 2 December 2016
- ^ a b c d e f g "Vladimir Putin Wins the Election No Matter Who The Next President Is", The Daily Beast, 4 November 2016, retrieved 2 December 2016
- ^ a b c d e f Frenkel, Sheera (4 November 2016), "US Officials Are More Worried About The Media Being Hacked Than The Ballot Box", BuzzFeed News, retrieved 2 December 2016
- ^ a b c d e Shapiro, Ari (25 November 2016), "Experts Say Russian Propaganda Helped Spread Fake News During Election", All Things Considered, National Public Radio, retrieved 26 November 2016
- ^ a b c d Ben Norton; Glenn Greenwald (26 November 2016), "Washington Post Disgracefully Promotes a McCarthyite Blacklist From a New, Hidden, and Very Shady Group", The Intercept, retrieved 27 November 2016
- ^ a b c d Ingram, Matthew (25 November 2016), "No, Russian Agents Are Not Behind Every Piece of Fake News You See", Fortune magazine, retrieved 27 November 2016
- ^ a b Taibbi, Matt (28 November 2016), "The 'Washington Post' 'Blacklist' Story Is Shameful and Disgusting", Rolling Stone, retrieved 30 November 2016
- ^ a b vanden Heuvel, Katrina (29 November 2016), "Putin didn't undermine the election. We did.", The Washington Post, retrieved 1 December 2016
- ^ a b c d Jamie Condliffe (15 November 2016), "Facebook's Fake-News Ad Ban Is Not Enough", MIT Technology Review, retrieved 16 November 2016
- ^ a b Craig Silverman and Lawrence Alexander (3 November 2016), "How Teens In The Balkans Are Duping Trump Supporters With Fake News", BuzzFeed, retrieved 16 November 2016,
As a result, this strange hub of pro-Trump sites in the former Yugoslav Republic of Macedonia is now playing a significant role in propagating the kind of false and misleading content that was identified in a recent BuzzFeed News analysis of hyperpartisan Facebook pages.
- ^ a b Ishmael N. Daro and Craig Silverman (15 November 2016), "Fake News Sites Are Not Terribly Worried About Google Kicking Them Off AdSense", BuzzFeed, retrieved 16 November 2016
- ^ a b Christopher Woolf (16 November 2016), "Kids in Macedonia made up and circulated many false news stories in the US election", Public Radio International, retrieved 18 November 2016
- ^ a b c d e f g Collins, Ben (28 October 2016), "This 'Conservative News Site' Trended on Facebook, Showed Up on Fox News—and Duped the World", The Daily Beast, retrieved 27 November 2016
- ^ a b c d e f Chacon, Marco (21 November 2016), "I've Been Making Viral Fake News for the Last Six Months. It's Way Too Easy to Dupe the Right on the Internet.", The Daily Beast, retrieved 27 November 2016
- ^ a b c Bambury, Brent (25 November 2016), "Marco Chacon meant his fake election news to be satire — but people took it as fact", Day 6, CBC Radio One, retrieved 27 November 2016
- ^ a b Rachel Dicker (14 November 2016), "Avoid These Fake News Sites at All Costs", U.S. News & World Report, retrieved 16 November 2016
- ^ Chang, Juju (29 November 2016), "When Fake News Stories Make Real News Headlines", ABC News, retrieved 29 November 2016
- ^ a b McAlone, Nathan (17 November 2016), "This fake-news writer says he makes over $10,000 a month, and he thinks he helped get Trump elected", Business Insider, retrieved 18 November 2016
- ^ a b Goist, Robin (17 November 2016), "The fake news of Facebook", The Plain Dealer, retrieved 18 November 2016
- ^ a b Dewey, Caitlin (17 November 2016), "Facebook fake-news writer: 'I think Donald Trump is in the White House because of me'", The Washington Post, ISSN 0190-8286, retrieved 17 November 2016
- ^ a b c d Hedegaard, Erik (29 November 2016), "How a Fake Newsman Accidentally Helped Trump Win the White House - Paul Horner thought he was trolling Trump supporters – but after the election, the joke was on him", Rolling Stone, retrieved 29 November 2016
- ^ Alyssa Newcomb (15 November 2016), "Facebook, Google Crack Down on Fake News Advertising", NBC News, NBC News, retrieved 16 November 2016
- ^ Drum, Kevin (17 November 2016), "Meet Ret. General Michael Flynn, the Most Gullible Guy in the Army", Mother Jones, retrieved 18 November 2016
- ^ a b Tapper, Jake (17 November 2016), "Fake news stories thriving on social media - Phony news stories are thriving on social media, so much so President Obama addressed it. CNN's Jake Tapper reports.", CNN, retrieved 18 November 2016
- ^ Masnick, Mike (14 October 2016), "Donald Trump's Son & Campaign Manager Both Tweet Obviously Fake Story", Techdirt, retrieved 18 November 2016
- ^ President Barack Obama (7 November 2016), Remarks by the President at Hillary for America Rally in Ann Arbor, Michigan, White House Office of the Press Secretary, retrieved 16 November 2016
- ^ Gardiner Harris and Melissa Eddy (17 November 2016), "Obama, With Angela Merkel in Berlin, Assails Spread of Fake News", The New York Times, retrieved 18 November 2016
- ^ a b Maheshwari, Sapna (20 November 2016), "How Fake News Goes Viral", The New York Times, ISSN 0362-4331, retrieved 20 November 2016
- ^ a b c d e f Kurtz, Howard, "Fake news and the election: Why Facebook is polluting the media environment with garbage", Fox News, archived from the original on 18 November 2016, retrieved 18 November 2016
- ^ a b Porter, Tom (1 December 2016), "US House of representatives backs proposal to counter global Russian subversion", International Business Times UK edition, retrieved 1 December 2016
- ^ a b c d Miller, Kevin (1 December 2016), "Angus King: Russian involvement in U.S. election 'an arrow aimed at the heart of democracy'", Portland Press Herald, retrieved 2 December 2016
- ^ Staff report (30 November 2016), "Angus King among senators asking president to declassify information about Russia and election", Portland Press Herald, retrieved 2 December 2016
- ^ a b c d Murdock, Jason (30 November 2016), "Russian hackers may disrupt Germany's 2017 election warns spy chief", International Business Times UK edition, retrieved 1 December 2016
- ^ Orlowski, Andrew (21 November 2016), "China cites Trump to justify 'fake news' media clampdown. Surprised?", The Register, retrieved 28 November 2016
- ^ Pascaline, Mary (20 November 2016), "Facebook Fake News Stories: China Calls For More Censorship On Internet Following Social Media's Alleged Role In US Election", International Business Times, retrieved 28 November 2016
- ^ Rauhala, Emily (17 November 2016), "After Trump, Americans want Facebook and Google to vet news. So does China.", The Washington Post, retrieved 28 November 2016
- ^ Dou, Eva (18 November 2016), "China Presses Tech Firms to Police the Internet - Third-annual World Internet Conference aimed at proselytizing China's view to global audience", The Wall Street Journal, retrieved 28 November 2016
- ^ a b Bump, Philip (14 November 2016), "Google's top news link for 'final election results' goes to a fake news site with false numbers", The Washington Post, retrieved 26 November 2016
- ^ a b Jacobson, Louis (14 November 2016), "No, Donald Trump is not beating Hillary Clinton in the popular vote", PolitiFact.com, retrieved 26 November 2016
- ^ a b c d e f g Wingfield, Nick; Isaac, Mike; Benner, Katie (14 November 2016), "Google and Facebook Take Aim at Fake News Sites", The New York Times, retrieved 28 November 2016
- ^ Sonam Sheth (14 November 2016), "Google looking into grossly inaccurate top news search result displayed as final popular-vote tally", Business Insider, retrieved 16 November 2016
- ^ "Google to ban fake news sites from its advertising network", Los Angeles Times, Associated Press, 14 November 2016, retrieved 16 November 2016
- ^ Avery Hartmans (15 November 2016), "Google's CEO says fake news could have swung the election", Business Insider, retrieved 16 November 2016
- ^ "Google cracks down on fake news sites", The Straits Times, 15 November 2016, retrieved 16 November 2016
- ^ a b Richard Waters (15 November 2016), "Facebook and Google to restrict ads on fake news sites", Financial Times, retrieved 16 November 2016
- ^ Sridhar Ramaswamy (21 January 2016), "How we fought bad ads in 2015", Google blog, Google, retrieved 28 November 2016
- ^ Paul Blake (15 November 2016), "Google, Facebook Move to Block Fake News From Ad Services", ABC News, retrieved 16 November 2016
- ^ a b c d Gina Hall (15 November 2016), "Facebook staffers form an unofficial task force to look into fake news problem", Silicon Valley Business Journal, retrieved 16 November 2016
- ^ a b c d Frenkel, Sheera (14 November 2016), "Renegade Facebook Employees Form Task Force To Battle Fake News", BuzzFeed, retrieved 18 November 2016
- ^ Shahani, Aarti (15 November 2016), "Facebook, Google Take Steps To Confront Fake News", National Public Radio, retrieved 20 November 2016
- ^ a b Cooke, Kristina (15 November 2016), Google, Facebook move to restrict ads on fake news sites, retrieved 20 November 2016
{{citation}}
: Unknown parameter|agency=
ignored (help) - ^ "Facebook's Fake News Problem: What's Its Responsibility?", The New York Times, Associated Press, 15 November 2016, retrieved 20 November 2016
- ^ a b c d e Ohlheiser, Abby (19 November 2016), "Mark Zuckerberg outlines Facebook's ideas to battle fake news", The Washington Post, retrieved 19 November 2016
- ^ a b c d Vladimirov, Nikita (19 November 2016), "Zuckerberg outlines Facebook's plan to fight fake news", The Hill, ISSN 1521-1568, retrieved 19 November 2016
- ^ a b c d e Mike Isaac (19 November 2016), "Facebook Considering Ways to Combat Fake News, Mark Zuckerberg Says", The New York Times, retrieved 19 November 2016
- ^ a b c d Samuel Burke (19 November 2016), "Zuckerberg: Facebook will develop tools to fight fake news", CNNMoney, CNN, retrieved 19 November 2016
- ^ a b Chappell, Bill (19 November 2016), "'Misinformation' On Facebook: Zuckerberg Lists Ways Of Fighting Fake News", National Public Radio, retrieved 19 November 2016
- ^ a b Silverman, Craig (19 November 2016), "This Is How You Can Stop Fake News From Spreading On Facebook", BuzzFeed, retrieved 20 November 2016
- ^ a b c d Taylor Hatmaker and Josh Constine (1 December 2016), "Facebook quietly tests warnings on fake news", TechCrunch, retrieved 2 December 2016
- ^ "False news items are not the only problem besetting Facebook", The Economist, 26 November 2016, retrieved 28 November 2016
- ^ a b Pesek, William (27 November 2016), "Will Facebook be China's propaganda tool?", The Japan Times, Barron's newspaper, retrieved 28 November 2016
- ^ a b c d e f g h i j Stelter, Brian (7 November 2016), "How Donald Trump made fact-checking great again", CNNMoney, CNN, retrieved 19 November 2016
- ^ a b c d e f Kessler, Glenn (10 November 2016), "Fact checking in the aftermath of a historic election", The Washington Post, retrieved 19 November 2016
- ^ a b c d e Neidig, Harper (17 November 2016), "Fact-checkers call on Zuckerberg to address spread of fake news", The Hill, ISSN 1521-1568, retrieved 19 November 2016
- ^ Hartlaub, Peter (24 October 2004), "Web sites help gauge the veracity of claims / Online resources check ads, rumors", San Francisco Chronicle, p. A1, retrieved 25 November 2016
- ^ "Fact-Checking Deceptive Claims About the Federal Health Care Legislation - by Staff, FactCheck.org", 2010 Sigma Delta Chi Award Honorees, Society of Professional Journalists, 2010, retrieved 25 November 2016
- ^ a b Columbia University (2009), "National Reporting - Staff of St. Petersburg Times", 2009 Pulitzer Prize Winners, retrieved 24 November 2016,
For "PolitiFact," its fact-checking initiative during the 2008 presidential campaign that used probing reporters and the power of the World Wide Web to examine more than 750 political claims, separating rhetoric from truth to enlighten voters.
- ^ Novak, Viveca (10 April 2009), "Ask FactCheck - Snopes.com", FactCheck.org, retrieved 25 November 2016
- ^ McNamara, Paul (13 April 2009), "Fact-checking the fact-checkers: Snopes.com gets an 'A'", Network World, retrieved 25 November 2016
- ^ a b c d e f Lori Robertson and Eugene Kiely (18 November 2016), "How to Spot Fake News", FactCheck.org, retrieved 19 November 2016
- ^ Lemann, Nicholas (30 November 2016), "Solving the Problem of Fake News", The New Yorker, retrieved 30 November 2016
- ^ a b Sharockman, Aaron (16 November 2016), "Let's fight back against fake news", PolitiFact.com, retrieved 19 November 2016
- ^ Burgess, Matt (17 November 2016), "Google is helping Full Fact create an automated, real-time fact-checker", Wired magazine UK edition, retrieved 29 November 2016
- ^ a b c The International Fact-Checking Network (17 November 2016), "An open letter to Mark Zuckerberg from the world's fact-checkers", Poynter Institute, retrieved 19 November 2016
- ^ a b Hare, Kristen (September 21, 2015), Poynter names director and editor for new International Fact-Checking Network, Poynter Institute for Media Studies, retrieved 20 November 2016
- ^ a b About the International Fact-Checking Network, Poynter Institute for Media Studies, 2016, retrieved 20 November 2016
- ^ a b Klasfeld, Adam (22 November 2016), "Fake News Gives Facebook a Nixon-Goes-to-China Moment", Courthouse News Service, retrieved 28 November 2016
- ^ a b Brian Feldman (15 November 2016), "Here's a Chrome Extension That Will Flag Fake-News Sites for You", New York Magazine, retrieved 16 November 2016
- ^ Will Oremus (15 November 2016), "The Real Problem Behind the Fake News", Slate magazine, retrieved 16 November 2016
- ^ a b c d Morris, David Z. (27 November 2016), "Eli Pariser's Crowdsourced Brain Trust Is Tackling Fake News", Fortune magazine, retrieved 28 November 2016
- ^ a b c Burgess, Matt (25 November 2016), "Hive mind assemble! There is now a crowdsourcing campaign to solve the problem of fake news", Wired magazine UK edition, retrieved 29 November 2016
- ^ a b Ingram, Matthew (21 November 2016), "Facebook Doesn't Need One Editor, It Needs 1,000 of Them", Fortune magazine, retrieved 29 November 2016
- ^ "Google, Facebook move to curb ads on fake news sites", Kuwait Times, Reuters, 15 November 2016, retrieved 16 November 2016
- ^ Menczer, Filippo (28 November 2016), "Fake Online News Spreads Through Social Echo Chambers", Scientific American, The Conversation, retrieved 29 November 2016
- ^ Douglas Perry (15 November 2016), "Facebook, Google try to drain the fake-news swamp without angering partisans", The Oregonian, retrieved 16 November 2016
- ^ a b c Cassandra Jaramillo (15 November 2016), "How to break it to your friends and family that they're sharing fake news", The Dallas Morning News, retrieved 16 November 2016
- ^ a b c d Domonoske, Camila (23 November 2016), "Students Have 'Dismaying' Inability To Tell Fake News From Real, Study Finds", National Public Radio, retrieved 25 November 2016
- ^ a b c d McEvers, Kelly (22 November 2016), "Stanford Study Finds Most Students Vulnerable To Fake News", National Public Radio, retrieved 25 November 2016
- ^ Shellenbarger, Sue (21 November 2016), "Most Students Don't Know When News Is Fake, Stanford Study Finds", The Wall Street Journal, retrieved 29 November 2016
- ^ a b c d e Willingham, Emily (28 November 2016), "A Scientific Approach To Distinguishing Real From Fake News", Forbes magazine, retrieved 29 November 2016
- ^ a b "Samantha Bee Interviews Russian Trolls, Asks Them About 'Subverting Democracy'", The Hollywood Reporter, 1 November 2016, retrieved 25 November 2016
- ^ a b Holub, Christian (1 November 2016), "Samantha Bee interviews actual Russian trolls", Entertainment Weekly, retrieved 25 November 2016
- ^ a b c d e Wilstein, Matt (7 November 2016), "How Samantha Bee's 'Full Frontal' Tracked Down Russia's Pro-Trump Trolls", The Daily Beast, retrieved 25 November 2016
- ^ Rogers, James (11 November 2016), "Facebook's 'fake news' highlights need for social media revamp, experts say", Fox News, retrieved 20 November 2016
- ^ Chenoweth, Eric (25 November 2016), "Americans keep looking away from the election's most alarming story", The Washington Post, retrieved 26 November 2016
- ^ "'I write fake news that gets shared on Facebook'", BBC News, BBC, 15 November 2016, retrieved 16 November 2016
Further reading
- Jamie Condliffe (15 November 2016), "Facebook's Fake-News Ad Ban Is Not Enough", MIT Technology Review, retrieved 16 November 2016
- Cassandra Jaramillo (15 November 2016), "How to break it to your friends and family that they're sharing fake news", The Dallas Morning News, retrieved 16 November 2016
- Craig Silverman and Lawrence Alexander (3 November 2016), "How Teens In The Balkans Are Duping Trump Supporters With Fake News", BuzzFeed, retrieved 16 November 2016
- Ishmael N. Daro and Craig Silverman (15 November 2016), "Fake News Sites Are Not Terribly Worried About Google Kicking Them Off AdSense", BuzzFeed, retrieved 16 November 2016
- Craig Silverman (16 November 2016), "Viral Fake Election News Outperformed Real News On Facebook In Final Months Of The US Election", BuzzFeed, retrieved 16 November 2016
- Timberg, Craig (24 November 2016), "Russian propaganda effort helped spread 'fake news' during election, experts say", The Washington Post, retrieved 25 November 2016
External links
- Kim LaCapria (2 November 2016), "Snopes' Field Guide to Fake News Sites and Hoax Purveyors", Snopes.com, snopes.com, retrieved 16 November 2016
- Rachel Dicker (14 November 2016), "Avoid These Fake News Sites at All Costs", U.S. News & World Report, retrieved 16 November 2016
- Lori Robertson and Eugene Kiely (18 November 2016), "How to Spot Fake News", FactCheck.org, Annenberg Public Policy Center, retrieved 19 November 2016
- Jared Keller (19 November 2016), "This Critique of Fake Election News Is a Must-Read for All Democracy Lovers", Mother Jones, retrieved 19 November 2016
- Lance Ulanoff (18 November 2016), "7 signs the news you're sharing is fake", Mashable, retrieved 19 November 2016
- Laura Hautala (19 November 2016), "How to avoid getting conned by fake news sites - Here's how you can identify and avoid sites that just want to serve up ads next to outright falsehoods.", CNET, retrieved 19 November 2016
- Sreenivasan, Hari (17 November 2016), "How online hoaxes and fake news played a role in the election", PBS NewsHour (video), retrieved 29 November 2016