Shifting Media: Where the 2016 US Election and Russia’s Hybrid Warfare in Ukraine Collide

The dual-window screen split first from my Skype photo to my face, and my interlocutor – Oleksandra Tsekhanovska – popped onto her side shortly after. I noticed her red hair first. Her face appeared Barbie-like under an angelic glow, features obscured. Tsekhanovska apologized for the blurred screen and I laughed. “Don’t worry, I’ve seen worse.”

I thought it fitting. I was speaking with a Research Officer from the Ukraine Crisis Media Center’s recently-established Hybrid Warfare Analytical Group (HWAG). Intentional or not, the video’s quality distorted Tsekhanovska’s identity, and there would be no way to share her face if I’d wanted to.

We would be discussing propaganda, disinformation and cyber aggressions in Ukraine. Since at least 2014, Ukraine has been the site for Russia’s information and cyber warfare experimentation.[1] If we could track what has been happening in Ukraine, we might better understand the hybrid strategies Russia has been manipulating in what is now global warfare.

Tsekhanovska wanted me to know that she had only recently begun working with HWAG, but her expertise belied her. During our conversation, she would guide me through the knowns and unknowns of Russia’s assault on Ukraine. The West views their tactics as malicious, though Russia would claim all’s fair in protecting its interests, particularly in a country like Ukraine. Dmitri Peskov, press secretary to Vladimir Putin, has claimed that Russia did not choose this war. Rather, their government has engaged defensively in response to American NGO involvement in the color revolutions across post-Soviet states that ousted pro-Russian governments.

The former Soviet Union’s farthest borders – also known as the “near abroad” that is made up of the fourteen post-Soviet states– are considered vital to Russia’s foreign policy. According to a 2018 RAND report, affecting political outcomes in the “near abroad” countries through information and cyber tactical methods helps Russia create a barrier to protect itself against Western influence.[2]

Tsekhanovska noted Russia’s tendency to target even the personal lives of their enemies. “I don’t know if this is widely known, but in Ukraine we had such an example of Russian cyber warfare when the relatives of our soldiers received emails, and they received SMS messages on their phones with threats from the Russian side.” Texts sent to Ukrainian soldiers likely came from fake cellphone towers called cell site simulators, and the threatening messages may be unattributed, or ostensibly from Ukrainian soldiers themselves. Families have received texts that their sons have been killed in action.

Hybrid happens, and this means war. But what kind of war?

Western countries have tended to view cyber and information operations as two distinct phenomena. In his book “The Perfect Weapon,” journalist David Sanger wrote that for the Americans, “cyber war was one thing, information war was another.” He noted the United States’ focus on the physical implications of cyber attacks, which ignored Russian war theory. This has made it difficult for the West to respond to the country’s varied and evolving attacks.

Russia has long relied on its own definition of cyber and information warfare, with roots in the 2013 Gerasimov Doctrine.  For Russia, these operations were “all on a spectrum. At one end was pure propaganda. Then came fake news, manipulated election results, the publication of stolen emails. Physical attacks on infrastructure marked the far end.”[3]

This is contemporary Russian military strategy, where “the approach is guerilla, and waged on all fronts with a range of actors and tools – for example, hackers, media, businessmen, leaks and, yes, fake news, as well as conventional and asymmetric military means.”[4] Russia finds itself in an ongoing “information confrontation,”[5] and “information” encompasses both “Internet and military policies” with technical and psychological dimensions.[6] In the West today, this is known as hybrid warfare.

Tsekhanovska of HWAG agreed that information and cyber warfare often intersect as two components of hybrid warfare. These strategies include the use of both misinformation, which she defines as unintentional, and disinformation, the intentional spread of false ideas, facts or knowledge. Information warfare might combine the distribution of mis- and disinformation, but “cyber warfare is a more aggressive tactic… like hacking someone’s emails, like leaking personal data.”

Russia’s disinformation campaigns have also been long-term. On December 10, European media platform EURACTIV published a report on statements by the EU security commissioner Julian King. King asserted that Russia had been disseminating fake news for a year preceding the recent Sea of Azov crisis, where Russian special forces seized three Ukrainian ships.

The Russian media had claimed all sorts of offenses: Ukraine was preparing to welcome a NATO fleet into the Sea, and so had begun dredging the seabed; they poisoned its waters with cholera; there were suggestions of long-term US preparations for Ukraine-Russian clashes in Azov; and one report implied nuclear aggression, with Ukraine planning to partner with British secret services to blow up a bridge to Crimea.

Across the pond

In the United States, we’ve been stumbling over similarly confusing issues since 2015. When the Obama administration finally released a report on October 7th of that year alleging that the Russians had hacked the emails of the Democratic National Convention, news of Donald Trump’s pussy-grabbing Access Hollywood comments first drowned out the statement, followed by reports of the Wikileaks Podesta email dump only an hour after the Trump video surfaced. These distractions served as one arena for America’s own ideological and “fake news” skirmishes: Are the Russian’s involved or not? Is it interference or not? Collusion or not?

These questions have distracted from what now needs to be addressed: the many tactics and strategies of cyber and information operations that Russia uses to sow discord in target states.

We know from the Mueller indictment that twelve Russian operatives associated with the GRU (Russia’s military intelligence agency) were the main perpetrators of the DNC cyber hacks. These individuals used spear phishing methods to weasel out passwords to DNC email accounts and hacking to access classified documents; they created false personas like “DC Leaks” and “Guccifer 2.0” to release the emails; and their operations were not cordoned off in Russia, but rather perpetrated using a system of computers across the world.

Their strategy was varied and complex. According to the official indictment released in July 2018, these agents planted X-Agent malware onto the network of the Democratic Congressional Campaign Committee (DCCC), which “transmitted information from the victims’ computers to a GRU-leased server located in Arizona.” The hackers could “capture keystrokes entered by DCCC employees” and even “take pictures of the DCCC employees’ computer screens.”

Beyond the hacks, Russia’s operations in the 2016 US elections might be divided into two components: the creation of messaging that could achieve Russia’s goal of disturbing the United States electoral base, and the manipulation of this messaging through online means. Russian operatives used their understanding of America’s politics coupled with insight into the public’s psychology to reach vulnerable audiences.

The Russians could follow US mainstream news to find hot-button topics, and their messaging – targeted at specific voting demographics – relied on sparking strong sentiment. [7] Often, the appeal was to fear. Social media posts might define an enemy, which “cast the electoral choice in ‘us-versus-them’ zero-sum-game terms ready-made to harness fears of cultural change.”[8]

Some of this messaging focused on xenophobic populations. Professor of Communication Kathleen Hall Jamieson wrote that “the Russian desire to fan discord while also priming topics central to Trump’s issue agenda is clear in the names adopted by troll Facebook accounts, which included ‘Fed-up with Illegals,’ and ‘Infidels against Islam.’”[9] Other messages were intended for different demographics – such as supporters of the Black Lives Matter movement – in order to demobilize them, effectively suppressing their votes.

But the strategy was in the systems. Although leaking hacked content had its explosive effects, much of what the Russians did was simple exploitation of social connection capabilities across the social media universe. Social media is riddled with influencing devices, from “sharing” to advertising. Something “trending” on social media, for example, will gain access to a wider audience.

The strength of a trending subject is its ability to attract people quickly – an “agenda-setting effect” that works because journalists designate something that is trending as something that is newsworthy. One Twitter account, followed by internet expert Lt. Col. Jarred Prier of the US Air Force, changed its name from “FanFan” to “Deplorable Lucy” right after Hillary Clinton had named Trump supporters a “basket of deplorables” in a New York Times interview. The troll account “went from just over 1,000 to 11,000 within a few days.”[10]

Other tactics have included posting incendiary videos on YouTube, but no sites have been so cleverly exploited as Facebook. Some Facebook pages spread fake news, others propagated Facebook advertisements, free posts, and event notices linked to Russia’s Internet Research Agency. Facebook accounts publicized or financed protests and marches in the United States, and at least 22 out of 60 “actually took place.”[11]

The dangers of Facebook are in its identifying capabilities. It is simple for “advertisers” to identify the ideologies of users, to understand their “political affiliation, political activity, a sensitivity issue (e.g., gun control), news consumptions, county, ZIP code, location within a five-mile radius, personal profile, demographics, and interests…”[12] This way, Russian operatives were able to pinpoint American audiences that would be most swayed by what they read, heard or saw.

What should we be asking?

While America’s focus has been the 2016 election hacking, the implications of the broader applications of cyber and information warfare are being discussed in other countries. There are conversations, forums, government bodies and NGOs focused on solving these problems all over Europe.

In November 2018, Thomas Graham – a former Foreign Service Officer and currently affiliated with Yale – spoke to a group of academics, diplomats and students about “Relations Between Russia and the Rest of Europe Within a Shifting Global Environment.” The participants had gathered for the two-day conference in London’s Royal Horseguard Hotel, and Graham, casually leaning into the side of his chair, seemed too relaxed for the airs of the formally decorated space. He, the panel’s mediator and his interlocutor – a Russian diplomat – traded ideas beneath tall ceilings and chandeliers, oil paintings of handsome military men and beautifully dressed young women.

When asked about soft power and the internet, Graham dallied on the rules of the social media space. It should be fair game when it came to governments influencing on social media, he claimed: “You can’t really control that.” However, he was firm that “a line had been crossed” when Russia hacked the DNC and released the information for political purposes. What he would recommend, then, would be to encourage governments to spend a great deal of time educating the public about being “played” on social media, and how to deal with that.

Hannah Smith, the Director of Strategic Planning of Responses at the European Centre of Excellence for Countering Hybrid Threats and a fellow conference participant, reflected on Graham’s comments the following day. “I think he did kind of take it a little bit steps backwards in the second answer, because then he said that obviously in case the companies like Facebook or Twitter or detect [someone] using the platforms wrong, they could close their accounts. So perhaps governments cannot do so much, but private companies could do much more to put rules on the platforms.”

She agreed, too, that governments would need to invest more “resources to the education side, to the way how people read social media,” and they would be wise to partner with private companies to educate their own staff about the hazards of cyber and information operations in the social media space.

Social media advertising would be another area to address, potentially with legislation on the national level. Companies could be required to make sure “that they can clearly identify who is buying advertising, and who, perhaps, is setting up certain pages. If you have a group page, for example, then you need to identify who’s the one who keeps up the group page.”

The message was clear: we need to be focusing on social media to understand modern warfare. Where better to start than Ukraine, the “petri dish,” Russia’s laboratory, Putin’s playground for information and cyber operations?

Russian cyber testing in Ukraine

In Ukraine, all types of computer- and information-related operations have been flexible tactics for Russian military strategy. The country has been a site for attacks with purely physical implications, a target for mass disinformation, the center of rampant social media trolling and a place where demoralizing text messaging has been fair game to terrify troops. “Social media are also used to spread fake rumors to undermine the morale of Ukrainian troops or discredit army leadership.”[13] It’s combinatorial play with psychological turns.

In May 2014, Russia combined cyber operations with data manipulation to undermine the Ukrainian elections. The hackers were able to both remove data from Ukrainian voting systems after the ballots had been cast and alter the count in the reporting system of the country’s television network. The results would show that President Poroshenko’s rival, Dmytro Yarosh, had actually won the election. At the last minute, “Ukrainian officials detected the attack, and corrected the results a nail-biting forty minutes before the network aired them.”[14]

Russia struck Ukraine’s electric grid the following year. Beginning months before Christmas 2015, Russia’s operations involved the theft of passwords and accessing the systems themselves. Around 225,000 Ukrainian citizens lost power, but the attack was not limited to the grid. Like trolls have been known to flood online comment sections with messaging, “for good measure, a ‘call center’ for customers to dial to report an outage was flooded with automated calls, to maximize frustration and anger.”[15]

Another purely “physical” operation took place in June 2017. What is known as the “NotPetya” malware slammed first Ukraine’s computer systems, and then jumped to others all over the world. In Ukraine, television stations could not broadcast, ATMs ceased to dispense, even the systems in the Chernobyl nuclear power plant were hit.[16] In other countries – the United States, Tasmania, France, even Russia – companies lost control of their computers. In 2018, the CIA concluded that Russians were behind the attack.

Of the three attacks described, only the election hacking can be said to have contained elements of disinformation. Sanger noted that even the Russian media had been in on the farce – “Russia’s own television networks, apparently unaware that the cyberattack had been detected, announced the phony results, with Yarosh as the victor.”[17] And the attack on Ukraine’s power grid was no simple matter, with the addition of those automated calls flooding help stations a neat trick of hybrid warfare. But Ukraine has been facing another crisis akin to what happened in the 2016 US election hacking: alongside the “cyber,” disinformation and social media disruptions reign freely.

Looking at patterns: social media strategies in Ukraine

Like in the 2016 US election operations, Russia has used targeted messaging in both traditional and social media to influence populations. According to a RAND report, “Russia appears to actively synchronize social media products with those of various other information outlets, including Russian-branded TV broadcasts and web news, proxy civil society agencies, and web outlets.”[18]

Oleksandra Tsekhanovska of HWAG explained that her organization’s most recent research has focused on television messaging. However, she also confirmed that the messaging found on traditional TV is not so far off from what can be read on social media. “Because television is a more controlled, safer environment for promoting any kind of political agenda, these narratives can be more easily detected, and they are presented in a more clear manner. But social media usually copies them.”

HWAG monitored Russian television messages from July 1st 2014 to December 30th 2017. These messages usually run along one of six narratives: narratives about the US, about the EU, about Ukraine, about NATO, about the Russian president and about Russia’s role in WWII. The United States is Russia’s greatest enemy; Europe is decaying; NATO countries are only under the thumb of the United States; Russia was a grand hero in WWII.

Tsekhanovska noted that while the most widely Russian-promoted narrative today is that Ukraine is a failed state, another popular idea is that Poroshenko and Ukrainian politicians are corrupt. The public is bombarded with the message that politicians don’t really care about changing the situation in the country for the better, they are only motivated for elections. Elections, elections, elections. All the talk is about it, because the elections are in four months.”

To complicate matters, there is a modest black hole of information regarding how, exactly, information operations are being conducted in Ukraine’s social media space. This is the current focus of Ukraine’s Hybrid Warfare Analytical Group, though only preliminary data has been gathered on the subject. However, other researchers have documented some of Russia’s social media operations in Ukraine.

Russia’s methodologies for media influence include operating a multilingual television channel, Kremlin-backed news sites and civil society organizations focused on social media. “To conduct these campaigns, Russia experts argue, Russia employs a synchronized mix of media that varies from attributed TV and news website content to far-right blogs and websites (with unclear attribution), as well as nonattributed social media accounts in the form of bots and trolls.”[19]

According to The Washington Post, Ukrainian president Petro Poroshenko began communicating with Facebook in 2015 in the attempt to stem fake news allegedly of Russian origin. His government claims that they were able to trace many accounts back to Russia. The 2014 revolution saw unrelenting posts distorting Ukraine’s image: the country was a Nazi haven, or home to Chechen and ISIS terrorists. There were suspicions that Russian bots may have begun reporting pro-democracy Facebook pages popular in Ukraine, in both 2014 and 2017, to have them banned; another operative – Russia still the suspect – imitated Ukrainian political figures, including Poroshenko, to post disinformation and divisive comments.

Twitter is another site were conflict between Russia and Ukraine can be easily monitored. Official Russian and Ukrainian Twitter profiles have engaged in tête-à-tête over political assertions. In May 2017, Russian president Vladimir Putin claimed that Ukraine’s Kievan Princess Anna Yaroslavna was ‘Russian’ – and the Ukrainian Twitter account wasted no time with a response schooling Putin on the princess’s history. The Tweet, published in English, established Princess Anna as a French queen, as she was married to Henry I; further messaging denigrated Moscow as an undeveloped expanse of forest, while Kiev was already active in European political affairs.[20]

Russia’s account countered that the great cathedral of St. Sophia was built in Veliky Novgorod during the same time period and urged Ukraine and Belarus (via flag emoji symbolizing the countries) to remember the common history “which should unite our nations, not divide us.” Ukraine edged in with the last word, snarking and throwing out a Simpson’s cartoon reminding viewers of Russia’s former USSR identity. Sarcasm and sass characterized the exchange, a rebuttal tactic that has become popular with pro-Ukrainian Twitter accounts.

Twitter trolls and bots engage, and engage again: they wait for an online victim, they bait them, they enrage them, they lead them astray. Sometimes, trolls will “operate in teams of three on a given forum: one to disparage the authorities and the other two to disagree, creating the appearance of genuine engagement and debate.”[21] There are also Twitter bots – those same insidious accounts used to create trending topics in the 2016 US elections – and honeypots, known (not unlike trolls) to pull in a victim, and then attempt to extort them.

In October 2017, Twitter cooperated with the United States Congress to identify bot accounts on the social network. The site released gigabytes of Tweets thought to be associated with Russia’s Internet Research Agency and terminated the bot accounts – all 2,752 of them. These accounts were geared to a variety of audiences, imitating US news outlets and activists, or tags that would appeal to Russian-speaking readers. Newsweek revealed the handles of two Ukrainian-oriented accounts: @OdessaNews24 and @KievNovini.

These bots flaunt the power of imitation. Where one troll might engage an individual in a comments section in some social media space, a Twitter bot can simulate the exchange of a large group of people. One post may soon become the opinion of masses.[22]

The 2014 crash of the Malaysia Airlines flight MH17 blew up the Twitter environment. The day after a surface-to-air missile downed the plane, different Twitter accounts spammed the site with 44 thousand messages about the crash; the next day, more than 25 thousand.[23] These Tweets implied Ukraine’s involvement in the crash, with hashtags like #ProvocatoionofKiev (#ПровокацияКиева), #KievShotDownBoeing (#КиевСбилБоинг) and #KievTellTheTruth (#КиевСкажиПравду).

A fake account – @spainbuca – proposed that a Ukrainian fighter plane had shot down MH17. This theory ended up promoted on several Russian television channels, including RT.[24] Four years later, an international joint investigation team has tracked down the origins of the missile that shot down the flight: they claimed proof that it was Russian military, and the BUK system that carried the weapon “had crossed the border into eastern Ukraine from Russia and returned after the plane had been shot down.”

Oleksandra Tsekhanovska of HWAG also explained the concept of a dump tank, another space where Russian operatives and pro-Russian activists might originate fake news. “Dump tanks are web resources that focus on news, and they mix the content. Sometimes they post good-quality content with a certain percent of truth to it, and this way they earned their readerships’ trust. But then, later, because there is no transparency, they may be used to push forward Russian narratives.”

She later wrote with more information on dump tanks, sharing material on the subject published by Internews, another Ukrainian nonprofit organization that she referred to as her “colleagues.” Tsekhanovska described dump tanks as shady news sources with attribution issues. It is hard to know who owns or finances these sites. The dump tanks take news from well-established web resources “to gain readership’s trust,” and then combine it with “manipulative news.”

“These manipulative articles usually have scandalous, highly emotional titles and are aimed at evoking emotional reaction from the readers,” Tsekhanova wrote. “It is my assumption that resources like these may utilize ‘useful idiots’ who don’t do fact-checking and further spread distorted information or messages with certain agenda.” The articles are “potentially harmful to the reputation of people involved,” and sometimes the dump tanks “demand money for taking these materials off the site.”

Money has been a common lubricant in the disinformation machine. In a Russian-language BBC report, journalist Tse In Lee wrote that, especially towards the beginning of the 2014 Ukraine crisis, many Ukrainian sites were willing to publish just about anything – news or fake – for the right price.[25] Once propaganda has been published on a site like a dump tank, Russian operatives distribute the information via social media in whatever context might be useful at the moment.

What’s next: dealing with information and cyber operations

Twitter, Facebook, YouTube, fake news, a focus on messaging: Russia’s overall social media strategy in Ukraine is an obvious precedent to the operations in the 2016 US elections. So what’s next?

Beyond educating the public on how to cope with social media and corporations on how to respond to cyber threats, experts today urge that global leaders engage globally to address these issues. Hannah Smith of the Center for Excellence acknowledged that cyber operations echo the espionage of the past. “It is in every state’s nature, basically, this spying issue. But it is a fully different story to get information and then use it inside of your own state for strategic planning, or possibly if you manage to get some kind of technological secret and you use it in your own country.”

But governments must now come together and agree on how they feel about these issues, and once the problem has been defined, states must respond. Smith offered some recourse: “If you hack us, make sure that you only use it for yourself and are not leaking it!” She laughed at herself shortly after. Smith acknowledged the absurdity of the comment, but insisted that when attempting to respond to something new like this, “this is how it is.”

As for countering propaganda, states and individuals can take a lesson from Ukraine. Oleksandra Tsekhanovska acknowledged that the prerequisite is public awareness, but that responding to social media messaging is key. Although there has been the perception that “if you dignify some kind of wild propagandistic story with a reply, you give it credibility,” Ukraine has shown that the social media space simply doesn’t work this way.

“You have to provide an answer,” Tsekhanovska insisted. “You have to react. And of course it puts you in a very uncomfortable position, when you feel like you’re always losing. Because you’re always in a responsive position, kind of a passive position, where you always have to defend. But I don’t really think that there is any other choice to this.”


[1] Bodine-Baron, Elizabeth et al. “Russian Social Media Influence: Understanding Russian Propaganda in Eastern Europe.” RAND National Defense Research Institute. Santa Monica, California, 2018. Kindle Edition.

[2] Bodine-Baron, Elizabeth et al. “Russian Social Media Influence: Understanding Russian Propaganda in Eastern Europe.” RAND National Defense Research Institute. Santa Monica, California, 2018. Kindle Edition.

[3] Sanger, David E. The Perfect Weapon: War, Sabotage, and Fear in the Cyber Age.” Crown. New York, 2018: p. 158.

[4] McKew, Molly K. “The Gerasimov Doctrine.”, September/October 2017. Accessed 12/11/2018.

[5] Bodine-Baron, Elizabeth et al. “Russian Social Media Influence: Understanding Russian Propaganda in Eastern Europe.” RAND National Defense Research Institute. Santa Monica, California, 2018. Kindle Edition.

[6] Thomas, Timothy. “Russia’s Information Warfare Strategy: Can the Nation Cope in Future Conflicts?” The Journal of Slavic and Military Studies, 27:1, 2014. Pp. 101-130.

[7] Jamieson, Kathleen Hall. Cyberwar: How Russian Hackers and Trolls Helped Elect a President. Oxford University Press, New York, 2018, p. 131.

[8] Jamieson, Kathleen Hall. Cyberwar: How Russian Hackers and Trolls Helped Elect a President. Oxford University Press, New York, 2018, p. 88.

[9] Jamieson, Kathleen Hall. Cyberwar: How Russian Hackers and Trolls Helped Elect a President. Oxford University Press, New York, 2018, p. 85.

[10] Jamieson, Kathleen Hall. Cyberwar: How Russian Hackers and Trolls Helped Elect a President. Oxford University Press, New York, 2018, p. 42-43.

[11] Jamieson, Kathleen Hall. Cyberwar: How Russian Hackers and Trolls Helped Elect a President. Oxford University Press, New York, 2018, p. 73.

[12] Jamieson, Kathleen Hall. Cyberwar: How Russian Hackers and Trolls Helped Elect a President. Oxford University Press, New York, 2018, p. 140.

[13] Bodine-Baron, Elizabeth et al. “Russian Social Media Influence: Understanding Russian Propaganda in Eastern Europe.” RAND National Defense Research Institute. Santa Monica, California, 2018. Kindle Edition.

[14] Sanger, David. The Perfect Weapon: War, Sabotage, and Fear in the Cyber Age.” Crown. New York, 2018: p. 155.

[15] Sanger, David. The Perfect Weapon: War, Sabotage, and Fear in the Cyber Age.” Crown. New York, 2018: p. 167-168.

[16] Sanger, David. The Perfect Weapon: War, Sabotage, and Fear in the Cyber Age.” Crown. New York, 2018: p. 155.

[17] Sanger, David. The Perfect Weapon: War, Sabotage, and Fear in the Cyber Age.” Crown. New York, 2018: p. 155.

[18] Bodine-Baron, Elizabeth et al. “Russian Social Media Influence: Understanding Russian Propaganda in Eastern Europe.” RAND National Defense Research Institute. Santa Monica, California, 2018. Kindle Edition.

[19] Bodine-Baron, Elizabeth et al. “Russian Social Media Influence: Understanding Russian Propaganda in Eastern Europe.” RAND National Defense Research Institute. Santa Monica, California, 2018. Kindle Edition.

[20] “Как пишет британская Metro, пост намекал на то, что ‘Москва была необитаемым лесом, когда Украина уже была активно вовлечена в европейскую политику’.”

[21] Bodine-Baron, Elizabeth et al. “Russian Social Media Influence: Understanding Russian Propaganda in Eastern Europe.” RAND National Defense Research Institute. Santa Monica, California, 2018. Kindle Edition.

[22] “Они могут быть полностью или частично автоматическими. Благодаря им один-единственный человек может создать иллюзию единого мнения многих людей. Их также используют для подавления критики – нападений на отдельных людей или дискредитации хештегов…”

[23] “Настоящий “Twitter-шторм” разразился 18 июля 2014 года, на следующий день после катастрофы MH17. В тот день аккаунты “натвитили” более 44 тыс. сообщений, а на следующий – более 25 тыс. 297 аккаунтов продвигали информацию о якобы виновности Украины в крушении Boeing с помощью хештегов #ПровокацияКиева (22,3 тыс. упоминаний), #КиевСбилБоинг (22,1 тыс.) и #КиевСкажиПравду (21,9 тыс.)”, – пишут исследователи.

[24] “По одной из “теорий заговора”, самолет был сбит в воздухе украинским истребителем. Эта теория возникла из Twitter-сообщения испанского авиадиспетчера Карлоса (@spainbuca), которого в действительности не существует. Сообщение @spainbuca распространили другие пользователи, а затем и телекомпания RT, а также другие российские новостные агентства. Министерство информационной политики Украины позже заявило, что пророссийские сообщения распространялись через этот аккаунт еще в начале года.”

[25] “Сначала ‘фейковые’ сообщения публикуют в периодическом издании в интернете или в частном блоге. Это несложно, так как ‘многие украинские интернет-издания… за деньги готовы опубликовать что угодно’.”

Published by

Haley Bader

Hey! You've made it to my site, and I'm thrilled to introduce myself... my name is Haley, a writer and artist with a passion for adventure, volunteering, cooking and generally tossing myself into some sorts of shenanigans. I hope you enjoy what you find!

One thought on “Shifting Media: Where the 2016 US Election and Russia’s Hybrid Warfare in Ukraine Collide

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s