The goal of the GRU operation was to reach an audience, and media manipulation - of both mainstream as well as niche media - was instrumental to achieving that reach. Referencing the GRU operation I alluded to in the first question: journalists need to recognise that they are a target. What is the biggest threat journalists in your part of the world are facing in 2020 in terms of information disorder? There’s a lot of nuance in these topics, and the public is far better informed about them now than they were even just a year ago. There were significant platform policy shifts related to health misinformation as someone who’s worked in that space since 2015, I found the pronounced steps to reduce the spread of anti-vaccine and health quackery a bright spot (with the caveat that there’s more implementation refinement needed).īut there was also a much broader shift towards actually having critically important public conversations about the balance between privacy and security, the costs and benefits of the integration of AI into our daily lives, the distinction between free speech and mass dissemination when content is mediated and ranked by algorithmic curators. There were numerous disinformation-related findings. I think, actually, that the aggregate amount of coverage of tech and society topics was remarkable in itself. ![]() There was remarkable breadth of issues related to technology and society, it’s kind of impossible to pick just one. What was the biggest story of the year when it comes to these issues of technology and society? It looks at how we’re still mistakenly treating global influence operations as disparate attacks, and how this may be pushing us toward digital security theater as we fumble for a solution My new essay on information war – “The Digital Maginot Line” – is up. Having it confirmed reinforces the challenge of dealing with evolving adversaries and the importance of platforms having multi-faceted, behaviour-based platform for addressing coordinated inauthentic activity. Researchers had expected to see this kind of activity happening, because detection is significantly more difficult when local administrators are running the Facebook Pages. This took the form of entities linked to Evgeniy Prigozhin hiring locals in several countries in Africa to run their influence operations. Social platforms have recognised that they are a target it’s important that the media does so as well, and plans in advance for how it will cover and contextualise a significant hack-and-leak in the 2020 election.Īnother significant thing we saw was the franchising of influence operations to locals. Media manipulation, of both mainstream as well as independent media, was instrumental to achieving their operational goals. “ Social platforms have recognised that they are a target: it’s important that the media does so as well.” – Renee DiResta It was a very distinct strategy, markedly different from the Internet Research Agency’s simultaneous effort (which was strongly social-first, memetic propaganda). ![]() Our perception of the operations in aggregate is that social media success was actually not the point. What gave them lift was reaching out to Wikileaks and DMing journalists to entice them to cover the dumps. Although they tried to create Facebook pages to dump the documents, ultimately they gained no traction via that channel. One key finding that stood out is the extent to which their social media operation failed, even for the dissemination of sensational hacked materials like the DNC emails. Renée DiResta: A few of us at the Stanford Internet Observatory spent a large part of 2019 investigating GRU influence operations worldwide, from 2014 to 2019, based on a data set that Facebook provided to the Senate Select Committee on Intelligence. She investigates malign narratives across social networks and helps policymakers understand and respond to the problem.įirst Draft: What was the biggest development for you in terms of disinformation and media manipulation this year? Renée DiResta is the technical research manager at the Stanford Internet Observatory and a 2019 Mozilla Fellow in Media, Misinformation and Trust. ![]() Information Disorder: Year in Review hears from experts around the world on what to expect and how to be more resilient in 2020. Mis- and disinformation moved up the news agenda over the last 12 months as researchers, journalists and the public faced unprecedented problems navigating the online news ecosystem.
0 Comments
Leave a Reply. |