Dr. Fiona Hill is a former official at the U.S. National Security Council specializing in Russian and European affairs. She was a lead witness in the November 2019 House hearings regarding the impeachment of President Trump. She currently serves as a senior fellow in the Center on the United States and Europe in the Foreign Policy program at the Brookings Institution.
On October 6th, I and Taylor Fairless had the pleasure of interviewing Dr. Fiona Hill. My angle of the interview focused on engaging her expertise to explore the role that social media plays in Russian meddling in the U.S. electoral process. My question centered on the challenge of how the United States can effectively thwart Russian disinformation on social media when many argue that they simply exploit vulnerabilities inherent to the platforms themselves. Dr. Hill agreed with this assessment, which makes it an issue she believes we will be grappling with for “some time to come.”
In a recent article where I questioned if social media can undermine democracy, I broadly noted that these vulnerabilities included their content curation algorithms, targeted digital advertising, and much of the toxic psychology that guides engagement on them. These varying facets enable foreign actors like Russia to engineer content that compounds polarization and saturates feeds with misleading or false information. Facebook estimates that Russian actors managed to generate 80,000 posts that reached nearly 126 million people over the span of two years in the United States.
Information has always been a blessing and curse
Dr. Hill noted that information, regardless of the way it is delivered, has always been susceptible to deception, falsehoods, and lies. She argues that these are issues that are part of human nature. “Humans have always had a propensity for spreading rumors.” In preliterate times, this occurred via word-of-mouth and town criers, then pamphlets, print news, and eventually the digital realm. To her, this dynamic is simply part of a larger information commons, making it unsurprising that the same phenomena that we have seen over the sweep of history are now playing out on Facebook, Twitter, and other platforms.
Dr. Hill believes the main difference is that social media platforms are “speeding up” the personal interactions that we have as human beings. Indeed, much of this information and interaction is distributed through highly sophisticated algorithms and is instantly engaged with by millions of people each day. Furthermore, no single person receives the same information experience. Each person is curated content that reflects what these platforms and actors believe an individual will be receptive to. This is what can make deception, falsehoods, and lies become so effective and formidable in the digital realm.
I stated before that the advent of targeted digital advertising enables actors, malicious or not, to engineer and formulate content that it believes their targeted audience would likely react and engage with. In 2018, the House Intelligence Committee released 3,500 Russian Facebook ads that were designed to target and polarize specific portions of the population. In some cases, Russian-owned pages created and promoted political rallies where they hoped people would violently clash. Russia’s sophisticated and broad utilization of data-centered advertising demonstrates the dystopian implication social media can have on the political process when users are reduced to data points for sale.
The Path Forward
Dr. Hill believes that self-regulation is the most practical way to thwart Russian disinformation on these platforms. She makes it clear to acknowledge the flip-side of social media, stating that the “leveling out and leveling up” of information has enabled more people than ever to have access to critical, informative information. Her reasoning for self-regulation is that a draconian, heavy-handed government approach to regulating these highly complicated platforms runs the risk of squashing free speech. “We thrive on free speech, free engagement, and freedom of assembly,” which increasingly takes place on the internet and these platforms within our societies.
Considering this, disinformation is something we will have to accept and grapple with for some time. “We will have to be creative in how we handle it for the foreseeable future because what is a strength is also often a vulnerability,” Dr. Hill says. She argues that this balancing act will largely rely on self-regulation and creating close working partnerships with the government, the private sector, and the entirety of society. “We ourselves as consumers and users of the platforms […] have to also be aware that part of the responsibility is ours to be more careful of the information that we propagate, verify, and cross-reference.” She concludes: “we have to be active consumers, not just passive recipients of information.”