Facebook has a problem (okay, not just one).
According to a recent investigative article in The Daily Beast, a Russian “troll factory” played as much (if not more) of a role swaying the minds of American voters in the run-up to their 2016 election than our earlier (and dire) suspicions.
Somewhere over 10,000,000 Americans saw and were possibly influenced by these ads, many of them targetting hot-topic and (sadly) divisive issues like LGBT rights, gun control, and the like. And, of course, the Clinton / Trump divide.
“Being Patriotic” was the account name and Twitter handle for one of the more popular Russian-backed groups on Facebook. According to The Daily Beast, some 200,000 Americans joined the group which then went on to organize at least a dozen potential marches and rallies in Trump’s favour during the last election cycle.
The sad part is that it’s not hard to understand why the 200,000 Americans who followed that group did so. The implications are clear: supporting Donald Trump meant you were patriotic, while supporting Clinton meant the exact opposite – being unpatriotic.
No one wants to be labelled “unpatriotic.” Least of all, in the United States where Patriotism is a currency, a challenge that always seems to face new contenders.
The crushing irony is that there was nothing patriotic about this group, since it was being secretly backed and organized by Russian operatives. In effect, Russia was telling Americans how they should understand patriotism. Never mind what it says about electoral integrity, campaign laws and voting fairness (not every candidate is fortunate enough to have Russian backers sponsoring you from the sidelines like daddy) — it’s a pretty damning look at the power and perniciousness of present-day social media.
Facebook’s CEO Mark Zuckerberg loves to talk about how his platform helps people connect, makes the world a better, and a whole other spool of yarn. While some of that may be partially true, it’s becoming more and more obvious than the opposite is taking place well.
In this sense, Facebook’s simplicity, ease-of-access, and droves on content are both a blessing and a curse. It helps more and more people get online and “connect” with others, while also subjecting them to more information at a faster rate than anything most of them had been used to before.
The problem is it doesn’t look like people know what to do with all this information or how to take it. Perhaps people are too trusting and accepting of whatever they read (no matter the source, nor the headline) and are inclined to take it at either face value or on an emotional level.
Unfortunately, information (and data) is meant to be taken critically or analytically (and not based on one’s emotional attachment to the keywords in the title). If people aren’t trained or taught to be skeptical of everything they read (or at the very least, of suspicious claims), then we have a problem. People will make uninformed decisions
it would seem then that thanks to Facebook, people are essentially being exposed to information which they are unable to properly process at a faster rate than they can handle. This information in turn leads to real-world events and actions. Some can be big and elaborate (like organizing rallies based on a Russian sneak’s claims to patriotism) or more mundane.
It even happens where someone a person casually sees on Facebook (“hot water causes cancer!”) might seep into casual conversation after (“did you know cold water is better for you?”). When it comes to mundane things like water temperature, it’s not so bad, but when it comes to making an informed decision about who you think should run the country, then it’s another matter altogether – and a more frightening one at that.
Since the article ran in the Daily Beast, Facebook has come forward with more details about Russia’s use of their platform. In tandem, the social media company has promised greater oversight and moderation, but the most obvious part of the damage it seems has already been done. The less obvious damage – people’s inclination to read ingest whatever they read on the internet – isn’t something that they can patch with a few quick fixes. It’s systemic.