As a fan of science fiction, I recently watched (actually, I binged on) the second season of “Stranger Things“, a Netflix show in which a central plot element is “The Upside Down”. This is a world, existing in an alternate dimension, which mirrors our world in many ways but which is controlled by a malevolent being. The Upside Down is not a place one would want to live–it feeds on the darkest human emotions and consumes all light and true knowledge. Back in our world, a few weeks later, I read several news articles–in the context of the recent revelations about Facebook and Cambridge Analytica–about Facebook’s policy for political advertisements during the 2016 US Presidential election. In short, the allegations made by a former Facebook employee is that what advertisements run on Facebook is not just a matter of how much the client is willing to spend, but how much “engagement” the ad is likely to generate.
Rather than simply reward that ad position to the highest bidder, though, Facebook uses a complex model that considers both the dollar value of each bid as well as how good a piece of clickbait (or view-bait, or comment-bait) the corresponding ad is. If Facebook’s model thinks your ad is 10 times more likely to engage a user than another company’s ad, then your effective bid at auction is considered 10 times higher than a company willing to pay the same dollar amount. — The AtlanticThe model is not only complex, but it’s proprietary, meaning that an outside observer will never know how it works in detail. But let’s assume, for the sake of argument, that the algorithm does exactly what it is intended to do: assign “value” to an ad placement that is based on the ad’s predicted engagement. What’s the issue with that? Well, if engagement is driven, at least in part, by how far and wide the ad will be spread within Facebook user’s social network, then advertisers will game the system as best they can. One such way is to make the ad “provocative” is to be creative in the ad’s subject matter or presentation. We’ve all seen such ads on television.
When consumers see or hear an advertisement, whether it’s on the Internet, radio or television, or anywhere else, federal law says that ad must be truthful, not misleading, and, when appropriate, backed by scientific evidence. — Federal Trade Commission “Truth in Advertising” summaryIn the US, however, such rules often do not apply (or apply with much less force) to political advertising, where the product being sold is a politician, and where opinions often have more weight than facts. How this applies to the 2016 election and Facebook’s algorithm for advertising costs, was made clear by the same Facebook executive.
During the run-up to the election, the Trump and Clinton campaigns bid ruthlessly for the same online real estate in front of the same swing-state voters. But because Trump used provocative content to stoke social media buzz, and he was better able to drive likes, comments, and shares than Clinton, his bids received a boost from Facebook’s click model, effectively winning him more media for less money.All this, in itself, is perhaps not a bad thing–it’s actually an ideal example of what advertisers have sought for decades: the ability to determine what advertising mechanisms work best so that their client’s money can be well-spent. The client pays not just for eyeballs, but for actions that result in sales of their product or service. If their algorithms are more adept at providing that engagement, so much the better. But add in another observation and this idea of an engaged potential customer might have some downsides. A recent study published in Science and lead by a data scientist at MIT, Soroush Vosoughi, finds troubling consequences in the current, widespread use of Facebook for information gathering by end-users.
The massive new study analyzes every major contested news story in English across the span of Twitter’s existence—some 126,000 stories, tweeted by 3 million users, over more than 10 years—and finds that the truth simply cannot compete with hoax and rumor. By every common metric, falsehood consistently dominates the truth on Twitter, the study finds: Fake news and false rumors reach more people, penetrate deeper into the social network, and spread much faster than accurate stories. — The Grim Conclusions of the Largest-Ever Study of Fake NewsHow bad is this tendency for falsehood to outperform truth? The study provides ample details.
A false story is much more likely to go viral than a real story, the authors find. A false story reaches 1,500 people six times quicker, on average, than a true story does. And while false stories outperform the truth on every subject—including business, terrorism and war, science and technology, and entertainment—fake news about politics regularly does best. — The Grim Conclusions of the Largest-Ever Study of Fake NewsThis observation that adds a distorting element to the mix. Provide an advertisement that targets a topic that is controversial, and you will likely more rapid, and more widespread dissemination of your message. Further, provide information that is misleading or downright false, and you virtually guarantee that your ad will gain a large audience, and gain it rapidly. (It should be noted that it’s not just the fault of the algorithms used by social media companies that make this happen–as the Science study shows, it’s humans’ tendencies to prefer false stories over others that contribute strongly to this phenomenon. But, clearly, social media amplifies this behavior in ways that were not possible before its arrival.) A recent study by the Pew Research Center makes it clear where people get their news today.
As of August 2017, two-thirds (67%) of Americans report that they get at least some of their news on social media – with two-in-ten doing so often, according to a new survey from Pew Research Center. — News Use Across Social Media Platforms 2017Among social media sites and channels, Facebook is by far the most commonly used to retrieve news.
Looking at the population as a whole, Facebook by far still leads every other social media site as a source of news. – News Use Across Social Media Platforms 2017On Facebook, as well as other social media platforms, ads vie with “found news” and filtered sources of information for the user’s attention, blurring the lines between well-curated, factually accurate material and advertisements that may not be entirely truthful. So, we have a combination of a platform–Facebook and similar social media sites–and algorithms that “prefer” provocative sources of information, which feeds the all-too-human desire for engaging, if not entirely truthful, subject matter in a manner that distorts what we see and likely believe. Add in the tendency for political ads to look all too much like “news”, and the problem is evident. The Upside Down appears to be real, and it’s aided by the very technologies we once thought would set up free. Perhaps something that we, as citizens who are best served by understanding the truth about the world, best consider before it’s too late.