Selective Advertising: How Facebook Moved People to the Polls
by Grace Huang
I will discuss a specific example concerning selective advertising and how Facebook used it to influence voter turnout. I assume we are all familiar with the basics of U.S. elections, so let’s proceed.
The “I Voted” Experiment: 2010 and 2012 Elections
Facebook launched an experiment during the 2010 and 2012 elections to encourage voting. It placed messages saying “I voted” throughout users’ news feeds, often showing that their friends had clicked the “I voted” button. As illustrated in the small image here, your friends would use the button, and the notification was sent to many people—possibly millions, though the exact number is unclear.
The result was that Facebook increased voter turnout by approximately 340,000 people—a number large enough to sway entire states. For context, one of President Bush’s election victories was won by a margin of only around 500 to 5,000 votes. In a closely contested race, this number of voters could shift the outcome, particularly if they were influenced to support one party or candidate. If Facebook had selectively targeted Republican or Democratic supporters to motivate them, the results could have been tilted in favor of one side.
Emotional Influence through Altered News Feeds
Two years later, a Facebook researcher named Solomon Messing altered the news feeds of 2 million politically engaged users. Instead of showing the usual personal content—such as travel or graduation videos—their feeds displayed more political news posted by friends. The team also used linguistic software to analyze how the emotional tone of these posts affected user behavior.
They discovered that when political news contained strong emotions, such as anger toward a candidate, users who saw these posts were highly influenced. These individuals went on to publish similar content with comparable emotion, often within the same day or week. The experiment demonstrated that emotional contagion can shape political expression—and potentially, voting behavior.
The Risk of Deliberate Emotional Manipulation
A major concern is that platforms can deliberately shape emotional responses leading up to Election Day. The word “made” is key here: it is possible to influence voters’ emotions enough to change their perspectives. As in the earlier example, large populations could be nudged toward a particular stance.
The risk is even greater when platforms show not just general political news, but specifically slanted content—for instance, posts expressing strong opposition to one candidate. If users are repeatedly exposed to such messages, they might change their vote, even if they originally supported the candidate.
Implications for Electoral Integrity
The first major implication is the potential damage to electoral fairness. We were fortunate that Facebook did not use this tool for outright political manipulation—for example, flooding all news feeds with content favoring one candidate. That would have been technically easy, but the company limited itself to scientific research. Still, the possibility remains troubling.
Broader Threat: Selective Advertising by Tech Platforms
The deeper issue is selective advertising itself. If Facebook can do this, so can other major platforms—Twitter, Google, Amazon, and WeChat, for example. WeChat, in particular, recommends videos and content to users who scroll through its channels. Many people engage with this content, making them susceptible to influence.
Companies can use this capability to target people when they are most vulnerable. For example, if someone expresses a desire to travel to a tropical island on their social feed, they may immediately begin receiving ads for luxury travel and tropical resorts.
This idea is echoed in the documentaryA Coded Bias: online algorithms can identify when a user is vulnerable and tempt them with exactly what they are susceptible to. While this may be harmless for travel ads, it becomes harmful when targeting users with offerings from for-profit colleges or other potentially detrimental services—especially when the user is unaware of the risks.
This practice often preys on vulnerable groups. Employees trained in these advertising systems have been instructed to target people who recently experienced the death of a relative, who have low self-esteem, who are poor, or who are divorced—in short, those most susceptible to influence.
The Challenge of Regulation
This leads to an essential question: how can we regulate social media companies or any other company to responsibly use their advertising power? It is a longstanding question, but given the high stakes, it remains urgently relevant.
对科技与社会议题感兴趣的同学,欢迎加入我们的读书会,一起探讨数字时代的机遇与挑战,并在数字洪流中保持独立清醒的思考。
第二期读书会主题”平台经济“的第四本书为:《Data Grab The New Colonialism of Big Tech and How to Fight Back》。第一次讨论会时间为9月20日,如果想要电子书了解书本内容,请评论区留言!
如果雅思8分或托福110分以上同学,可以直接联系vx:logicx001
如果是CAE通过或FCE优秀以上,想了解阅读与辩论的同学,可以进大群。

