大数跨境
0
0

Selective Advertising:How Facebook Moved People to the Polls

Selective Advertising:How Facebook Moved People to the Polls 老A讲跨境
2025-09-10
1
导读:第一期“科技伦理”读书会中《The Ethical Algorithm》的案例研究。主要讨论了社交媒体通过选择性广告和内容推送影响用户行为的能力,尤其以Facebook在选举中的实验为例。
预告:第二期”平台经济“读书会的第四本书Data Grab, The New Colonialism of Big Tech and How to Fight Back》,讨论会时间为9月20日。
     第三期的读书会主题为”Beyond Human: AI futures in film &fiction”,将会在2025年国庆节期间开始,包括了4本科幻小说及电影:《2001: A Space Odyssey》,《I, Robot》,《Do Androids Dream of Electric Sheep?》和《The Minority Report》。欢迎加入!!


此文章为第一期“科技伦理”读书会第二本书:《The Ethical Algorithm》的案例研究。
     本此讨论了社交媒体通过选择性广告和内容推送影响用户行为的能力,尤其以Facebook在选举中的实验为例。
      Facebook通过推送“我已投票”消息和情绪化政治新闻,显著影响了用户投票行为和情绪,甚至可能改变选举结果。这种定向操纵技术同样可用于商业广告,针对心理脆弱、贫困或处于困境的人群进行精准推广,例如推销高利润商品或不良服务,从而加剧社会不公。
      最后引出了一个关键问题:我们应如何监管社交媒体和其他科技公司,以确保其负责任地运用广告力量?尽管这是一个长期存在的问题,但在高风险背景下,它仍然具有紧迫的现实意义。

Selective Advertising: How Facebook Moved People to the Polls

by Grace Huang

I will discuss a specific example concerning selective advertising and how Facebook used it to influence voter turnout. I assume we are all familiar with the basics of U.S. elections, so let’s proceed.

The “I Voted” Experiment: 2010 and 2012 Elections

Facebook launched an experiment during the 2010 and 2012 elections to encourage voting. It placed messages saying “I voted” throughout users’ news feeds, often showing that their friends had clicked the “I voted” button. As illustrated in the small image here, your friends would use the button, and the notification was sent to many people—possibly millions, though the exact number is unclear.

The result was that Facebook increased voter turnout by approximately 340,000 people—a number large enough to sway entire states. For context, one of President Bush’s election victories was won by a margin of only around 500 to 5,000 votes. In a closely contested race, this number of voters could shift the outcome, particularly if they were influenced to support one party or candidate. If Facebook had selectively targeted Republican or Democratic supporters to motivate them, the results could have been tilted in favor of one side.


Emotional Influence through Altered News Feeds

Two years later, a Facebook researcher named Solomon Messing altered the news feeds of 2 million politically engaged users. Instead of showing the usual personal content—such as travel or graduation videos—their feeds displayed more political news posted by friends. The team also used linguistic software to analyze how the emotional tone of these posts affected user behavior.

They discovered that when political news contained strong emotions, such as anger toward a candidate, users who saw these posts were highly influenced. These individuals went on to publish similar content with comparable emotion, often within the same day or week. The experiment demonstrated that emotional contagion can shape political expression—and potentially, voting behavior.


The Risk of Deliberate Emotional Manipulation

A major concern is that platforms can deliberately shape emotional responses leading up to Election Day. The word “made” is key here: it is possible to influence voters’ emotions enough to change their perspectives. As in the earlier example, large populations could be nudged toward a particular stance.

The risk is even greater when platforms show not just general political news, but specifically slanted content—for instance, posts expressing strong opposition to one candidate. If users are repeatedly exposed to such messages, they might change their vote, even if they originally supported the candidate.

Implications for Electoral Integrity

The first major implication is the potential damage to electoral fairness. We were fortunate that Facebook did not use this tool for outright political manipulation—for example, flooding all news feeds with content favoring one candidate. That would have been technically easy, but the company limited itself to scientific research. Still, the possibility remains troubling.

Broader Threat: Selective Advertising by Tech Platforms

The deeper issue is selective advertising itself. If Facebook can do this, so can other major platforms—Twitter, Google, Amazon, and WeChat, for example. WeChat, in particular, recommends videos and content to users who scroll through its channels. Many people engage with this content, making them susceptible to influence.

Companies can use this capability to target people when they are most vulnerable. For example, if someone expresses a desire to travel to a tropical island on their social feed, they may immediately begin receiving ads for luxury travel and tropical resorts.

This idea is echoed in the documentaryA Coded Bias: online algorithms can identify when a user is vulnerable and tempt them with exactly what they are susceptible to. While this may be harmless for travel ads, it becomes harmful when targeting users with offerings from for-profit colleges or other potentially detrimental services—especially when the user is unaware of the risks.

This practice often preys on vulnerable groups. Employees trained in these advertising systems have been instructed to target people who recently experienced the death of a relative, who have low self-esteem, who are poor, or who are divorced—in short, those most susceptible to influence.

The Challenge of Regulation

This leads to an essential question: how can we regulate social media companies or any other company to responsibly use their advertising power? It is a longstanding question, but given the high stakes, it remains urgently relevant.

      


       对科技与社会议题感兴趣的同学,欢迎加入我们的读书会,一起探讨数字时代的机遇与挑战,并在数字洪流中保持独立清醒的思考。

第二期读书会主题”平台经济“的第四本书为:《Data Grab  The New Colonialism of Big Tech and How to Fight Back》。第一次讨论会时间为9月20日,如果想要电子书了解书本内容,请评论区留言!

如果雅思8分或托福110分以上同学,可以直接联系vx:logicx001

如果是CAE通过或FCE优秀以上,想了解阅读与辩论的同学,可以进大群。



以下为视频观看链接:

【声明】内容源于网络
0
0
老A讲跨境
跨境分享坊 | 每天记录行业思考
内容 41497
粉丝 1
老A讲跨境 跨境分享坊 | 每天记录行业思考
总阅读242.4k
粉丝1
内容41.5k