Facebook emotional manipulation test turns users into 'lab rats'

Anger grows even as Facebook researcher posts apology for causing users anxiety

Users and analysts were in an uproar over the news that Facebook manipulated users' News Feeds to conduct a weeklong psychological study that affected about 700,000 people.

News reports said that Facebook allowed researchers to manipulate the positive and negative information that users saw on the social network in order to test the users' emotional responses to that content. The study was conducted from Jan. 11 to Jan. 18, 2012. Its findings were published in the Proceedings of the National Academy of Sciences.

For the past several days, the media, blog posts, social commentators and industry analysts have been venting their anger over what they see as Facebook's exercise in emotional manipulation.

"I think this violates the trust of Facebook users who rely on their protection," said Patrick Moorhead, an analyst with Moor Insights & Strategy. "There were two lines that were crossed that violated trust. First, Facebook's News Feed was manipulated for the sake of the experiment. Secondly, it involved a third party who published results externally. Facebook users are more than public lab rats."

In the experiment, Facebook temporarily influenced the kind of posts and photos that about 700,000 of its English-speaking users would see in their News Feeds, making it possible for researchers to show either mostly positive comments and posts or mostly negative comments and posts in order to see if the nature of the content influenced users' emotions.

As a consequence, users were not shown a regular cross-section of their friends' posts, but instead were given a manipulated feed.

The study found that people who saw more positive comments made more positive comments themselves, while users who saw more negative comments became more negative.

Facebook did not respond to a request for comment, though Adam Kramer, a data scientist at Facebook who participated in the study, apologized for upsetting users in a post on his Facebook page.

"Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone," Kramer wrote. "I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety."

He pointed out that the research affected only about 0.04% of Facebook's users, or 1 in 2,500. The social network today has more than 1 billion users.

Some users who commented on his blog post said they appreciated the fact that he addressed the issue, but not everyone was so understanding.

"I appreciate the statement, but emotional manipulation is still emotional manipulation, no matter how small of a sample it affected," wrote Kate LeFranc in a comment.

Andrew Baron wrote, "This is the nail in the coffin for my concern that Facebook is the kind of company that Google talks about when they say don't be evil.... There is no turning back from this."

Hadley Reynolds, an analyst at NextEra Research, said that while he finds Facebook's move "reprehensible," he also doesn't find it surprising.

"Web-based businesses have been 'experimenting' on their users since the beginning of the commercial Internet, so it's hardly surprising that Facebook would be testing the effect of various content algorithms on groups of their members," Reynolds said. "Facebook is simply gauging its ability to manipulate its customers' emotions for its own private gain. This incident won't break Facebook's franchise, but it will add another straw to the growing pile that eventually will erode user's perception of value enough to put the business in long-term decline."

Jeff Kagan, an independent analyst, called the experiment an abuse of users' trust.

1 2 Page
Join the discussion
Be the first to comment on this article. Our Commenting Policies