Facebook emotional manipulation test turns users into 'lab rats'
Anger grows even as Facebook researcher posts apology for causing users anxiety
Computerworld - Users and analysts were in an uproar over the news that Facebook manipulated users' News Feeds to conduct a weeklong psychological study that affected about 700,000 people.
News reports said that Facebook allowed researchers to manipulate the positive and negative information that users saw on the social network in order to test the users' emotional responses to that content. The study was conducted from Jan. 11 to Jan. 18, 2012. Its findings were published in the Proceedings of the National Academy of Sciences.
For the past several days, the media, blog posts, social commentators and industry analysts have been venting their anger over what they see as Facebook's exercise in emotional manipulation.
"I think this violates the trust of Facebook users who rely on their protection," said Patrick Moorhead, an analyst with Moor Insights & Strategy. "There were two lines that were crossed that violated trust. First, Facebook's News Feed was manipulated for the sake of the experiment. Secondly, it involved a third party who published results externally. Facebook users are more than public lab rats."
In the experiment, Facebook temporarily influenced the kind of posts and photos that about 700,000 of its English-speaking users would see in their News Feeds, making it possible for researchers to show either mostly positive comments and posts or mostly negative comments and posts in order to see if the nature of the content influenced users' emotions.
As a consequence, users were not shown a regular cross-section of their friends' posts, but instead were given a manipulated feed.
The study found that people who saw more positive comments made more positive comments themselves, while users who saw more negative comments became more negative.
Facebook did not respond to a request for comment, though Adam Kramer, a data scientist at Facebook who participated in the study, apologized for upsetting users in a post on his Facebook page.
"Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone," Kramer wrote. "I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety."
He pointed out that the research affected only about 0.04% of Facebook's users, or 1 in 2,500. The social network today has more than 1 billion users.
Some users who commented on his blog post said they appreciated the fact that he addressed the issue, but not everyone was so understanding.
"I appreciate the statement, but emotional manipulation is still emotional manipulation, no matter how small of a sample it affected," wrote Kate LeFranc in a comment.
Andrew Baron wrote, "This is the nail in the coffin for my concern that Facebook is the kind of company that Google talks about when they say don't be evil.... There is no turning back from this."
Hadley Reynolds, an analyst at NextEra Research, said that while he finds Facebook's move "reprehensible," he also doesn't find it surprising.
"Web-based businesses have been 'experimenting' on their users since the beginning of the commercial Internet, so it's hardly surprising that Facebook would be testing the effect of various content algorithms on groups of their members," Reynolds said. "Facebook is simply gauging its ability to manipulate its customers' emotions for its own private gain. This incident won't break Facebook's franchise, but it will add another straw to the growing pile that eventually will erode user's perception of value enough to put the business in long-term decline."
Jeff Kagan, an independent analyst, called the experiment an abuse of users' trust.
- What to expect in Facebook's earnings call today
- Could you quit Facebook for 99 days?
- Facebook is a school yard bully that's going down
- EPIC says Facebook 'messed with people's minds,' seeks FTC sanctions
- 7 things you need to know about Facebook's mood experiment
- Facebook emotional manipulation test turns users into 'lab rats'
- Facebook tries to stop Snapchat drain with Slingshot
- TMI! Facebook moves to stop over-sharing
- Inside Facebook's brilliant plan to hog your data
- Facebook shows mobile app developers the money with new ad network
- Social Media in Technology: A Unified Strategy for Success Find out how social media is sparking a new era of customer and industry-understanding in technology enterprises and how industry leaders are overcoming...
- Agility & Scalability for Oracle EBS R12 and RAC on VMware vSphere 5 This white paper outlines extensive performance and scalability testing of Oracle EBS applications on a Vblock™ Systems with vSphere 5.
- Oracle and VCE: The Next Step in Integrated Computing Platforms In this ESG Lab review you will learn how a VCE system driven by Oracle, delivers the perfect blend of high performance and...
- Migrate Oracle Apps from RISC/UNIX to Virtualized x86 Ready to move Oracle to a virtualized environment? This brief explains how true converged infrastructure can help you migrate from a RISC/UNIX environment...
- Keep Servers Up and Running and Attackers in the Dark An SSL/TLS handshake requires at least 10 times more processing power on a server than on the client. SSL renegotiation attacks can readily...
- On Demand: Mastering the Art of Mobile Content Management Mobile device usage in the enterprise has skyrocketed, and it continues to escalate. IT must answer to users who demand access to their... All Social Media White Papers | Webcasts
Our new weekly Consumerization of IT newsletter covers a wide range of trends including BYOD, smartphones, tablets, MDM, cloud, social and what it all means for IT. Subscribe now and stay up to date!