7 things you need to know about Facebook's mood experiment

Is your News Feed normally manipulated? Is it legal? Is it fair? Get your questions answered

1 2 Page 2
Page 2 of 2

He pointed out that the research affected only about 0.04% of Facebook's users, or 1 in 2,500. The social network today has more than 1 billion users.

5. Is this legal?

The short answer is yes.

The study's authors noted in their research paper that users accept Facebook's right to manipulate their News Feeds when they click on the site's terms and conditions of use.

Jeff Kagan, an independent analyst, told Computerworld that users do agree to information manipulation when they accept the site's terms of use. "If user didn't know that, it's their own fault," he said.

Patrick Moorhead, an analyst at Moor Insights & Strategy, noted that Facebook's user agreement states clearly that the company can do research on people who agree to the site's terms.

"Legally, even if people never read the terms, they are still bound to them," Moorhead added. "There are exceptions in the U.S. on things like bank loans and insurance documents, but this is not one of them."

6. Is it ethical or fair?

The question of whether the experiment was ethical or fair is another matter altogether from the question of whether it was legal, according industry analysts.

"The biggest complaint is Facebook's mind frame -- their lack of care about customers and protecting their privacy," Kagan said. "It's about the sneaky approach that Facebook continually seems to take and not caring about the concerns of the users."

Rob Enderle, an analyst at Enderle Group, said the drawback of any free site or service is that the people who use them may lose control over their privacy.

"A big part of the cost of 'free' is that companies often don't value customers who don't pay them for their services," Enderle said. "There is a growing elitism in the technology market likely connected to the massive power and wealth imbalance between the people who control social media properties and those that invest in and use them."

7. What options do users have?

While analysts say they'd be surprised if there wasn't a class-action lawsuit filed over this move, they also note there's a very easy step that users can take: Quit. Just stop using Facebook.

The the problem with that advice is that Facebook has engaged in activities that have raised privacy concerns and angered users in the past but there has never been a mass exodus.

So will users leave Facebook this time? Doubtful.

"Whatever your view, it is the responsibility of each of us to think through the implications of behaviors such as Facebook's recent experiment and come to a position about what kinds of intrusions we are willing to accept," said Hadley Reynolds, an analyst at NextEra Research. "The conclusions we come to will impact our perspective on the companies we choose to patronize on the Web."

Moorhead added that Facebook has shown that it will play fast and loose with users, and now it's up to users to decide what to do about it.

"Facebook is not trustworthy, as they step on users' privacy routinely," he said. "The reason why there isn't a mass exodus is mixed. Some users just don't care about privacy. Some stay because there isn't an alternative. Others don't know or understand the downside of an invasion of privacy."

This article, "7 Things You Need to Know About Facebook's Mood Experiment," was originally published on Computerworld.com.

Sharon Gaudin covers the Internet and Web 2.0, emerging technologies, and desktop and laptop chips for Computerworld. Follow Sharon on Twitter, at @sgaudin, and on Google+, or subscribe to Sharon's RSS feed . Her email address is sgaudin@computerworld.com.

See more by Sharon Gaudin on Computerworld.com.

Copyright © 2014 IDG Communications, Inc.

1 2 Page 2
Page 2 of 2
  
Shop Tech Products at Amazon