Facebook COO Sheryl Sandberg, speaking about her company's controversial psychological experiment for the first time, apologized for upsetting users.
Sandberg, who is second only to co-founder and CEO Mark Zuckerberg at the social network, discussed the experiment assessing users' emotional responses to Facebook content while in New Delhi to meet with advertisers on Wednesday.
"We clearly communicated really badly about this, and that we really regret," Sandberg said in an interview aired on India's NDTV. "We are in communication with regulators all over the world, and this will be OK and we will continue to make sure users understand that we care about their privacy."
"This was one week, and it was a small experiment," she added.
Sandberg said she was sorry that users are upset. She did not, however, say she was sorry the research was conducted.
"It's not exactly what it was," she responded when asked about what the interviewer referred to as the emotional experiment. "It was an experiment in showing people different things to see how it worked. And again, what really matters here is that we take people's privacy incredibly seriously... I want to be clear. Facebook can't control emotions. And cannot and will not try to control emotions."
Patrick, Moorhead, an analyst at Moor Insights & Strategy, said he was surprised at the words that Sandberg used in the interview.
"Facebook can't and won't control emotions? They did," he told Computerworld. "This was confusing. There appears to be a difference between what happened and what she is saying. Facebook did manipulate emotions, but Sandberg is saying they can't and won't."
He added that Facebook doesn't seem to have qualms about conducting this kind of research on its users.
"This apology doesn't help them in the public eye, but I think it communicates what Facebook thinks about it," Moorhead said.
Dan Olds, an analyst at Gabriel Consulting Group, said it was smart for a Facebook executive to take on the issue.
"Facebook's intentions were probably only to see how to make ads more effective," Olds said. "However, they were explicitly trying to alter user moods -- both positively and negatively. It seems like Facebook was looking to see if they could alter people's feeds to impact their moods, which is something that an advertiser might want to take advantage of."
A study published in the Proceedings of the National Academy of Sciences noted that Facebook allowed researchers to manipulate the positive and negative information that users saw on their News Feeds in order to test users' emotional responses to that content. The study was conducted from Jan. 11 to Jan. 18, 2012.
In the experiment, Facebook temporarily influenced the kind of posts and photos that about 700,000 of its English-speaking users saw in their News Feeds, making it possible for researchers to show either mostly positive comments and posts or mostly negative comments and posts.
That means users were not shown a regular cross-section of their friends' posts, but instead were given a manipulated feed. It also means that half of those users were purposefully shown certain types of content in an effort to see if it would make them sad.
The study found that people who saw more positive comments made more positive comments themselves, while users who saw more negative comments became more negative.
Users, analysts and bloggers have been in an uproar over what they see as intentional manipulation of people's emotions.
While analysts generally agreed that the experiment was not illegal, they also said it unfairly manipulated the company's users.
Sharon Gaudin covers the Internet and Web 2.0, emerging technologies, and desktop and laptop chips for Computerworld. Follow Sharon on Twitter, at @sgaudin, and on Google+, or subscribe to Sharon's RSS feed . Her email address is firstname.lastname@example.org.