words in messages they read affected users.…Indeed, after the exposure the manipulated users began to use negative
or positive words in their updates
Down with Facebook we say!!! Hang social media, how dare they manipulate us emotionally? Well….did you know that they issued a heart-felt apology? From the COO even. Guess what folks, everything is fine and dandy now.
….wait a minute…did they promise to never ever pull that stunt again? No they did not? Just blamed everything on poor communication?? And these are experts in social media???? Hang social media, we say. Down with Facebook.
Apologising for bad communication on Facebook’s news feed experiment,
visiting Chief Operating Officer Sheryl Sandberg today said it was like
any other test aimed at improving service quality for over 1.2 billion
Facebook conducted a study in January 2012 to see how the placement of
positive or negative words in the news feeds of users affected their
The results of the study, conducted on 700,000 users, were published in
an article in the journal ‘Proceedings of the National Academy of
This stirred a global debate on privacy and protection of user data.
“This was an experiment done for one week, this was communicated
terribly and for that communication, we have apologised. This is part
of the ongoing research that companies do to test different products,”
Sandberg told reporters here.
She added that Facebook, which is the world’s largest social network
with over 1.2 billion users, takes privacy and security very seriously.
In India, Facebook has over 100 million users.
Previously, Facebook data scientist Adam Kramer, in a post, had also
said the goal of Facebook’s research is to learn how to provide a better
A UK regulator has begun an inquiry into the experiment to determine if
the company broke data protection laws applicable in that country.
“We work very closely with the regulators all over the world. We are
fully compliant (with regulations). (respect for data privacy) is the
hallmark of our service, it is the underpinning of our service,”
A study detailing how Facebook secretly manipulated the news feed of
some 700,000 users to study “emotional contagion” has prompted anger on
social media. For one week in 2012 Facebook tampered with the algorithm used to place
posts into user news feeds to study how this affected their mood.
conducted by researchers affiliated with Facebook, Cornell University,
and the University of California at San Francisco, appeared in the June
17 edition of the Proceedings of the National Academy of Sciences.
The researchers wanted to see if the number of positive, or negative,
words in messages they read affected whether users then posted positive
or negative content in their status updates.
Indeed, after the exposure the manipulated users began to use negative
or positive words in their updates depending on what they were exposed
Results of the study spread when the online magazine Slate and The Atlantic website wrote about it yesterday.
“Emotional states can be transferred to others via emotional contagion,
leading people to experience the same emotions without their awareness,”
the study authors wrote.
“These results indicate that emotions expressed by others on Facebook
influence our own emotions, constituting experimental evidence for
massive-scale contagion via social networks.”
While other studies have used metadata to study trends, this appears to
be unique because it manipulates data to see if there is a reaction.
The study was legal according to Facebook’s rules but was it ethical?
“#Facebook manipulated user feeds for massive psych experiment… Yeah, time to close FB acct!” read one Twitter posting.
Other tweets used words like “super disturbing,” “creepy” and “evil,” as well as angry expletives, to describe the experiment.
Susan Fiske, a Princeton University professor who edited the report for publication, told The Atlantic that she was concerned about the research and contacted the authors.