Facebook Basically Shrugs Off User Outrage Over 'Emotional' Experiment

NBC News Clone summarizes the latest on: Wbna55542883 - Breaking News | NBC News Clone. This article is rewritten and presented in a simplified tone for a better reader experience.

The social network wanted to find out what influences our emotions. Unfortunately, everyone is angry now.

OK, Facebook, here we go again.

The big blue social network came under fire this weekend following a report about a published study for which Facebook manipulated content seen by more than 600,000 people. The reason: To find out whether the changes would affect people’s emotional state.

People’s emotional states are most definitely running high now that news of the experiment has gotten out. People are calling it outrageous. Some news sites have likened Facebook users to lab rats.

And, so far, Facebook’s response has been underwhelming.

Related: Richard Branson on Why Compassion Is a Competitive Advantage

The report, published recently in the Proceedings of The National Academy of Sciences (PNAS), is called "Experimental evidence of massive-scale emotional contagion through social networks." In it, Facebook core data science team member Adam Kramer, along with other researchers, explained how Facebook altered the number of positive or negative terms seen by a select group of people. The researchers wanted to see if people would respond with increased positivity or negativity of their own.

What they found is that people produced an average of one fewer emotional word, per 1,000 words, over the following week. The experiment was conducted over one week in early 2012.

In a public message posted to Facebook, Kramer attempted to explain the logic behind experiment. “The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product,” he wrote. “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook.”

Related: 3 Surefire Ways to Drive Your Finest Employees to a Competitor

Kramer’s note fails, however, to address the nature in which the experiment was conducted without users’ knowledge and why people are obviously upset about it.

Sure, Facebook tweaks its algorithm all the time without announcing it. So why are people so upset over this experiment?

It’s one thing to observe behavior. It is another to try to manipulate it, as Facebook did. Even the PNAS editor who handled the study has her doubts about it.

“It's ethically okay from the regulations perspective, but ethics are kind of social decisions,” Princeton University psychology professor Susan Fiske, who edited the study for PNAS, told The Atlantic. “There's not an absolute answer. And so the level of outrage that appears to be happening suggests that maybe it shouldn't have been done...I'm still thinking about it and I'm a little creeped out, too. "

A Facebook representative did not immediately return a request this weekend seeking comment on the widespread negative reaction to the experiment.

Related: The Ethics Coach on Dealing With Backstabbing

×
AdBlock Detected!
Please disable it to support our content.

Related Articles

Donald Trump Presidency Updates - Politics and Government | NBC News Clone | Inflation Rates 2025 Analysis - Business and Economy | NBC News Clone | Latest Vaccine Developments - Health and Medicine | NBC News Clone | Ukraine Russia Conflict Updates - World News | NBC News Clone | Openai Chatgpt News - Technology and Innovation | NBC News Clone | 2024 Paris Games Highlights - Sports and Recreation | NBC News Clone | Extreme Weather Events - Weather and Climate | NBC News Clone | Hollywood Updates - Entertainment and Celebrity | NBC News Clone | Government Transparency - Investigations and Analysis | NBC News Clone | Community Stories - Local News and Communities | NBC News Clone