whatsapp

Connect on Whatsapp : +1 206 673 2541, Uninterrupted Access 24x7, 100% Confidential. Connect Now

Ethical Standards in Research (331)

Answer the following prompts using at least 250 words in your initial post:

In 2012, Facebook conducted an experiment on 700,000 Facebook users without these users knowing about the research. Facebook divided these users into two groups and displayed more negative content in the news feed of one group and more positive content in the news feed of the other group. Facebook found that those users who had more negative content in their news feed were more likely to make more negative posts themselves. The opposite was found with the other group as more positive posts were made by individuals who experienced more positive content.

You can read more about the above study here.

Considering what you have learned about ethics in research this week and throughout the course, what ethical issues are raised by this Facebook study? What consent issues were present in this Facebook study? Could there have been any issues with regards to vulnerable populations in this study? How would you feel if you found out that you were unknowingly part of this Facebook study? Finally, considering the hypothesis of the study that more negative content would lead to more negative posts and vice versa, could Facebook have addressed this hypothesis without manipulating the news feed content of these 700,000 Facebook users?

To Facebook, we are all lab rats.

Facebook routinely adjusts its users’ news feeds — testing out the number of ads they see or the size of photos that appear — often without their knowledge. It is all for the purpose, the company says, of creating a more alluring and useful product.

But last week, Facebook revealed that it had manipulated the news feeds of over half a million randomly selected users to change the number of positive and negative posts they saw. It was part of a psychological study to examine how emotions can be spread on social media.

The company says users consent to this kind of manipulation when they agree to its terms of service. But in the quick judgment of the Internet, that argument was not universally accepted.

“I wonder if Facebook KILLED anyone with their emotion manipulation stunt. At their scale and with depressed people out there, it’s possible,” the privacy activist Lauren Weinstein wrote in a Twitter post.
ADVERTISEMENT
Continue reading the main story

On Sunday afternoon, the Facebook researcher who led the study, Adam D. I. Kramer, posted a public apology on his Facebook page.

“I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused,” he wrote.

Unlock more free articles.
Create an account or log in
Facebook is hardly the only Internet company that manipulates and analyzes consumer data. Google and Yahoo also watch how users interact with search results or news articles to adjust what is shown; they say this improves the user experience. But Facebook’s most recent test did not appear to have such a beneficial purpose.

“Facebook didn’t do anything illegal, but they didn’t do right by their customers,” said Brian Blau, a technology analyst with Gartner, a research firm. “Doing psychological testing on people crosses the line.”

In an academic paper published in conjunction with two university researchers, the company reported that, for one week in January 2012, it had altered the number of positive and negative posts in the news feeds of 689,003 randomly selected users to see what effect the changes had on the tone of the posts the recipients t

Solution:

Looking for help with your homework?
Grab a 30% Discount and Get your paper done!

30% OFF
Turnitin Report
Formatting
Title Page
Citation
Place an Order

Calculate your paper price
Pages (550 words)
Approximate price: -