Facebook's News Feed-the main list of status updates, messages, and photos you see when you open Facebook on your computer or phone-is not a perfect mirror of the world.
But few users expect that Facebook would change their News Feed in order to manipulate their emotional state.
We now know that's exactly what happened two years ago. For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves.
This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine 'emotional contagion,' as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it.
The experiment is almost certainly legal. In the company's current terms of service, Facebook users relinquish the their data ' data analysis, testing, [and] research.' Is it ethical, though? Since news of the study first emerged, I've seen and heard both privacy advocates and casual users express surprise at the audacity of the experiment.
In the wake of both the Snowden stuff and the Cuba twitter stuff, the Facebook 'transmission of anger' experiment is terrifying.
- Clay Johnson (@cjoh) June 28, 2014
Get off Facebook. Get your family off Facebook. If you work there, quit. They're fucking awful.
- Erin Kissane (@kissane) June 28, 2014
We're tracking the ethical, legal, and philosophical response to this Facebook experiment here.
Tidak ada komentar :
Posting Komentar