Senin, 30 Juni 2014

Facebook Doesn't Understand The Fuss About Its Emotion Manipulation Study

On Facebook, you may be a guinea pig and not know it.

This weekend, the Internet discovered a study published earlier this month in an academic journal that recounted how a Facebook data scientist, along with two university researchers, turned 689,003 users' New Feeds positive or negative to see if it would elate or depress them. The purpose was to find out if emotions are 'contagious' on social networks. (They are, apparently.) The justification for subjecting the users to the psychological mind games was that everyone who signs up for Facebook agrees to the site's ' Data Use Policy,' which has a little line about how your information could be used for 'research.' Some people are pretty blase about the study, their reaction along the lines of, 'Dude. Facebook and advertisers manipulate us all the time. NBD.' Others, especially in the academic environment, are horrified that Facebook thinks that the little clause in the 9,045-word ToS counts as ' informed consent' that a user will be part of a psychological experiment, and that an ethics board reportedly gave that interpretation a thumbs up. The larger debate is about what companies can do to their users without asking them first or telling them about it after.


I asked Facebook yesterday what the review process was for conducting the study in January 2012, and its response reads a bit tone deaf. The focus is on whether the data use was appropriate rather than on the ethics of emotionally manipulating users to have a crappy day for science. That may be because Facebook was responding to a privacy reporter:


'This research was conducted for a single week in 2012 and none of the data used was associated with a specific person's Facebook account,' says a Facebook spokesperson. 'We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people's data in connection with these research initiatives and all data is stored securely.'


It's particularly fascinating to me that Facebook puts this in the 'research to improve our services' category, as opposed to 'research for academic purposes' category. That makes me wonder what other kind of psychological manipulation users are subjected to that they never learn about because it isn't published in an academic journal. This gives more fodder to academic Ryan Calo who has argued that companies need to get their psychological studies of users vetted in some way that echoes what happens in the academic context.


Before this story broke, Betsy Haibel wrote a relevant post that linguistically elevated the stakes by calling companies' assumption of consent from users as corporate rape culture. 'The tech industry does not believe that the enthusiastic consent of its users is necessary,' wrote Haibel. 'The tech industry doesn't even believe in requiring affirmative consent.'


When I signed up for 23andMe - a genetic testing service - it asked if I was willing to be part of '23andWe,' which would allow my genetic material to be part of research studies. I had to affirmatively check a box to say I was okay with that. As I suggested when I wrote about this yesterday, I think Facebook should have something similar. While many users may already expect and be willing to have their behavior studied - and while that may be warranted with 'research' being one of the 9,045 words in the data use policy - they don't expect that Facebook will actively manipulate their environment in order to see how they react. That's a new level of experimentation, turning Facebook from a fishbowl into a petri dish, and it's why people are flipping out about this.


Tidak ada komentar :

Posting Komentar