Senin, 30 Juni 2014

Facebook And The Ethics Of User Manipulation [Updated]


A recent, partially Army-funded study conscripted Facebook users as unwitting participants during a weeklong experiment in direct emotional manipulation. The study set out to discover if the emotional tone of a users' News Feed content had an impact on their own emotional makeup, measured through the tone of what they posted to the social service after viewing the skewed material.


Nearly 700,000 Facebook users were shown either more positive, or more negative content. The study found that users who were given more positive news feeds posted more positive things, and users who were given more negative news feeds posted more negative things.


Surprising? Doubtful. Unethical? Yes.


Bear it in mind that the impact of the study wasn't contained merely to those it directly manipulated. It notes that around 155,000 users from the positive and negative groups each 'posted at least one status update during the experimental period.' So, hundreds of thousands of status updates were posted by the negatively-induced user group. Those negative posts likely caused more posts of similar ilk.


Contagion, after all, doesn't end at the doorstep.


We won't know if the experiment did any more than darken the days of a few hundred thousand users for a week in 2012. But it could have. And that's enough to make a call on this: Allowing your users to be unwitting test subjects of emotional manipulation is beyond creepy. It's a damn disrespectful and dangerous choice.


Not everyone is in a good emotional spot. At any given moment, a decent chunk of Facebook's users are emotionally fragile. We know that because at any given moment, a decent chunk of humanity of emotionally fragile, and Facebook has a massive number of active users. That means that among the negatively influenced were the weak, the vulnerable, and potentially the young. I've reached out to Facebook asking if the study excluded users between the ages of 13 and 18, but haven't yet heard back.


Adding extraneous, unneeded emotional strain to a person of good mental health is an unkindness. Doing so to a person who needs encouragement and support is cruel.


The average Facebook user has something akin to an unwritten social contract with the company: I use your product, and you serve ads against the data I've shared. Implicit to that is expected polite behavior, the idea that Facebook won't abuse your data, or your trust. In this case, Facebook did both, using a user's social graph against them, with intent to cause emotional duress.


We're all manipulated by corporations. Advertising is among the more blatant examples of it. There's far more of it out there than we realize. The pervasiveness of the manipulation makes us slightly inured to it, undoubtedly. But that doesn't mean we can't point out things that are over the line when we are shown what's going on behind the curtain. If Facebook was willing to allow this experiment - lead author of which, according to the study itself is a Facebook employee working on its Core Data Science Team - what else might it allow in the future?


I am not arguing that Facebook has a moral imperative to make news feed content more positive on average. That would render the service intolerable - not all life events are positive, and the ability to commiserate with friends and loved ones digitally is now part of the human experience. And Facebook certainly tweaks its news feed over time for myriad reasons to improve its experience.


That's all perfectly reasonable. Deliberately looking to skew the emotional makeup of its users, spreading negativity for no purpose other than curiosity without user assent and practical safeguards is different. It's irresponsible.


Tidak ada komentar :

Posting Komentar