Facebook Deliberately Experimented on Your Emotions
Imagine a company that records every detail of your biography, can recognize your face, tries to listen in on you through your phone, and stores countless messages between you and your friends. Now imagine this same company intentionally making you sad. It happened.
The findings of Facebook's psychological experiment, which turned 689,003 users into test subjects without their consent, were recently published in the Proceedings of the National Academy of Sciences. The paper, a horror-show of anti-ethics, presents an online reality that's ominous to the point of parody:
We show, via a massive experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.
[...]
The experiment manipulated the extent to which people were exposed to emotional expressions in their News Feed.
In short, these almost 700,000 users were artificially (and arbitrarily) shown Facebook posts in an attempt to alter their mood enough to share similar happy or sad posts. Facebook wanted to see if it could spread feelings ("e.g., depression, happiness") like the NIH spreads pathogens among monkeys.
The experiment was a success: Facebook's research concludes that it can indeed make you feel what Facebook wants you to feel, and to a certain extent post certain kinds of things.
Naturally, the internet threw a fit over the weekend upon hearing the news—our sense of privacy hasn't been that fucked. How could we all be so naive, again? But as The Verge points out, Facebook didn't break any rules by putting people into a psychological experiment without their permission:
As the researchers point out, this kind of data manipulation is written into Facebook's Terms of Use. When users sign up for Facebook, they agree that their information may be used "for internal operations, including troubleshooting, data analysis, testing, research and service improvement."
When you signed up for Facebook, did you think you were consenting to have your brain used as a test bed for online advertising algorithms? Probably not! And although the Facebook researcher behind the project has expressed some vague remorse ("I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused"), Facebook itself doesn't see the big deal. They've already trotted out the corporate PR speak in a statement to The Guardian:
A Facebook spokeswoman said the research...was carried out "to improve our services and to make the content people see on Facebook as relevant and engaging as possible".
She said: "A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow."
So, there it is: in order to "improve [its] services" in order to sell advertising more efficiently than ever, Facebook is willing to toy with your feelings. Facebook is willing to make your interactions with friends part of a study. Facebook is willing to use you to make money. This is what Facebook is willing to do. Or at least this is what Facebook is willing to openly admit to doing in an academic journal.
The most profound takeaway here isn't that Facebook can make us feel things (we've known that for years). It's not that Facebook lacks respect you as a person, or has even an adolescent's grasp of ethics. We've known that for years, too. The most valuable lesson for the company might be that it can keep creeping us out and violating its customers, over and over again, and none of us will ever delete our accounts. I'd love to read that study.