Social network Facebook is under fire after its covert 'emotional contagion' study left some users outraged.
Facebook has come under fire for allowing a social "emotional contagion" study. Source: AFP
THERE'S a sign up behind the counter in a cafe in Adelaide that chirpily suggests "If you see someone without a smile, give them one of yours".
The point of the old adage is that moods can be contagious, and we all have the power to spread a bit of cheer. Or gloom. But no-one likes a party pooper.
Our power to manipulate moods in real life — ours and other people's — is something we've always understood innately.
But now, scientists commissioned by social media behemoth Facebook, in the US, have revealed research they say proves our emotions can be toyed with in the virtual world, on social media, where so many of us spend so much of our time.
And their experiment has caused a hellbroth of trouble.
Over a random week in 2012, without telling them, researchers from Facebook, the University of California, University of San Francisco and Cornell University, manipulated the news feeds of 689,000 users, filtering content to tone down updates containing words considered to be "negative" for some people and words considered "positive" for other users.
The point of the study? To see if the emotions people were exposed to would have an influence over the mood of their own subsequent status updates.
They found that they did. They had proved the possibility of "emotional contagion".
"..for people who had positive content reduced in their News Feed, a larger percentage of words in people's status updates were negative and a smaller percentage were positive. When negativity
was reduced, the opposite pattern occurred," the study's authors wrote.
The findings of the study, published in the journal Proceedings of the National Academy of Sciences earlier this month, have sent shockwaves around the world — and not just because of the findings of the experiment itself, but because the guinea pigs, real people like you, had no idea they were taking part.
The researchers maintain the experiment was legal, because users must agree to Facebook's Data Use policy, when they sign up for an account.
But legal experts, academics and politicians around the world say that while the study may have been legal, it was unethical, too, because users were not able to give "informed consent".
Internet expert Clay Johnson, who co-founded the firm which managed Barack Obama's online campaign for the 2008 Presidential election called the study "terrifying" and called for someone to launch a class action against Facebook.
"In the wake of both the Snowden stuff and the Cuba Twitter stuff, the Facebook 'transmission of anger' experiment is terrifying", Mr Johnson said in a series of tweets.
"Could the CIA incite revolution in Sudan by pressuring Facebook to promote discontent? Should that be legal?"
Rachel Spencer, from the School of Law at the University of South Australia, said putting the legality of the study aside, the research appeared not to have followed the basic standards for ethical research by a university.
"Normally, when you conduct research into human subjects, you have to get permission from the Ethics Committee of a university and I would be very surprised if an Ethics Committee would approve of such a methodology," Ms Spencer said.
She said forcing people to feel happier or sadder, or withholding potentially important personal information because it was negative, such as the death of a friend or loved one, clearly could have a range of serious and worrying consequences.
"Finding out that you had been manipulated like that could also cause adverse consequences too — people with mental health issues or a fragile state of mind, for whatever reason, could be quite severely affected," she said.
"Just the thought of a company monitoring what I'm being told is a little bit creepy."
Terry O'Gorman, president of the Australian Councils for Civil Liberties agreed, saying Facebook's argument that the study was above board was "legalistic".
"Do you know anyone who's read 2000 words of Facebook's privacy policy?" Mr O'Gorman said.
"People genuinely feel they've been used by Facebook.
"If you're one of the users, and you find that your Facebook usage has been subject to some kind of emotional analysis, I think you're entitled to say that's a breach of privacy."
He said the fact tens of thousands of people still might not be aware their News Feed had been manipulated was irrelevant.
"Because people do not realise that in using Facebook, they're handing over data to be analysed by anonymous people who will never tell you what they're using it for and you'll never know the end result."
But if the whole thing feels creepy and Orwellian, what's in it for Facebook? Merely to add to the canon of psychology research?
Hardly, says Dr Karen Nelson-Field, Senior Research in the Ehrenberg-Bass Institute at UniSA.
It's all about marketing, she says, adding that she personally thinks the study adds to our understanding of the kinds of stimuli people respond to emotionally and that they share socially.
"What they've done is amazing, because it adds to the psychology literature on such a large scale," Dr Nelson-Field said.
"It would have been largely to see what content gets talked about, it would have been related around emotions.
"They've done it to understand a benchmark reaction (from users) to brands and reaction to sentiment to work out, in the future, where advertisers are best placed in terms of sending out positive and negative messages.
"It's not rocket science. If you watch a scary movie, you feel scared."
Dr Nelson-Field said people who argued the study was deceptive and unethical were "overreacting".
"If you sign up for a platform, you sign up to their privacy statements.
"You're exposing yourself to third party (agreements) constantly, so unless you turn your cookies off, you give Facebook 100 per cent rights to serve everything you do — when I say serve, I mean they can on-sell their data to a third party of their can use it themselves."
She said she accepted there may be more ethical ways to conduct the same kinds of research, particularly to filter out people who identified as having mental health issues or who were on medication for a mental illness.
But ultimately, people should know what they're getting into from the outset.
"Take some responsibility," she said.
"Sure, there's always going to be those that are less tech-savvy than others, but at the end of the day, what do you expect? And it hasn't done any harm."
Originally published as Can we still be friends with Facebook?