You know how you imagine those guinea pigs used in lab experiments have no idea what’s going on, and you might feel bad for them and think “aww, those poor defenseless creatures.” Yeah well… you might have unknowingly been that guinea pig.

Chaos ensued throughout the online world when people found out about Facebook’s social experiment in which some users’ newsfeeds were manipulated – without direct consent – in order to find out whether these manipulations would directly affect their emotions.

In early 2012, Facebook conducted an experiment on almost 700,000 of its users in order to find out whether the psychological phenomenon known as “emotional contagion” can occur online through social networks. Emotional contagion basically means that emotions are contagious. For example, when you’re constantly around a bubbly, out-going, happy person you may in turn feel happier yourself. Conversely, when you’re around a pessimistic, or depressed person, you may find yourself feeling rather depressed as well. Pretty straight-forward and logical. Facebook researchers, as well as researchers from Cornell University and the University of California – San Francisco, decided to find out whether this emotional contagion phenomenon can occur online as well as it does in our everyday physical lives. They created an algorithm in which, for a week’s duration, some Facebook newsfeeds would be tweaked into showing some users mostly negative content, and others mostly positive content. Can you guess what the results were? Surprise, surprise… positive content affects our emotions positively and negative content affects our emotions negatively!

Not only does the experiment seem to have an inevitable outcome right from the get-go, but it was also conducted without direct consent of the users… although apparently, when signing up for Facebook, we basically tick a box that has us agree that Facebook can freely do such things. Even Susan Fiske, the Princeton University psychology professor who edited the study, said:

“I was concerned until I queried the authors and they said their local institutional review board had approved it — and apparently on the grounds that Facebook apparently manipulates people’s News Feeds all the time.”

Let’s be real here though, surely the people over at Facebook know that no one actually knows what they’re agreeing to in terms of legal matters when it comes to signing up for the social network.

After having finally published the results of the experiment this week in the scientific journal, Proceedings of the National Academy of Sciences, people were outraged that they could have possibly been one of the guinea pigs of this experiment.

Ethical as well as legal questions started to arise and people were greatly discomforted by the thought of having had their emotions unknowingly manipulated for the sake of an experiment.

Many people expressed various thoughts and concerns regarding the issue. When one stops to think about it, however, hasn’t the media always manipulated us in some way or another? No matter how much we may dislike the thought of being manipulated by people “behind the scenes” it has undoubtedly been done over and over again. How else would advertisers know what to do and how to keep us coming back for more?facebook-thumbs-down-820x420

Let’s face it… these days we are being watched, closely. Creepy, yes, but true.

Just the other day, a friend of mine was telling me about how Facebook somehow conveniently placed just the right advertisements at just the right time when he was looking for a class he needed around his area. But it’s not just Facebook… Google is also all sorts of creepy – I mean Google maps alone – I’m sure you’ve also stopped to think about all those mysterious cameras in the sky. But Google is its own separate story- those guys will probably eventually rule the world.

In response to the storm of negative feedback all over the internet following the publication of the experiment, Adam D. I. Kramer, one of the co-authors of the study and a member of Facebook’s Core Data Science Team released a statement on Facebook in which at some point he states:

“[O]ur goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused.”

That’s nice, but too little too late perhaps?

Well here are my two cents: the experiment was uncalled for – I mean did they seriously not expect those results – but the reactions were also a little over the top. Don’t get me wrong – when I first read about it, I was pretty angry myself. I thought about that one person that could have already been having the worst day ever and then opened Facebook only to find themselves feeling even more down. But then, I also thought about that one person that could have been feeling upset, and then maybe went on Facebook and felt just a little better.

The truth is that we are obliviously manipulated like this everyday by all media outlets.

This “emotional contagion” phenomenon is inevitable in our everyday interactions, but perhaps a small way in which we can control it when carried out through media manipulation is by simply controlling what we choose to get manipulated by, and when we choose to get manipulated by it. It may not be as easy as it sounds, but now that this Facebook experiment controversy has made us more wary of such things, we could at least try, right?

And if in the end, you can’t let this go and can’t stand Facebook anymore, just go on your profile and go ahead and let Facebook know “what’s on your mind.”

The published study

The Kramer guy’s Facebook apology statement

Total
10
Shares