"Be silent as to services you have rendered, but speak of favours you have received." - Seneca

Facebook Study That Manipulated Users: Why You Should Be Concerned

By , June 29th, 2014 | 3 Comments »

Facebook research study manipulates emotions
When you signed up for Facebook, you probably didn’t think that one of the things you agreed to was to let the company do psychological tests on you, but that’s what Facebook claims you did. They contend one of the things you agreed to, as part of the terms of agreement, was their data use policy which says Facebook can run experiments with your account. On the surface, this doesn’t seem like such a big deal, but it certainly became a lot bigger deal with the recent release of a study where Facebook concluded it has the power to change your emotions by what it places in your newsfeed.

In January 2012, Facebook ran an experiment on nearly 700,000 (689,003) of its English-speaking users with the sole purpose of seeing whether or not they could manipulate how those people felt and acted online simply by manipulating what information they saw in their daily newsfeed. In doing so, Facebook came to the conclusion that it does have the power to make you feel better or worse during the day by the news it feeds you.

The experiment was set up in such a way to study the effects of what would happen if users received exposure to primarily positive posts or negative posts in their newsfeed. Facebook did not alter any of the user’s posts which they put out, but what they did do was change what the users saw when they looked at their newsfeed. The newsfeed is controlled by Facebook through an algorithm, so what shows up isn’t everything that all your friends post, but what Facebook decides would be of most interest. For this experiment, they changed the algorithm in such a way as to favor certain types of posts over others to see if this could manipulate your emotions.

The users were split into two groups, with the first group receiving a newsfeed that had primarily positive posts. The second group had a newsfeed that was primarily negative content. The study found that the users were influenced depending on the type of emotion which they were reading in their newsfeed. That is, those who have a newsfeed that was primarily positive appeared to become more positive themselves, and shared that positiveness in their own posts. The same effect occurred with those with negative posts in their newsfeed. They appeared to become more negative, and this negativity reflected in what they posted on Facebook.

The fact that Facebook is able to manipulate your emotions should be a concern for anyone who is on Facebook (or that Facebook did this psychological experiment to in the first place). It’s reasonable to assume Facebook will use this information to help it make more money. As the saying goes, “If you’re not paying for something, you’re not the customer; you’re the product being sold.” It doesn’t take a leap of faith to understand Facebook could begin tailoring your newsfeed in ways to encourage you to buy certain products and services from advertisers, making both the advertisers and Facebook a lot of money.

Even for those who don’t buy anything from Facebook, this should still be of concern. When you go to your newsfeed each morning, Facebook could be subtlety determining how you are going to feel that day simply by deciding what friend’s posts you’re going to read. Even if you don’t buy, you’ll still be manipulated to see only the type of posts Facebook wants you to see.

Some will argue that this isn’t a big deal because advertisers are doing this on a daily basis in all aspects of our lives. Everyone does it, so why should Facebook doing it be of any more concern? The difference is that most advertisers aren’t using your friends, and what side of your friend’s you see, to do the manipulating.

My guess is that many people using Facebook already expect their behavior is being studied, but most probably weren’t expecting Facebook to actively manipulate what they see in their newsfeed to gauge how they react, and to see if their emotions can be changed in subtle ways. That’s psychological experimentation, something that usually needs to be run by an ethics board before it can be conducted.

(Photo courtesy of Marco Pakoeningrat)

Get Your FREE Book Now

Enter your name and email address to get your FREE copy of "Guide to Shopping at Costco."

We won't send you spam. Unsubscribe at any time. Powered by ConvertKit
What did you think about this article?
1 Star2 Stars3 Stars4 Stars5 Stars (1 votes, average: 5.00 out of 5)


  • Lynn Cee says:

    “Dr.” Zucky Strangelove … anything for another Billion. Just say NO. Life without Farcebook is really quite nice.

    • Edward says:

      Indeed, especially since my emotions are not manipulated by its research and sneaky advertising.

      Good riddance, lets see the FTC pick its bones once its users leave in droves.

  • sunny says:

    Zuckerberg is the Devil !!! More proof !!!


Leave a Reply


Sign up for the "Saving Advisor" newsletter (Weekly)
Google Plus

Subscribe by email:

Related Articles

Previous Years Articles

Today, last year...

Copyright © 2018 SavingAdvice.com. All Rights Reserved.