Accessibility/Mobile Features
Skip Navigation
Skip to Content
Editorial News
Opinion
Classified Sites

Brandon Sun - PRINT EDITION

Grasping social media's influence

Facebook, the social networking company, has shown it can play with the emotions of its millions of users by tweaking the algorithm that selects content of their news feed. The report of the company’s experiment, published this month in a scientific journal, casts light on the power of social-media managers over their customers. It also raises questions about ethics and accountability in the control of social media.

The experiment, reported in the Proceedings of the National Academy of Sciences, was conducted by Adam D.I. Kramer, a researcher who works for Facebook, together with two Cornell University scientists, Jamie E. Guillory and Jeffrey T. Hancock. In the week of Jan. 11 to 18, 2012, they kept negative messages out of the news feed to selected Facebook users and kept positive messages out of the feed to other selected users. Then they monitored the messages from those users to see if their mood was affected by the news feed content they received.

The news feed is a filtering device within Facebook that keeps each user’s message volume within manageable limits. Most people’s friends put out far more messages than one person could ever read. The news feed notices which kinds of messages a user most often reads and selects messages accordingly. The researchers narrowed that selection to strip out messages containing emotionally negative words for some users and emotionally positive words for others. Then they watched the results.

The experimenters’ conclusion, given in the journal article was: “We show, via a massive (N 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.”

When Facebook users learned of this experiment and complained about the dirty trick played on them, Facebook chief operating officer Sheryl Sandberg apologized for upsetting people and said the company should have communicated better. She gave no clue she saw a problem with an ethically challenged organization holding and using power to sway the emotions of millions of people by secretly tweaking an algorithm.

Politicians, entertainers, publicists and advertisers are trying to affect our emotional states all the time, sometimes for innocent reasons, sometimes to advance their own interests. In a free society, different speakers are tugging us this way and that and we make our own choices to which voices we will listen. Social media takes this a step further by sending us messages from our friends, whose feelings naturally affect us more directly.

Adam Kramer’s experiment, however, takes us into a world where emotional states can be transferred to others through emotional contagion, leading people to experience the same emotions without their awareness. Facebook or any social medium can show us messages from our friends who loved the show and weed out the messages from friends who hated the show. We will then love the show — and we won’t even know what happened.

The implications for marketing and for political decision-making are obvious: Facebook has the power to manipulate the selection of messages in its news feed. It did manipulate that selection in January 2012 and still today sees nothing wrong with that. Marketers and political campaigners will pay handsomely to get their fingers on that kind of persuasive power. Now that Facebook has done that once on a massive scale for a week, why would they or another social medium refuse to do it again? How would we know if they were doing it today?

For social media users, the best defence may be an alert and critical mind: That seems to be your friends talking to you, but it’s really a giant corporation sending you a selection of messages from your friends — and there’s no knowing how the choice was made.

» This editorial was recently published in the Winnipeg Free Press.

Republished from the Brandon Sun print edition July 14, 2014

  • Rate this Rate This Star Icon
  • This article has not yet been rated.
  • We want you to tell us what you think of our articles. If the story moves you, compels you to act or tells you something you didn’t know, mark it high. If you thought it was well written, do the same. If it doesn’t meet your standards, mark it accordingly.

    You can also register and/or login to the site and join the conversation by leaving a comment.

    Rate it yourself by rolling over the stars and clicking when you reach your desired rating. We want you to tell us what you think of our articles. If the story moves you, compels you to act or tells you something you didn’t know, mark it high.

Sort by: Newest to Oldest | Oldest to Newest | Most Popular 0 Commentscomment icon

You can comment on most stories on brandonsun.com. You can also agree or disagree with other comments. All you need to do is register and/or login and you can join the conversation and give your feedback.

There are no comments at the moment. Be the first to post a comment below.

Post Your Commentcomment icon

Comment
  • You have characters left

The Brandon Sun does not necessarily endorse any of the views posted. Comments are moderated before publication. By submitting your comment, you agree to our Terms and Conditions. New to commenting? Check out our Frequently Asked Questions.

Facebook, the social networking company, has shown it can play with the emotions of its millions of users by tweaking the algorithm that selects content of their news feed. The report of the company’s experiment, published this month in a scientific journal, casts light on the power of social-media managers over their customers. It also raises questions about ethics and accountability in the control of social media.

The experiment, reported in the Proceedings of the National Academy of Sciences, was conducted by Adam D.I. Kramer, a researcher who works for Facebook, together with two Cornell University scientists, Jamie E. Guillory and Jeffrey T. Hancock. In the week of Jan. 11 to 18, 2012, they kept negative messages out of the news feed to selected Facebook users and kept positive messages out of the feed to other selected users. Then they monitored the messages from those users to see if their mood was affected by the news feed content they received.

Please subscribe to view full article.

Already subscribed? Login to view full article.

Not yet a subscriber? Click here to sign up

Facebook, the social networking company, has shown it can play with the emotions of its millions of users by tweaking the algorithm that selects content of their news feed. The report of the company’s experiment, published this month in a scientific journal, casts light on the power of social-media managers over their customers. It also raises questions about ethics and accountability in the control of social media.

The experiment, reported in the Proceedings of the National Academy of Sciences, was conducted by Adam D.I. Kramer, a researcher who works for Facebook, together with two Cornell University scientists, Jamie E. Guillory and Jeffrey T. Hancock. In the week of Jan. 11 to 18, 2012, they kept negative messages out of the news feed to selected Facebook users and kept positive messages out of the feed to other selected users. Then they monitored the messages from those users to see if their mood was affected by the news feed content they received.

Subscription required to view full article.

A subscription to the Brandon Sun Newspaper is required to view this article. Please update your user information if you are already a newspaper subscriber.

letters

Make text: Larger | Smaller

Brandon Sun Business Directory
The First World War at 100
Why Not Minot?
Welcome to Winnipeg

Social Media