//
2 mins read

The bright side of Facebook’s social experiments on users

facebook_mac_book_air_reuters

 

Facebook’s disclosure last week that it had tinkered with about 700,000 users’ news feeds as part of a psychology experiment conducted in 2012 inadvertently laid bare what too few tech companies acknowledge: that they possess vast powers to closely monitor, test and even shape our behavior, often while we’re in the dark about their capabilities.

The publication of the study, which found that showing people slightly happier messages in their feeds caused them to post happier updates, and sadder messages prompted sadder updates, sparked a torrent of outrage from people who found it creepy that Facebook would play with unsuspecting users’ emotions.

Because the study was conducted in partnership with academic researchers, it also appeared to violate long-held rules protecting people from becoming test subjects without providing informed consent. Several European privacy agencies have begun examining whether the study violated local privacy laws.

But there may be other ways to look at the Facebook study and its publication. For one thing, studying how we use social media may provide important insights into some of the deepest mysteries of human behavior.

Facebook and much of the rest of the web are thriving petri dishes of social contact, and many social science researchers believe that by analyzing our behavior online, they may be able to figure out why and how ideas spread through groups, how we form our political views and what persuades us to act on them, and even why and how people fall in love.

Most web companies perform extensive experiments on users for product testing and other business purposes, but Facebook, to its credit, has been unusually forward in teaming with academics interested in researching questions that aren’t immediately pertinent to Facebook’s own business. Already, those efforts have yielded several important social science findings.

But there’s another benefit in encouraging research on Facebook: It is only by understanding the power of social media that we can begin to defend against its worst potential abuses. Facebook’s latest study proved it can influence people’s emotional states; aren’t you glad you know that? Critics who have long argued that Facebook is too powerful and that it needs to be regulated or monitored can now point to Facebook’s own study as evidence.

After the outcry against the Facebook research, we may see fewer of these studies from the company and the rest of the tech industry. That would be a shame.

“It would be kind of devastating,” said Tal Yarkoni, a psychology researcher at the University of Texas at Austin who works on methods for studying large sets of data. “Until now, if you knew the right person at Facebook and asked an interesting question, a researcher could actually get collaborators at Facebook to work on these interesting problems. But Facebook doesn’t have to do that. They have a lot to lose and almost nothing to gain from publishing.”

If you’ve been cast in a Google or Facebook experiment, you’ll usually never find out. Users who are put into experimental groups are selected at random, generally without their knowledge or express permission. While Facebook says people agree to such tests when they sign up for the site, users aren’t given any extra notice when they’re included in a study.

One problem is that obtaining consent may complicate experimental results.

“Facebook could throw up a bubble asking people to opt-in to each test, but it would totally mess up the results, because people would be selecting themselves into the test,” Yarkoni said. (Offline social-science and medical researchers face a similar problem.)

Over the last few years, Facebook has expanded what it calls its Data Science team to conduct a larger number of public studies. The company says the team’s mission is to alter our understanding of human psychology and communication by studying the world’s largest meeting place. So far, it has produced several worthy insights.

In 2012, the Data Science team published a study that analyzed more than 250 million users; the results shot down the theory of “the filter bubble,” the long-held fear that online networks show us news that reinforces our beliefs, locking us into our own echo chambers. Like the new study on people’s emotions, that experiment also removed certain posts from people’s feeds.

Leave a Reply