Q13 FOX Season of Giving

Facebook: We’re still experimenting on users, but now it’s less creepy

This is an archived article and the information in the article may be outdated. Please look at the time stamp on the story to see when it was last updated.

A man shows the smartphone photo sharingNEW YORK (CNNMoney) — Facebook says it’s making changes after an uproar over its mood manipulation experiment.

Not among those changes? Ending experiments.

The announcement marks Facebook’s fullest public acknowledgment yet of problems with the study, conducted unwittingly on some 690,000 users for one week in early 2012.

Some people in this group were shown a higher number of positive posts in their News Feeds, while others were shown more negative posts, in an attempt to gauge their emotional responses. But the study generated a backlash when it was published earlier this year, with Facebook accused of manipulating its users’ emotions without consent.

The experiment also drew the attention of regulators in Europe, who questioned whether it had broken data protection laws.

“Although this subject matter was important to research, we were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism,” Facebook chief technology officer Mike Schroepfer said in a blog post Thursday. “It is clear now that there are things we should have done differently.”

But Schroepfer defended the importance of user experimentation generally, saying it “helps us build a better Facebook.”

“Like most companies today, our products are built based on extensive research, experimentation and testing,” he wrote. “It’s important to engage with the academic community and publish in peer-reviewed journals, to share technology inventions and because online services such as Facebook can help us understand more about how the world works.”

Schroepfer said Facebook is introducing a series of reforms to its research process in response to the study, though his post was short on specifics.

The reforms, he said, include expanded research training for new engineers, as well as new guidelines and an enhanced review process for potentially sensitive research.

“We want to do this research in a way that honors the trust you put in us by using Facebook every day,” Schroepfer wrote.

User experiments are commonplace on the Web, where companies routinely conduct tests to improve their services.

The most common kind of user experiment is called an “A/B” test. That’s when a company provides a different Web experience for a small subset of customers. If you’re part of an A/B test, your screen may look different than your neighbor’s even when you’re both on the same website.

By their very nature, A/B tests are manipulative. If you’re part of a test, you might click on something you otherwise would have ignored, buy something you wouldn’t otherwise have purchased or feel something you wouldn’t otherwise have felt.

But the fact that some users in the Facebook experiment were deliberately made to feel less happy, without their explicit consent, raises ethical questions for many observers.

Schroepfer said Facebook wanted to conduct the experiment in response to studies published the previous year suggesting that people felt worse after seeing positive posts from their friends.

Facebook’s own research concluded that users who saw positive posts were likely to respond positively themselves.

“We thought it was important to look into this, to see if this assertion was valid and to see if there was anything we should change about Facebook,” Schroepfer said.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s