Facebook Won’t Stop Experimenting on You. It’s Just Too Lucrative
Recently, the Facebook fee hoax started circulating on, yes, Facebook, and you didn’t have to be an investigative journalist to debunk the thing. You just had to look at the company’s revenue numbers. Facebook’s 1.3 billion users are so valuable as advertising targets, the company would never run the risk of cutting any of them off with a paywall.
But, as it turns out, Facebook is willing to risk alienating its users in other ways. It also sees tremendous value in using its social network to experiment on those 1.3 billion souls—so much value that it’s still worth losing a few here and there.
If anything in recent memory comes close to validating off-repeated conspiracy theories about the motives of Facebook, it was the company’s now infamous “emotional contagion” study published over the summer. In the study, Facebook researchers tweaked the News Feeds of nearly 700,000 users—without their knowledge—to see if more positive or negative updates from friends induced the same emotions in the users themselves. The outcry was swift and loud, and now, several months later, Facebook says it’s being more careful in how it conducts its research. But there’s no sign that it’s stopping.
The idea that Facebook isn’t a
content-neutral communication medium like the phone or email seems to
generate constant surprise and outrage.
In a blog post
Thursday, Facebook Chief Technology Officer Mike Schroepfer
acknowledged missteps in the emotional contagion study. “We were
unprepared for the reaction the paper received when it was published and
have taken to heart the comments and criticism,” he wrote. “It is clear
now that there are things we should have done differently.”Schroepfer said Facebook should have considered other ways to do the study, and that the research should have been vetted more carefully by more and higher-ranking people. Over the past three months, Facebook has put clearer research guidelines into place along with a more thorough review process and more training, Schroepfer said.
But nowhere did he say that Facebook plans to stop experimenting on users. On the contrary, by setting up a system to undertake research more carefully, Facebook is giving itself cover to conduct more such research. All of which should come as a surprise to exactly no one.
Not Evil, Just Business
That’s not because Facebook is somehow evil, but because Facebook is a business—albeit a business that is perpetually misunderstood. The idea that Facebook isn’t a content-neutral communication medium like the phone or email seems to generate constant surprise and outrage. To be fair to the outraged, Facebook doesn’t go out of its way to remind users that the News Feed is gamed, and it specifically does not reveal how it is gamed.So we’ll spell it out: Facebook has every reason to manipulate the News Feed to optimize for whatever user engagement metrics correspond to the best returns for advertisers, which in turn correspond to the best returns for Facebook. And it has every reason to use other experiments in an effort to improve other parts of its operation. This is the way so many online companies work.
“Facebook does research in a variety of fields, from systems infrastructure to user experience to artificial intelligence to social science,” Schroepfer said. “We do this work to understand what we should build and how we should build it, with the goal of improving the products and services we make available each day.”
by setting up a system to undertake research
more carefully, Facebook is giving itself cover to conduct more such
research. All of which should come as a surprise to exactly no one.
These efforts are particularly valuable to Facebook because the reach
of its service is so large. It has nearly as many test subjects as China has people—a competitive advantage it’s not about to sacrifice just because its manipulations make some users uncomfortable. Most conspicuously absent from Schroepfer’s post is any suggestion that users can opt into or out of experiments like the emotional contagion study. The lack of transparency and consent is exactly what outraged users in the first place. But it’s understandable why Facebook likely wouldn’t see traditional informed consent as an option.
Facebook’s user base gives it access to one of the largest, most revealing random samples of human behavior ever assembled. Offering users the option not to participate would undermine the quality of Facebook’s results by compromising their randomness. The reactions and behaviors of a self-selecting group that knows it’s being watched pale in value compared to 1.3 billion people un-self-consciously going about the drama of their daily lives.
Little Incentive to Change
Monitoring, manipulating, and packaging users for advertisers are among the practices that are purportedly driving 50,000 would-be users per hour to jump on the wait list for Ello, the new ad-free social network. But even if that number were in the millions, Facebook would have little incentive to do things differently.A few weeks after the emotional contagion scandal erupted in late June, Facebook reported record revenues and profits for its most recent quarter, and expectations are high that this quarter Facebook will once again top itself. One user behavior Facebook would no doubt have little trouble measuring is whether news of its maligned research project correlated with an uptick in defections from the service or a drop in logins. If it had, Facebook might be expected to do something more drastic to curb such projects in the future.
But however more careful Facebook promises to be, its experiments aren’t going away. “We believe in research, because it helps us build a better Facebook,” Schroepfer wrote. And judging by Facebook’s bottom line, that research seems to be working. Did you hear the one about Facebook charging $2.99 per month for access?
No comments:
Post a Comment