Facebook management is desperately trying to tamp down a scandal after allowing users’ moods to be controlled by an app called Moody. The whole incident has made ..., the software’s brainchild, the target of ire among Facebook users who resent having their chains yanked like puppets on a string. People on Facebook (1.3 billion and counting) are outraged that the social network would allow its own users to be so blatantly manipulated. The kerfuffle began earlier this year when ... dreamed up a side project to investigate how users are affected by the content they see and share.

The hypothesis was that being exposed to “happy” people on Facebook, with all their group photos and countless likes and friends, may cause users to sink into a depression when contrasted with their own sad reality. ... remembers the genesis of the now controversial project. “I thought to myself, how many times have I checked my Facebook newsfeed only to find that my life seems more pathetic than everyone else in my social network? What if I could somehow filter out all that happiness so that I only saw people worse off than myself? How great would that feel? I began to suspect that there might actually be a way for Facebook to make me happy.” ... presented the novel idea to Facebook, which decided to fund an app designed to manipulate users’ moods through screening the content they were exposed to.

Working with Facebook developers, ... used the Linguistic Inquiry and Word Count (LIWC) software to filter user feeds in order to induce particular emotions. ... had few qualms about the ethical ramifications of such tinkering. “Facebook users are already being manipulated in ways they can’t even imagine—why not mess with their emotional state as well?” Although ... claims he designed the app to empower users to have control over their own emotional well-being, the final outcome was quite the opposite. ...’s mood reinforcement technology proved to be far stronger than even he could have imagined. “It’s hard not to get sentimental thinking about it, but damn that app was good. It could wreck an interior life better than any actual high-impact event.” The software began inducing mood states that were so powerful they began enslaving many users’ behavior to sentiments Facebook could fabricate at will. In secret beta-tests, users could be made irrationally exuberant, clinically depressed, neurotically anxious, and violently angry. Those who had their filter set to sad became conditioned to seeing heart-breaking content that made them wallow in even more grief, pain and self-pity. Those who had their filter set to aggressive completely flew off the rails and started threatening those around them. ... himself was shocked by the extreme effectiveness of his app. “I had no idea it would get bootstrapped to their minds and snowball into such extremes,” said ....

When news of the mood-hacking became public, a mass of lawsuits from both naturally and artificially distraught users flooded Facebook offices. Privacy advocates and the general public agreed: Facebook had overstepped its boundaries. “We have come to expect companies like Facebook to mine our data, but nobody expects them to deliberately data-warp our minds” one leading advocate said. Facebook has since removed the app, and re-written its privacy rules to specifically ban such practices in the future. Looking back on the scandal, ... feels profoundly disappointed. “Every time I try to do something that should make everyone happy, it traps everyone into unhealthy emotional states that are beyond their control,” adding that “it’s Facebook’s fault.”