The following is a work of satire. I’m leading with this disclaimer, because many of these examples of Facebook’s attempts at mind control sound a little too believable.
Phase 2 of Facebook’s Emotional Manipulation Study
This week, Phase 1 of Facebook’s emotional manipulation experiments came to light. Having altered their Data user policy to include “research,” Facebook performed a study to test its influence on users’ psychology.
Positioning positive posts in the first test group’s feeds, the social network manipulated users to make merry messages of their own. Satiating some in sullen cynicism, they found these users were prone to mope and moan. Inspirational influencers led to delighted updaters, while pensive peers led to cocky contributors.
In his article Digital Market Manipulation, Ryan Calo believes companies “will increasingly be able to trigger irrationality or vulnerability in consumers.”
Like the copywriter in the Film Roger Dodger says, “You can’t sell a product without first making people feel bad… you convince them that your product is the only thing that can fill the void.”
There’s speculation Facebook implemented these studies to appease its shareholders. These suspicions would make sense, had evidence of Facebook’s second study not surfaced. It turns out these early experiments were the tip of the iceberg.
Phase 2 Experiments:
The Relationship Status Randomizer
Toying with eagle eyed ex lovers and potential stalkers, Facebook implemented the relationship status randomizer, listing married users as single, turning their private phone numbers to public, then posting “Feeling lonely” as their status on the hour every hour.
The Bogus Baby Broadcaster
Since baby announcements get the most engagement, Facebook posted pregnancy news on behalf of couples who weren’t expecting, pulling random ultrasounds from Google image search. The Bogus Baby Broadcaster asked family friends to vote on children’s names. The most popular choices were: Link McFly Skywalker, for boys, and Buffy Ripley Croft, for girls.
Open House Mode
Taking advantage of their Oculus Rift acquisition, Facebook started mapping real spaces for Virtual Reality. Rift owners have reported early access to a feature called Open House Mode. Stitching architecture together from users’ pictures, Open House Mode allowed beta testers to go on virtual tours of their friends’ homes. Rendering intimate living spaces, complete with exteriors from Google Street View, Open House Mode points out structural vulnerabilities like flimsy locks and windows that can be pushed open. When pressed for comment, Facebook’s lawyers said this feature was for users who wanted to throw surprise parties for one another.
The Celebrity Death Generator
Attempting to stir up grief, Facebook filled users feeds with links that falsely reported celebrity deaths. A candlelit vigil, for actor Steve Buscemi, caused a twenty block traffic jam in downtown Atlantic City. The show runners for Boardwalk Empire had already hired Digital Domain to create a CGI stand-in, by the time the real Buscemi appeared on set, hungover, but still breathing.
Promoting posts containing the words “hand soap, linen towels,” and “quilted tissue,” Facebook found an uptick in geotags to ‘home thrones.’ Once users were in their bathrooms, Facebook blasted them with footage of kayakers going over waterfalls, three story fountains, and animated gifs of lemonade flowing from bottles. This drew criticism from the American Society of Plumbing Engineers, fearing the effects a mass flushing incident will have on the nation’s sewer systems.
Manufacturing outrage, Facebook posted updates as ESPN, tricking users into believing the Washington Redskins were changing their names to the Washington Yellowskins, replacing their native American logo with that of a crude cartoonish Samurai. Soon after, the hashtag #YesAllShoguns started trending.
A petition to ban penicillin emerged, after Facebook made an article linking the antibiotic to childhood obesity trend. Medical authorities flooded the net to refute the claim, taking over the conversation in a matter of hours, but not soon enough to prevent media personality Jenny Mccarthy from endorsing the original findings. In the aftermath of the incident, Orange County has reported an outbreak of typhoid fever.
The Title Lengthening System
Some users awoke to find the phrase, “You Won’t Believe What Happens Next” tacked onto every link in their newsfeed, others saw, “… is the worst kind of discrimination.” Some reported seeing each link wrapped in the phrase “What… did is genius.” Everyone exposed to this title lengthening system reported feeling disturbed by the trend, as if they were the only ones noticing it happening.
Businesses, sports teams, and families reported finding phantom images of Mark Zuckerberg, Facebook’s Chief Executive Officer, in their photos. In each image, Zuckerberg appears to be interacting with people, bringing his hands in for a team building seminar, hitting a beer bong at a keger, even wrapping his arms around someone else’s grandmother. Those who noticed the phantom CEO, said he appeared immediately after they uploaded their pictures, as if he’d been there all along. One group experimented with the feature, pointing to a camp fire in mock horror, posting the photo, they found Zuckerberg emerging from the fire.
Facebook’s Milgram Experiment
Members of the psychoanalytic community were horrified when the social network conducted it’s own interpretation of the infamous Milgram Experiment.
Testing blind obedience, the Milgram Experiment urged subjects’ to commit actions at the expense of their conscience. Subjects took on the role of a teacher administrating electric shocks to a learner, an actor who was in no real danger. Every time the learner failed to answer a question, a man in a lab coat would instruct the teacher to hit them with shock treatment. Ignoring the actor’s cries, this authority figure would tell the teacher to up the voltage. The goal was to see how many of the subjects would protest, halting the experiment before the lethal jolt was given.
Facebook introduced a virtual version of this experiment. Believing they were administering electric shocks to prison inmates, users became executioners by way of an application. The app gave users a video stream of both a researcher, commanding them to move forward, and a prisoner writhing in agony.
Stanley Milgram found that 65 percent of his participants administered the lethal dose. Facebook, on the other hand, had a 100 percent success rate. In fact, the only user to report distress, was a man in Texas, claiming to be “bummed out” when the app disappeared from the service.
As social networks become more prevalent in our virtual lives their effects will be felt in the real world. If the cost of connecting means surrendering control of our bowels, most of us will pay it. If the price of admission is submitting to a full body scan, most of us will jump right in. We’ll accept, that if Facebook wants us to be happy, we’ll be happy, and if we’re sad, it’s because Facebook willed us to be. The social network works in mysterious ways.
We’re just guinea pigs, hitting ‘Like’ to get more food pellets, wandering through this maze of messages, looking for meaning. The all seeing eye of Zuckerberg watches us share pictures of our plates on first dates, engage in political debates, and when we think our cameras are off, he watches us masturbate.
Ours is not to question his reasoning, but to trust in his plan. We must open our minds and accept his influence.
Check out my April Fool’s post Facebook Buys DrewChial.com and my article on how The Facebook Bait and Switch is already effecting authors.
11 thoughts on “Phase 2 of Facebook’s Emotional Manipulation Study”
Excellent, excellent. Very funny, and not THAT far away from what is happening! When someone read out to me what really IS going on, I said, ‘but surely people will notice, and not respond?’, and he replied ‘you’re talking about the 10%, which includes me and thee. Alas, 90% of people are sheep’. Hmmmm….
I feel the same way. Facebook wasn’t just toying with people’s emotions just to see if they could influence them. They were working on a new twisted way to raise their bottom line. Even though these are supposed to be jokes, I wouldn’t be too surprised if one of them turned into a reality, either.
Thanks so much for reading and commenting.
😀 Brilliant! And as usual, scary and unnerving. I’d like to think I’m wise and immune to it all but I sense the manipulation social media uses is more subtle than that. Now please stop giving them ideas Drew!
I’m enabling them aren’t I. Although I do think the Phantom Zuckerberg feature might be kind of cool, in a creepy sort of way.
I’m glad you dug this.
Excellent. I also wrote an article on Facebook’s psychology experiment at my blog The Kaleidoscope. It takes a different look into the matter rather than taking a side of data scientist or a law advisor. Please do have a look to the blog and the article at : http://wp.me/p4Czjd-31
One of the big reasons people are up in arms is that Facebook performed their experiment before altering their terms of service to include the word “Research,” so no one who they toyed with had actually given their consent.
The reason I’m upset has less to do with how they published their data, but why they collected it in the first place. My hunch is they’re planning on using these techniques to make users more susceptible to advertisements or posts from their partners. It also speaks to a cumulative betrayal of users’ trust, making private photos public & allowing advertisers to use them without paying or even informing the original user.
Obviously, the rest of my post is satire, but each little entry is informed by a practice I’ve seen, just taken to the next logical extreme.
Yeah, you are very accurate in your observations. The concluding lines of my article resonate your anxiety and upset mind that after-all FB is a business and it will continue to push boundaries to the extreme for its benefit.
You’re right. It was a good article, btw.
I wish Facebook would increase their bottom line by offering enhanced features, subscription or premium services, rather than take these extreme measures.
This is brilliant, of course. A few years ago, a friend who is a professor of media suggested to his students an experiment by which they would change their FB relationship status (eg. from “married” to “divorced”) abruptly, without explanation, and gauge the reaction. Ironically, FB users seem to do this anyway, to a predictable barrage of responses (“What’s wrong?” “OMG What happened” etc.) We’ve let Facebook do this to us by assigning too much emotional weight to it in the first place.
Agreed. I have this conversation with my face to face friends every day. It’s not that our Facebook friends are hyper-emotional, jaded, or superficial, it’s just that’s the content they’re sharing. I know people in real life who are nothing like their online identity. Some portray themselves as more guarded, others more vulnerable than they really are. We need to interpret these interactions differently.
Glad you liked the piece. Thanks for commenting.