Why the freaking out about Facebook's "psychological experiment" is BS (article) - TennisForum.com
TennisForum.com   Wagerline.com MensTennisForums.com TennisUniverse.com
TennisForum.com is the premier Women's Tennis forum on the internet. Registered Users do not see the above ads.Please Register - It's Free!
Reply

Old Jul 3rd, 2014, 09:02 PM   #1
country flag pov
Senior Member
 
pov's Avatar
 
Join Date: Feb 2008
Posts: 12,444
pov has a reputation beyond repute pov has a reputation beyond repute pov has a reputation beyond repute pov has a reputation beyond repute pov has a reputation beyond repute pov has a reputation beyond repute pov has a reputation beyond repute pov has a reputation beyond repute pov has a reputation beyond repute pov has a reputation beyond repute pov has a reputation beyond repute
Why the freaking out about Facebook's "psychological experiment" is BS (article)

http://www.vice.com/read/martin-robb...experiment-342
Stop Freaking Out About Facebook's 'Psychological Experiments'
By Martin Robbins


The last time I wrote about Facebook it was under an image depicting Mark Zuckerberg as a fleece-wearing Satan. Since then, the company’s image has taken a bit of a blow. The "revelation" last weekend that the business performed "psychological experiments" to manipulate the moods of its users led to an outpouring of emotion on Facebook walls up and down the land.

But the truth is there’s nothing new or even particularly interesting about the now infamous study, titled “Experimental evidence of massive-scale emotional contagion through social networks.” Researchers tweaked the algorithm that chooses which stories are displayed in people's feeds, so some users were shown more posts deemed to be "negative" by virtue of the words they contained—sad stuff, break-up stuff, war, famine... that kind of thing, I guess. The subjects—by a barely significant amount—then tended to be more likely to use the same, sad language in their own posts. This is the "emotional contagion" the researchers were looking for.

You could extrapolate this to mean that if you show people their friends’ more depressing posts, they get depressed themselves. Whether you would sign up to that theory depends on how much faith you put in sentiment analysis, which tries to mark pieces of text according to how many positive- or negative-sounding words they contain—love, hate, good, bad, etc.

It’s the sort of thing PR people do to impress bosses who aren’t very tech savvy. It also tends to be less accurate the shorter the text being analyzed is, which means it's not great for things like tweets and Facebook updates.

The media response to this is neatly summed up by the headline of Laurie Penny’s New Statesman article: “Facebook can manipulate your mood. It can affect whether you vote. When do we start to worry?” Of course, the trite answer is that Laurie Penny can manipulate your mood and that Laurie Penny can affect whether you vote, so when do we start to worry about her, or all the other journalists out there doing the same? That’s maybe a little unfair, but it gets to the heart of the issue, which is that Facebook is simply one player in a far larger game of algorithms and data, the implications of which Penny and other non-technical pundits are only just beginning to grasp.

Most online media brands worth anything are doing something similar all the time. Huffington Post was a pioneer of automated A/B testing in the 00s, testing different version of headlines in real time to see which would gain more traction with audiences. “But Martin,” someone who types in all caps is about to email me, “that’s nothing like Facebook’s experiment!” Except it is. Huffington Post experimented on its users, exposing them to different content in an attempt to find the text that created the strongest emotional response. The two major differences are that HuffPo used the results to boost profits in real time, and that nobody really gave a shit.

Should they have? The Coding Conduct blog has a fascinating post looking at the study and the ethical approval the researchers may have needed to obtain. Dave Gorski at Science Based Medicine pointed out that the policies of PNAS—the journal that published the research—require that, “Research involving Human and Animal Participants and Clinical Trials must have been approved by the author’s institutional review board.” Since Facebook (like most companies) doesn’t have an IRB, that leaves everyone in a bit of an ambiguous state.

The problem is, it just doesn’t make sense for this kind of study to have to go through ethical approval processes. A vast amount of what you see on the internet is controlled in some way by algorithms similar to those Facebook uses to dictate what appears on your wall, from suggested videos on YouTube to the headlines on Google News. Much of the rest is controlled by humans—the front page of VICE, for example. It’s hard to imagine how you’d bring Facebook’s research into an ethics regime without dragging half the internet into a farce.

Not that Facebook really need the scientific community, in any case. Back in February I wrote about deep learning, warning that companies like Facebook seizing a monopoly over data and expertise would sideline scientists. Here we have a case in point—Facebook can live without PNAS, but the academic community doesn’t have an alternative source of vital social network data. The power in this situation is entirely one-sided.

That’s the real story here. The idea that Facebook should have sought ethical approval to tweak the ranking of stories is a technologically-illiterate fantasy; but people are absolutely right to be skeptical about the sheer brute power that Facebook and its peers wield over our data, and the means of analyzing it.

There’s another big problem with the study that, to my knowledge, nobody else has raised, and that’s repeatability. If you wanted to test the findings by doing the same research yourself, pretty much your only option would be to go back to Facebook and ask them nicely. That should tell you everything you need to know about the health of an information economy, where one company holds all the data and scientists are left begging for scraps.
__________________
“Nonviolence means avoiding not only physical violence but also internal violence of spirit. You not only refuse to shoot a man, but you refuse to hate him.” – MLK
pov is offline View My Blog!   Reply With Quote
Sponsored Links
Advertisement
 

Old Jul 3rd, 2014, 09:05 PM   #2
country flag pov
Senior Member
 
pov's Avatar
 
Join Date: Feb 2008
Posts: 12,444
pov has a reputation beyond repute pov has a reputation beyond repute pov has a reputation beyond repute pov has a reputation beyond repute pov has a reputation beyond repute pov has a reputation beyond repute pov has a reputation beyond repute pov has a reputation beyond repute pov has a reputation beyond repute pov has a reputation beyond repute pov has a reputation beyond repute
Re: Why the freaking out about Facebook's "psychological experiment" is BS (article)

http://www.forbes.com/sites/dandiamo...udy-proves-it/
The Outrage Over Facebook's 'Creepy' Experiment Is Out-Of-Bounds -- And This Study Proves It


Two years before Facebook began a study that critics are branding as unethical and manipulative, another researcher used the social media site to do a similar experiment. And his story pokes holes in many of the arguments that Facebook did something wrong.

In Facebook’s experiment, researchers found that manipulating the emotional content of the posts in users’ Facebook feeds had a mildly contagious effect: It caused people to use slightly more or less positive language when writing their own updates.

But since Facebook voluntarily published its findings in PNAS, the company’s been under attack, with dozens of news outlets featuring stories critical of Facebook’s “secret mood manipulation study.”

Critics say that Facebook needed to get informed consent from users, as one would in a medical experiment, before messing with their Facebook pages in ways that might affect their mood. Others argue that the research should have been reviewed by an institutional review board, a panel of experts that governs safety at an academic site—especially because Facebook’s data scientist partnered with university academics and published the research in a journal.

But here’s the thing. Facebook isn’t alone in running academic-minded experiments on its users. Previous studies have faced the same issues of informed consent without receiving even a modicum of public pressure.

Take Timothy Ryan, a political scientist who will be on the faculty at UNC-Chapel Hill this fall, and who authored a study called “What Makes Us Click? Demonstrating Incentives for Angry Discourse with Digital-Age Field Experiments.”

Ryan’s study was published in the Journal of Politics in October 2012. It was peer-reviewed. It was cheered by fellow academics.

“My study is four years old,” Ryan says. “It’s not a secret—I’ve talked to a number of different people about it. And I don’t think I’ve ever gotten pushback.”

And unlike Facebook’s much-debated study, Ryan’s research did receive approval from an institutional review board—a fact that he double-checked this week. “All this discussion led me to go back and make sure I got IRB approval,” says Ryan. “It was a relief.”

Ryan’s study relied on a clever mechanism: Could changing the language and images in Facebook’s ads affect user behavior? The short answer—yes. By using his own series of tests that played on users’ emotions, Ryan found that liberal voters were more likely to click on ads that actively angered them.

(Ryan shared some sample ads with me, and I’ve pasted them below.)




One important point: Ryan didn’t get people’s informed consent before actively angering them…and his IRB was OK with that.

Informed consent would have been “unnecessary and impossible,” Ryan told me. “There were something like 14 million people in my study.”

Ryan notes that other academics have used Facebook to experiment with even more fraught issues, like driving people to vote.

And he also says that the study Facebook is getting flack for is well within the standards of his field.

The risk to users from Facebook was quite low – a mild change in the composition of their News Feeds – and within the realm of what they’d otherwise experience, Ryan argues. The benefit: “We better understand what can make people happy or sad.”

So why did Facebook’s study make so many people upset, whereas Ryan’s study—which was designed to actually anger users—get overlooked?

Maybe because people feel betrayed. “It comes from people whose fears about Facebook’s manipulation of what they see on the News Feed have been confirmed,” says Alex Howard, a well-known research analyst who’s written about these issues.

“I was trying to manipulate emotions through advertisements,” Ryan says, but Facebook “was trying to do it through the News Feed. [And] I think people have different expectations on ads versus News Feeds.”
__________________
“Nonviolence means avoiding not only physical violence but also internal violence of spirit. You not only refuse to shoot a man, but you refuse to hate him.” – MLK
pov is offline View My Blog!   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Copyright (C) Verticalscope Inc
Powered by vBulletin® Version 3.6.8
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
vBCredits v1.4 Copyright ©2007, PixelFX Studios