Karen Hamilton wasn’t surprised to learn that Facebook manipulated the news feeds of 689,003 users to see whether positive or negative posts would create “emotional contagion.”

The 52-year-old Portland woman said she often feels manipulated by the social network’s data mining and sharing. And sometimes Facebook’s algorithms get it wrong.

When she recently posted a few items against Monsanto, advertisements for the controversial agrochemical giant started showing up in her news feed.

“Facebook is manipulating what users see all the time, and they don’t always get it right,” Hamilton said. “Facebook owns Facebook. If you post something, it’s theirs. We have to remember that. Facebook is also public. If I want something to remain private, I don’t post it in the public domain.”

Few Mainers may be shocked to learn that, for a week in January 2012, Facebook conducted research to determine whether emotional states can be transferred via social media, according to a research paper published in the June 17 issue of Proceedings of the National Academy of Sciences.

Randomly selected Facebook users were subjected to news feed posts that contained at least one positive or one negative word, and their status updates were analyzed to see if they showed signs of emotional transference.

Advertisement

“If most people are saying, ‘I don’t care anymore,’ that’s the story,” said Rich Brooks, a social media consultant who is president of flyte new media in Portland.

“We’ve already given (Facebook) so much information and they’ve already stepped over the line so many times,” Brooks said. “At some point, we’re complicit.”

Concern about the 2012 experiment seems to be focused on how Facebook researchers gathered and analyzed the data, with unclear involvement from researchers at Cornell University and the University of California, San Francisco.

“If they just watch my activity, that’s one thing,” Brooks said. “But if you manipulate my experience to see how I respond, that’s very different.”

Federal law requires institutions that accept federal funding to meet certain research standards, said Ross Hickey, assistant provost for research integrity at the University of Southern Maine. Most universities have institutional review boards to ensure that all research projects meet those standards.

In particular, federal law requires researchers to obtain informed consent from human subjects to prevent civil rights violations, Hickey said. If researchers can justify skipping prior consent because it would taint results, they must inform subjects immediately afterward.

Advertisement

“You have to tell them after the fact why you deceived them,” Hickey said. “It doesn’t appear that (the Facebook study) did that.”

Some news reports indicate that Facebook may have avoided getting prior consent by claiming to use pre-existing data. However, Facebook lead researcher Adam D. I. Kramer posted a status update Sunday that describes how subjects’ news feeds were manipulated so they would see posts planted by researchers while posts by “friends” would be suppressed.

“Nobody’s posts were ‘hidden,’ they just didn’t show up on some loads of (the news feed),” Kramer wrote. “Those posts were always visible on friends’ timelines, and could have shown up on subsequent news feed loads.”

Hickey, who helps to review more than 300 research applications annually at USM, said it appears that the Facebook study called for actively manipulating what research subjects saw.

“The question is, is that really pre-existing data?” Hickey asked rhetorically.

While the study’s methodology may call the research into question, the results were far from surprising.

Advertisement

“In one way, the results are like, ‘Duh!,’ ” said Brooks, the social media consultant.

Researchers concluded that “when positive expressions in posts were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”

Facebook declined to answer specific questions about its research practices, including whether it’s currently manipulating users’ information for research purposes.

In a prepared statement, Facebook explained that it does “research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”

Facebook’s data-use policy informs users that their content may be accessed for “internal operations, including troubleshooting, data analysis, testing, research and service improvement.”

Isidora Mayer, a 19-year-old psychology major at Southern Maine Community College, said she would be surprised if Facebook wasn’t manipulating users’ news feeds for research or other promotional purposes.

“(The media) do that on a daily basis,” Mayer said Monday while having lunch in Portland’s Monument Square. “If it’s Facebook or Fox News, does it really matter? You have to keep your eyes open. Nothing about Facebook is private. It’s called social media for a reason.”


Only subscribers are eligible to post comments. Please subscribe or login first for digital access. Here’s why.

Use the form below to reset your password. When you've submitted your account email, we will send an email with a reset code.