Tag Archives: facebook newsfeed experiment

Twitter experiment in favorited tweets reveals dark side of social networking

In my best Carrie Bradshaw voiceover, I have to ask: are we all just social network test-subjects?

A lot of criticism has been levelled at Twitter over its new, stealth, timeline experiment.

Matt Farrington Smith talks about Twitter favorites experiment

In an effort to engage newbies, Twitter has been sharing users ‘favourite’ tweets in the timelines of people they follow. The offending tweets appear as retweets, further adding to the confusion.

Let’s get this straight, we make the choice to actively follow people on Twitter that we have an interest in. We enjoy reading their tweets, along with any retweets they make. We don’t expect to start seeing tweets from accounts we have never followed appearing in our timeline.

I don’t really see the benefits of favouriting tweets, why favourite when a retweet proves more effective? But I appreciate there are occasions when all you want to do is stick a pin in something (or need to acknowledge a Tweet). For these reasons alone Twitter’s favourite function is tailored more towards private use, moreover than consumption by the masses.

Of course by their very nature social networks need to adapt and grow. Changes are inevitable, but many users have been left puzzled as to why Twitter didn’t provide a heads-up out of courtesy. In-fact the last time Twitter spoke about its experiments was back in 2013: https://blog.twitter.com/2013/experiments-twitter

At the time of writing there is no discernible way to turn this ‘feature’ off. Plus it transcends all Twitter-supported platforms, meaning you’ll encounter it whether you’re on mobile, tablet, or desktop.

Facebook’s psychology experiments

“I’m not a lab rat!” Just a couple of months back, the Internet was awash with similar cries from Facebook users after it was revealed that Zucks’ and company had secretly carried out experiments on a sample of 700k.

The experiments involved manipulating users’ news feeds to control the emotional expressions they were subjected to.

It’s possible to view this as part of a wider ethical debate – is Facebook really allowed to play with our emotions and purposefully make us feel sad?

Facebook makes 600,000 users sad

In its defence, Facebook did speak-up about the experiments. Adam Kramer, a data scientist at Facebook revealed:

“Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused.”

He went on to explain the rationale behind such a study: “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out”.

“At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.”

Social networks are fragmenting over time

You could argue that Facebook, Twitter, LinkedIn (and other networks of their ilk) are so far-removed from their genesis, they’re almost unrecognisable.

When Facebook changed the algorithm that determined how often posts by pages were shown in news feeds, many felt like they were being unfairly penalized.

The change involved Facebook prioritizing content that it deems more relevant to users, but it also means page owners need to now pay should they want to reach the audience they once reached.

Sometimes less is more. This is especially true when it comes to the unending torrent of crap that spews forth daily in our news feeds. Important updates from your nearest and dearest are now punctuated by promoted or sponsored messages.

In some ways this barrage of promoted posts/tweets/statuses are like the pop-up adverts of old. The difference being your ad-blocking software proves resilient to this scourge…

Acting as Community Editor for Procurious I should point out that we take upmost care when it comes to our users’ privacy.  If you want to be part of an ethical online network that values its members, and won’t enforce experimental changes on you needlessly – then you should really stick with us.