What the researchers called “hard content” – national or world news and politics – was very polarised in terms of how it is shared: liberals shared stories from liberal news sources, conservatives from conservative sources. What you click on determines what you will see, although the researchers in this paper seem to make a distinction between the news feed selection algorithm and user choices as if they were semi-independent yet similar factors: in fact, one actually drives the other.Īs regards other findings from this study, some of the results made sense, albeit with a sample group that had issues in terms of its selection. Most social networks (and also non-social services such as search) are trying to personalise your content and make it more relevant, creating the so-called “Filter Bubble” as a result, so in this respect Facebook is similar to many other platforms. This was apparent after the aforementioned emotion manipulation study, when a lot of people stated that they didn’t realise that the Facebook news feed was filtered at all. That is, everyone would see a balanced set of content items overall, perhaps reordered based on “Likes” but not so much on one’s own profile characteristics (apart maybe for the ads on the sidebar which a lot of people realise are tailored). In terms of social networks and how algorithms work on sites like Facebook, there is often an assumption of neutrality, and many would think that they are being shown the same types of content as other people would see from their own sets of friends. However, an editor can also decide on any one day that it is in the interests of a newspaper’s readers to see a more diverse range of news stories around an important breaking topic. Some argue that an algorithmic ranking is much the same as an editor choosing what we see in a newspaper: most people would know when they pick up a certain newspaper that they are going to see stories aligning to the ideology of that newspaper and its readers. It may have been an isolated test, but the attitude behind carrying out such a study did cause me to stop using my own Facebook personal profile. For example, we had the experiment carried out by Facebook in 2012 and published last year where they manipulated the display of happy and sad stories to 150,000 users to see if they would in turn share happy or sad content. One general issue with the algorithms that control the selection and display of content in our social network news feeds is that we do not actually know what else these algorithms are selecting our stories based on, or how widespread their effects are. (Cross-cutting content are stories that are more likely to have been shared by those who are strongly committed to a different ideology than you.) The paper has raised concerns about the role that algorithms play in the kind of content that Facebook users are being exposed to. In a recent academic paper from Facebook, researchers described how their news feed algorithm is presenting users with content that is related to their ideological standpoint, and removes some “cross-cutting content” from sources they are less likely to agree with.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |