Sie sind auf Seite 1von 9

Isabelle Helmich 3/31/14 CAS138T, Veena Raman Persuasive Essay

Personalize the Newsfeed

The Facebook Newsfeed is home to images, articles, status updates, and events, churning out into a seemingly balanced set of stories posted and shared by friends, groups, and pages. What most Facebook users dont know (or understand the consequences of) is that out of approximately 1,500 stories posted by the average users friends per day, only about 300 appear on the Newsfeed (Szoldra). The remaining posts are filtered out using algorithms that determine what stories to prioritize. The stories that appear on the Newsfeed vary from user-to-user and are a result of three factors: affinity score, edge weight, and time decay (EdgeRank). The affinity score of a story is determined by the posters relevance toor relationship with the respective account owner. Edge weight is determined by how many likes, comments, and shares a post receives. In addition, photos and videos are typically prioritized over text and links. Lastly, time decay is simply the relevance of a post, which decreases with time. Since the implementation of EdgeRank, Facebooks Newsfeed algorithm, there has been a decrease in visibility of organic posts from companies (posts that are not paid advertisements or promoted by Facebook), unpopular posts (in terms of likes, comments, and shares), and posts that the user would ideologically disagree with. As one can see, Facebook basically decides which posts to show each user based on these criteria, because 1,500 posts per day is simply too much. Initially, this appears to be a positive change for Facebook users. After all, it is a social network site where people log on to communicate with friends, family, and those they identify

with. People are typically using sites like Facebook for basic communication and light-hearted exchanges. In fact, half of the American population never discusses political affairs with others online (Smith). On the other hand, heavy filtering of the news feed is potentially dangerous when people use social networks to discuss politics and other important issues, because each account owner is shown mostly posts expressing similar ideologies to their own. With the growing popularity of Internet use in general, online political discussion has increased as well. A study that observed the civic behaviors of Internet users states that in 2008, 11% of social networking site users said they used these sites to post political news for others to read. In 2012, 28% of users said they posted links to political stories or articles and 33% said they reposted other types of political content on these sites (Smith). With a third of Americans reposting political material, concerns are growing about algorithmic filtering and the way this type of limited exposure affects informed discourse. The main problem with hiding stories from the Newsfeed is that users are not exposed to alternative points of view. Eli Pariser, the chief executive at Upworthy.com and president of MoveOn.org, discusses the concept of a Filter Bubble, which is the result of online filtering. When websites and companies observe the Internet-related behaviors of users, they use this information to their advantage, whether it is to market a product or, in Facebooks case, optimize the social networking experience. Essentially, Facebook hides posts that it believes, or calculates, are irrelevant to each consumer (EdgeRank). This basically necessitates users to physically like something in order to show that they like it and desire similar posts to continue showing up. People may appreciate posts on the Newsfeed that they disagree with, even if they dont actually interact with them. After the algorithm has been applied, people are left with a multitude of posts that affirm their already established beliefs (Pariser). A different study, by

Cass Sunstein, a legal scholar, showed that when people are surrounded by and discuss issues with like-minded people, each participant leaves the discussion with a more extreme perspective than before. This mechanism is called group polarization and can be applied to the online sphere. Considering that the number of political posts taking place online has more than doubled between 2008 and 2012, one can assume that the Internet will become an even more popular medium for discussing political issues. While engaging in political conversation, exposure to a range of opinions can help correct errors, encourage the consideration of alternative viewpoints, and expand the public discourse to include a wider range of participants in the public sphere (Himelboim). If Facebook continues to filter the Newsfeed using the same algorithm, users will be less exposed to conflicting ideas, potentially triggering the group polarization mechanism in a country where recently polarization is already considered a political nuisance. In order to address this issue, Facebook should first more explicitly and transparently inform users that the newsfeed is altered in this way. It is one problem to heavily filter the newsfeed, but an even worse issue when account owners are unaware of this tailoring and blindly assume that personal newsfeeds are accurate representations of friends daily activities and ideas. To inform users of the filtering algorithm would be extremely feasible, considering that when Facebook makes notable updates, a note sometimes appears at the top of the home screen for a given period of time. Once the user has seen and understood the update, they can close it and continue socializing with a more accurate and discerning perspective. Along with this notification should be a list of factors (such as number of likes/comments/shares) that influence the algorithm. By including this information, users can more easily make the connection between their own activity and what shows up on the newsfeed, and can have more authority in influencing calculated preferences.

One instance where Facebook actually did inform the public of a change in the newsfeed algorithm was when they announced that high quality articles you or others read may show up a bit more prominently in your News Feed, and meme photos may show up a bit less prominently (D'Onfro). This change was made because people more frequently click on articles and links of higher quality. The assumption was then made that account owners no longer wanted to see internet memes on the newsfeed. Again, lack of activity on a photo or post does not necessarily discount its relevance (Szoldra). The blog post announcing this change, by Varun Kacholia, an engineering manager, and Minwen Ji, a software engineer, was shared a whopping ten times (Kacholia). It seems a bit ineffectual to publish this information to a separate blog than directly to the Facebook community. Perhaps the information was posted separately because explicitly notifying account owners of the algorithms nature would raise some concerns about equal access to information and ideas -- as it should. Surely, more than ten people have read the post, considering other articles have been written about the change; however the vast majority of Facebook users are virtually uninformed when updates to the newsfeed algorithm are made. The aforementioned blog post noted that more people are getting news from Facebook than ever. According to the software engineers, traffic from Facebook to media sites has increased by 170% in the last year, justifying their extraction of cat photos from the newsfeed. This is great news for those who seek to be informed through social media and prioritize highquality announcements posted by friends. The problem is that this news is typically catered to each specific users pre-existing beliefs and interests. Therefore, the level of exposure in terms of new, unexpected, or ideologically controversial news stories is further depleted, potentially polarizing the users beliefs and limiting exposure to new, alternative information. As a counterpoint, there is a real-time news feed that gives a chronological update of all current activity at

any given time, however this feed is very small and in the upper right-hand corner of the home page. Not only is it small, but it is also overwhelming in terms of quantity of information, validating the need for some sort of filtering. The complications of obtaining news from Facebook can be addressed by the site. Users should have the authority to choose what gets filtered and what does not. For example, Facebook naturally prioritizes photos over article links, based off of a series of assumptions that claim account owners enjoy certain types of stories over others. It would be ideal for one to select which type of story or stories they want prioritized. As mentioned before, organic posts from company or special interest pages (like those of small businesses or science pages) are appearing less and less, to the dismay of many (Szoldra). Within this list of options that should be made available to the account owner, should be the opportunity to select certain pages and companies whose posts are desired. Another filter option could be to disregard the popularity of a given post in the Newsfeed algorithm. Often times, the most commented on, shared, and liked posts appear, and for good reason. On the other hand, other posts, which may be equally interesting, are deemed unimportant, creating something of a barrier to entry in the context of a Facebook marketplace of ideas. Lastly, account owners should have the power to choose which data sets are used in the newsfeed calculation. Users may want to tell Facebook that they do not want their browsing or shopping history, or previous likes or shares regarded. This will help to increase the appearance of stories that a Facebook user would typically disagree with or be uninformed about, resulting in a less biased and limited Newsfeed. Facebook is fully capable of providing a checklist of elements that users can select or deselect when it comes to tailoring the Newsfeed. Facebook does essentially the same thing when allowing users to set their privacy options. Privacy preferences are so specific that account

owners can control everything from who sees tagged photos to a custom list of friends who are only capable of viewing a limited profile (the contents of which the user can also control). Account owners can already alter the newsfeed to a very small extent in that they can choose to hide stories they wish to see less of. One must remember, though, that this option is only available for the stories which Facebook has already revealed, meanwhile there are about 1,000 other stories left involuntarily hidden. Eli Pariser, the Filter Bubble conceptualist has created a list of things one can do in order to reduce their own Filter Bubble. One of his suggestions was to contact Facebook and indicate that this is an important issue. He argues that software developers avoid making changes they are capable of making to the algorithm for the sole reason that users are not concerned (Pariser). If users are made aware of their newsfeed filter, they would be more likely to express interest and request the opportunity to alter it in accordance to their true preferences. Facebook wants to personalize and optimize the social networking experience, and providing this list of options would do so, in a more productive way. If Facebook implemented these changes they would allow the user to choose what is hidden. Granted, there is value to Facebook filtering algorithm. Most users are unable and unwilling to view 1,500 posts per day, most of which have little to no personal relevance. There must be some method of reducing the size of the newsfeed, but in a way that does not allow statistical assumptions to dictate and seriously limit the exposure to alternative ideas. By informing users of the algorithm and then allowing them to alter it themselves, account owners are made aware of their limitations, and can then choose to be increasingly exposed to alternative messages. The value of personalizing the newsfeed increases when the person is the one doing the personalizing. The algorithm makes assumptions based off of activity of specific account owners as well as Facebook users as whole,

when neither of these observational perspectives yields an accurate set of preferences. This is important because, as stated before, political discussions are occurring more and more frequently on social media sites and social media users are risking polarization. While the Internet has great potential for cross-ideological exposure, it also allows users to tune out individuals and information sources with whom they disagree (Himelboim). With that being said, it is crucial for people to obtain information and discuss pressing issues on various platforms and from a multitude of sources, online and off. Facebook is not a news site, and people who desire to be informed from myriad perspectives already know this. However, Facebook is capable of making the suggested changes that will broaden each users exposure to a variety of posts on the newsfeed, and further optimizing the personalized user experience without opaquely obscuring the methods of doing so.

Works Cited

D'Onfro, Jillian. "Facebook Wants To Banish Low-Quality Photos - Like LOLCats - From Your News Feed." Business Insider. Business Insider, Inc, 02 Dec. 2013. Web. 30 Mar. 2014. <http://www.businessinsider.com/facebook-news-feed-update-banishes-memes-201312#ixzz2xUQ8vb9L>.

"EDGERANK Responsible Search & Internet Marketing Solutions." EdgeRank. N.p., n.d. Web. 30 Mar. 2014. <http://www.edgerank.net/>.

Himelboim, Itai, Marc Smith, and Ben Shneiderman. "Tweeting Apart: Applying Network Analysis to Detect Selective Exposure Clusters in Twitter." Communication Methods and Measures 7.3-4 (2013): 195-223. Print.

Kacholia, Varun. "News Feed FYI: Helping You Find More News to Talk About."Facebook Newsroom. N.p., 2 Dec. 2013. Web. 30 Mar. 2014. <http://newsroom.fb.com/news/2013/12/news-feed-fyi-helping-you-find-more-news-totalk-about/>.

Pariser, Eli. "10 Ways to Pop Your Filter Bubble." The Filter Bubble. N.p., n.d. Web. 30 Mar. 2014. <http://www.thefilterbubble.com/10-things-you-can-do>.

Smith, Aaron. "Civic Engagement in the Digital Age." Pew Research. N.p., 25 Apr. 2013. Web. 30 Mar. 2014. <http://www.pewinternet.org/2013/04/25/civic-engagement-in-the-digitalage/>.

Sunstein, Cass R. "Group Dynamics." Cardozo Studies in Law and Literature 12.1 (2000): 12939. Print.

Szoldra, Paul. "The Major Problem with Facebook's Newsfeed." Slate Magazine. N.p., n.d. Web. 30 Mar. 2014. <http://www.slate.com/blogs/business_insider/2014/01/20/veritasium_why_facebook_s_ news_feed_changes_are_bad_for_users.html>.

Das könnte Ihnen auch gefallen