Who’s in Control – The Algorithmic Trap
By Ivana Lopez Espinosa ’19
October 2010 was an important month for the world of social media – Kevin Systrom and Mike Krieger launched Instagram, an application exclusive to Apple users. The once independent platform was bought by Facebook for $1 billion dollars in 2012 – Systrom and Krieger announced their departure from the company September 2018 (Isaac 2018). Naturally, the application faces constant updates to better user experience;however, in 2016 Instagram feeds changed from chronological content to posts sorted by algorithms (Constein 2018; Pollock 2018). This shift caused a commotion among users because they were no longer seeing content of people they were following, but rather they were getting the “best” posts (Pollock 2018). Changing content to increase user engagement serves to further capitalist tendencies in the United States. Instagram’s algorithm was updated earlier this year – the update is not too different from the one in 2016 however it requires users to continually engage with their followers in order to appear “relevant” to other users. Wild4Games, a popular Twitch.tv streamer, released a video in January of 2018 to inform followers of Instagram’s new algorithm, stating that users will only see ten percent of posts from people they follow. Instagram users will be unconsciously competing against one another to rank high enough to appear in others feeds. These personalized feeds create filter bubbles, trapping users into endless loops of content most relevant to their past interests (Van den Bulck and Moe 2018:878-879).
To understand the effects of the algorithms, I asked for volunteers during the class I co-taught on Tuesday, October 30 to pick something PG-13 – like dogs, travel, beach, coffee – with tags and pages for them to like the content to see how much 30 minutes of this topic could affect one’s Instagram feed. One student expressed a week after – November 1 – that their feed was still infiltrated with posts of dogs – this is what he chose to look for on Instagram.
Filter Bubbles
Filter bubbles created by these algorithms can be detrimental to users because their preconceived notions of freedom may lead them to believe that they are in control of the content they are receiving. To an extent, they can control what they like, comment, share, follow, and their own content; however, the Instagram algorithm dictates the posts that appear after users add interests to these mathematic functions – eventually displaying content that is more relevant to the user than they even imagined. Whether the user is a humanitarian or a right-wing activist, these filter bubbles narrow content so much that posts from outside perspectives become so distant, reinforcing one perspective and creating a never-ending cycle of thought.
The purpose of social media algorithms is to keep users engaged with the application the longest by producing relevant content to entertain the user while displaying product advertising to increase revenue for various companies (Wheeler 2017). Van den Bulck and Moe (2018:878) define media content personalization through explicit and implicit actions to help users stay well informed. Instagram depends on implicit personalization because it relies on “previous choices…for future personalization” (Van den Bulck and Moe 2018:878) – contrary to explicit personalization which requires thoughtful choice for media content. Gourarie (2016) argues that algorithms are meant to approximate our life in three steps: 1) user information that goes in, 2) the “black box” and 3) the outcome. The black box is also known at the algorithmic process that calculates user information to produce the most efficient and applicable result. Instagram’s black box requires implicit information gathered by users interests which are obtained by the likes, comments, shares, and follows in individuals past and present. However, it is Instagram’s explore feed that is more affected by the algorithms.
Instagram has a simple, user-friendly layout – the layout is the same on Apple and Android products. The bottom navigation bar is – from left to right – 1) Home feed, 2) Explore feed, 3) new posts, 4)” general statistics” (your activity and the activity of your followers), and 5) your page and setting options. Unless one adds new content to their Instagram page, it remains static. Users can control who comments on their posts and whether to make the account private or keep it public – if the account is private, only followers can access the information.1 Public accounts are open to all Instagram users to look at, follow, and like – some account users also have their direct messages on the public which allows for universal contact – there are ways to filter messages from people. The Home feed displays content from pages that one follows – occasionally advertisements appear, but this more recent.
I argue that the Explore page is influenced the most by algorithms since it is meant to show users new content. In his January video, Wild4Games (2018) elaborates on the new Instagram guidelines that impact the Home and Explore feed. Currently, users are receiving content from ten percent of their users – this number is generated by the level of interaction users. Users can request to follow private pages, but it is not guaranteed that they will be followed back. If accepted, the private user is not required to follow the requestor back. Granting this access allows for the requestor to access the content of the page.
Prior to 2016, Instagram did not require algorithms to produce content but rather the Home feed would chronologically display posts from people one follows (Pollock 2018). After the March 2016 updates, Instagram focused on displaying the newest post – once the user caught up on posts from people they follow, top posts were selected by an algorithm that determined the most interesting content; however, it was not yet known how one could compete for the “best post” (Pollock 2018). Agrawal (2016) noted in Forbes that the algorithm was a result of the increased number of users, thus content would become buried for different users. Also, in 2016, Instagram increased the number of business tools which permits advertisers to see post insights – statistical information gathered from user interaction with a certain post (Agrawal 2016; Kim 2017; Pollock 2018). Businesses can use this information to better customize their content by seeing live updates.
Algorithm Updates
In-Class Activity
To understand the impact of the Instagram algorithms, I requested for several students in the class to focus on one topic to “like” in their Instagram accounts. The purpose of this activity was to show my peers how the pervasive nature of algorithms – the activity was conducted during my ten-minute presentation. I did not monitor the topic picked or how often posts were “liked” or commented on, so results varied and not all shared their experiences. One student shared his topic – dogs – and commented that he liked over 50 posts within the first five minutes of the presentation, for the remaining time, he sparingly liked posts as they appeared in his feed. During the first five minutes, he remained on the thread of posts created by one image in the Explore feed. He shared thathis Explore feed turned into only pictures and videos of dogs. A week later I was approached by the same individual to notify me that his Explore feed was “messed up” thanks to my class activity. From a previous discussion with this individual, I know that he does not use his account too often which explains they his feed remained infiltrated with dog content – had he increased his personal activity after the class his Explore feed would be less filled with dogs. Regardless of his level of usage, we can see that the pervasiveness of the algorithm. This student unknowingly trapped himself in an online world of only dogs by interacting with content that with images, tags, comments, and descriptions about or relating to dogs. If this activity had a significant impact on this students account immediately after the ten minutes, then we can assume that there are greater ramifications. In fact, issues extend beyond the capitalist market and into social problems. The longer one stays on the application the more one is exposed to advertisements which are catered to the interests of the user. Not only are algorithms dictating the content, but it is paring interests with relevant advertisements that narrow the filter bubble one becomes trapped in.
Danger of Algorithms
Many claim (Alang 2017; Gourarie 2016; Hochman and Lev 2013; Thurman and Schifferes 2012) algorithms are dangerous to the consumption of information on Instagram. Alang (2017) and Gourarie (2016) address the racist nature of algorithms by reminding readers that algorithms require human interaction and in fact algorithms themselves are not racist but rather reflect user’s beliefs. Often, we focus on negative social impacts, but it is important to remember that algorithms maintain users in bubbles with content that they choose. If individuals only like content with dogs, then they will remain in a bubble with only dogs. If one is a right-winged extremist, then the production of content will reflect their interests. The current Instagram algorithms will keep students in a similar thread of similar narratives. Users blindly fall into this continuous loop of content because that is what the algorithm assumes is best for the individual.
Algorithms do not go unnoticed by users – there is some acknowledgment however it is often ignored because it does a great job at producing relevant content. Although the class activity gave my peers a small insight into the effects of the algorithm used by Instagram, we can assume that there are large scale effects. Algorithms are not meant to have positive or negative impacts on society, but rather they are a reflection of human thought since it requires people to implicitly enter data. If we created a filter bubble during a ten-minute presentation than I argue that personalized feeds produce a sense of freedom that trap individuals a narrow world. Instagram argues that they listen to users, but users are only aware of what they see and if they do not mind then the will remain in the bubble. As long as users see what they are interested they are not in control since algorithms learn more about one that we can imagine.