question archive An applied reflection paper will be due at the end of each unit
Subject:CommunicationsPrice:13.86 Bought3
An applied reflection paper will be due at the end of each unit. In these short essays (1,200–1,500 words), your task will be to apply key concepts from that unit’s texts and lectures to a subject of your choosing. There will be a review and question-and-answer session on the final class meeting of each unit, and your papers will be due the following Sunday. After you submit the paper, you will receive feedback from your peers (and give feedback to others); you will then have the opportunity to revise the paper before it is graded by the instructor and/or TA. The feedback mechanism encourages students to support each other.
For ARP1, please select a topic and use concepts from at least two academic texts that we have read in this unit to analyze that topic. Topics may range from algorithmic systems (social media platforms, risk assessment tools, facial recognition, financial algorithms, etc.) to non-algorithmic ways of sorting and evaluating people and things (Yelp!, IQ tests, SAT tests, debates around identity-sorting mechanisms such as race and gender, and so on). The academic texts from this unit include the following:
How Algorithmic Systems spoil user Experience
Algorithmic systems are one of the most important aspects that have been discussed in class. After the introduction and proper analysis of algorithmic systems, a proper grasp of the concept has been developed past the regular know uses of determining search engine results of every individual. An algorithmic system is made up of more than one algorithm that is utilized in software, for the collection and analysis of data and draws conclusions at the end; all this is part of a process that is designed to solve a pre-defined challenge. In other words, such systems are responsible for the "personalized" user experience that almost all social media platforms have adopted at the moment. Mostly the word algorithm is associated with Facebook, and so are the various scandals that are all about misused user data or spying allegations.
Social Media has revolutionized since the time of MySpace up to the moment, every inch of most of the social media platforms is filled with advertisements and "personalized" suggestions that will make the user experience better. Ironically, most of the time the suggestions are invalid or inaccurate, especially if there is a change in device or change in location. The search results on Google or suggestions on any social media platform vary from one individual to another. This is caused by various factors that will be discussed afterward, but the reason for this variation is to create "best results" for individual users (Seaver, 2013). People from the same office may search Google and get different results, first of all, that will create confusion and it is better if an individual is exposed to more information and not restricted information based on their location.
Reasons for the Variation in the Social Media Platforms
Physical Location: Based on the physical location, the results or suggestions provided will be based on the physical location present. That can be prevented if our physical location is turned off, but most of the time the device's location is on. Traveling from one country to another will lead to the change of a social media feed. Talking from experience, it is not a good experience as it will be hard to get accustomed to the "personalization," set by algorithms in the new location. For instance, if regularly I shop in a particular store, it will be hard to find the same store online when I move to another location or country. It would be better if the same results would be disposable despite the location.
Search History: Personalization is also set by the search history. It becomes hard when using a different device, or a device that is utilized by various people. Based on the search results, social media platforms can establish “your favorites,” and always suggest them. It is not a good show because at the moment communication experts mostly work online and it is not good to get a false sense of any aspect searched. For instance; if you search a personal website to check ranking it will show a top position based on the search history, whereas the website ranking is determined by other aspects like keywords or traffic (Gellipse, 2016).
Algorithmic systems in social media platforms have become very integral, and for that reason, they are hard to get rid of them. They have their perks, but not to the user. Algorithms benefit the companies at the expense of the user and that is why they spoil the user experience.
Real-world Examples of How Algorithmic Systems have failed
The heavy reliance on technology has led us to trust algorithms to be objective while that cannot happen. Algorithms are the creation of man and cannot have the ability to overcome the bias that is beyond machine learning. That is very clear as different people have suffered because algorithms were trusted to do the job.
Google Search Racial Bias: Latanya Sweeney decided to hope on an office device and searched her name, the Google ads she received were related to criminal activity. Her colleagues who had “white names” searched the same device and an ad “Have you ever been arrested?” did not come up. Latanya Sweeney, who is a Harvard professor concluded that names associated with black people produced ads that were related to crime (BBC, 2013). This happened because of Google’s algorithms and how they are set to communicate with the target. Placing ads on Google requires keywords that will trigger the ads and may explain the racial profiling. Additionally, user search history and habits play an important role in what ads might be suggested by Google Adsense.
Google Play Recommender System: as mentioned before the algorithms are placed by the different organizations for profit reasons. They are not designed to make the life of the user better. The Google Play system's algorithms fell into the same pit as it offered suggestions that Grindr users should also download the sex-offender location tracking app. Grindr is a social media tool that gay men used to network, and in this case, Google's algorithms are insinuating that gay men are sex offenders. Before black people were culturally associated with criminal activity and in this case, homosexuality is stereotyped to have a connection to sex offenders. Such events surely leave a distasteful taste in one's mouth because it is insensitive.
Algorithms in law Enforcement and Other Corporations: There have been different scenarios whereby people have suffered because of the biased character that algorithmic systems have. Voters have been removed from registrations; responsible parents have been identified as a deadbeat and other people have been profiled as terrorists. Whereby many of the incidents are because of the new roles assigned to algorithms disguised as automated systems. Governments and other corporations have opted for cutting costs and immediately saw that algorithms are the way to go. Well, that is not the case as 50% of the time automated systems have made the life of the user hard.
Algorithms have become a menace in all matters the internet, talk about social media platforms, Google and among other organizations who have embraced the power of algorithms. The point remains that an individual bias can be understandable and will not affect a lot of people. But considering that an algorithm serves millions of people and has the same bias is catastrophic. Algorithmic systems have become a perfect example of the cons of technology, especially because it is supposed to improve our lives. On the other hand, it is an illustration of a capitalist society that is run by greedy corporations that will stop at nothing to make profits. When talking about Facebook's algorithm, it sounds like it is a separate entity that also Facebook is unaware of. But the truth of the matter is that it is a tool that is placed in our social media platforms knowingly, with the knowledge of the biases it possesses.
Conclusion
At this point, it is clear how algorithmic systems spoil the user experience. The problem is that at the moment there is a heavy reliance on automated systems and that should not be the case. Logic is an important factor that should be considered by organizations that fully trust algorithmic systems.