Redefining Societal Progress for Engineers

Abstract

Dr. Erin Cech’s critique of the National Academy of Engineering’s Grand Challenges focuses heavily on the pitfalls of technological determinism. This paper supports Cech’s argument through a discussion of current examples of the consequences of technological determinism, such as the Black Lives Matter and Me Too movements, medication accessibility, and facial recognition technology. Though the consequences of a zealous engineering mentality are commonplace in our society, there are also current examples of tech companies neglecting to develop technology due to its possible negative consequences.


The Grand Challenges highlight fourteen of the most important issues that engineers are working to improve for the sake of humanity. However, Erin Cech, a sociology professor at Rice University, critiques the National Academy of Engineering’s approach to the challenges, including their perspective on societal progress. Cech claims that the Grand Challenges adopt a technologically deterministic view, meaning they view society as a timeline of inventions, and progress is defined by the birth of new technologies. However, she is explicit that this perspective is problematic as it ignores non-technological solutions and the negative consequences of technology [1]. Cech’s position on technological determinism is justified and challenges the premise throughout the Grand Challenges that technology is the driving force behind societal progress. Modern times have shaped a more sympathetic and socially aware audience that agrees with Cech’s emphasis on social justice issues, making her points even more relevant. Although her arguments are valid, Cech’s views do leave room for expansion and specific examples – both of which will be discussed in this paper. 

Cech states that her critique against technological determinism (or technocracy) is her most “potent” out of the four. This self-proclaimed potency is questionable because there are different interpretations of what it means for an argument to be potent. The simple definition is persuasive, meaning that it can convince its audience effectively of the points. Cech’s argument succeeds at presenting a multitude of strong points to convince her audience. However, the effectiveness is dependent on the viewpoints of the audience. When Cech released this paper in 2012, the ideas were more provocative as they brought new ideas into view. However, since then, there has been a societal shift around social justice. Social media has been able to make social justice movements much more prominent in society. The late 2010s have been defined by important social justice movements such as the Black Lives Matter Movement and the #MeToo Movement. The prevalence of social justice in our daily lives has made Cech’s argument feel ordinary. When an audience is closely aligned with the argument, it decreases the argument’s potency as not much persuasiveness is required for the argument to accomplish its task [1]. This leaves room for the audience to look for specific details that Cech’s argument seems to be lacking. When her audience in 2012 was less agreeable, the number of points Cech made played a role in the potency of her argument. Time, however, has diminished that potency.

“Technologies have no inherent agency of meaning” is a strong point by Cech, though it requires some clarification. The existence of revolutionary technology loses its value if those who handle the technology do not have the right intentions. Technology is usually created with a purpose in mind and aims to solve a problem. However, the usefulness of a piece of technology can only go so far if it is mishandled after its invention. The methods of distribution are often just as important as the technology itself, and technological determinism does not acknowledge that. A relevant example of this is the current situation with insulin in the medical industry. Insulin is a life-saving drug for diabetic patients whose bodies cannot properly generate the proper amount. It was a revolutionary discovery that let diabetic patients live a normal, healthy life after diagnosis [2]. However, recently in the United States, big pharmaceutical companies have been greatly increasing insulin costs to their customers. Insulin only costs a few dollars to manufacture, but in Maine, it is sold for $865 for a 40-day supply [3]. Families often have to take on extra jobs or ration their insulin in order to survive. The Grand Challenges introduction briefly mentions the desire for accessibility: “So in pursuing the century’s great challenges, engineers must frame their work with the ultimate goal of universal accessibility in mind” [4]. However, this mindset is not reinforced anywhere else in the Grand Challenges and ignores making technology both easy and practical to obtain, rather than just possible. For example, most pharmacies have insulin to provide, but its inflated cost makes it difficult for individuals to afford. It is treated as a luxury rather than a necessity. So there is an important question to consider: Can we deem this technology as “societal progress” if authority figures are gatekeeping the product from its base consumer?  

The Grand Challenges push the idea of technological determinism – a concept that deems technology the solution to all the world’s problems. Cech briefly argues that not considering low-tech or technology-free solutions is problematic, but this idea needs support from examples. If engineers were to work with politicians and non-engineers on social justice issues, it would protect millions of Americans trying to stand up for a cause. For example, two of the biggest movements of the 2010s were the Black Lives Matter movement and the Me Too Movement. Black Lives Matter is a movement started in 2020 after the violent death of a Black man named George Floyd at the hands of police. Millions across the United States stood together to fight against racism and police brutality against the Black community. When peaceful protests were being organized, the organizers would promote the events and share content on social media advocating for BLM. Through what is called “big data surveillance,” authorities had access to information that ranged from what activists were saying on social media to where they lived. This information was used in Cookeville, Tennessee, where BLM protest organizers were tracked down and questioned by the FBI about their involvement [5]. Authorities have even arrested individuals for posts on social media criticizing the police. These things were possible because the laws surrounding data privacy and usage are flimsy in the United States. A simple solution to these problems would be better privacy laws – which could be implemented via collaboration between engineers and politicians. Engineers could bring their knowledge of big data to the table, including how data surveillance works, and how it can be exploited. Politicians can use the knowledge of these weak points to derive new laws that protect users. When technology is part of the problem, engineers may need to take a step back to make room for other avenues of societal progress.

Cech’s points about opening the engineering mindset to non-tech solutions are becoming more common in conversation between engineers. The Me Too movement started in 2017 with individuals confessing their stories about sexual harassment and abuse. It was a movement largely spearheaded by women whose sexual assault cases would either go unreported out of fear or ignored by authorities. A lawsuit was filed against Activision/Blizzard for unequal treatment and sexual harassment with no accountability. Although the higher-ups denied these accusations, the employees still demanded change. They organized a global walkout with four demands, but they were given an unexpected one: the president’s resignation. J. Allen Brack resigned from his position after saying his response was insensitive and hoped that new leadership would be able to encourage change. [6]. Engineers at Activision acted upon a non-technology-based solution to solve the sexual harassment problem and were successful. In the future, engineers might consider all types of solutions to problems, as they might be more productive than tech solutions. 

The strongest point that Cech makes is “[technological determinism] closes down any room for questions about whether these endeavors should be undertaken in the first place” [1]. Analyzing the consequences of technology can prevent more severe problems from happening down the line. To support Cech’s argument, it is worth discussing examples of technological determinism in social media and its consequences. Social media grew significantly in the 2010s, and it is a preferred method of communication due to the ease of maintaining relationships. Most consumers have a surface-level idea of how social media apps operate, but in “The Social Dilemma,” engineers and executives talk about how much data collection is involved with social media. Tristan Harris, a former employee of Google, explains that “[at] a lot of tech companies, there are three main goals. There’s the engagement goal: to drive up your usage, to keep you scrolling. There’s the growth goal: to keep you coming back and inviting as many friends and getting them to invite more friends. And then there’s the advertising goal: to make sure that is all happening, we’re making as much money as possible from advertising” [7]. Figuring out the “ideal strategy” meant using the public as guinea pigs to slowly introduce addiction through it. Developers’ focus on profit and perfecting social media to achieve their desired results has resulted in users suffering from mental health effects, addiction, cyberbullying, and more. Today, many of these developers acknowledge their mistakes, but several of the negative effects are irreversible. This emphasizes why it is important for evaluations to take place before the technology is released to the public. 

Facial recognition technology suffered a similar fate to that of social media. The very creation of facial recognition technology simultaneously violated human rights while setting a precedent on vague consent, rendering the violation difficult to fix. Now this technology is commonplace; most iPhone users utilize it to unlock their phones. Facial recognition was made possible by compiling the faces of millions of people into a data set and designing algorithms. However, much of that data set was collected without the consent of the people featured, which was an abuse of human rights. In 2019, there was a backlash against this and many of those data sets were taken down. There was some improvement following the backlash, but the damage had already been done as vague consent had already become a trend. An example of this was seen in a 2018 study. “The study, published in 2018, had trained algorithms to distinguish faces of Uyghur people, a predominantly Muslim minority ethnic group in China, from those of Korean and Tibetan ethnicity” [8]. The subjects received “consent” from over 300 students to work on this algorithm, but there were suspicions that the students were not well informed on the research. Though the technology behind the research is progressive in the field of artificial intelligence, Uyghurs have been facing heavy racism in China and are being targeted and thrown into internment camps. If this research were used in China, Uyghur populations would become easier to target, and those researchers would be partly responsible for a human rights crisis. Researchers demanded publisher Wiley retract the paper for its unethical applications to which Wiley replied, “This article is about a specific technology and not an application of that technology.” After backlash, they opened an investigation, but it was eventually closed with no real progress.  Not enough thought was put into whether this technology should be developed and the negative consequences were brushed off, concerns that Cech’s argument highlights as possible outcomes. It is extremely important for engineers to consider the applications of their technology and determine those applications are ethical. Cech asserts that companies need to not brush off negative consequences as “collateral damage” for progress. 

Handbooks like Citizen Engineer and researchers like Cech are slowly making an impact on the tech industry [9]. Twitter recently was developing an algorithm for automatic image cropping for their website. They used a dataset to try and mimic the actions of how most humans choose to crop their photos. However, after testing and getting feedback, programmers quickly realized that the algorithm was inherently biased after cropping out minorities and leaving white people in the photos. Instead of trying to fix the problem or just launching it anyway, Twitter scrapped the project [10]. They said that human perception is clearly better at image cropping and stated that there does not always have to be a feature for everything. Twitter evaluated the ethical effects of their technology and decided that this problem simply did not have a good technological solution. This mentality is a small beacon of hope for social justice in technology. 

Cech’s argument is strengthened by showing the effects of a technocratic mentality on societal progress. Examples such as the Black Lives Matter and Me Too movements emphasize the need for collaboration with non-engineers to create non-tech solutions and show how effective they can be. The social media and facial recognition crises highlight the consequences of ignoring the applications of technology for the sake of progress. Current conversations about engineering ethics are more prevalent in society than they were at the time of Cech’s critique. Though Cech’s argument has lost some of its potency over the years, it is still capable of compelling her audience. As these arguments will continue to evolve, it is important that engineers continue to adapt. When developing technologies, engineers need to broaden their interpretation of societal progress and keep social justice at the forefront of their minds.

By Natasha Singh, Viterbi School of Engineering, University of Southern California


About the Author

At the time of writing this paper, Natasha was a General Computer Science major and a senior transfer student at USC. Her interests include playing the ukulele, songwriting, collecting vinyl records, and playing Dungeons & Dragons with her friends. She was a part of the Society of Women Engineers, the Songwriter’s Forum, and Key Learning. She intends to pursue a Master’s degree in Computer Science after earning her undergraduate degree.

References

[1] E. Cech, “Great Problems of Grand Challenges: Problematizing Engineering’s Understandings of its Role in Society,” International Journal of Engineering, Social Justice, and Peace, vol. 1, no. 2, pp. 85–94, Nov. 2012.

[2] “The History of a Wonderful Thing We Call Insulin.” American Diabetes Association. [Online]. Available: https://www.diabetes.org/blog/history-wonderful-thing-we-call-insulin [accessed: Oct. 12, 2021].

[3] E. Magee, “Big pharma’s unconscionable insulin racket endangers people with diabetes,” Boston Globe, Feb. 2020. [Online]. Available: https://www.bostonglobe.com/2020/02/24/magazine/big-pharmas-unconscionable-insulin-racket-endangers-people-with-diabetes/. [Accessed: Oct. 12, 2021].

[4] “Introduction to the Grand Challenges for Engineering.” NAE Grand Challenges for Engineering. [Online]. Available: http://www.engineeringchallenges.org/challenges.aspx. [Accessed: Oct. 9, 2021].

[5] A. Funk, “How Domestic Spying Tools Undermine Racial Justice Protests,” Freedom House, June 2020. [Online]. Available: https://freedomhouse.org/article/how-domestic-spying-tools-undermine-racial-justice-protests. [Accessed: Oct. 12, 2021].

[6] S. Dean, “Blizzard president out in wake of a discrimination lawsuit and employee walkout” LA Times, Aug. 2021. [Online]. Available: https://www.latimes.com/business/technology/story/2021-08-03/blizzard-president-departs-game-maker-labor-lawsuit. [Accessed: Oct. 13, 2021].

[7] J. Orlowski and L. Rhodes, “The Social Dilemma,” Netflix Official Site, Sep. 9, 2020. [Online]. Available: https://www.netflix.com/title/81254224. [Accessed: Oct. 11, 2021].

[8] R. Van Noorden, “The Ethical Questions that Haunt Facial Recognition Research,” Nature, vol. 587, pp. 354-358, Nov. 2020.

[9] D. Douglas, J. Boutelle, and G. Papadopoulos, “‘Citizen Engineer’ Defined,” Citizen Engineer: a handbook for socially responsible engineering, 1st edition. Upper Saddle River, N.J.: Prentice-Hall, 2009.

[10] R. Metz, “Twitter says its image-cropping algorithm was biased, so it’s ditching it,” CNN Business, May 2021. [Online]. Available: https://www.cnn.com/2021/05/19/tech/twitter-image-cropping-algorithm-bias/index.html. [Accessed: Oct. 13, 2021].

Links for Further Reading

https://medium.com/swlh/what-is-technological-determinism-heres-a-bite-sized-explanation-2ecd80ecfe42

https://issues.org/perspectives-the-true-grand-challenge-for-engineering-self-knowledge/   

https://www.reuters.com/technology/tackle-racism-ai-blm-co-founder-tells-tech-bosses-2021-11-03/