Ethics of UX Design in Social Media

Abstract

UX design is an increasingly important field in the digital world as designers dictate the way users interact with digital products, shaping their experiences, emotions, and satisfaction level. This article explores the ethical implications of social media UX design through consequentialist and rights approaches. It also identifies existing patterns of unethical design on social media. In order to maintain integrity and public trust, it is important to establish a universal code of ethics in UX design. 

Introduction 

Social media has introduced a new paradigm in the way people communicate with each other. In some ways, phones have become an extension of their users, creating a new field in HCI (human-computer interaction). With the significant influence of HCI over human life, UX (User Experience) designers hold a unique position of power. 

Engineers are responsible for building the functionality behind social media applications. UX designers, on the other hand, decide how their users will interact with their technology. Hence, UX designers directly manipulate a user’s intentions, choices, and actions. Designers make ethical decisions as they create these platforms. 

What constitutes ethical design? As HCI is a relatively new field, there is no universal standard in the UX design industry to dictate a code of ethics. Generally, ethical design creates experiences that are not only engaging and effective but also respectful, inclusive, and beneficial to users [1]. Standard principles for ethical design include the right to data privacy, inclusion through accessibility, avoidance of manipulative design patterns, and prioritization of user well-being [1]. 

Still, these principles are too broad to evaluate the actual ethicality of UX design. On social media platforms, nearly every design choice matters. From interaction design to the notification system, each detail affects users in some way. Thus, decisions cannot be made on generic principles alone — they must also take into account the broader impact on users. Through the lens of consequentialism and rights-based ethics, the necessity for a universal code of ethics in UX is apparent.

Ethical Design Violations in Social Media

A dark, or deceptive, pattern is one “that prompts users to take an action that benefits the company employing the pattern by deceiving, misdirecting, shaming, or obstructing the user’s ability to make another (less profitable) choice” [2]. Dark patterns are quite pervasive. A 2019 study by the University of Zurich discovered that 95% of a sample of 240 Google Play Store applications contained dark patterns. The majority of those applications contained seven dark patterns each [3]. 

Dark patterns are often intentionally designed and violate users’ fundamental rights. Misdirection, forced continuity, and hidden costs are common examples [4]. Misdirection occurs when a design purposefully focuses a user on one thing to distract attention from a more concerning characteristic. For instance, flawed privacy agreements might put heavy emphasis on the “Accept” button, making it tempting to click in comparison to crucial yet bulky information on users’ data usage. Forced continuity, a practice in which users are automatically charged after a free trial period without clear consent or notice, is common across many digital platforms (for instance, subscription services like LinkedIn Premium). Hidden costs appear at the last step of the checkout process, and the user finds unexpected charges before payment. Each of these patterns uses illusions and misdirection to exploit the user and instead benefit the business.

According to the Nielsen Norman Group, a widely respected leader in the UX industry, dark patterns are effective because of two specific design phenomena: A/B testing and copycat designs. A/B testing is an evaluation method that compares two versions of a product to decide which performs better [5]. It focuses strongly on conversions, which can lead to more dark patterns. Copycat designs are also very common in the design industry due to a design principle called Jakob’s Law. Jakob’s Law states that since users spend 80% of their time on other web experiences, they prefer websites to work similarly to those they are familiar with. Thus, copycat designs emerge to imitate “successful” products. This has helped standardize digital experiences, but has also proliferated dark patterns across the web [2]. 

Persuasive design differs slightly from dark patterns — rather than “deceiving” the user, persuasive design influences a user’s attitude and behavior through its characteristics [6]. Stanford University Professor B.J. Fogg describes the three factors of user behavior in the Fogg Behavior Model — motivation, ability, and a prompt [7]. By influencing these factors, persuasive design can create an environment to stimulate user behavior and point it in a certain direction, often with the user unaware. At the same time, persuasive design is valued positively and even ethically, because it can drive improvement and better habits in people’s lives [4]. For example, the Duolingo language-learning app employs persuasive design features like personalized messages and gamification to motivate users to engage with its content consistently. Still, the unethical use of persuasive design is quite common. For example, language can be used to push social proof: “the idea that people copy the actions of others in an attempt to emulate behavior in certain situations” [7]. This includes phrases like “Most wanted” or “10 people have this in their cart.” Social proof language urges users to fill their carts, add more items, and ultimately spend their money, fulfilling the business’s desires regardless of how true the phrase is [8]. 

“Fomo-centric language” is another type of persuasive design language. Fomo-centric behavior stems from a user’s “pervasive apprehension that others might be having rewarding experiences from which one is absent” [9]. This language takes advantage of the emotional response by framing actions in terms of exclusive or scarce opportunities, manipulating the user into feeling like they need to act quickly. This feeling of missing out can reluctantly drive users to engage more with a platform or purchase a product to avoid feeling excluded. 

Dark patterns and persuasive design are ultimately unethical and manipulative tactics. They undermine user autonomy, encouraging action based on emotional coercion or pathos rather than reason. They exploit cognitive biases like urgency or social pressure and prioritize business goals such as capturing more data or increasing user engagement over user well-being and informed decision-making. This violates the designer’s ethical responsibility to create protected spaces where users can make autonomous and informed decisions. 

The Rights Approach 

When determining the ethicality of design choices, it is important to consider rights that should be guaranteed to users. Digital human rights are rooted in natural rights — rights people have inherently by existing [10]. For instance, users have a right not to be harmed, as stated in Article 3 of the UN’s Universal Declaration of Human Rights — that “everyone has a right to life, liberty, and security of person” [11]. This includes mental harm that can be caused by social media design, like dark patterns and persuasive language. Other digital rights include access to information, equality, expression, and privacy protection [10].

The right of privacy protection allows users to understand and control how their data is collected and used. Users would be informed of the collection and asked to consent without coercion [12]. This means clear communication, accessible privacy policies, and an interface that allows users to adjust their privacy settings easily. Additionally, the right to equality should play a crucial role in design. Ethical design must consider the diversity of a platform’s users and create experiences that accommodate their needs. This might look like making a platform accessible to screen-readers, or ensuring that social media algorithms are not perpetuating bias and discrimination. 

Design patterns that infringe upon digital rights are inherently unethical, as they produce a net negative impact on user welfare. While such designs might achieve short-term business objectives, they ultimately result in harmful consequences that outweigh any temporary benefits. 

Consequentialism

The digital rights approach provides an abstract understanding of a user’s fundamental protections. On the other hand, consequentialism offers a practical approach that examines the real-world consequences of these platforms. At first glance, social media platforms seem to have made a substantially positive impact on society — they have contributed to unprecedented levels of global communication, democratization of information, and community building. For instance, TikTok, a popular short-form video platform, has fundamentally transformed online interaction. The platform’s recommendation algorithm allows users to discover content from creators and communities they’ve never followed before. This democratizes content discovery, where a user’s creativity and content quality matter more than their social connections. 

However, a more thorough consequentialist examination will reveal the problematic nature of certain social media designs. For instance, Facebook’s “Like” button is controversial. On one hand, it provides immediate social validation and positive feedback to users. On the other hand, the “like” system quantifies social worth. Facebook’s research showed that this increased user anxiety. Social comparison correlated to the amount of likes, leading the company to consider removing the signature feature altogether [13, 14]. This psychological impact extends beyond individual experiences, transforming how an entire generation of young social media users perceives self-worth. This feature has inadvertently created a complex feedback loop, as participants’ emotional well-being grows increasingly dependent on these algorithmic forms of social recognition.

Presently, the attention economy is one of the most influential factors that shape UX design in social media. It is also one of the most harmful paradigms in terms of consequentialism. The attention economy is a business model that monetizes users’ attention and engagement by maximizing the amount of time they spend on a platform, increasing ad revenue and data collection [15]. To facilitate user attention, design features such as infinite scroll and pull-to-refresh are now standardized across most social media platforms, making content consumption frictionless. The pull-to-refresh design, commonly linked to gambling addictions, also drives the same addictive behavior in social media. The content refresh facilitates a powerful neurocognitive reward system that functions similarly to the mechanisms of a slot machine. Infinite scroll also undermines a user’s ability to make conscious decisions regarding their time and attention. Infinite scroll often results in doom scrolling — “endlessly scrolling news or social media feed[s] and engaging with negative headlines” [16][17]. Doom scrolling is detrimental to consumer mental health, causing anxiety, depression, irritability, lack of productivity, and overall mental harm. Without the implementation of design features that mitigate infinite scroll’s cognitive dissonance, this platform design is unethical. In the end, it prioritizes business needs over the impact on consumer well-being. 

Infinite scroll and pull-to-refresh continue to dominate nearly every major social media platform. These design features may appear initially innovative and engaging. However, in the long run, they may drive irreversible societal consequences. Consequentialism demands that designers weigh the outcomes of their design choices, which include mental health and socioeconomic impact, rather than simply implementing features that drive the best performance or engagement. 

Universal Code of Ethics

Unethical design patterns and their consequences warrant a universal code of ethics for UX designers. Currently, the industry gravely lacks a standardized set of principles for ethical design, leaving designers to follow or ignore these patterns. The User Experience Professionals Association’s Code of Professional Conduct is a commonly referenced UX framework [18]. While the content of this code encompasses many principles of ethical design, it is not robust enough to serve as a universal code, lacking specific guidelines for addressing emerging challenges in data privacy and algorithmic bias. The Association for Computing Machinery’s (ACM) Code of Ethics and Professional Conduct is another prominent code of ethics [19]. While ACM’s code was created for engineers and other computing-related professionals, its guidelines on integrity, privacy, and “avoiding harm” align closely with the general idea of ethical design. Moreover, large corporations with strong values centered around user experience also typically have company policies that align with the foundations of ethical design. Google, for example, demonstrates its corporate responsibility for accessibility, digital well-being, and human rights [20]. 

Considering the current state of the UX industry, a universal code of ethics must be established to maintain public trust and professional integrity as it continues to gain influence. The code should include the following core components: digital human rights, design evaluation, and ethical design practices. Digital human rights build on classic human rights by adding a layer of protection for privacy, autonomy, and informed consent when interacting with digital systems and interfaces. These design choices preserve human agency and dignity. Design evaluation ensures that all designs are assessed by their impact and potential to cause harmful consequences, prioritizing well-being over profit. Finally, ethical design practices aim to avoid dark patterns and manipulative interfaces, prevent addictive design mechanics, and promote accessibility and inclusivity in all design decisions.

To establish this code deeply in the industry, stakeholders like academic institutions or professional organizations like IEEE must come together to build upon these principles and define enforcement mechanisms. By aligning these key figures in the industry, ethical design practices will become a foundational part of UX design, starting from the ground up. This will bring forward a new era in design that protects users’ rights and well-being. It will also encourage trustworthy design solutions in the long run. 

Conclusion 

It is difficult to control the outcome of human-social media interaction. However, that does not waive the responsibility of UX designers to make intentional, informed, and ethical design decisions. There is an ongoing pattern of conflict between business needs and ethics, where businesses sacrifice the good of their users’ experience for their own profitability. Designers must act as the middlemen in this tension, ensuring that ethical considerations on behalf of user well-being are integrated into every decision made. The UX industry must establish a firm universal code of ethics for the ultimate good of society.

By Anna Katherine Zhao, Viterbi School of Engineering, University of Southern California


About the Author: At the time of writing this paper, Anna Katherine Zhao was a senior at USC studying computer science. She’s an aspiring UX designer and loves exploring the intersection of art and technology. In her free time, she takes ballet lessons, reads literary fiction, and plays video games.

References 

[1] “Ethical Considerations in UX Design,” designlab.com, Feb. 08, 2024. https://designlab.com/blog/ethical-considerations-in-ux-design  

[2] M. Rosala, “Deceptive Patterns in UX: How to Recognize and Avoid Them,” Nielsen Norman Group, Dec. 01, 2023. https://www.nngroup.com/articles/deceptive-patterns/ 

[3] L. Di Geronimo, L. Braz, E. Fregnan, F. Palomba, and A. Bacchelli, “UI Dark Patterns and Where to Find Them,” Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Apr. 2020, doi: https://doi.org/10.1145/3313831.3376600

[4] C. M. Gray, Y. Kou, B. Battles, J. Hoggatt, and A. L. Toombs, “The Dark (Patterns) Side of UX Design,” Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems – CHI ’18, pp. 1–14, Apr. 2018, doi: https://doi.org/10.1145/3173574.3174108

[5] A. Gallo, “A Refresher on A/B Testing,” Harvard Business Review, Jun. 28, 2017. https://hbr.org/2017/06/a-refresher-on-ab-testing 

[6] “What is Persuasive Design?,” The Interaction Design Foundation, Jun. 06, 2016. https://www.interaction-design.org/literature/topics/persuasive-design?srsltid=AfmBOorJjFVYuGxaI_T-2-Yvh52-mamObQQ6PNxXWlKvSHBdG5ulBXif 

[7] C. West, “Social Proof: How to Use Marketing Psychology to Boost Conversions,” Sprout Social, May 24, 2021. https://sproutsocial.com/insights/social-proof/ 

[8] A. K. S, “Ethical UX Design and User Trust: Why It Matters?,” Aufait UX, Apr. 23, 2024. https://www.aufaitux.com/blog/ethical-ux-design/ 

[9] I. Karagoel and D. Nathan-Roberts, “Dark Patterns: Social Media, Gaming, and E-Commerce,” Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 65, no. 1, pp. 752–756, Sep. 2021, doi: https://doi.org/10.1177/1071181321651317

[10] S. Roussos, “What about digital rights?,” Living Democracy, Nov. 08, 2022. https://www.living-democracy.com/what-about-digital-rights/ 

[11] United Nations, “Universal Declaration of Human Rights,” OHCHR, 1948. https://www.ohchr.org/en/human-rights/universal-declaration/translations/english 

[12] PwC, “Privacy UX: Designing Tomorrow’s Experience,” PwC, 2024. https://www.pwc.com/m1/en/media-centre/articles/privacy-ux-designing-tomorrows-experience.html 

[13] M. Isaac, “Facebook Wrestles With the Features It Used to Define Social Networking,” The New York Times, Oct. 25, 2021. Available: https://www.nytimes.com/2021/10/25/technology/facebook-like-share-buttons.html 

[14] L. Scissors, M. Burke, and S. Wengrovitz, “What’s in a Like? Attitudes and behaviors around receiving Likes on Facebook,” Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing – CSCW ’16, 2016, doi: https://doi.org/10.1145/2818048.2820066

[15] C. Steinhorst, “Lost In The Scroll: The Hidden Impact Of The Attention Economy,” Forbes, Feb. 06, 2024. https://www.forbes.com/sites/curtsteinhorst/2024/02/06/lost-in-the-scroll-the-hidden-impact-of-the-attention-economy/ 

[16] UNC Health, “Doom Scrolling And Its Effect On Your Mental Health,” Caldwellmemorial.org, May 13, 2023. https://www.caldwellmemorial.org/wellness/wellbeing-with-caldwell/doom-scrolling-and-its-effect-on-your-mental-health/ 

[17] D. Yuhas, “Why Social Media Makes People Unhappy—And Simple Ways to Fix It,” Scientific American, Jun. 20, 2022. https://www.scientificamerican.com/article/why-social-media-makes-people-unhappy-and-simple-ways-to-fix-it/ 

[18] User Experience Professionals Association, “UXPA Code of Professional Conduct,” UXPA International, Mar. 25, 2013. https://uxpa.org/uxpa-code-of-professional-conduct/ 

[19] Association for Computing Machinery, “ACM Code of Ethics and Professional Conduct,” Association for Computing Machinery, 2018. https://www.acm.org/code-of-ethics 

[20] Google, “About Google,” Google. https://about.google/ 

Further Reading

Unconsciousness by Design: Addictive Technologies and the Escape from Freedom

Ethical User Interfaces: Exploring the Effects of Dark Patterns on Facebook

Analysis of User Motivation and User Interface Toward Screen Addiction

Ethical design in social media: Assessing the main performance measurements of user online behavior modification