Government Backdoor in Cloud Services: Ethics of Data Security

In February 2018, users of Apple’s cloud service “iCloud” in China received a notification stating that all of their data were being moved from Apple’s servers to a state-run data center, Guizhou-Cloud Big Data (GCBD). In the updated terms of service, Apple stated that both they and GCBD reserve the right to access users’ data when requested by the Chinese government [1]. This action was Apple’s response to China’s 2017 Cybersecurity Law, which forces companies operating in China to host all data within its territory [2]. While it is natural for a company to obey local regulations, granting the government direct access to all user data could compromise users’ security. People should have privacy because a private space is the basis of human freedom and autonomy. As professionals devoted to human welfare, software engineers should defend user privacy by protecting users’ personal data. When facing unavoidable government requirements that threaten user privacy, engineers should prioritize ethical considerations and terminate the service in that market to avoid unethical consequences. If the financial damage from stepping out unacceptably threatens the company’s survival, engineers must transparently operate the compromised cloud service by informing consumers of the security risks in their product and allowing them to decide for themselves.

Government agencies acquiring user information from software services is not a new phenomenon. In the first half of 2017 alone, Apple received over thirty thousand requests for customer information from law enforcement agencies around the world, according to its transparency report. In these cases, information from Apple’s servers is provided to these agencies on a case-by-case basis, and the process is regulated legally by search warrants or court orders [3]. In the recent GCBD case, however, Apple was not responding to the usual individual request; by offering the encryption keys of their cloud servers to the Chinese government, they essentially built a “backdoor” into the personal data vault of the entire Chinese iCloud userbase. Chinese authorities can now arbitrarily peek into every aspect of a user’s personal life by looking at photos, contacts, or messages. In addition, since Chinese agencies have direct control over the encryption keys, they don’t need to go through any legal processes to enter the servers, so it is much easier for them to open the backdoor into users’ private data.

An all-purpose backdoor in cloud services that allows arbitrary access to user data is unethical because it compromises the privacy of all users. All people have the right to pursue privacy regardless of nationality or culture, because humans need a private space where they can avoid publicity and be left alone [4]. As lawyers Samuel Warren and Louis Brandeis argued in their influential 1890 article, “The Right to Privacy,” the right to be “let alone” is the necessary expansion of the right to life, which is one of the oldest moral principles on human well-being [5]. Being left alone in a private space allows people to freely make decisions on their actions, properties, or beliefs, without any interventions from others. Privacy is thus a necessary condition for people’s freedom and autonomy [3]. Even in cultures where human liberty isn’t emphasized, freedom in confined personal space and autonomy over an individual’s own matters should be inherent rights of all human beings [6]. In online environments, users of a cloud service are seeking a secure and private space to store virtual items that represent their personality, without control from an outside force. These online personal zones should thus grant people privacy the way private spaces do in real life. As professionals dedicated to creating software that improves human well-being, developers should respect the freedom and autonomy that users seek in a private space. Therefore, when creating a cloud service that stores personal data, it is their responsibility to maintain user privacy and eliminate security flaws in their system, as emphasized in the Code of Ethics of Association of Computing Machinery (ACM) [7]. As such, it is unethical for software engineers to build an all-purpose backdoor that allows a third party to freely access private user data.

When the key to the backdoor is in the hands of the government, the resulting breach in privacy is especially impactful on users’ peaceful living. When government authorities demand users’ information from software services, the common rationale is state security. But, after engineers hand the access key to authorities, it is difficult for them to control what authorities do with users’ data. For example, government agents handling the personal data could leak it to media outlets, either by accident or with malicious intent. Government data leaks through personnel have already occurred, such as in 2017 when a U.S. government contractor shared a top-secret NSA report with a news outlet [8]. In the worst case, private information extracted from the government backdoor could be exploited by cybercriminals, who could use it to defraud or blackmail individuals, endangering their property or personal safety.

Besides being exploited in criminal activities, data from the backdoor could be used by government agencies to flag and suppress political dissidents. A precedent had already been set in 2005 when Yahoo provided communication records to Chinese officials that led to the arrest of journalists [9]. Although some countries might have more repressive laws on public speech than others, in private settings all people should be free to have their own opinions and beliefs without being judged. All humans are born with a mind that produces feelings, thoughts, and decisions, so in their private space, they have the right to make their own decisions without fear of being monitored by a governing force. Users’ personal space in a cloud service should be the online private zones that give them such freedom and autonomy; these privileges will be diminished if every citizen’s personal data is under control of an authoritarian administration and if every individual can consequently be prosecuted for their private opinions. Engineering a government backdoor creates the possibilities for these unethical treatments of citizens’ online data, contradicting engineers’ obligation to protect fundamental human rights [7]. Therefore, developers of cloud services should not give the server’s encryption keys to authorities upon request.

The possibility of denying government demands varies in different markets. In some scenarios, developers might be able to stick to their ethical codes and refuse the government’s request. For example, in 2015 the FBI asked Apple to write custom software that could break into an iPhone which belonged to a shooter responsible for fourteen deaths in San Bernardino. Apple refused, stating that engineering a backdoor into the phone would compromise data privacy, especially in future cases once the precedent has been set [10]. In this case, Apple engineers could outright deny the FBI’s unethical request because the emphasis on human rights in America’s public discourse allowed them to go public with the issue and gain traction for their stance. In some political environments, however, the most ethical way to release software products is not feasible. In the GCBD iCloud case, for example, Apple could not publicly object to the authorities’ demand because in China’s state-censored discourse, human rights and ethics are not as celebrated as in America. In markets like China, software companies’ ethical initiatives simply don’t have enough strength in front of a centralized administration.

When engineers are forced to create features that contradict their ethical codes, such as an all-purpose backdoor in a cloud service, an ethical resolution for them would be to stop offering the service in that particular market. Discontinuing a service might mean one fewer choice for the consumers, but more services aren’t necessarily better if they cause unethical harm to consumers’ right to privacy. An example of a company taking this route is Google, which exited the Chinese search market in 2010 because it refused to censor search results according to government agencies’ requirements [11]. The consequence of losing a market would be lost revenue and damage to the company’s finances, which is not always an easy consequence to face. In Google’s case, it could make this decision because the financial damage of leaving China didn’t pose a huge threat to the company’s future. It had a smaller market share in search engines compared to local search giant Baidu [12], and it could still generate millions in revenues from advertisements in China after it stopped offering its search service [13]. But if a software service is so important to a company that discontinuing it will effectively destroy the company’s future, should it be canceled solely due to ethical concerns? The question of whether a company should produce a software with a government backdoor will lose its meaning if the company no longer exists.

Unfortunately, there isn’t and likely will never be a definitive solution to the conflict between ethical considerations and financial interests. The decision-making in these situations is complex and needs to be considered on a case-by-case basis. For example, Google might decide that the honesty of their search engine is more important than the profit it generates in China, while Apple might decide that losing the Chinese market is too risky since most of its manufacturers are located in that country. Engineers need to carefully consider both sides of the issue before making any judgments because as the stakes of discontinuing a service rise, there will be a point where financial revenue outweighs ethical considerations. For some companies, that critical point could be where its entire future product roadmap needs to be reconsidered; for others it could be where canceling a product will result in bankruptcy. At that point, although choosing profit over privacy isn’t the most ethical choice, it is the more reasonable choice. But still, it is important for engineers to understand the distinction between “reasonable” and “easy”. Before jumping to the conclusion that they should create a compromised product to preserve profit, they must consider all alternatives such as shifting to another market or modifying the service to avoid unethical regulations. When evaluating the tradeoff between ethicality and profit, engineers should always put more weight to ethicality. They shouldn’t engineer services with unethical implications unless it is proven that every other option to avoid damaging privacy is infeasible.

If engineers do decide to create a cloud service with government surveillance, they still need to be responsible for their compromised service and operate it transparently. In philosophy, the owner of a private property has the authority to control the property [14]; one of the necessities of assuming full control over an object is understanding its status, such as where it is or what could happen to it. In the context of cloud services, users’ personal data are their private property. So, when they trust a service with their data, they have the right to know how and why their data are accessible via a backdoor. This approach also conforms to ethical standards of engineering organizations like ACM and IEEE, which require engineers to honestly disclose any potential conflicts of interest to affected parties [7] [15]. Therefore, when implementing the software, developers should inform consumers of the privacy risks by clearly displaying a message when users register for this service instead of burying it in the long and often ignored terms of service like iCloud does right now. Besides, when developers must build a backdoor in their service, they could still fulfill their duty to defend user privacy in a “soft” way, by proactively educating users on protecting privacy and their ethical concerns regarding government surveillance. These guidelines can be integrated into the user interface: for instance, when users upload their data into the cloud, they should see a pop-up text suggesting what types of data they should avoid uploading so that their security isn’t compromised.

Ultimately, consumers have complete control over whether they use a compromised cloud service. Individuals have the right to dispose of their private property as they see fit [13], so they should have the choice to upload their private data to services that they deem appropriate, given that they are properly informed of the risks in using the service. Still, when distributing the software, engineers should make sure that no users are unwittingly using a service with government surveillance. For example, users need to actively sign up for the service to use it; it should not be bundled with another product or automatically activated. Also, users should always have the option to quit using and delete their data from the cloud service. Despite having an unethical aspect in the form of a government backdoor, this approach at least ensures that no user unwillingly gives up their privacy to the authorities. It is not a perfect solution, but in situations where creating a backdoor-enabled product is unavoidable, it is the necessary compromise that covers both honesty to the consumers and survival of the company.

Software service providers will often encounter government policies that threaten privacy, such as giving government agencies keys to encrypted data. Since privacy is the foundation of inherent human rights like autonomy, engineers should not submit to such unethical government demands. They should first evaluate whether they can stop offering the product that violates their ethical standards. If stepping out of the market will cause a financial loss too big for the company to handle, they should offer the backdoor-enabled service with maximum transparency and let users make the choice on what to do with their data. With advancements in Internet technologies, ethical issues regarding data privacy will only increase in complexity. To resolve these issues, engineers need to carefully weigh and balance all aspects of the equation: the regulative policies, the users’ benefits, and the company’s finances.

By Jinkun Liu, Viterbi School of Engineering, University of Southern California


References

[1] Apple Inc., “iCloud operated by GCBD – Terms and Conditions”. [Online]. Available: https://www.apple.com/legal/internet-services/icloud/en/gcbd-terms.html. [Accessed: Apr. 4, 2018].

[2] S. Liao, “Apple officially moves its Chinese iCloud operations and encryption keys to China,” The Verge, Feb. 28, 2018. [Online]. Available: https://www.theverge.com/2018/2/28/17055088/apple-chinese-icloud-accounts-government-privacy-speed. [Accessed: Apr. 4, 2018].

[3] Apple Inc., “Report on Government and Private Party Requests for Customer Information, January 1 – June 30, 2017”. [Online]. Available: https://images.apple.com/legal/privacy/transparency/requests-2017-H1-en.pdf. [Accessed: Apr. 4, 2018].

[4] J. J. Britz, “Technology as a threat to privacy: Ethical challenges and guidelines for the information professionals,” in Microcomputers for Information Management, vol. 13, no. 3-4, pp. 175-93, 1996.

[5] S. Warren and L. Brandeis, “The Right to Privacy,” in Harvard Law Rev., vol. 4, no. 5, pp. 193-220, Dec. 1890.

[6] J. Gumbis, V. Bacianskaite, and J. Randakeviciute, “Do Human Rights Guarantee Autonomy?,” in Cuadernos constitucionales de la Cátedra Fadrique Furió Ceriol, vol. 62, pp. 77-93, 2008.

[7] Association for Computing Machinery, “ACM Code of Ethics and Professional Conduct”. [Online]. Available: https://www.acm.org/about-acm/acm-code-of-ethics-and-professional-conduct. [Accessed: Apr. 4, 2018].

[8] O. Beavers, “Gov’t contractor charged with leaking classified info to media,” The Hill, Jun. 5, 2017. [Online]. Available: http://thehill.com/homenews/administration/336432-federal-government-contractor-charged-for-leaking-classified-material. [Accessed: Apr. 4, 2018].

[9] N. Kobie, “In following China’s iCloud law, has Apple betrayed itself?,” Wired, Mar. 3, 2018. [Online]. Available: http://www.wired.co.uk/article/apple-icloud-china-iphone-data-privacy. [Accessed: Apr. 4, 2018].

[10] A. Kharpal, “Apple vs FBI: All you need to know,” CNBC, Mar. 29, 2016. [Online]. Available: https://www.cnbc.com/2016/03/29/apple-vs-fbi-all-you-need-to-know.html. [Accessed: Apr. 4, 2018].

[11] K. Waddell, “Why Google Quit China—and Why It’s Heading Back,” The Atlantic, Jan. 29, 2016. [Online]. Available: https://www.theatlantic.com/technology/archive/2016/01/why-google-quit-china-and-why-its-heading-back/424482/. [Accessed: Apr. 4, 2018].

[12] R. Fannin, “Why Google Is Quitting China,” Forbes, Jan. 15, 2010. [Online]. Available: https://www.forbes.com/2010/01/15/baidu-china-search-intelligent-technology-google.html. [Accessed: Apr. 4, 2018].

[13] A. Doland, “Despite Google’s Many China Woes, Its Ad Revenue There Is Growing,” AdAge, Dec. 29, 2014. [Online]. Available: http://adage.com/article/digital/google-s-china-woes-ad-sales-growing/296424/. [Accessed: Apr. 4, 2018].

[14] Stanford Encyclopedia of Philosophy, “Property and Ownership,” Sep 6, 2004. [Online]. Available: https://plato.stanford.edu/entries/property/. [Accessed: Apr. 4, 2018].

[15] IEEE, “IEEE Code of Ethics”. [Online]. Available: https://www.ieee.org/about/corporate/governance/p7-8.html. [Accessed: Apr. 4, 2018].