Information Privacy In The Us, Eu And China : Differences In Cultural Conceptions And Regulatory Stringency

In recent years, many governance and ethical frameworks have been established to address the rapid development and widespread deployment of AI. Much of the existing efforts have been devoted towards developing AI principles that outline key values and challenges; examples include the EU’s High-Level Expert Group on Artificial Intelligence’s Ethics Guidelines for Trustworthy AI (2019), the Beijing Academy of Artificial Intelligence’s Beijing AI Principles (2019) and The Future of Life’s Asilomar AI Principles (2017). Further work has also been done on comparing different sets of principles and finding overarching themes between them, with studies finding immense overlap between the principles outlined even across cultures (Floridi & Cowls, 2019; Zeng et al, 2019; Jobin et al, 2019). It is imperative to understand cross-cultural value differences as the impact of AI is global; the consideration of diverse perspectives is hence required to work towards AI that is beneficial for all (ÓhÉigeartaigh et al, 2020). However, there is limited existing literature addressing the different interpretations of said principles found across different cultures and communities (Whittlestone et al, 2019a, 2019b).

In this essay, I attempt to fill this gap by comparing the EU, the US and China’s different approaches to information privacy. Information privacy was selected to be the focus of this essay as it is a key recurring theme found across various sets of principles (Zeng et al, 2018; Whittlestone et al, 2019a; Floridi et al, 2018). Moreover, it is an area where allegedly irreconcilable ethical differences are commonly assumed to exist between China and ‘the West’ due to fundamentally different philosophical worldviews (Tam, 2018; Minter, 2016); there is also a tendency to conflate the EU and the US despite their different approaches to privacy (Movius & Krup, 2009; Bignami, 2007; Kobrin, 2004). Existing literature has mainly focused on the Confucian influence of digital ethics more broadly (Kirk et al, 2020) and comparisons in privacy regulation between the US and EU (Movius & Krup, 2009; Bignami, 2007; Kobrin, 2004); there is limited work comparing the respective privacy approaches of the three political entities.

The first section of this essay provides an overview of the different cultural conceptions of information privacy found in said political entities. Section two explores the implications of their respective conceptions by comparing the varying stringency levels each entity adopts towards the private sector and government respectively; reasons for said differences are also proposed. A caveat is that due to the limited scope of this essay, some claims and findings below are abstractions; for instance, the EU is not a monolithic entity and is made up of a culturally diverse patchwork of nations that share nuanced differences in their respective approaches to privacy (Bennett, 1992).

Section 1: Three Different Conceptions of Privacy

This first section introduces the different conceptions of information privacy held by the US, the EU and China.

USA: Commodity

In the US, privacy is viewed as a commodity rather than a right (Movius & Krup, 2009). This is evidenced in the economic terms and ‘property rights framing’ used to discuss questions of privacy, including “who ‘owns’ the data collected in a commercial transaction and who has the right to the rents flowing from its exploitation” (Kobrin, 2004). Kobrin also highlights Senator Hollings’ framing of the utility of the Online Personal Privacy Act, where privacy’s importance for “promoting consumer confidence and bolstering online commerce, and preventing consumer fears from stifling the Internet as a consumer medium (U.S. Senate Committee on Commerce, 2002, para. 2)” was stressed. More broadly, the US takes a pragmatic approach to privacy. Benthall (2019a) argues that US privacy regulation is grounded in realpolitik, with corporate powers having a strong influence over governing institutions. For instance, private actors like Big Tech engage in legal entrepreneurship by reinterpreting entitlement relationships and actively reshaping legal institutions (Cohen, 2019). A notable example is Big Tech’s use of the first amendment, the freedom of speech, as a shield for information privacy violations (Kaminski & Skinner-Thompson, 2020; Volokh, 2000). Cohen (2019) also argues that private actors in the information age exploit neoliberal values by pitting ‘innovation’ and ‘privacy’ against one another to justify the inability to deliver on the latter.

The pragmatism of the US privacy system can also be observed in the Supreme Court’s focus on privacy harms rather than privacy as an inherent right. Citron and Solove (2021) highlight how harms act as a gatekeeper of privacy cases, with many such cases being dismissed due to a purported lack of cognizable harms experienced by the plaintiff. For example, Doe v. Chao acts as a precedent whereby the Supreme Court held that “statutory damages provision under the federal Privacy Act of 1974” would only be pressed if the plaintiff experienced “actual harms”. Citron and Solove (2021) outline that said cognizable harms involve physical and economic harms, and reputational harms to some extent, but not ‘intangible’ harms such as emotional harms as ruled in Federal Aviation Administration v. Cooper. This approach to privacy benefits corporate actors as the privacy harms experienced are often diffused amongst a large aggregate population and are largely ‘intangible’, ultimately failing to render said organisations accountable for privacy violations.

EU: Human Right

The EU considers privacy to be a human right, as seen in its reference in the Council of Europe’s Convention for the Protection of Human Rights and Fundamental Freedoms, and in the constitutions of multiple EU countries, including Spain and Germany (Bignami, 2007; Kobrin, 2004; Rustad & Koenig, 2018). The EU’s definition of privacy is strongly influenced by Kant (Benthall, 2019a), who stipulates that human freedom is grounded in autonomy, self-determination and dignity (Kant, 2008) and highlights that autonomous decisions should be made free from “physical force, coercive threats, deception, manipulation, and oppressive ideologies” (Hill, 2013, p. 29, para. 3). Kantian autonomy greatly influenced Germany and their legal system, which in turn, greatly influenced the EU and their conception of human rights (Benthall, 2019a). The EU’s concern of privacy therefore stems from the fear that an erosion of privacy would lead to an erosion of self-determination and autonomy. As Floridi highlights (2016, p. 1), this is observed in the Article 88 of the 2016 General Data Protection Regulation, which stresses that the stipulated regulations “shall include suitable and specific measures to safeguard the data subject’s human dignity [Floridi’s italics], legitimate interests and fundamental rights” (Council of the European Union, 2016).

There are also historical reasons driving the EU’s concern for privacy as a human right. In Nazi Germany of the 1930s, census data including information on nationality and religion were used to identify and round up Jews. Several decades later, East Germany’s Stasi also greatly violated privacy by screening mail, wiretapping and bugging citizens, and gathering heaps of personal information via unofficial informants (Waxman, 2018). This collective trauma of privacy violations that facilitated immense harm has led the EU to have the most comprehensive data protection initiative in the world. An example is the 1995 EU Data Protection Directive, which protects the data privacy rights of European citizens and addresses both the processing and transferral of data. Its protections also extend beyond the EU; Article 25 only allows data to be transferred to countries deemed to have adequate protections in place (Birnhack, 2008). More recently, the EU established the General Data Protection Regulation (European Commission, 2016), which is arguably the most comprehensive and robust data protection system in place in the world (Goddard, 2017); it will be discussed in more detail in section 2.

China: Instrument Safeguarding National Security

Privacy in China is primarily wielded as an instrument safeguarding national security, social stability and population flourishing. This translates to some protection from corporate machinations, but does not check against government power as that would weaken surveillance and hence national security (Huai, 2020).

Privacy is often taken as a trade-off of national security (Whittlestone, 2019b) due to its impediment to intelligence gathering. This poses a prioritization issue for the Chinese due to the huge importance they place on national security, which is a reflection of the Confucian value of harmony (Li, 2006; Wong, 2011), though it is also grounded in more practical matters. Since the Han dynasty, the legitimacy of the Chinese government has been grounded in the Mandate of Heaven [1], which outlines that the ruler’s legitimacy from heaven is derived from their ability to foster social stability, harmony, prosperity and flourishing (Fan, 2011; Perry, 2008). Periods of unrest, whether social, economical or geographical, are viewed as heavenly signs of an impending dynastic overthrow, and are used to undermine the existing government (Fan, 2011; Perry, 2008). It is therefore in the best interests of the CCP to ensure that such conditions of human flourishing are met as legitimacy is derived from the performance of the rulers (Fan, 2011; Jiang, 2011). National security and prosperity is hence prioritized in China over privacy to prevent divisions of power and resulting disorder, and the undermining of CCP legitimacy; indeed, a more cynical take is that the CCP is more concerned about maintaining power under the guise of ‘social stability’ and national security (Roberts et al, 2021). The Late Qing Dynasty saw strong negative connotations being attached to privacy, so much so that the Chinese term ‘private’, si, also came to mean selfishness (Zarrow, 2002). This belief is reiterated by seminal late-Qing and early Republic intellectual Liang QiChao. For him, Gong which translates to ‘public’, indicates ‘public mindedness’ and si selfishness. He highlights that having a private sphere is important for cultivating morality, but should not happen at the expense of group welfare. Zarrow outlines that for Liang and many pre-modern and modern Chinese thinkers: “Gong (public) represented individual rights but only within their proper bounds, while si essentially represented the same concept of rights but as practised in an aggrandised, monopolistic and predatory fashion” (2002).

Concrete examples of weak privacy protections against the State include Article 7 of China’s National Intelligence Law, which requires all citizens and institutions to help safeguard national security (Chinese National People’s Congress Network, 2017); this involves companies handing over data if requested by the CCP. Legal instruments aside, the CCP also features softer tools for coercion (Feng, 2019). For instance, police cells have been instituted in Big Chinese Tech companies such as Baidu, Alibaba and Tencent, where employees may be asked to pass over sensitive information. Many of the top executives at Chinese firms, from Huawei’s Ren Zhengfei to Alibaba’s Jack Ma, also have close ties with the CCP.

China has recently established targeted privacy regulations that protect users against corporate interests and prevent data misuse by companies. This contrasts against the lack of privacy regulations surrounding the State; this discrepancy is due to the view that privacy from the State impedes national security while privacy from the private sector protects it; privacy protections are perceived to prevent private sector companies from gaining too much power and control over the citizens (Pernot-Leplay, 2020). Moreover, consumer privacy protections can be wielded as a tool to earn consumer trust and encourage engagement within the digital economy (Hao, 2020) In recent years, a range of papers from various institutions have highlighted the importance of privacy in the discussion of AI ethics and governance (Beijing Academy of Artificial Intelligence, 2019; Ying, 2019; Zeng et al, 2018; Ding, 2018). For instance, the 8th principle of China’s New Generation AI Governance Expert Committee’s Chinese AI principles stresses the importance of privacy. Furthermore, said principles and discussions have been gradually translated into regulatory action, as reflected in a ban of over 100 apps by the Chinese government due to privacy-related issues and the need for dozens more to alter practices relating to data collection and storage (Toh, 2020). China’s Cybersecurity Law was also enacted in 2017, which requires companies to store their network data in China and allows the government to conduct security checks of their network operations upon request (Creemers et al, 2018).

Section 2: Privacy Approach: Stringency Level Comparison & Reasons for Divergence

Drawing from the various conceptions and implementations of privacy found across the three political entities, this second section compares the varying stringency levels of their respective approaches towards privacy from the private sector and the State, and proposes possible philosophical and practical reasons for said differences.

The Private Sector

Out of the three political entities, the EU takes the most stringent approach towards the private sector due to its conception of privacy as a human right (Bignami, 2007; Kobrin, 2004; Rustad & Koenig, 2018). This is reflected in its recent establishment of the GDPR, which is arguably the most comprehensive and robust data protection system in place in the world. The GDPR (European Commission, 2016) stipulates the many rights to privacy held by users, such as the right to access personal data (Art. 15), right to have said data deleted (Art. 17) or amended (Art. 16) and right to revoke consent at any time (Art. 7). It also requires the data controller to clearly communicate the purpose of the collected data to users (Art. 6(1)(a)), keeping them up to date and fully informed. Data controllers are forced to minimize the personal data processed to only that which is necessary for a specific purpose (Article 5(1)(c)) and are required to have information security measures in place (Article 5(1)(f)). Financial penalties for data privacy violations have also been drastically augmented to increase accountability (Burgess, 2020).

The EU’s focus on privacy as an intrinsic right contrasts against the US’ conception of privacy, which appears to be less ‘philosophical’ in nature and more pragmatic, focusing on potential “substantial harm[s]” that could be inflicted (Benthall, 2019b). In the US, privacy is not an alienable right that is enshrined in the constitution nor is it mentioned in the Bill of Rights; the concept was first explicitly introduced by American lawyers Samuel Warren and Louis Brandeis’, who define the term as “the right to be left alone” (1890). The US has the weakest consumer privacy protections out of all three entities due to its conception of privacy in economic terms and its prioritization of economic efficiency. To this date, the Supreme Court offers weak consumer privacy law; instead, there is an emphasis on ‘self-governance’, which could be attributed to the nation’s strong libertarian streak and desire to not stifle innovation. Commercial privacy regulation that is enacted also tends to be sector-specific and reactive in nature, forming a “patchwork” rather than a unitary system of laws (Fromholz, 2000). A key example is the Fair Credit Reporting Act 1970, which is the first attempt of privacy regulation in the private sector; it only covers credit information and reports (Kagan, 2021). This bottom-up and market-driven approach to privacy governance starkly contrasts against the EU’s top-down approach, which relies on federal laws.

Their different approaches to privacy towards the private sector stem from the less consumer-driven nature of the EU economy compared to that of the US (Movius & Krup, 2009). For instance, inhabitants in the EU have between 0.8 and 3.9 credit cards, with the upper figure typically referring to those with cards issued from the EU but who are living abroad (European Central Bank, 2018), whereas Americans have an average of four credit cards (Stolba, 2020). Furthermore, the EU has an average savings rate of 10.8% and the US only 5.2% (Leetmaa et al, 2009). Movius & Krup (2009) argue that the EU Directive is an exemplar instance of the EU prioritizing rights over economic efficiency and data access. The EU and the US’ differing attitudes towards the bilateral Safe Harbor Agreement, which allows EU data to be transferred to the US in compliance with the EU Data Directive (Movius & Krup, 2009), encapsulates the contrasting attitudes and priorities they hold with regards to privacy and economic efficiency: Kobrin (2004, p. 116) notes that during the time of agreement, the US found SHA to be too expensive for business operations whilst the EU found the agreement to be too lenient.

China comes behind the EU but in front of the US in terms of the stringency of their consumer privacy regulations. Its comparatively stronger protections could be attributed to their perception of corporate overreach as a potential threat to national security (Pernot-Leplay, 2020) and their need to boost consumer trust to drive the economy (Hao, 2020). Recently, the 2018 Personal Information Security Specification (henceforth PISS) was introduced, where novel requirements were stipulated, such as a ban on coercion in data collection agreements via service bundling (Shi et al, 2019). PISS was inspired by the EU’s GDPR and features a comprehensive system of rules, compared to the ‘patchwork’ of cybersecurity laws facing the US’ corporate sector (Movius & Krup, 2009). The specification also features requirements that are found in the EU but not in the US, such as limits on further processing and data minimization (Pernot-Leplay, 2020). However, the privacy regulations found in China are weaker than those in the EU. For example, PISS only serves as an encouragement of best practices rather than a mandatory legal document (Horseley, 2021). Moreover, many requirements are weaker than their GDPR counterparts (Roberts et al, 2021); for instance, data minimization only applies to collected personal information that has no bearing on the services provided, contrary to the strict adherence required to the stated purpose à la GDPR (Pernot-Leplay, 2020). This weakening could be due to China’s need for technological innovation, which would spur national economic development and consolidate China’s geopolitical power on the world stage. Indeed, the push for innovation by the Chinese government can be observed in China’s 13th 5-year-plan on National Technological innovation, which highlights their desire to catch up in various industries (Li, 2018). This is especially notable in the field of AI technologies, as seen in developments such as the New Generation of Artificial Intelligence Development Plan, which stresses China’s goal to become the leading AI superpower by 2030 (Webster et al, 2017). Moreover, multiple scholars have also underlined that AI strategy and governance in the EU prioritizes the mitigation of potential harms whilst China prizes innovation above other considerations barring national security threats (Roberts et al, 2021; Duan, 2020; Zeng, 2020).

Although China currently has more stringent consumer privacy protections than the US, said laws are becoming more stringent in the latter (Williams, 2020). Following high-profile data scandals such as Cambridge Analytica, there have been strong calls from the general population for stricter privacy protections. This, coupled with a rising fear of innovation stealing by the Chinese (Henriquez, 2021), has led to more attention being paid to consumer privacy and privacy more generally in the US in recent years. California acts as a notable example. In 2017, the California Consumer Privacy Act was established in response to the large-scale data breaches in recent years, the most prominent one being the Cambridge Analytica scandal (Gavejian et al, 2020). This act gives consumers various rights, such as the right to know what data is being collected about them by businesses and whether said data is being sold or disclosed (Bonta, 2018). However, one could highlight that California has always been ahead of the rest of the nation in terms of privacy governance; for instance, in 1972, California voters amended the California Constitution to include the right of privacy among the “inalienable” rights of all people (Kelso, 1991). A further example is how two members of Congress representing Silicon Valley proposed introducing a “consumer privacy bill” in 2019, which would entail establishing “a federal digital privacy agency” that would “protect Americans’ personal information and prohibit companies from using that data for discriminatory purposes in areas like jobs and credit” (Singer, 2020), reminiscent of privacy agencies found in the EU (more details below). Another point to note is that although consumer privacy laws in China are stricter than in the US, enforcement also matters.

Government

Although both the US and EU focus on individual rights, the latter has much stronger protections against government intrusions of privacy than the former. This point begs further exploration as the primacy of individual rights over the government is enshrined in the founding principles of America. This is evidenced in the Declaration of Independence, which states:

“…that all men… endowed by their Creator with certain unalienable Rights… That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed…” (Congress, U.S., 1776, para. 2)

The US adopts the Lockean ideal in which the individual with their natural rights comes before the State, with the purpose of the State being to ensure that said rights are safeguarded. Thus, given that privacy protections against government are not perceived to impede economic innovation, one would expect a reversion to a rights-based approach with regards to privacy from the government. In fact, scholars have highlighted that the EU and US shared similar privacy regulations against the government during the Nixon era [2]. In a study comparing the data protection regimes and the relevant national legislation of the US, Sweden, Germany and the UK, Bennett (1992) uncovered many similarities between the countries’ respective approaches, ranging from protections against the State from using data as an instrument of oppression to an emphasis on safeguarding individual dignity. Indeed, during said era, the US introduced several acts to check government power and protect citizens from governmental privacy intrusions. This includes the 1964 extension of the Fourth Amendment by the Supreme Court (Douse, 1972) and the US Privacy Act of 1974, which outlines guidelines that prevent data misuse by governments, albeit with no mention of corporations (The United States Department of Justice, 2020). Similarly, the EU also introduced data protection mechanisms against the state at around the same time: the Convention on Personal Data Processing was opened for signature by the Council of Europe in 1981; it covers a wide range of topics, such as the necessity of legal safeguards when processing sensitive information and the right to know that information is being stored, and remains influential even today.

However, the privacy regimes of both entities later diverged with the EU adopting stricter restrictions. Bignami (2007) attributes this divergence to three reasons. Firstly, the US and the EU have different privacy enforcement mechanisms. The EU relies on independent privacy agencies and the US individual litigants. Bignami argues that the former is much more effective than the latter because it is difficult to sue the government successfully given that violations of the Privacy Act must be demonstrated to be “intentional or willful” (p. 685). In contrast, EU’s privacy agencies have the authority to investigate other government agencies for potential violations of privacy; their effectiveness is compounded due to their “expertise, historical memory, and bureaucratic dedication” (p. 685), along with their capacity to act as policymakers in keeping up to date with new privacy developments. A second reason is the rise of executive power in the US since the Reagan administration — think the “unitary executive,” the “presidential administration’’ and the “war against terror” (p. 686) — where presidents have reformed many areas of policy including information privacy. This stands in stark contrast to the various checks introduced to national executive power by the EU. Bignami (2007) points to the example of how contending with exiting EU data protection law would require dealing with multiple entities including “other Member States, the Court of Justice, and the Working Party of Data Protection Commissioners; in the Council of Europe, the European Court of Human Rights; and at the national level, its judicial branch and its independent privacy agency” (p. 687). The third reason provided for this divergence in privacy stringency from the State is historical trauma. Many human rights catastrophes in Europe were partly made possible by data breaches by the government, notably that of Nazi Germany and the Stasi mentioned in an earlier section. Bignami (2007) claims that the US does not have a comparable lived experience, which leads to less stringent protections.

Moreover, the privacy protections from the US government have further weakened following 9/11 due to the Patriot Act that was established as a response: the act allows for higher government surveillance via methods such as wiretapping and access to business records when requested to combat terrorism and protect national security (McCarthy, 2002), though some checks were later reintroduced via legislation such as the 2015 USA Freedom Act (Beaghley, 2015). This divergence in approaches is evidenced in the annulment of multiple US-EU privacy agreements due to the latter’s privacy concerns. For example, Safe Harbour was invalidated by the Court of Justice of the European Union (CJEU) in 2015 due to EU concerns of US government surveillance as US law enforcement can gain access to personal data transferred from the EU. The later replacement of said agreement, Privacy Shield, also failed in 2020 due to similar reasons (Kerry, 2021). That is not to say that the EU has foolproof protections against government surveillance; although there are checks in place, the use of broad terms like ‘national security’ in the GDPR (European Commission, 2016) as overriding exemptions may not completely curtail ‘surveillance’. However, the right to privacy in the EU is also based on the fundamental rights law framework, which highlights the need for proportionality (European Commission, 1992, art. 5). Broadly, this entails weighing in various factors, such as the problem at stake and public welfare versus private interests (Bignami, 2007). As Kloza and Dreschsler (2020) argue, the CJEU has already invoked proportionality in previous cases, such as “in determining if a disclosure of personal data was proportionate to the legitimate aim pursued (Case C-465/00, Österreichischer Rundfunk)” (para. 12). “Proportionality” is also increasingly brought to bear on privacy regulation, as evidenced in its references in many articles of the GDPR, with Article 6, “Lawfulness of processing”, and Article 24, “Responsibilities of the Controller”, being examples (European Commission, 2016). China has the weakest protections against privacy intrusions from the government. This can be observed in the above-mentioned Article 7 of China’s National Intelligence Law, which requires citizen cooperation with requests aiding national security, such as data handover (Chinese National People’s Congress Network, 2017). I have previously also touched on the soft power the government holds over the private sector due to the police cells embedded in various firms and the executives’ close association with the CCP (Feng, 2019). That is not to say that there are no initiatives protecting the privacy of the people from the government. The 2020 Draft Personal Information Protection Law in China has regulations that also apply to State organs (Creemers et al, 2020). For example, Article 61 empowers users to file complaints relating to right violations and misuse of information to the relevant State departments, while Article 65 allows individuals to “seek compensation” for any infringements on rights (Creemers et al, 2020). However, said attempts are mostly aspirational at present (Horsely, 2021). Furthermore, it is clear that any privacy initiatives can be largely overridden in favour of national intelligence. For instance, Article 35 of the law stipulates that unlike private firms, State organs do not need to request consent if it would obstruct their duties, which likely involves any national security matters (Creemers et al, 2020).

China’s weak privacy regulations from the government is due to the paternalistic role its government adopts in ensuring national security and collective welfare over individual rights. Contrary to the EU and the US, China does not have a governance system that prizes individual rights (Chan, 2002; Fan, 2011); the free choice and self-fulfillment of the individual is not a priority or an inalienable right in China, as illuminated in Article 51 of the Constitution of the People’s Republic of China, which stands in stark contrast to the US constitution (Hurlock, 1993):

“When exercising their freedoms and rights, citizens of the People’s Republic of China shall not undermine the interests of the state, society or collectives, or infringe upon the lawful freedoms and rights of other citizens…..” (Constitution of the People’s Republic of China. art. 51).

Furthermore, throughout most of Chinese history, a system of ‘meritocratic’ rule has been present, where the masses are ruled by an educated elite who ascend through the ranks primarily through examinations. Those in power are said to not only be exemplary in intellect, but also in moral cultivation; they are called Junzi, virtuous individuals, and set an example for the masses to follow. As a result, decisions are outsourced to this more educated and morally-cultivated elite, who take on a paternalistic role in guiding the masses (Pohl, 2002). As Han (2013) describes, the role of the Chinese State is to adopt a stance of “paternalistic authoritarianism” (p. 117), which runs counter to the European value of autonomy. Thus, in order to play an effective paternalistic role in guiding the population and ensuring national security, comprehensive intelligence of both public and private affairs is necessary. This is evidenced in China’s use of a Social Credit System. In a 2014 plan, the Chinese government outlined plans for a system that spits out a score for each citizen based on observed financial, social and moral behaviour; the score of the users would then enable or prevent them from accessing certain resources or obtaining certain privileges (Mitchell & Diamond, 2018). Despite its privacy violations and manipulative acts, many Chinese citizens view the social credit systems as a positive development as it helps build social trust and maintain societal harmony (Hawkins, 2017). Thus, the historically paternalistic role played by the Chinese government in ensuring national security could explain why China has the weakest privacy protections against the government out of all three entities; national security is a clear priority over individual privacy. However, there is evidence that said priorities are changing amongst the population. Recently, citizens have begun to voice concerns over surveillance technologies, such as facial recognition, being employed by the government; said calls have only grown stronger since the COVID-19 pandemic (Horsely, 2021). Chinese regulators have gone as far as citing said privacy concerns as having the potential to undermine social stability (新華社, 2020).

Section 3: Conclusion

In conclusion, the US, the EU and China hold different conceptions of privacy that roughly correspond to privacy as a commodity, a human right and an instrument safeguarding national security respectively. These various conceptions, along with other practical reasons, led the entities towards adopting differing levels of privacy protections against the private sector and the government. The stringency rankings can be summarised as:

● Private sector: EU > China > US

● Government: EU > US > China

However, these rankings blur the distinct mechanisms and incentives behind the decisions of each political entity. For instance, the US has weakest privacy protections as it views privacy as a commodity, and ultimately prioritizes innovation and economic efficiency due to its consumer-driven economy (Movius & Krup, 2009). Although it prizes individual rights, various practical reasons have also led to weakened privacy protections against the government (Bignami, 2007). The EU views privacy as a human right and perceives the government as a guarantor of said right (Kobrin, 2004; Rustad & Koenig, 2018). It is less affected by economic concerns as its economy is less consumer-driven (Movius & Krup, 2009) but is influenced by historical trauma (Bignami, 2007). Finally, China uses privacy to safeguard national security and gain consumer trust (Pernot-Leplay, 2020; Hao, 2020), but not at the complete expense of innovation which is needed to consolidate their economical and geopolitical power. Collective welfare of the society also comes before any individual, with the government playing a paternalistic role in ensuring harmony and national security (Kirk et al, 2020; Han, 2013; Fan, 2011; Perry, 2008); weak protections against the government are hence observed. Due to the limited scope of this essay, I only offer a high-level overview and comparison of the varying conceptions and approaches taken by the three political entities; future work should explore specific differences in cross-cultural regulatory practices in more detail. I was also unable to explore the ramifications these conceptions have on international cooperation, which is a key topic that needs to be addressed as it relates to practical issues such as data transfer. Other future directions involve digging deeper into other possible important cross-cultural ethical and value differences. Key points of tension to explore include differing values surrounding Cosmocentrism versus Humancentrism (Gal, 2019; 中國信息安全, 2019), fairness versus efficiency and attitudes towards technological progress (Whittlestone et al, 2019a). An awareness of similarities should also be cultivated by exploring common ground that can be taken as a springboard for discussion, such as a common concern for moral deskilling across Aristotelian and Confucian thought (Vallor, 2015; Wong, 2019).

[1] Though the roots of the idea originated back in the Zhou Dynasty, the Han emperor was the first to officially rule under such a mandate [2] Though not corporations (which the EU held a tough stance on early on)

References

Abrami, R. M., Kirby, W. C., & McFarlan, F. W. (2014). Why China can’t innovate. Harvard business review, 92(3), 107-111. Agence France-Presse in Shanghai. (August, 2019). Chinese deepfake app Zao sparks privacy row after going viral. The Guardian. Retrieved from https://www.theguardian.com/technology/2019/sep/02/chinese-face-swap-app-zao-triggers-privacy-fears-viral. Allison, G., Schmidt, E. (Aug, 2020). Is China Beating the U.S. to AI Supremacy?, Belfer Center for Science and International Affairs. https://www.belfercenter.org/publication/china-beating-us-ai-supremacy Arnold, Z. (2020). What Investment Trends Reveal about the Global AI Landscape. Brookings. https://www. brookings. edu/techstream/what-investment-trendsreveal-about-the-global-ai-landscape. Beaghley, S. (2015). The USA Freedom Act: The Definition of a Compromise. RAND Corporation. https://www.rand.org/blog/2015/05/the-usa-freedom-act-the-definition-of-a-compromise.html Beijing Academy of Artificial Intelligence (BAAI). (2019). Beijing AI Principles. Bennett, C. J. (1992). Regulating privacy: Data protection and public policy in Europe and the United States. Cornell University Press. Benthall, S. (2019a). The diverging philosophical roots of U.S. and E.U. privacy regimes. Digifesto. https://digifesto.com/2019/11/14/the-diverging-philosophical-roots-of-u-s-and-e-u-privacy-regimes/ Benthall, S. (2019b). Autonomy as link between privacy and cybersecurity. Digifesto. https://digifesto.com/2019/11/21/autonomy-as-link-between-privacy-and-cybersecurity/ Bignami, F. (2007). European Versus American Liberty: A Comparative Privacy Analysis of Antiterrorism Data Mining. BCL Rev., 48, 609. Birnhack, M. D. (2008). The EU Data Protection Directive: An engine of a global regime. Computer Law & Security Review, 24(6), 508-520. Bonta, R. (2018). California Consumer Privacy Act (CCPA). State of California Department of Justice. https://oag.ca.gov/privacy/ccpa. Burgess, M. (2020, Mar). What is GDPR? The summary guide to GDPR compliance in the UK. The Wired. https://www.wired.co.uk/article/what-is-gdpr-uk-eu-legislation-compliance-summary-fines-2018. Brandeis, L., & Warren, S. (1890). The right to privacy. Harvard law review, 4(5), 193-220. Brindley, E. (2010). Individualism in Early China: Human Agency and the Self in Thought and Politics. University of Hawaii Press. Chan, J. (2002). “Moral Autonomy, Civil Liberties, and Confucianism,” Philosophy East and West, 52.3: 281-310. Chinese National People’s Congress Network. (2017, June 27). National Intelligence Law of the People’s Republic. Retrieved From https://cs.brown.edu/courses/csci1800/sources/2017_PRC_NationalIntelligenceLaw.pdf Citron, D. K., & Solove, D. J. (2021). Privacy Harms. Available at SSRN. Cohen, J. E. (2019). Between Truth and Power: The Legal Constructions of Informational Capitalism. Oxford University Press, USA. Congress, U. S. (1776). Declaration of Independence. Retrieved from https://www.archives.gov/founding-docs/declaration-transcript Council of the European Union. (2016). ‘Position of the Council at First Reading with a View to the Adoption of a Regulation of the European Parliament and of the Council on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46/Ec (General Data Protection Regulation) St 5419 2016 Init - 2012/011 (Olp)’. Creemers, R., Triolo, P., & Webster, G. (2018). Translation: Cybersecurity Law of the People’s Republic of China (Effective June 1, 2017). New America. Creemers, R., Shi, M., Dudley, L., Webster, G. (2020, Oct 21). China’s Draft ‘Personal Information Protection Law’ (Full Translation). New America. https://www.newamerica.org/cybersecurity-initiative/digichina/blog/chinas-draft-personal-information-protectio n-law-full-translation/ Ding, J. (2018). Deciphering China’s AI dream. Future of Humanity Institute Technical Report. Douse, S. C. (1972). The concept of privacy and the Fourth Amendment. U. Mich. JL Reform, 6, 154. Duan, W. (2020). Build a robust and agile artificial intelligence ethics and governance framework [构建稳健敏捷的人工智能伦理与治理框架]. Science Research [科普研究], 15(03), 11- 15+108-109. European Central Bank. (2018, Sept). Fifth report on card fraud. European Central Bank. Retrieved from https://www.ecb.europa.eu/pub/cardfraud/html/ecb.cardfraudreport201809.en.html European Commission. (1992). Treaty on European Union. Luxembourg: Office for Official Publications of the European Communities. Retrieved from https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A12016M005 European Commission. (2014). A Rights-Based Approach, Encompassing all Human Rights for EU Development Cooperation. European Instrument for Democracy and Human Rights. Retrieved from https://ec.europa.eu/international-partnerships/system/files/online-170621-eidhr-rba-toolbox-en-a5-lc_en.pdf European Commission (2016). Regulation EU 2016/679 of the European Parliament and of the Council of 27 April 2016. Official Journal of the European Union. Retrieved from https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A02016R0679-20160504&qid=153234868343 4 (accessed 4 June 2021). Fan, R. (2011) The Renaissance of Confucianism in Contemporary China (Vol. 20). Springer Science & Business Media. Pp.142. Feng, A. (2019). We Can’t Tell if Chinese Firms Work for the Party. Foreign Policy. https://foreignpolicy.com/2019/02/07/we-cant-tell-if-chinese-firms-work-for-the-party/ Floridi, L., & Cowls, J. (2019). A unified framework of five principles for AI in society. Issue 1.1, Summer 2019, 1(1). Floridi, L., Cowls, J., Beltrametti, M., Chatila, R., Chazerand, P., Dignum, V., … & Vayena, E. (2018). AI4People—an ethical framework for a good AI society: opportunities, risks, principles, and recommendations. Minds and Machines, 28(4), 689-707. Floridi, L. (2016). On human dignity as a foundation for the right to privacy. Philosophy & Technology, 29(4), 307-312. Fromholz, J. M. (2000). The European Union data privacy directive. Berk. Tech. LJ, 15, 461. Future of Life Institute (FLI). (2017). Asilomar AI Principles. Retrieved from https://futureoflife.org/ai-principles/. Gal, D. (2019). Perspectives and approaches in AI ethics: East Asia. Oxford Handbook of Ethics of Artificial Intelligence, Oxford University Press, Forthcoming. Gavejian, J. C., Lazzarotti, J. J., Atrakchi, M. (2020, Jan 6). The Case that Sparked the CCPA Gets an FTC Final Order. Jackson Lewis. Retrieved from https://www.workplaceprivacyreport.com/2020/01/articles/uncategorized/the-case-that-sparked-the-ccpa-gets-an -ftc-final-order/ Glaeser, A. A., & Sacerdote, B. (2001). Why Doesn’t the US Have a European Style Welfare State?. Brookings Papers on Economic Activity, Fall. Goddard, M. (2017). The EU General Data Protection Regulation (GDPR): European regulation that has a global impact. International Journal of Market Research, 59(6), 703-705. Gormley, K. (1992). One hundred years of privacy. Wis. L. Rev., 1335. Han, P. C. (2013). Confucian leadership and the rising Chinese economy: Implications for developing global leadership. Chinese economy, 46(2), 107-127. Hao, K. (2020, Aug 19). Inside China’s unexpected quest to protect data privacy. MIT Technology Review. Retrieved from https://www.technologyreview.com/2020/08/19/1006441/china-data-privacy-hong-yanqing-gdpr/ Hawkins, A. (2017, May 24). Chinese Citizens Want the Government to Rank Them. Foreign Policy. https://foreignpolicy.com/2017/05/24/chinese-citizens-want-the-government- to-rank-them/ Henriquez, M. (2021, Feb 2). China has stolen the personal data of 80% of American adults. Security Magazine, https://www.securitymagazine.com/articles/94493-china-has-stolen-the-personal-data-of-80-of-american-adults High-Level Expert Group on Artificial Intelligence (2019). Ethics guidelines for trustworthy AI. Retrieved from https://ec.europa.eu/digital-single-market/en/news/ethics-guidelines-trustworthy-ai Hill, T. (2013). Kantian Autonomy and Contemporary Ideas of Autonomy. In Kant on Moral Autonomy, ed. Oliver Sensen, 15–31. Cambridge: Cambridge University Press. Horsely, J. (2021). How will China’s privacy law apply to the Chinese state?. Brookings. Retrieved from https://www.brookings.edu/articles/how-will-chinas-privacy-law-apply-to-the-chinese-state/. Yao-Huai, L. (2020). Privacy and data privacy issues in contemporary China. In The Ethics of Information Technologies, 189-197. Routledge. Hurlock, M. H. (1993). Social harmony and individual rights in China. Jiang, Y. L. (2011). The Mandate of Heaven and The Great Ming Code. Seattle: University of Washington Press. Jobin, A., Ienca, M., & Vayena, E. (2019). The global landscape of AI ethics guidelines. Nature Machine Intelligence, 1(9), 389-399. Kagan, J. (2021). Fair Credit Reporting Act (FCRA). Investopedia. Retrieved from https://www.investopedia.com/terms/f/fair-credit-reporting-act-fcra.asp Kaminski, M. E., Skinner-Thompson, S. (2020, March 9). Free Speech Isn’t a Free Pass for Privacy Violations. Slate. Retrieved from https://slate.com/technology/2020/03/free-speech-privacy-clearview-ai-maine-isps.html Kant, I. (2008). Groundwork for the Metaphysics of Morals. Yale University Press. Kelso, J. C. (1991). California’s Constitutional Right to Privacy. Pepp. L. Rev., 19, 327. Kersbergen, K.V. (2018). The Welfare State in Europe. OpenMind BBVA. Retrieved from https://www.bbvaopenmind.com/en/articles/the-welfare-state-in-europe/ Kerry, C. (2021, Jan 11). The oracle at Luxembourg: The EU Court of Justice judges the world on surveillance and privacy. Brookings. Retrieved from https://www.brookings.edu/research/the-oracle-at-luxembourg-the-eu-court-of-justice-judges-the-world-on-surv eillance-and-privacy/ Kirk, H. R., Lee, K., & Micallef, C. (2020). The Nuances of Confucianism in Technology Policy: an Inquiry into the Interaction Between Cultural and Political Systems in Chinese Digital Ethics. International Journal of Politics, Culture, and Society, 1-24. Kloza, D., Dreschsler, L. (2020, Dec 9). Proportionality has come to the GDPR. European Law Blog. Retrieved from https://europeanlawblog.eu/2020/12/09/proportionality-has-come-to-the-gdpr/. Kobrin, S. J. (2004). Safe harbours are hard to find: the trans-Atlantic data privacy dispute, territorial jurisdiction and global governance. Review of International Studies, 111-131. Kuhnle, S., & Sander, A. (2010). The emergence of the western welfare state. In The Oxford handbook of the welfare state. Leetmaa, P., Rennie, H., & Thiry, B. (2009). Household saving rate higher in the EU than in the USA despite lower income. Eurostat Stat Focus, 29, 1-11. Li, C. (2006). The Confucian ideal of harmony. Philosophy East and West, 583-603. Li, Y. (2018, August 3) Understanding China’s Technological Rise. The Diplomat. Retrieved from https://thediplomat.com/2018/08/understanding-chinas-technological-rise/ Lin, Y. (1975). “The Evolution of the Pre-Confucian Meaning of Jen and the Confucian Concept of Moral Autonomy,” Monumenta Serica, 31,172-83. McCarthy, M.T. (2002). Recent Developments: USA Patriot Act. Harvard Journal on Legislation, 39, 435-453. Minter, A. (2016, May 17). Why China Doesn’t Care About Privacy. Bloomberg Opinion. Retrieved from https://www.bloomberg.com/opinion/articles/2016-05-17/why-china-doesn-t-care-about-privacy Mitchell, A., & Diamond, L. (2018). China’s surveillance state should scare everyone. The Atlantic. Retrieved from https://www.theatlantic.com/international/archive/2018/02/china-surveillance/552203/ Movius, L. B., & Krup, N. (2009). US and EU privacy policy: Comparison of regulatory approaches. International Journal of Communication, 3, 19. ÓhÉigeartaigh, S. S., Whittlestone, J., Liu, Y., Zeng, Y., & Liu, Z. (2020). Overcoming barriers to cross-cultural cooperation in AI ethics and governance. Philosophy & Technology, 33(4), 571-593. Pernot-Leplay, E. (2020, May). China’s Approach on Data Privacy Law: A Third Way Between the U.S. and the E.U.? Penn State Journal of Law & International Affairs. 8(1), 49. Perry, E. J. (2008). Chinese conceptions of ‘rights’: from Mencius to Mao—and now. Perspectives on Politics. Cambridge University Press, 6(1), 37–50. https://doi.org/10.1017/S1537592708080055. Pohl, K.H., (2002). Chinese and Western values: reflections on a cross-cultural dialogue on a universal ethics. Komparative Ethik: Das gute Leben zwischen den Kulturen. München: Chora, 213-232. Reidenberg, J. R. (2014). The data surveillance state in the United States and Europe. Wake Forest L. Rev., 49, 583. Roberts, H., Cowls, J., Hine, E., Morley, J., Taddeo, M., Wang, V., & Floridi, L. (2021). China’s artificial intelligence strategy: lessons from the European Union’s ‘ethics-first approach. Available at SSRN 3811034. Rogoff, M. A. (1997). Comparison of Constitutionalism in France and the United States, A. Me. L. Rev., 49, 21. Rustad, M. L., & Koenig, T. H. (2018). Towards a Global Data Privacy Standard. Florida Law Review, 71, 18-16. Sacks, S., Laskai, L. (Feb, 2019). China’s Privacy Conundrum. Slate. Retrieved from https://slate.com/technology/2019/02/china-consumer-data-protection-privacy-surveillance.html. Schabas, W. A. (2015). The European convention on human rights: a commentary. Oxford University Press. Shi, M. L., Sacks, S., Chen, Q. H., & Webster, G. (2019). Translation: China’s personal information security specification. New America. Shun, K.L. (2004). “Conception of the Person in Early Confucian Thought,” Confucian Ethics: A Comparative Study of Self, Autonomy, and Community, eds., Kwong-loi Shun and David Wong, 183-99. Cambridge: Cambridge University Press. Singer, N. (2020). The government protects our food and cars. Why not our data. The New York Times. Retrieved from https://www. nytimes. com/2019/11/02/sunday-review/data-protection-privacy. Html.’’ Stolba, S. L. (2020, Nov 30). Credit Card Debt in 2020: Balances Drop for the First Time in Eight Years. Experian. Retrieved from https://www.experian.com/blogs/ask-experian/state-of-credit-cards/ Tam, L. (2018). Why privacy is an alien concept in Chinese culture. South China Morning Post. Retrieved from https://www.scmp.com/news/hong-kong/article/2139946/why-privacy-alien-concept-chinese-culture The State Council of the People’s Republic of China. (1982). Constitution of the People’s Republic of China. Retrieved from http://english.www.gov.cn/archive/lawsregulations/201911/20/content_WS5ed8856ec6d0b3f0e9499913.html#:~ :text=Article%2051%20When%20exercising%20their,and%20rights%20of%20other%20citizens.&text=Its%20 permanent%20organ%20is%20the%20National%20People’s%20Congress%20Standing%20Committee. The United States Department of Justice (2020). Overview of the Privacy Act of 1974 (2020 Edition). Retrieved from https://www.justice.gov/opcl/overview-privacy-act-1974-2020-edition Tillman, H. C. (1994). Ch’en Liang on public interest and the law (Vol. 12). University of Hawaii Press. Toh, M. (Dec, 2020). Tripadvisor’s app, and more than 100 others, have just been blocked in China. CNN Business. Retrieved from https://edition.cnn.com/2020/12/08/tech/tripadvisor-china-apps-intl-hnk/index.html. U.S. Senate Committee on Commerce, Science, and Transportation. (2002). Statement by Senator Ernest F. Hollings. Retrieved from https://www.congress.gov/congressional-report/107th-congress/senate-report/240 Vallor, S. (2015). Moral deskilling and upskilling in a new machine age: Reflections on the ambiguous future of character. Philosophy & Technology, 28(1), 107-124. Volokh, E. (2000). Freedom of speech and information privacy: The troubling implications of a right to stop people from speaking about you. Stanford Law Review, 1049-1124. Waxman, O. (May, 2018). The GDPR Is Just the Latest Example of Europe’s Caution on Privacy Rights. That Outlook Has a Disturbing History. Time Magazine. Retrieved from https://time.com/5290043/nazi-history-eu-data-privacy-gdpr/. Webster, G., Creemers, R., Triolo, P., & Kania, E. (2017). Full Translation: China’s ‘New Generation Artificial Intelligence Development Plan’. New America. Whittlestone, J., Nyrup, R., Alexandrova, A., Dihal, K., & Cave, S. (2019a). Ethical and societal implications of algorithms, data, and artificial intelligence: a roadmap for research. London: Nuffield Foundation. Whittlestone, J., Nyrup, R., Alexandrova, A., & Cave, S. (2019b). The role and limits of principles in AI ethics: towards a focus on tensions. In Proceedings of the 2019 AAAI/ACM Conference on AI, Ethics, and Society, 195-200. Williams, R. D. (2020, Dec 1). To enhance data security, federal privacy legislation is just a start. Brookings. Retrieved from https://www.brookings.edu/techstream/to-enhance-data-security-federal-privacy-legislation-is-just-a-start/ Wong, D. (2011). Confucian Political Philosophy. In The Oxford Handbook of the History of Political Philosophy. Wong, P. H. (2019). Rituals and Machines: A Confucian Response to Technology-Driven Moral Deskilling. Philosophies, 4(4), 59. Zarrow, P. (2002). The origins of modern Chinese concepts of privacy: Notes on social structure and moral discourse. Chinese concepts of privacy, 21-46. Zeng, J. (2020). Artificial intelligence and China’s authoritarian governance. International Affairs, 96(6), 1441–1459. https://doi.org/10.1093/ia/iiaa172 Zeng, Y., Lu, E., & Huangfu, C. (2018). Linking artificial intelligence principles. arXiv preprint arXiv:1812.04814. Zhang, D., Mishra, S., Brynjolfsson, E., Etchemendy, J., Ganguli, D., Grosz, B., Lyons, T., Manyika, J., Niebles, J.C., Sellitto, M., Shoham, Y., Clark, J. & Perrault, R. (2021). The AI Index 2021 Annual Report. arXiv preprint arXiv:2103.06312. 中國信息安全. (2019). 發布| 國家新一代人工智能治理專業委員會:《新一代人工智能治理原則——發展 負責任的人工智能》(附全文). Weixin. https://mp.weixin.qq.com/s/JWRehPFXJJz_mu80hlO2kQ. 新華社. (2020, December 24). 中共中央辦公廳國務院辦公廳印發《關於做好2021年元旦春節期間有關工 作的通知》. 中华人民共和国中英民政府. Retrieved from http://www.gov.cn/zhengce/2020-12/24/content_5573063.htm