Comparing Internet Privacy in the European Union and the United States:: 4 Works Cited
Length: 4671 words (13.3 double-spaced pages)
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Most people regard one's right to privacy as a fundamental right. But how do we define "privacy?" According to Basse, there are three aspects to privacy: freedom from intrusion, control of information about one's self, and freedom from surveillance.1 Certainly, we cannot expect complete privacy in all of these aspects at all times. However, technological advances are making it increasingly difficult for individuals to determine when they can and cannot expect privacy, and what degree of privacy they can expect.
For example, at one time a personal conversation taking place far from prying ears would have afforded the participants a very high expectation of privacy. This was no longer the case once directional microphones were developed. Similarly, satellites in orbit high overhead can take pictures of places that had previously been considered private (and do so with astounding resolution). GPS-compatible cellular phones can be used to pinpoint the location of the person carrying them.
Computers are another technological advancement that has threatened the privacy of an individual's personal information. In 1977, it was announced that computer matching, which takes previously unrelated files, would be used to reduce welfare abuse.2 Computer matching is now commonplace both within the government and in the private sector. While this type of matching would have been possible without the use of computers (by hand matching hard-copy file, for example), computers have made it practical, relatively easy and inexpensive.
With the Internet, organizations can transfer data from one point in the world to another easily and almost instantly, further facilitating the practice of computer matching. In addition, with the rise in popularity of the World Wide Web, the Internet has become not only an information exchange medium, but also an information collection medium.
Consider browsing the World Wide Web. Simply by visiting a web page, one has already told the owner of that web site quite a bit of information about one's self. Web browsers routinely send web servers information as part of the hypertext transport protocol (http) request. This information can include things such as the date and time of the visit, the web browser's IP address, the type of web browser and operating system, and the URL of the web page previously visited. In addition, web servers can send "cookies," small files containing identifying information, back to the web browser. In this way, a web server can now uniquely identify repeat visitors to a web site.
These are examples of "invisible information gathering." That is to say, the collection of personal information without a person's knowledge.3 This is of particular concern because it goes to the very expectation of privacy. That is to say, a person may not even be aware that personal information about them is being gathered.
Another way in which computers and the Internet have encroached upon personal privacy is through the secondary use of data. This is the use of data for a purpose other than the one for which it was originally supplied.4 For example, a user may trust online company XYZ and supply it with person information necessary to complete a transaction. However, unknown to the user, company XYZ may routinely sell the person information it gathers to other online services, including some that the user may not have trusted with this information in the first place. While online services that share and/or sell personal information to third parties should (and often do) declare this in their posted privacy policies, few users actually take the time to read these policies or realize their implications.
The European Union's (EU) approach to personal information privacy on the Internet
The European Union was concerned enough about the privacy of personal information on the Internet to create a directive governing Internet privacy. This directive, adopted on October 24, 1995, is officially called "Directive 95/46/EC of the European Parliament and of the Council on the Protection of Individuals With Regard to the Processing of Personal Data and on the Free Movement of Such Data of 24 October 1995." From this point on, I will simply refer to this as the "EU Directive."
The objective of the EU Directive is clearly stated in its first article: to "protect the fundamental rights and freedoms of natural persons, and in particular their right to privacy with respect to the processing of personal data."5 The EU Directive seeks to accomplish this goal by setting forth a number of rules and principles, several of which are described in the paragraphs that follow.
With respect to data quality, the data itself must be gathered lawfully, and must be collected for a specific, legitimate purpose. The data cannot later be processed in a way that is inconsistent with that original purpose (i.e., no secondary use of data). Furthermore, the data collected should be the minimum necessary to fulfill the stated purpose. Finally, reasonable steps must be taken to ensure that the data is accurate and up-to-date. Data that is inaccurate, obsolete, or is no longer needed is to be deleted.6
The EU Directive also includes criteria for determining if data processing is fulfilling a legitimate purpose. In general, data can only be processed if the data subject (i.e., the person whom the data is about) has given their unambiguous consent, or if the processing is necessary to fulfill a contract that the data subject has entered (or is going to enter). In addition, data can be processed if it is legally necessary to protect the vital interests of the data subject, is necessary for a task being performed in the public interest, or is being done by official authorities in the performance of their tasks.7
Certain personal information falls into what the EU Directive calls "special categories of data." This includes information about the data subject's racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, and health or sex life.8 These special categories of data are subject to even stricter rules, and cannot be processed unless certain specific and restrictive conditions apply (e.g., a non-profit religious organization can process information about the religious beliefs of its members).
Data subjects are afforded certain rights under the EU Directive. A subject must be told who is collecting the personal information, their purpose for collecting the information, whether providing the information is voluntary or mandatory, and (if mandatory) what the consequences are for not providing the requested information. The data subject also has the right see the information that has been gathered about him or her, the right to have incorrect data corrected or erased, and the right to know if the information may be given to a third party.9
From a global point of view, the most significant rule in the EU Directive is the one that prohibits the transfer of personal information to a third country (i.e., a non-EU member nation) unless that country "ensures an adequate level of protection."10 This means that companies doing business in the EU and in other countries cannot simply transfer personal information between facilities. This will be discussed further in the section on global implications.
The United States' approach to personal information privacy on the Internet
The United States has taken a much different approach with regards to the processing of personal data. There is no single law governing the protection of personal information. Instead, there is a "patchwork of federal, state, constitutional, statutory, and case law."11 So, while there are specific laws concerning medical information, video rentals, and driver license information, there is no underlying framework covering all types of personal information.12 These isolated examples of legislation are typically responses to issues that have caught public attention, resulting in protection for some personal information whose sensitivity is questionable, while more highly sensitive personal information has little or no legal protection.13
The position of the Unites States is that self-regulation, backed up by the Federal Trade Commission, is sufficient and the most effective way to achieve better protection for personal information.14 This position is due in part to the results of a 1996 government task force that concluded legislation was not appropriate, that the private sector should take the lead, and that "competition and consumer choice will shape the marketplace."15
Furthermore, the United States is opposed to giving data subjects access to their personal information, stating that this would be too expensive for U.S. companies, industry, and the U.S. government.16
The global implications
As stated previously, the rule in the EU Directive prohibiting the transfer of personal information to third countries that do not have an adequate level of protection is the most significant in terms of its global implications. This can cause serious problems for companies that do business within the EU, but process the personal information of EU citizens in a third country. In the worst case, the EU could deny a U.S. company the right to conduct business in Europe, if the transfer of personal information is involved.
Fortunately, this situation has largely been avoided. In July 2000, an understanding was reached and the EU approved the "safe harbor" framework. Under the framework, organizations wishing to enter the safe harbor certify and publicly declare that they will comply with seven principles, which are considered "adequate" privacy protection under the EU Directive.17 The U.S. Department of Commerce maintains a list of those organizations that have entered the safe harbor on their web site.
While the safe harbor principles themselves are not laws in the U.S. (as they are in the EU, as a result of the EU Directive), the U.S. does have federal and state laws governing deceptive and unfair practices. So if an organization enters the safe harbor but then does not comply with its principles (after publicly stating that it would), it can be prosecuted under these deceptive and unfair statues. Thus the same ends are achieved using different means. One significant distinction is that entrance into the safe harbor is voluntary for U.S. organizations, while compliance with the EU Directive is mandatory for EU organizations.
Even after passage of the safe harbor agreement, some U.S. companies complained that the privacy requirements "make it time-consuming, expensive, and burdensome for a U.S. company to store business data on European citizens." These companies were looking for the EU to make changes to the EU Directive; putting less emphasis on individuals' privacy and simplifying the free flow of information, possibly by self-regulation instead of the existing legislation.18
However, in spite of these complaints, the EU Directive has remained in force, and the list of organizations that have entered the safe harbor has grown (as of April 29, 2004, 492 organizations are listed). In addition, other independent third-parties programs called "Seal Programs" have sprung up. These Seal Programs give companies subscribing to certain privacy policies a seal of approval, which the company can then display on its web site.19 Two of the more familiar Seal Programs are TRUSTe and BBBOnLine, both of which offer variants of their seal that indicate compliance with the safe harbor agreement, and are thus adequate under the EU Directive.20, 21 Consequently, despite opposition by some U.S. corporations, the EU Directive seems to be growing in acceptance, albeit reluctantly in some cases.
An Ethical viewpoint
In the preceding sections, we have discussed the different approaches the EU and the U.S. have taken in protecting the privacy of personal information on the Internet. We have also discussed the global implications of these different approaches in a world where Internet makes the transfer of personal information a relatively fast and inexpensive proposition. But how do we determine which approach is "right?" To do this, and to take a moral position on this topic, it is necessary to evaluate the approaches taken by the EU and the U.S. against three ethical theories: deontological, utilitarian, and natural rights.
The Deontological perspective
The deontological perspective emphasizes absolute rules that can be applied universally to everyone.22 To me, this viewpoint seems to match the EU's position on personal privacy. The EU Directive defines the rules, and they apply to all EU member states. Furthermore, I feel the EU Directive does have a sense of "universalism" to them in that they prohibit the transfer of personal information to third countries whose protection levels are inadequate. While this could stem from a desire to isolate European citizens and businesses from the rest of the world, the more likely conclusion is that the EU Directive is aimed at compelling other countries into adopting similar laws that protect the privacy and security of personal information. If this were the case, it would certainly be in keeping with deontological theory. As more and more countries seek to transfer personal information to and from EU countries, EU Directive-like laws or safe harbors (judged to be adequate by the EU) would spread throughout the world until they did indeed become universal.
On the other hand, U.S. approach appears to be the antithesis of the deontological theory. By not having a single underlying framework of laws governing the privacy of personal information on the Internet, individual data collectors are free to establish their own privacy policies and regulate themselves. While most reputable companies seeking to do a continuing business will most likely have adequate policies and data protection, the environment is open to those who would use personal information inappropriately. Granted, the U.S. does have some laws governing the processing of personal data, but these are few and far between, and are much more the exception than the rule. Thus the presence of these laws is not inconsistent with my assessment of the U.S. not having a deontological outlook.
The Consequentialist/Utilitarianism perspective
The main principle of utilitarianism is to increase overall happiness, or "utility." An action is right if it increases overall utility, wrong if it decreases it. There are two variants of utilitarianism: act-utilitarianism and rule-utilitarianism. In act-utilitarianism each individual action is considered and judged by its impact. In rule-utilitarianism, the principle of utility is not applied to individual actions, but to more general ethical rules.23
One major objection to (and identifying characteristic of) act-utilitarianism is that it does not respect individual rights.24 By this statement alone, one can argue that the EU Directive is clearly not act-utilitarian, since the EU Directive put a great deal of emphasis on the rights of individuals. However, there are some parts of the EU Directive that prefer public interests to individual rights. For example, the EU Directive's criteria for legitimate data processing include "processing when it is necessary for the performance of a task carried out in the public interest."25
At this point, the observation can be made that laws do not always fall into a single ethical perspective; they may contain aspects of two or more. However in the case of the EU Directive, there is clearly a preference for individual rights, and thus I feel confident in my assessment that the EU Directive is primarily deontological - not utilitarian - in nature.
The approach of the U.S. seems more consistent with a utilitarian viewpoint. According to Roger Clarke, the U.S. government has "heeded to demands of business, and resisted calls from the public for effective privacy laws. The stance has been based on the presumptions that economic efficiency is the greater good..."26 The key phrase here is "greater good", which strongly indicates an underlying utilitarian philosophy. This is reinforced by the U.S. government's desire for electronic commerce to develop fully, and for the Internet to remain a non-regulatory medium.27
Again though, we see a mix of ethical perspectives. Even though the U.S. clearly takes a utilitarian stance on the privacy of personal information on the Internet, there are "pockets" of legislation that do protect personal information in a very deontological way (again, we note the legal protection afforded video rental records, for example).
The Natural Rights perspective
From a natural rights perspective, ethical behavior is behavior that respects a set of fundamental rights of others (such as life, liberty, and property). The natural rights approach lets people use their own best judgment, make their own decisions, and act freely, without interference by others. Interactions between people are generally ethical if they are voluntary and freely made, and no coercion or deception is involved.28 In this situation, the natural right under discussion is a person's right to privacy.
Although the EU acknowledges an individual's right to privacy as a fundamental right,29 and the EU Directive is aimed at protecting this right, this is where any similarity with the natural rights approach ends. Since natural rights encourages free action without interference, it appears obvious that the EU Directive does not take a natural rights approach. The EU Directive does interfere with interactions between people by forcing certain rules and regulations. The fact that the EU Directive is attempting to do "what is best for the people involved, or for humanity in general"30 is irrelevant; it is still viewed as interfering and coercing from a natural rights perspective.
The U.S. approach to the privacy of personal information is much closer to a natural rights approach. Self-regulation lets online services that are processing personal data establish their own rules, and data subjects are free to interact with online services as they wish. The underlying assumption in the natural rights approach is that when a data subject voluntary interacts with an online service, there is no deception or coercion taking place. Unfortunately, this is not always the case (this will be discussed further in my viewpoint in this matter).
Once more, we see a combination of ethical viewpoints. The U.S. does take a natural rights standpoint: data subjects and online services will trust each other, thus preserving privacy. Those online services that do not respect the data subject's privacy will be undeserving of trust, and consumers will simply choose other, more reputable, online services to do business with. Thus, the online marketplace will evolve through consumer choice. The utilitarian perspective is demonstrated by the "greater good" this approach serves, without specifically addressing individual rights.
My viewpoint in the matter
Like the European Union and the United States, I view personal privacy (and the privacy of one's personal information - on the Internet and elsewhere) to be a fundamental right. However, I believe the natural rights perspective - upon which the U.S. approach is partially based - is fundamentally flawed where the Internet is concerned, and the correct moral approach is for the U.S. to adopt a set of comprehensive laws governing the privacy of personal information on the Internet, similar to those in the EU Directive. My belief is based on the following arguments:
In a natural rights scenario, two people choose to interact because they trust each other to respect the other's natural rights. This trust can be based on many things, including past experiences, recommendations from others, personal impressions, and reputations. Each person is free to use their own best judgment to determine whom to trust.
In a "traditional" setting, where the two people interacting can "see" each other (or where a person goes to a "brick and mortar" business), I believe there is a greater incentive for the parties involved to respect each other's rights. This is because the consequences for not respecting someone's rights cannot be dismissed lightly. Poor past experiences, bad reputations, and negative reviews from customers will all factor into peoples' future decisions on whom they trust. A person or business that is not trusted to respect the rights of others will soon find that they can no longer interact with others.
This distrust frequently follows a person or business, even after they move or change names. For example, after the 1996 crash of a ValuJet flight into the Everglades, the airline changed its name to AirTran Airways in an attempt to distance itself from the memory of the crash. Over a year after the crash, the airline was still trying to return to profitability, and the company chairman stated "We assume that we're going to be the most scrutinized airline forever."31
However, on the Internet it is relatively easy for a person or company to "change identities" without leaving a trail of any sort to connect them to a previously distrusted entity. Consider the ease with which a person can get a new email address, or a new login name on eBay, or create a new profile or account with an online service. Also consider the ease and relatively low cost of registering a domain name, setting up a web site, and starting a new online business.
Thus, the very anonymity many people praise as one of the virtues of the Internet is also its fundamental flaw in trying to apply the natural rights approach. The decision whether or not to interact with an online service is based in part on past experiences and reputation. But if a previously distrusted person or online service can simply reappear in a new guise (without any connection to their former identity), then past experiences and reputations are meaningless. Every time one interacts with a new person or service, one is at risk. When the fundamental right in question is the privacy of one's personal information, then it is one's privacy that is in jeopardy.
I also believe that a form of "mutually assured destruction" is part of the natural rights perspective. That is to say, we might not necessarily know or trust the individual we're interacting with, but we believe he'll do the right thing because if he doesn't he knows we can do something bad to him in return.
In terms of data subjects and online services, I believe that an honest data subject is at a much greater risk from a deceptive online service than an honest online service is from a deceptive data subject. What does an individual stand to lose if the online service they interact with is dishonest? The privacy of their personal information. What does an online service stand to lose if the data subject they interact with is dishonest? Perhaps some skewed data in their data collection efforts. Certainly, the data subject has more at stake, so there really isn't the deterrent of "mutually assured destruction" to keep online services honest.
Since our trust is in part based on past experiences, and since the U.S. approach also involves no overall framework or legislation for the protection of personal information on the Internet, a major concern of mine is that the reputable companies that adequately protect personal information and privacy will lull unsuspecting users into a false sense of security.
I also feel that the different approaches of the U.S. and the EU may put non-EU citizens at greater risk. As an analogy, consider the Canadian government's regulation of prescription drug prices. Since prices are regulated in Canada, drug manufacturers must sell prescription drugs at higher prices in other countries, such as the U.S. (drug prices in Canada can be from 20% to 80% cheaper than prices in the U.S.32). It is entirely possible that online services that cannot process the personal information of EU citizens will intensify the processing of personal information of non-EU citizens, thus putting their privacy more at risk.
For these reasons, I feel the U.S. approach, which takes a natural rights and utilitarian viewpoint, is fundamentally flawed and morally wrong, and that the deontological approach of the European Union is the more appropriate for the Internet. The laws, and the penalties for breaking those laws, provide the motivation for online services to remain honest and protect and respect the privacy of data subjects. Since the privacy laws are consistent and applied universally (within the EU, at least), data subjects can make valid assumptions about the level of protection their personal information is afforded, and no group of data subjects is at greater risk than any other.
While the tactics of the EU Directive's rule concerning the transfer of information to a third country might seem heavy-handed, it may yet achieve universal acceptance for the Directive. As more and more U.S. companies seek to do business in Europe, they will have to certify that they are a safe harbor, thus effectively putting them in compliance with the EU Directive. Once they make this certification, any violation of the safe harbor principles will have the consequence of putting them in violation of federal and/or state unfair and deceptive laws.
No doubt, there will still be online services - both on the U.S. and in other non-EU countries - that do not subscribe to the safe harbor principles, and data subjects will have to be educated on this matter so they can make informed decisions on which services they interact with. Thus, while the U.S. may argue that their approach has worked, and that consumer choice and self-regulation has shaped the online marketplace, they have still done a disservice to their citizens by making data subjects primarily responsible for the protection of their own private information, and for keeping them exposed to misuse of their personal information longer than necessary while the marketplace in the U.S. catches up to that in the EU.
1: Baase, Sara. A Gift of Fire (second edition). New Jersey: Prentice Hall, 2003, p. 36.
2: Kusserow, Richard and Shattuck, John. "Computer Matching: Should it be Banned?" Communications of the ACM. June 1984, 27:6, p. 537.
3: Baase, p. 38.
4: Baase, p. 40.
5: Directive 95/46/EC of the European Parliament and of the Council on the Protection of Individuals With Regard to the Processing of Personal Data and on the Free Movement of Such Data of 24 October 1995 (hereafter referred to as the "EU Directive").
6: EU Directive, article 6.
7: EU Directive, article 7.
8: EU Directive, article 8.
9: EU Directive, articles 10 - 12.
10: EU Directive, article 25.
11: Klosek, Jacqueline. Data Privacy in the Information Age. Connecticut: Quorum Books, 2000, p. 130.
12: Baase, p. 85.
13: Clarke, Roger. " Internet privacy concerns confirm the case for intervention," Communications of the ACM. June 1999, 42:2, pp. 60-67.
14: Klosek, p. 20.
15: Klosek, p. 131.
16: CNET: U.S., EU Privacy talks hit stalemate. CNet News, May 20, 1999. Available at http://news.com.com/2100-1023-226153.html?legacy=cnet (visited on April 27, 2004).
17: Safe Harbor Overview. U.S. Department of Commerce Export Portal, February 3, 2003. Available at http://www.export.gov/safeharbor/sh_overview.html (visited on April 28, 2004).
18: U.S. tech protests EU privacy laws. ZDNet, September 30, 2002. Available at http://zdnet.com.com/2100-1106-960134.html (visited on April 20, 2004).
19: Klosek, p. 182.
20: EU Safe Harbor Program. TRUSTe, March 22, 2004. Available at http://www.truste.org/programs/pub_harbor.html (visited on April 29, 2004).
21: European Union/US Safe Harbor Compliance. BBBOnLine, 2003. Available at http://www.bbbonline.org/privacy/eu.asp (visited on April 29, 2004).
22: Baase, p. 405.
23: Baase, p. 406.
24: Baase, p. 406.
25: EU Directive, article 7.
26: Clarke, pp. 60-67.
27: Klosek, p. 131.
28: Baase, p. 407.
29: EU Directive, article 1.
30: Baase, p. 407.
31: The ValuJet Crash: Its lasting legacy -- The Airline. CNN Interactive, 1997. Available at http://www.cnn.com/US/9705/valujet/the.airline/ (visited on April 28, 2004).
32: GlaxoSmithKline stops selling drugs to Canadian pharmacies. CNN.com/Health, January 21, 2003. Available at http://www.cnn.com/2003/HEALTH/01/21/canadian.drug.sales/index.html (visited on April 28, 2004).