Personal data

Personal data, also known as personal information or personally identifiable information (PII),[1][2][3] is any information related to an identifiable person.

The abbreviation PII is widely accepted in the United States, but the phrase it abbreviates has four common variants based on personal or personally, and identifiable or identifying. Not all are equivalent, and for legal purposes the effective definitions vary depending on the jurisdiction and the purposes for which the term is being used. [lower-alpha 1] Under European and other data protection regimes, which centre primarily on the General Data Protection Regulation (GDPR), the term "personal data" is significantly broader, and determines the scope of the regulatory regime.[4]

National Institute of Standards and Technology Special Publication 800-122[5] defines personally identifiable information as "any information about an individual maintained by an agency, including (1) any information that can be used to distinguish or trace an individual's identity, such as name, social security number, date and place of birth, mother's maiden name, or biometric records; and (2) any other information that is linked or linkable to an individual, such as medical, educational, financial, and employment information." For instance, a user's IP address is not classed as PII on its own, but is classified as a linked PII.[6]

Personal data is defined under the GDPR as "any information which [is] related to an identified or identifiable natural person".[7][5] The IP address of an Internet subscriber may be classes as personal data.[8]

The concept of PII has become prevalent as information technology and the Internet have made it easier to collect PII leading to a profitable market in collecting and reselling PII. PII can also be exploited by criminals to stalk or steal the identity of a person, or to aid in the planning of criminal acts. As a response to these threats, many website privacy policies specifically address the gathering of PII,[9] and lawmakers such as the European Parliament have enacted a series of legislation such as the General Data Protection Regulation (GDPR) to limit the distribution and accessibility of PII.[10]

Important confusion arises around whether PII means information which is identifiable (that is, can be associated with a person) or identifying (that is, associated uniquely with a person, such that the PII identifies them). In prescriptive data privacy regimes such as HIPAA, PII items have been specifically defined. In broader data protection regimes such as the GDPR, personal data is defined in a non-prescriptive principles-based way. Information that might not count as PII under HIPAA can be personal data for the purposes of GDPR. For this reason, "PII" is typically deprecated internationally.

Definitions

The U.S. government used the term "personally identifiable" in 2007 in a memorandum from the Executive Office of the President, Office of Management and Budget (OMB),[11] and that usage now appears in US standards such as the NIST Guide to Protecting the Confidentiality of Personally Identifiable Information (SP 800-122).[12] The OMB memorandum defines PII as follows:

Information which can be used to distinguish or trace an individual's identity, such as their name, social security number, biometric records, etc. alone, or when combined with other personal or identifying information which is linked or linkable to a specific individual, such as date and place of birth, mother’s maiden name, etc.

A term similar to PII, "personal data" is defined in EU directive 95/46/EC, for the purposes of the directive:[13]

Article 2a: 'personal data' shall mean any information relating to an identified or identifiable natural person ('data subject'); an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural or social identity;

In the EU rules, there has been a more specific notion that the data subject can potentially be identified through additional processing of other attributes—quasi- or pseudo-identifiers. In the GDPR Personal Data is defined as:

Any information relating to an identified or identifiable natural person ('data subject'); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person[14]

A simple example of this distinction: the color name "red" by itself is not personal data, but that same value stored as part of a person's record as their "favorite color" is personal data; it's the connection to the person that makes it personal data, not (as in PII) the value itself.

Another term similar to PII, "personal information" is defined in a section of the California data breach notification law, SB1386:[15]

(e) For purposes of this section, "personal information" means an individual's first name or first initial and last name in combination with any one or more of the following data elements, when either the name or the data elements are not encrypted: (1) Social security number. (2) Driver's license number or California Identification Card number. (3) Account number, credit or debit card number, in combination with any required security code, access code, or password that would permit access to an individual's financial account. (f) For purposes of this section, "personal information" does not include publicly available information that is lawfully made available to the general public from federal, state, or local government records.

The concept of information combination given in the SB1386 definition is key to correctly distinguishing PII, as defined by OMB, from "personal information", as defined by SB1386. Information, such as a name, that lacks context cannot be said to be SB1386 "personal information", but it must be said to be PII as defined by OMB. For example, the name John Smith has no meaning in the current context and is therefore not SB1386 "personal information", but it is PII. A Social Security Number (SSN) without a name or some other associated identity or context information is not SB1386 "personal information", but it is PII. For example, the SSN 078-05-1120 by itself is PII, but it is not SB1386 "personal information". However the combination of a valid name with the correct SSN is SB1386 "personal information".[15]

The combination of a name with a context may also be considered PII; for example, if a person's name is on a list of patients for an HIV clinic. However, it is not necessary for the name to be combined with a context in order for it to be PII. The reason for this distinction is that bits of information such as names, although they may not be sufficient by themselves to make an identification, may later be combined with other information to identify persons and expose them to harm.

According to the OMB, it is not always the case that PII is "sensitive", and context may be taken into account in deciding whether certain PII is or is not sensitive.[11]

When a person wishes to remain anonymous, descriptions of them will often employ several of the above, such as "a 34-year-old white male who works at Target". Note that information can still be private, in the sense that a person may not wish for it to become publicly known, without being personally identifiable. Moreover, sometimes multiple pieces of information, none sufficient by itself to uniquely identify an individual, may uniquely identify a person when combined; this is one reason that multiple pieces of evidence are usually presented at criminal trials. It has been shown that, in 1990, 87% of the population of the United States could be uniquely identified by gender, ZIP code, and full date of birth.[16]

In hacker and Internet slang, the practice of finding and releasing such information is called "doxing".[17][18] It is sometimes used to deter collaboration with law enforcement.[19] On occasion, the doxing can trigger an arrest, particularly if law enforcement agencies suspect that the "doxed" individual may panic and disappear.[20]

Laws and standards

Australia

In Australia, the Privacy Act 1988 deals with the protection of individual privacy, using the OECD Privacy Principles from the 1980s to set up a broad, principles-based regulatory model (unlike in the US, where coverage is generally not based on broad principles but on specific technologies, business practices or data items). Section 6 has the relevant definition.[21] The critical detail is that the definition of 'personal information' also applies to where the individual can be indirectly identified:

"personal information" means information or an opinion about an identified individual, or an individual who is reasonably identifiable whether the information or opinion is true or not; and whether the information or opinion is recorded in a material form or not. [emphasis added]

It appears that this definition is significantly broader than the Californian example given above, and thus that Australian privacy law may cover a broader category of data and information than in some US law.

In particular, online behavioral advertising businesses based in the US but surreptitiously collecting information from people in other countries in the form of cookies, bugs, trackers and the like may find that their preference to avoid the implications of wanting to build a psychographic profile of a particular person using the rubric of 'we don't collect personal information' may find that this does not make sense under a broader definition like that in the Australian Privacy Act.

The term "PII" is not used in Australian privacy law.

Canada

  • Privacy Act governs the Federal Government agencies
  • Ontario Freedom of Information and Protection of Privacy Act and similar Provincial legislation governs Provincial Government agencies
  • Personal Information Protection and Electronic Documents Act governs private corporations, unless there is equivalent Provincial legislation
  • Ontario Personal Health Information Protection Act and other similar Provincial legislation governs health information

European Union

European data protection law does not utilize the concept of personally identifiable information, and its scope is instead determined by non-synonymous, wider concept of "personal data".

  • Article 8 of the European Convention on Human Rights
  • The General Data Protection Regulation adopted in April 2016. Effective 25 May 2018
    • supersedes the Data Protection Directive – 95/46/EC
  • Directive 2002/58/EC (the E-Privacy Directive)
  • Directive 2006/24/EC Article 5 (The Data Retention Directive)

Further examples can be found on the EU privacy website.[22]

United Kingdom

  • The UK (Data Protection Act 2018)[23]
  • The UK Data Protection Act 1998 – superseded by the UK Data Protection Act 2018
  • General Data Protection Regulation (Europe, 2016)
  • Article 8 of the European Convention on Human Rights
  • The UK Regulation of Investigatory Powers Act 2000
  • Employers' Data Protection Code of Practice
  • Model Contracts for Data Exports
  • The Privacy and Electronic Communications (EC Directive) Regulations 2003
  • The UK Interception of Communications (Lawful Business Practice) Regulations 2000
  • The UK Anti-Terrorism, Crime and Security Act 2001

New Zealand

The twelve Information Privacy Principles of the Privacy Act 1993 apply.

Switzerland

The Federal Act on Data Protection of 19 June 1992 (in force since 1993) has set up a protection of privacy by prohibiting virtually any processing of personal data which is not expressly authorized by the data subjects.[24] The protection is subject to the authority of the Federal Data Protection and Information Commissioner.[24]

Additionally, any person may ask in writing a company (managing data files) the correction or deletion of any personal data.[25] The company must respond within thirty days.[25]

United States

The Privacy Act of 1974 (Pub.L. 93–579, 88 Stat. 1896, enacted 31 December 1974, 5 U.S.C. § 552a), a United States federal law, establishes a Code of Fair Information Practice that governs the collection, maintenance, use, and dissemination of personally identifiable information about individuals that is maintained in systems of records by federal agencies.[26]

One of the primary focuses of the Health Insurance Portability and Accountability Act (HIPAA), is to protect a patient's Protected Health Information (PHI), which is similar to PII. The U.S. Senate proposed the Privacy Act of 2005, which attempted to strictly limit the display, purchase, or sale of PII without the person's consent. Similarly, the (proposed) Anti-Phishing Act of 2005 attempted to prevent the acquiring of PII through phishing.

U.S. lawmakers have paid special attention to the social security number because it can be easily used to commit identity theft. The (proposed) Social Security Number Protection Act of 2005 and (proposed) Identity Theft Prevention Act of 2005 each sought to limit the distribution of an individual's social security number.

Additional U.S. specific personally identifiable information[27] includes, but is not limited to, I-94 Records, Medicaid ID Numbers, Internal Revenue Services (I.R.S.) documentation. Exclusivity of personally identifiable information affiliated with the U.S. highlights national data security concerns[28] and the influence of personally identifiable information in U.S. federal data management systems.

State laws and significant court rulings

  • California
    • The California state constitution declares privacy an inalienable right in Article 1, Section 1.
    • California Online Privacy Protection Act (OPPA) of 2003
    • SB 1386 requires organizations to notify individuals when PII (in combination with one or more additional, specific data elements) is known or believed to be acquired by an unauthorized person.
    • In 2011, the California State Supreme Court ruled that a person's ZIP code is PII.[29]
  • Nevada
    • Nevada Revised Statutes 603A-Security of Personal Information[30]
  • Massachusetts
    • 201 CMR 17.00: Standards for The Protection of Personal Information of Residents of the Commonwealth[31]
    • In 2013, the Massachusetts Supreme Court ruled that ZIP codes are PII.[32]

Federal law

NIST definition

The National Institute of Standards and Technology (NIST) is a physical sciences laboratory, and a non-regulatory agency of the United States Department of Commerce. Its mission is to promote innovation and industrial competitiveness.

The following data, often used for the express purpose of distinguishing individual identity, clearly classify as personally identifiable information under the definition used by the NIST (described in detail below):[12]

The following are less often used to distinguish individual identity, because they are traits shared by many people. However, they are potentially PII, because they may be combined with other personal information to identify an individual.

Forensics

In forensics, particularly the identification and prosecution of criminals, personally identifiable information is critical in establishing evidence in criminal procedure. Criminals may go to great trouble to avoid leaving any PII, such as by:

  • Wearing masks, sunglasses, or clothing to obscure or completely hide distinguishing features, such as eye, skin, and hair colour, facial features, and personal marks such as tattoos, birthmarks, moles and scars.
  • Wearing gloves to conceal fingerprints, which themselves are PII. However, gloves can also leave prints that are just as unique as human fingerprints. After collecting glove prints, law enforcement can then match them to gloves that they have collected as evidence.[34] In many jurisdictions the act of wearing gloves itself while committing a crime can be prosecuted as an inchoate offense.[35]
  • Avoiding writing anything in their own handwriting.[36]
  • Masking their internet presence with methods such as using a proxy server to appear to be connecting from an IP address unassociated with oneself.

Personal safety

Personal data is a key component of online identity and can be exploited by individuals. For instance, data can be altered and used to create fake documents, hijack mail boxes and phone calls or harass people, such as in the data breach from the EE Limited company.[37]

Another key case can be referred to as Financial Identity Theft,[38] which usually entails bank account and credit card information being stolen, and then being used or sold.[39]

Personal data can also be used to create fake online identity, including fake accounts and profiles (that can be referred as Identity Cloning[40] or Identity Fraud) for celebrities to gather data from other users more easily.[41] Even individuals can be concerned, especially for personal purpose (this is more widely known as sockpuppetry).

The most critical information, such as one's password, date of birth, ID documents or Social Insurance Number, can be used to log in to different websites (See Password reuse and Account verification) to gather more information and access more content.

Also, several agencies ask for discretion on subject related to their work, for the safety of their employees. For this reason, the United States Department of Defense (DoD) has strict policies controlling release of personally identifiable information of DoD personnel.[42] Many intelligence agencies have similar policies, sometimes to the point where employees do not disclose to their friends that they work for the agency.

Similar identity protection concerns exist for witness protection programs, women's shelters, and victims of domestic violence and other threats.[43]

Trade of personal data

During the second half of the 20th century, the digital revolution introduced "privacy economics", or the trade of personal data. The value of data can change over time and over different contexts. Disclosing data can reverse information asymmetry, though the costs of doing so can be unclear. In relation to companies, consumers often have "imperfect information regarding when their data is collected, with what purposes, and with what consequences."[44]

Writing in 2015, Alessandro Acquisti, Curtis Taylor and Liad Wagman identified three "waves" in the trade of personal data:

  1. In the 1970s, the Chicago Boys school claimed that protection of privacy could have a negative impact on the market because it could lead to incorrect and non-optimal decisions. Other researchers like Andrew F. Daughety and Jennifer F. Reinganum suggested that the opposite was true, and that absence of privacy would also lead to this.[45]
  2. In the mid 1990s, Varian retook the Chicago Boys approach and added a new externality, stating that the consumer would not always have perfect information on how their own data would be used.[46] Kenneth C. Laudon developed a model in which individuals own their data and have the ability to sell it as a product. He believed that such a system should not be regulated, to create a free market.[47]
  3. In the 2000s, researchers worked on price discrimination (Taylor, 2004[48]), two-sided markets (Cornière, 2011[49]) and marketing strategies (Anderson and de Palma, 2012[50]). The theories became complex, and showed that the impact of privacy on the economy highly depended on the context.

See also

  • Anonymity
  • Bundesdatenschutzgesetz
  • De-identification
  • General Data Protection Regulation
  • Non-personal Data
  • Personal identifier
  • Personal identity
  • Personal Information Agent
  • Protected health information
  • Privacy
  • Privacy law
  • Privacy laws of the United States
  • Pseudonymity
  • Obfuscation
  • Self-sovereign identity
  • Surveillance

Notes

  1. In other countries with privacy protection laws derived from the OECD privacy principles, the term used is more often "personal information", which may be somewhat broader: in Australia's Privacy Act 1988 (Cth) "personal information" also includes information from which the person's identity is "reasonably ascertainable", potentially covering some information not covered by PII.

References

  1. "Management of Data Breaches Involving Sensitive Personal Information (SPI)". VA.gov. Washington, DC: Department of Veterans Affairs. 6 January 2012. Archived from the original on 26 May 2015. Retrieved 25 May 2015.
  2. Stevens, Gina (10 April 2012). "Data Security Breach Notification Laws" (PDF). fas.org. Retrieved 8 June 2017.
  3. Greene, Sari Stern (2014). Security Program and Policies: Principles and Practices. Indianapolis, IN, US: Pearson IT Certification. p. 349. ISBN 978-0-7897-5167-6.
  4. Schwartz, Paul M; Solove, Daniel (2014). "Reconciling Personal Information in the United States and European Union". California Law Review. 102 (4). doi:10.15779/Z38Z814.
  5. "NIST Special Publication 800-122" (PDF). nist.gov. This article incorporates public domain material from the National Institute of Standards and Technology.
  6. Section 3.3.3 "Identifiability"
  7. "Personal Data". General Data Protection Regulation (GDPR). Retrieved 23 October 2020.
  8. "European Court of Justice rules IP addresses are personal data". The Irish Times. 19 October 2016. Retrieved 10 March 2019.
  9. Nokhbeh, Razieh (2017). "A study of web privacy policies across industries". Journal of Information Privacy & Security. 13: 169–185.
  10. "Proposal for a Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation)". European Data Consilium. 11 June 2015. Retrieved 3 April 2019.
  11. M-07-16 SUBJECT:Safeguarding Against and Responding to the Breach of Personally Identifiable Information Archived 8 February 2020 at the Wayback Machine FROM: Clay Johnson III, Deputy Director for Management (2007/05/22)
  12. "Guide to Protecting the Confidentiality of Personally Identifiable Information (PII)" (PDF). NIST. Special Publication 800-122.
  13. "Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data". Eur-lex.europa.eu. Retrieved 20 August 2013.
  14. "What is personal data?". TrueVault.
  15. "Text of California Senate Bill SB 1386 ref paragraph SEC. 2 1798.29.(e)". California.
  16. "Comments of Latanya Sweeney, PhD on "Standards of Privacy of Individually Identifiable Health Information"". Carnegie Mellon University. Archived from the original on 28 March 2009.
  17. James Wray and Ulf Stabe (19 December 2011). "The FBI's warning about doxing was too little too late". Thetechherald.com. Archived from the original on 31 October 2012. Retrieved 23 October 2012.
  18. "Anonymous's Operation Hiroshima: Inside the Doxing Coup the Media Ignored (VIDEO)". Ibtimes.com. 1 January 2012. Retrieved 23 October 2012.
  19. "Did LulzSec Trick Police into Arresting the Wrong Guy? - Technology". The Atlantic Wire. 28 July 2011. Archived from the original on 29 October 2013. Retrieved 23 October 2012.
  20. Bright, Peter (7 March 2012). "Doxed: how Sabu was outed by former Anons long before his arrest". Ars Technica. Retrieved 23 October 2012.
  21. "Privacy Act 1988". Retrieved 15 May 2019.
  22. "Data protection". European Commission – European Commission. 11 April 2017.
  23. Data Protection Act 2018 Published by legislation.gov.uk, retrieved 14 August 2018
  24. Federal Act on Data Protection of 19 June 1992 (status as of 1 January 2014), Federal Chancellery of Switzerland (page visited on 18 September 2016).
  25. (in French) Cesla Amarelle, Droit suisse, Éditions Loisirs et pédagogie, 2008.
  26. "Privacy Act of 1974". www.justice.gov. 16 June 2014. Retrieved 6 December 2020.
  27. Rana, R.; Zaeem, R. N.; Barber, K. S. (October 2018). "US-Centric vs. International Personally Identifiable Information: A Comparison Using the UT CID Identity Ecosystem". 2018 International Carnahan Conference on Security Technology (ICCST): 1–5. doi:10.1109/CCST.2018.8585479. ISBN 978-1-5386-7931-9. S2CID 56719139.
  28. "HIGH-RISK SERIES Urgent Actions Are Needed to Address Cybersecurity Challenges Facing the Nation" (PDF). United States Government Accountability Office. September 2018. Retrieved 16 November 2020.
  29. "California Supreme Court Holds that Zip Code is Personal Identification Information – Bullivant Houser Bailey Business Matters eAlert". LexisNexis.
  30. "CHAPTER 603A - SECURITY AND PRIVACY OF PERSONAL INFORMATION".
  31. "201 CMR 17.00: Standards for The Protection of Personal Information of Residents of the Commonwealth" (PDF). Commonwealth of Massachusetts.
  32. Tyler v. Michaels Stores, Inc., 984N.E.2d 737, 739 (2013)
  33. "Anonymity and PII". cookieresearch.com. Archived from the original on 17 June 2011. Retrieved 6 May 2015.
  34. Sawer, Patrick (13 December 2008). "Police use glove prints to catch criminals". Telegraph. Archived from the original on 11 January 2022. Retrieved 20 August 2013.
  35. James W.H. McCord and Sandra L. McCord, Criminal Law and Procedure for the paralegal: a systems approach, supra, p. 127.
  36. John J. Harris, Disguised Handwriting, 43 J. Crim. L. Criminology & Police Sci. 685 (1952-1953)
  37. "EE failures show how data breaches damages lives". GDPR.report.
  38. Miller, Michael (2008). Is It Safe? Protecting Your Computer, Your Business, and Yourself Online. p. 4. ISBN 9780132713900.
  39. "Card data of 20,000 Pakistani bank users sold on dark web: report". Dunya News.
  40. Miller, Michael (2008). Is It Safe? Protecting Your Computer, Your Business, and Yourself Online. p. 6. ISBN 9780132713900.
  41. Krombholz, Katharina; Dieter Merkl; Edgar Weippl (26 July 2012). "Fake Identities in Social Media: A Case Study on the Sustainability of the Facebook Business Model". Journal of Service Science Research. 4 (2): 175–212. doi:10.1007/s12927-012-0008-z. S2CID 6082130.
  42. "MEMORANDUM FOR DOD FOIA OFFICES" (PDF). United States Department of Defense. Archived from the original (PDF) on 6 August 2020. Retrieved 1 April 2019.
  43. "Protection of victims of sexual violence: Lessons learned" (PDF). 2019.
  44. Acquisti, Alessandro; Curtis Taylor; Liad Wagman (2015). The Economics of Privacy (PDF).
  45. Daughety, A.; J. Reinganum (2010). "Public goods, social pressure, and the choice between privacy and publicity". American Economic Journal: Microeconomics. 2 (2): 191–221. CiteSeerX 10.1.1.544.9031. doi:10.1257/mic.2.2.191.
  46. Varian, H. R. (1997). Economic aspects of personal privacy. In Privacy and Self-regulation in the Information Age.
  47. Laudon, K. (1997). Extensions to the theory of markets and privacy: Mechanics of pricing information (PDF).
  48. Taylor, C. R. (2004). "Consumer privacy and the market for customer information". The RAND Journal of Economics. 35 (4): 631–650. doi:10.2307/1593765. hdl:10161/2627. JSTOR 1593765.
  49. Cornière, A. D. (2011). "Search advertising". American Economic Journal: Microeconomics. 8 (3): 156–188. doi:10.1257/mic.20130138.
  50. Anderson, S.; A. de Palma (2012). "Competition for attention in the information (overload) age". The RAND Journal of Economics. 43: 1–25. doi:10.1111/j.1756-2171.2011.00155.x. S2CID 11606956.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.