Understanding PETs In Nigeria: The Difference Between Anonymisation and Pseudonymisation

0
DATA PRIVACY Icon Concept
Share on

By David Christopher Giwa 

INTRODUCTION

The major thrust and focus of this article is the differentiation of anonymisation from pseudonymisation, both of which are types of Privacy Enhancing Technologies (PETs). The use of these and other types of PETs have become increasingly important, especially in light of the growing volume and sensitivity of data being shared online, and the need for individuals to have control over their personal information. Countries the world over, including Nigeria, have been enjoined to use anonymisation, pseudonnymisation, and other appropriate forms of PETs  to maximise the use of data and minimise the risk of data privacy breach.  In fact, the evolution of PETs over the years has shifted the practice of data protection from being exclusively in the hands of service providers to being also in the hands of data subjects. That is to say, that emerging PETs encourage and foster participation as data subjects are now given the opportunity to determine the use to which their personal data is put, including the extent to which it is used. In light of the need for utilizing such PETs, the enumeration of the differences between anonymisation and pseudonymisation will foster an understanding of the benefits of their use and the current challenges associated with the same use.

DEFINITION OF PETs

PETs is an acronym denoting Privacy-Enhancing Technologies. They are technologies developed to ensure digital security and confidentiality of private data as the same are being collected, stored and put to use by Service providers. It refers to digital solutions permitting the collection, processing, analysis and sharing of  information without compromising the confidentiality and privacy of personal data. Thus,  by engaging with PETs, the usual business is done with personal data without prejudice to the privacy of data subjects. These technologies are aimed at addressing the growing concerns surrounding the collection and use of personal data by online services, governments, and other organizations. Some examples of PETs include encryption, anonymisation, pseudonymisation, secure communication protocols, access control mechanisms, and privacy-preserving data mining algorithms.

It is noticeable that the Nigeria Data Protection Regulation (NDPR) lacks explicit provisions for PETs. However, the Implementation Framework makes reference to privacy by design under paragraphs 2.6 and 3.2(v) of the NDPR  (Implementation Framework 2020) (hereafter referred to as NDPR IF). More specifically, the Annexure A to the NDPR IF provides for an audit template for NDPR compliance. The template is a guideline for Data Controllers and Administrators to show evidence of compliance. And repsonses must be evidenced with documentary evidence. Particularly, in its para. 2.6, one of these responses is in answer to the question “Do you adopt data pseudonymisation, anonymisation and encryption methods to reduce exposure of personal data?”

DIFERENCES BETWEEN ANONYMISATION AND PSEUDONYMISATION

The following are some of their dissimilarities:

  1. ABSOLUTE DE-IDENTIFICATION: Anonymisation absolutely de-indentifies while pseudonymisation partially de-identifies. This means that anonymisation has to do with absolutely eliminating (in theory though), all means linking a particular data or set of data to a natural person. It therefore is the process of removing identifying elements from data so as to ensure that re-identification of the data subject is absolutely prevented. The idea is that an attempt to link an anonymised data back to an individual by combining such data with additional data sets would fail, as the data would not be linkable to such individual.
  1. POSSIBLE RE-IDENTIFICATION: Flowing from the first difference stated above, Pseudonymisation offers partial de-identification, making re-identification possible. This is a risk which does not feature in anonimysation. Note that the European Union (EU) General Data Privacy Regulations (GDPR) in its Article 4(5) defines pseudonymization as the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information. It also provides the condition that such additional information must be kept separately and should be subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person. This highlights the fact that there is a possibility that additional data may be acquired by a recipient of a pseudonymised data. It is submitted that the provisions of the EU GDPR notwithstanding, re-identification remains a feasible possibility as all the several ways in which additional information may be outsourced by such recipient of pseudonymised data may not be foreseeable and guarded against by the supplier of such data.

On the other hand, anonymized data  cannot be linkable to a data subject even by the use of additional information.

  1. USER CONTROL IN ANONYMISATION AND PSEUDONYMISATION: Flowing from the foregoing, it follows that even the party responsible for the anonymisation of personal data cannot reverse the anonymised data to an identifiable one (i.e. one that can be linked to a data subject). This may pose a challenge if relevant data forms part of the information that is removed to make the data an anonymized one. The anonymisation is irreversible.

On the other hand, the party responsible for a pseudonymised data can reverse such data.  This helps where there Is pseudonymisation of relevant aspects of data which makes the data unusable unless it is reversed to it’s original form.

  1. APPLICATION OF LAWS TO DATA PROTECTED BY ANONYMISATION AND PSEUDONYMISATION: While the GDPR clearly applies to pseudonymised data, it does not apply to anonymised data as stated in paragraph (26) of its preamble. The NDPR does not restrict its application to only psuedonymised data but all forms of PETs, etc. as implicitly provided in its paragraph 2.6. Therefore, in Nigeria anonymized data is also protected. In fact, the NDPR IF requires the adoption of data pseudonymisation, anonymisation and encryption methods to reduce exposure of personal data in line with the import of paragraph 2.6 of the NDPR.
  1. USE OF CRYPTOGRAPHIC ENCRYPTION: Pseudonymisation may make use of cryptographic hash functions, which would map sensitive personal data to hash values etc. encrypted with “secret keys” accessible only to the party responsible for the pseudonimysation. This will most likely prevent unauthorised access to such pseudonymised data as such unauthorised third party would lack the keys to assess the original personal data. And it will be reversible from being a pseudonymised data to the original data by the key holders. This is not the case with anonymized data which are irreversible and cannot be used in its original form.
  1. REVERSIBILITY: PETs making use of Pseudonymity guarantees a person the use of a false identity or pseudonym online. This prevents third parties from linking data back to a real-world identity. In this case the false identity is reversible. However, this does not apply to an anonymized data which is irreversibly altered or modified.

However, the use of either anonymisation or pseudonymisatIon is guaranteed by the NDPR IF, thus organisations and other data controllers can make use of any of them depending on their particular need for the reversibility/irreversibility of data, with the informed consent of the data subjects.

  1. LACK OF PROTECTION FOR AGGRIEVED PARTIES FOR BREACH CAUSED BY INEFFECTIVE PROTECTION: The de-identification of personal data by anonymisation notwithstanding, there is still the possibility of re-identification by hardened tech-criminals. The fact that the GDPR does not apply to anonymized data may lead to a lack of protection for a person whose personal data was linked from its anonymised form to him or her. Therefore, it is recommended that the approach used by Nigeria in para. 2.6 of the NDPR should be adopted by the EU to address such possibility. However, pseudonymised data enjoys the protection of these regulations.
  1. PRESERVATION OF STATISTICAL OR ANALYTICAL VALUE OF DATA: Anonymisation has types such as suppression and generalisation. Teska opines that Suppression which is also known as data masking, is an extreme form of anonymization which substitutes data with pre-specified fixed text (or a black tape). It is very simple to implement and very effective in removing sensitive data. The flipside of the coin however, is that any statistical or analytical value of data is lost in the masking process. This underscores the disadvantage with anonymised data, the relevance of the data or relevant data may be lost due to anonymisation.

Similarly, generalisation has been opined by Teska  to be a method in which isolated or individual values of fields are substituted with a broader category. For example, the value 27 of the field Age may be replaced by ‘ ≤ 30′, the value ’46’ by ’30 < Age ≤ 60′, etc. Here as well, sensitive data is permanently modified in such a manner that it cannot be attributed back to the original data subject.

Relevant statistical or analytical value of data is secure under pseudonymisation.

CONCLUSION

From the above analysis, it is observed that there are advantages of the use of anonymisation and pseudonymisation forms of PETs in fostering enhanced data protection measures of personal and private data whilst maximising data usage. Both forms have also been identified as de-indentifiers of personal data. It remains the uncontroverted fact that organizations, government institutions, online services, health care systems, etc. will continue to run Information Technology systems in which personal data continues to be stored, whether it be customers’ emails, names, addresses, bank accounts, and even credit card informations, etc. These information contained in these databases of these entities could belong personally to clients and customers, employees and partners, suppliers, etc. In order to avoid leakage and/or unauthorised use of such sensitive data, the NDPR IF, for instance, requires under para. 2.6 NDPR that anonymisation, pseudonymisation or encryption is used. However, there is need for expert guidance by seasoned legal practitioners in this regard so as to avoid untoward outcomes from the ineffective deployments of these PETs.  Profits would be maximised if the appropriate PET is used for any given circumstance or case.


David Christopher Giwa LL.B (Hons) is a graduate of law from the Lagos State University

Share on

LEAVE A REPLY

Please enter your comment!
Please enter your name here