"I was pleased to see that the new journal is aimed at managers in the field to better understand the benefits of supply chain management thinking. The journal is focused on delivering these developing best practices to practicing managers. There is a vast gulf between academic’s theory and managerial practice [and] your journal should be a timely addition."
Volume 3 (2019-20)
Each volume of Journal of Data Protection & Privacy consists of four 100-page issues. The articles published in Volume 3 are listed below.
Volume 3 Number 4
-
Editorial: Is the principle of the ‘Rule of Law’ under attack from those who should protect it?
Ardi Kolah, Founding Editor-in-Chief, Journal of Data Protection & Privacy -
Research papers:
Communicating with technology, computers and artificial intelligence: Are human rights and privacy being neglected?
Patricia Higham, Independent Consultant, Counsellor
The paper explores some of the problems of communicating with information technology (IT) and artificial intelligence (AI), and the dilemma encountered by professional practitioners and the general public. Negative aspects of the new technology, including breaches of privacy, are too often not taken into consideration. Technology can become an oppressive force that targets poor people. Voluntary regulation by privately owned internet providers fails to protect human rights. AI has produced algorithms that contain unintended prejudicial biases. An uncritical acceptance of IT and AI may lead to more inequality in our society. Impact assessments for introducing new technology are needed. Professional practitioners and professional bodies need to recognise not only the benefits of the new technology but also its drawbacks. Our society resists overt control and censorship but needs to find a consensus about how to protect public freedom.
Keywords: communication, professional practice, information technology, artificial intelligence, ethics, human rights, bias, risks -
Digitalisation, legal activism and the generational divide
Jacob Kornbeck, European Commission, Youth Unit
Whenever new developments (whether of an online or an offline kind) risk unsettling people’s lives as they know them, they can resort to different responses, including protest, advocacy and activism. One kind of activism, which may be chosen in response to regulatory and social change triggered by digitalisation, could be called legal activism, in that it focuses on reasserting extant rights, insisting on effective enforcement, and/or on adapting the legal framework so as to afford a higher level of protection and/or more effective enforcement, in an effort to ensure more direct access to the enjoyment of rights related to digital interaction. But how does legal activism relate to the (imagined or real) generational divide? This paper aims to provide some answers using selected vignettes on young people as legal activists (‘Examples of Digital Activists’). The vignettes will be discussed by drawing on various available theories, models and insights from social science (‘Discussion’). The question set (‘The Generational Divide: Representation versus Reality’) will be assessed in light of insights gained from discussing, contextualising, fact-checking and theorising the vignettes exercise (‘Conclusion’), but to allow for this, the question set will need to be further refined (‘Looking for Political Agency’ and ‘Digitalisation and the Generational Divide’). The focus will be on millennials, while zoomers obviously may be concerned, too.
Keywords: generations, millennials, digital divide, privacy, activism, litigation, European Union -
Artificial intelligence in a privacy-concerned world: Automated decision-making and the GDPR
Senna Mougdir, Supervision Officer, Dutch Data Protection Authority
When we think of artificial intelligence (AI), most of us think of robots and machines that have yet to be invented. AI is, however, all around us and has been for a while now. AI makes our everyday lives easier but has a downside, such as massive data processing (Big Data). The General Data Protection Regulation (GDPR) that came into force on 25th May, 2018, regulates the protection of personal data of individuals within the European Union member states. While innovation is important to our society, it is also important that organisations using AI technologies and Big Data comply with the GDPR to ensure that privacy and data are protected. This paper focuses on the relation between AI, automated decision-making by using machine learning and the GDPR. It describes the risks of automated decision-making in the context of data protection and the rights of data subjects. Finally, it will give recommendations that could be useful for organisations using automated decision-making.
Keywords: GDPR, artificial intelligence, automated decision-making, Big Data, privacy -
Data protection officer: Tasks and responsibilities of a key role for the innovation of the relationship between data and data subjects’ rights
Donato La Muscatella, Attorney and CIPP/E Professional, Avv. Donato La Muscatella, Italy
This paper discusses the European legal framework, the guidelines, the Italian context and the essential issues, including whether the data protection officer (DPO) is a strategic role or an operational role. The paper also asks, Should the DPO be an internal function or an external professional, and what does expert knowledge of data protection law and practices mean? What is the sufficient time for a DPO to fulfil their tasks, and what does it mean to be easily accessible from each establishment?
Keywords: data protection officer, general data protection regulation, guidelines, DPO independence, DPO skills, DPO tasks, DPO responsibilities, DPO resources, data protection by design, data protection audit, data protection awareness -
Data breach in the travel sector and strategies for risk mitigation
Belinda Enoma, Global Privacy and Cybersecurity Consultant, istartandfinish.com
The airline industry relies heavily on personal data for transactions. This paper discusses the British Airways data breach of 2018, how the attack unfolded and problems that led to the attack. It provides examples of other airline-breach incidents through the years, how they were handled, and shows different types of risks airlines face today that should be addressed. Personal data has become increasingly attractive to hackers, and this paper highlights privacy law compliance, the importance of protecting and securing data in the industry including vulnerabilities and levels of acceptable risk in third-party transactions. It reviews various publications and resources about the cyberattack and describes risk mitigation strategies that airlines should implement that would significantly reduce cyber risks. As threat actors adapt with methods of cyberattacks, proper steps should be taken to mitigate these risks.
Keywords: cybersecurity, data protection, airline data breach, governance, awareness training, risk mitigation, third-party risk and privacy -
Data privacy progress, enforcement and Brexit
Dubhe Sarmiento Félix, Privacy Lawyer, CIPP/E and CIPM, Mexico and Steve Wright, CEO and Partner, Privacy Culture Limited
This paper discusses how, in the second year of the General Data Protection Regulation (GDPR), there has been outstanding progress in public awareness regarding data privacy, an increase in corporate accountability and awareness at all levels of organisations, a cultural shift in businesses regarding the use of data and international influence of the Regulation. It stipulates how there has also been progress in GDPR enforcement as Data Protection Authorities (DPAs) have been more active, imposing warnings, administrative fines and processing limitations but it has been inconsistent, as decisions have varied in terms of rigour and quality across the EU (European Union). This has been particularly so due to its conflicting interpretations, the lack of cooperation between DPAs, the extent of the efficiency of the one-stop shop mechanism, among other motives. The paper discusses the lack of clarity on enforcement rules and trends has left businesses in a vulnerable position. Also, Brexit and the Schrems II judgment have left individuals and businesses in a vulnerable position. First, individuals might have to exercise their rights in multiple jurisdictions, and businesses might face double fines for the same violations. Secondly, regarding data transfers between the EU and the UK, if an adequate decision is not reached, businesses will have to rely on other safeguard mechanisms for data transfer, such as the standard contractual clauses (SCCs). The reliance on SCCs, however, is now questionable and uncertain in light of the Schrems II decision. The paper analyses how the use of SCCs will imply additional burdens for data controllers and their valid implementation will depend on the approaches that DPAs and other EU authorities take.
Keywords: privacy, data protection, GDPR, enforcement, Brexit, data protection authorities, one-stop shop mechanism, standard contractual clauses, Schrems II -
Developments in China’s data protection and privacy regulatory regime
Lilly Langford, Bachelor of Business (Accounting)/Bachelor of Laws (Hons) student, Queensland University of Technology
On 1st June, 2017, the People’s Republic of China (PRC) Cybersecurity Law (CSL) came into force and became the first comprehensive privacy and security regulation for cyberspace and data protection. The CSL represents the gradual formation of China’s new legal framework for cyber security and data protection, despite the regime being split across various legislation and industry-specific regulations (including their draft versions). In light of the rapidly evolving regulatory environment, the broad and sometimes vaguely expressed obligations imposed by the CSL have added further complexity to the system and attracted significant attention and criticism from international companies due to its unclear application, lack of protection authority, harsh penalties, risk of stolen intellectual property, and strict compliance obligations. While new judicial developments seek to gradually provide clarification of the law — as seen in the outbreak of the COVID-19 pandemic — continuous and thorough review of cybersecurity measures will place foreign entities, ‘network operators’, and ‘critical information infrastructure operators’ in a position to effectively manage unprecedented, paradigm-shifting compliance obligations and risks. This paper outlines China’s data protection and privacy regulatory framework, provides a thorough analysis on important provisions of the CSL, and comments on recent legislative changes. Further, this paper will engage in an in-depth discussion on compliance procedures for best practice, compare the CSL with the European General Data Protection Regulation (GDPR), and analyse the impact of the coronavirus disease 2019 (COVID-19) pandemic on PRC data privacy legislation.
Keywords: data protection, cybersecurity, compliance, PRC Cybersecurity Law, COVID-19 -
Case study: The vicarious liability of employers for employees in the context of UK data protection law - WM Morrison Supermarkets plc v Various Claimants, 1st April, 2020
Victoria Hordern, Partner and Head of Data Privacy UK, Bates Wells
This paper discusses the ruling concerning the judgment of the UK Supreme Court in an appeal brought by the supermarket retailer WM Morrison (Morrisons) against a claim brought by former and existing employees of Morrisons (the Claimants) due to the actions of a rogue Morrisons’ employee (Morrisons’ Employee).1 In the High Court, the judge (Mr Justice Langstaff) had determined that, although Morrisons was not primarily liable for the actions of the rogue employee, Morrisons was vicariously liable.2 Morrisons had appealed to the Court of Appeal who dismissed the appeal on 22nd October, 2018.3 Morrisons then appealed to the Supreme Court.
Keywords: data protection, privacy, vicarious liability, employees, GDPR, employers -
Opinion pieces:
The Court of Justice of the European Union ruling in Data Protection Commissioner v Facebook Ireland and Maximillian Schrems. Judgment in Case C-311/18
Nick Baskett, Director, UKGDPR
On 16th July, 2020, The Court of Justice for the European Union (CJEU) delivered its verdict in the case brought by Maximillian Schrems against Facebook Ireland. The court’s finding was that EUUS Privacy Shield is invalid, while the Standard Contractual Clauses (SCCs) remained a lawful way to transfer data. This finding is a blow to the US administration and the thousands of businesses relying on it. Many of those companies must remember that the previous framework for data transfers between the regions was found invalid in 2015: Safe Harbour. Now, EU–US data transfers have limited options. The SCCs were found by the CJEU court to only be valid if each processing activity involving an EU to US data transfer is assessed individually, that risks are appropriately mitigated and the rights of data subjects can be guaranteed. This puts an additional compliance burden on smaller businesses. This case study discusses how difficult it is to argue with the court’s decision. The requirement now is to fix the underlying issues surrounding intelligence gathering to make it more transparent, build trust and to establish a framework that balances an individual’s privacy rights with the needs of intelligence agencies.
Keywords: Schrems II, CJEU, Facebook, EU-US Privacy Shield -
How to survive the 21st century: Three existential threats to humanity
Yuval Noah Harari, Professor, Department of History, Hebrew University of Jerusalem
This paper discusses how the 21st century presents three existential threats to humanity: social and economic upheaval, unprecedented inequality due to the artificial intelligence revolution and the rise of digital dictatorships. The world is at risk of technology dividing it into wealthy elites and exploited ‘data colonies’. This paper makes the case for better international cooperation in order to tackle global challenges.
Keywords: existential threats, social and economic upheavals, AI, technology risks, digital dictatorships, globalisation, cooperation
Volume 3 Number 3
Special issue: Privacy Enhancing Technologies
-
Editorial: Privacy Enhancing Technologies
Ardi Kolah, Founding Editor-in-Chief, Journal of Data Protection & Privacy -
Foreword
Dr Khaled El-Emam, CHEO Research Institute -
Ethics in Artificial Intelligence: A disjoint between knowing and acting
Henry Chang, Adjunct Associate Professor, The University of Hong Kong
While artificial intelligence (AI) is changing our daily lives, its impacts and risks to privacy remain one of the top agendas in its developments. As AI frequently relies on massive and seemingly irrelevant data to discover insights that are sometimes unexpected, the technology is at odds with the traditional data protection principles of data minimisation and transparency. Aware of the challenge, data protection authorities (DPAs) are looking for a paradigm shift in how personal data privacy may be respected in the AI era. Now many of them have started to advocate the idea of the ethical use of AI. There is evidence, however, to suggest that there is a disjoint between knowing the importance of ethics and acting ethically. This paper describes the current efforts by DPAs in promoting ethics in AI, discusses the flux concept of ethics, suggests the reasons behind the disjoint between the ethical ‘knowing’ and ‘acting’, and proposes how education, promotion of desired behaviours and a cross-functional approach may help bridge the gap.
Keywords: data protection, privacy, AI, ethics, behaviour, ethical decision-making -
De-identification as public policy
Gilad L. Rosner, Founder, IoT Privacy Forum
Canada’s data privacy law, the Personal Information Protection and Electronic Documents Act (PIPEDA), does not require or incentivise de-identification of personal data for purposes of sharing or research. This regulatory lacuna puts Canadian national law at a disadvantage in contrast with the privacy regimes of other countries, such as the United Kingdom, Australia and the United States, all of whom have regulatory language requiring or incentivising de-identification by custodians of personal data. This paper is based on a report commissioned by the Office of the Privacy Commissioner of Canada in service of eventual reform of PIPEDA to include de-identification. The paper addresses terminology, definitions, key debates and policy in other jurisdictions. It recommends legal reform, specific regulatory actions, and investigation of emerging policy strategies and lists remaining open questions for the development of a national Canadian de-identification policy. Chief among these recommendations is a reorientation from a regulatory focus on ‘outputs’ (‘Is the dataset rendered anonymous?’) to a focus on ‘process’ (‘Has the data custodian taken proper steps to reduce identification and privacy risks?’). In part, this is based on a rejection of the possibility of ‘irreversible anonymisation’. Relatedly, the paper argues for requiring a risk management approach to de-identification and for the discouragement of the ‘release-and forget’ model of data disclosure, which relies only on data transformations while ignoring technical, physical, administrative and contractual controls.
Keywords: de-identification, PIPEDA, Canadian law, data protection, anonymisation, risk management -
How privacy-enhancing technologies are transforming privacy by design and default: Perspectives for today and tomorrow
Joseph Srouji, Srouji Avocats and Thibault Mechler, Panthéon — Assas
This paper explains how privacy-enhancing technologies (PETs) fit into the overall European Union privacy legal framework, providing numerous examples of different types of PETs and their applicability. Given the close interaction between technology and legal aspects, this paper seeks to provide a broad legal and technological perspective for understanding how businesses can leverage PETs. The paper begins with a refresher of the legal framework of privacy by design and by default, with special attention provided to the sanctioning regime. The authors then provide a review of how PETs fit into privacy by design and by default, with a sampling of the more interesting tools that allow organisations to address issues, such as data subject consent and control over personal data, not to mention transparency and data minimisation. The paper concludes with a word of prudence, noting some of the pitfalls to avoid.
Keywords: privacy by design, privacy by default, privacy-enhancing technology, PET, data protection, GDPR, General Data Protection Regulation -
Applications of privacy-enhancing technology to data sharing at a global pharmaceutical company
Stephen Bamford, Head of Clinical Data Standards & Transparency, The Janssen Pharmaceutical Companies of Johnson & Johnson
Janssen has been at the forefront of the recent pharmaceutical industry trend towards more transparency and sharing of clinical trials data, committing early on to make its data available for both internal and external innovation. Janssen is also committed to protecting patient privacy and giving individuals a voice on how their data is used and disclosed. This paper outlines Janssen’s data-sharing initiatives and describes how it is using leading-edge privacy-enhancing technologies (PETs) to mitigate privacy risks and find the right balance between innovation and privacy.
Keywords: privacy-enhancing technologies, data sharing, transparency, pharmaceutical industry, open data, clinical trial data -
Does de-identification require consent under the GDPR and English common law?
Khaled El Emam, Senior Scientist, CHEO Research Institute, Mike Hintze, Partner, Hintze Law and Ruth Boardman, Partner, Bird & Bird
Data de-identification has many benefits in the context of the General Data Protection Regulation (GDPR). One of the recurring questions is whether consent is required to anonymise or de-identify data. In this paper, the authors make the case that no consent is required for anonymisation or other forms of de-identification under the GDPR, although additional conditions have to be met where special category data is anonymised. Further, under the English equitable duty of confidentiality, consent is generally not required if the de-identification is performed by the direct care team or on behalf of the direct care team; it is arguable that de-identification can also be performed by others outside of the direct care team, but less clear. The alternative would be special authorisation under section 251 of the National Health Service (NHS) Act.
Keywords: Privacy-enhancing technologies, de-identification, anonymisation, consent, secondary purposes, data sharing, General Data Protection Regulation Compliance, English common law -
Reasoning about unstructured data de-identification
Patricia Thaine Co-Founder & CEO and Gerald Penn, Co-Founder & Chief Science Officer, Private AI
We frame the problem of de-identifying unstructured text within the greater landscape of privacy-enhancing technologies. We then cover what sort of background knowledge can be gained from only stylistic information about a written document and how we can use research on authorship attribution and author profiling to improve our understanding about the sorts of inferences that can be made from an otherwise de-identified text. Finally, we provide a risk score for determining the likelihood that a message will be attributed to a particular author within a dataset using only author profiling tools.
Keywords: anonymisation, de-identification, authorship attribution, author profiling, unstructured data, risk -
Blockchain and the GDPR: Coexisting in contradiction?
John Timmons, Associate and Tim Hickman, Partner, White & Case
The recent adoption of the General Data Protection Regulation (GDPR) has fundamentally altered the legal landscape in the European Union and beyond with respect to data protection. Organisations that process personal data must ensure their data-processing practices are compliant with the requirements of the GDPR, irrespective of the technology used. The use of new technologies to process personal data can lead to additional complexities from a compliance perspective, particularly where the technology has intrinsic features that appear to be at odds with certain fundamental requirements of data protection law. This is an issue that applies to the use of blockchain technology as key features of the technology do not, at first glance, appear to be consistent with the requirements of the GDPR. While it is accurate to state that the GDPR has created some challenges regarding the adoption of blockchain technology to process personal data, these challenges are not necessarily insurmountable. This paper discusses the most pertinent challenges to the adoption of blockchain technology from a data protection compliance perspective.
Keywords: General Data Protection Regulation (GDPR), blockchain, cryptocurrency, Bitcoin, data protection, Data Protection Act, privacy, data privacy, cybersecurity, EU law, EU Data Protection Law, UK Data Protection Law, data protection law, right to be forgotten -
Data loss prevention as a privacy-enhancing technology
William Stallings, Independent consultant
Data loss prevention (DLP) is a mature technology deployed by many enterprises to support information security requirements. Key characteristics of DLP also make it a powerful privacy-enhancing technology that can satisfy a wide range of information-privacy requirements. In essence, DLP is a set of integrated technologies that detect sensitive data and prevent unauthorised use, release or delivery to specific destinations or recipients, as well as their storage at prohibited locations. DLP works in real time to identify personally identifiable information and react to privacy risks based on data content and the dynamic context of the information environment. Thus, DLP provides technical enforcement of terms and conditions, or policies more generally, to prevent privacy leaks. This paper provides an overview of the main features and elements of DLP.
Keywords: content analysis, context analysis, data at rest, data in motion, data in use, data leakage prevention, data loss prevention -
Engineering risk-based anonymisation solutions for complex data environments
Luk Arbuckle, Chief Methodologist and Muhammad Oneeb Rehman Mian, Senior Data Scientist, Privacy Analytics
Technological advancements have dramatically increased the ability to collect, store and process vast quantities of data. The general applicability and precision of analytical tools in artificial intelligence and machine learning have driven organisations to leverage these advances to process personal data in new and innovative ways. As stewards of personal data, organisations need to keep that data safe and ensure processing is legal and appropriate. Having more data, however, has also led to an increased interest to process personal data for purposes other than why they were originally collected, known as secondary purposes. The reuse of personal data introduces important regulatory challenges, increasing the need to disassociate data used for secondary purposes from personal data, be it to safeguard the data, support a legitimate interest, or anonymise the data. Whereas some academics have focused on specific issues preventing more widespread adoption of this privacy-enhancing technology, others have reframed the discussion around anonymisation as risk management. Combining technology-enabled processes with measures of identifiability provides an opportunity to meet complex business needs while ensuring best practice is adopted in reusing sensitive data. This paper examines these many considerations and demonstrates how risk-based anonymisation can and should be detailed, evidence based and objectively supported through measures of identifiability. The engineering of privacy solutions, through the application of risk-based anonymisation, is also briefly explored for complex use cases involving data lakes and hub and spoke data collection, to provide the reader with a deeper understanding of real-world risk-based anonymisation in practice.
Keywords: anonymisation, de-identification, pseudonymisation, privacy engineering, secondary uses -
Viewpoint: Implementing privacy-enhancing technologies in the time of a pandemic
Khaled El Emam, Senior Scientist, CHEO Research Institute
This paper provides a personal perspective on the implementation of privacy enhancing technologies (PETs) based on almost two decades of work in the field. As we are currently in the midst of a global pandemic, this fact will modify our views on PETs and shed light on some key factors shaping the use of privacy technology. Ongoing and expected challenges that may inhibit the wide deployment of PETs at this critical time will also be highlighted. The pandemic has illuminated many of the reasons as to why access to health data is crucial from a public health perspective. Access needs to be, however, provided in a responsible way, even during a crisis, making PETs all the more important as a means by which to facilitate data access while helping to manage the associated privacy risks.
Keywords: privacy, data sharing, privacy-enhancing technologies, pandemic, risk management
Volume 3 Number 2
-
Editorial: Look back, leap forwards
Ardi Kolah, Founding Editor-in-Chief, Journal of Data Protection & Privacy -
Papers: The big data race
Alex Brown, Partner, Simmons & Simmons Information, Communications & Technology Group -
How to specify the content of a cookie consent request
Georg Philip Krog, Cofounder and Chief of Legal Counsel, Signatu -
Mapping the supervisory authorities’ activities: Pragmatic problem-solvers or new practice creators?
Anna Aurora Wennäkoski, doctoral student in law, University of Helsinki -
Dubai International Financial Centre’s updated data protection law, Part 1: Developing a modern, global law in a UAE financial free zone
Lori Baker, Vice President, Legal Affairs & Director of Data Protection, Dubai International Financial Centre Authority and Julie Beeton, General Counsel and Assistant Professor, Canadian University Dubai -
Ad tech in a data privacy world
Ezgi Pilavci, Privacy Lawyer and Steve Wright, CEO and Partner, PrivacyCulture -
Artificial intelligence, facial recognition technology and data privacy
Steve Wilkinson, subject matter industry expert -
Protection and control of personal identifiable information: The PoSeID-on approach
Rui Casaleiro, et al. -
Book review: Decoding AI in Financial Services: Business Implications for Boards and Professionals
Reviewed by Ardi Kolah
Volume 3 Number 1
-
Editorial: In memoriam - a true visionary: Giovanni Buttarelli (1957–2019)
Ardi Kolah, Founding Editor-in-Chief, Journal of Data Protection & Privacy -
The implications of a ‘no-deal’ Brexit for data protection in the United Kingdom
Oliver Butler, Research Fellow, Bonavero Institute of Human Rights; Fellow, Wadham College, University of Oxford -
Accountability and human rights in the age of tech
Dan Shefet, French lawyer; author, ‘Online Radicalization’ UNESCO report, Individual Specialist, Expert, Council of Europe on the Internet Ombudsman; President of AAID -
Personal data protection in blockchain
Steve Wright, DPO, CEO, Privacy Culture and Ezgi Pilavci, privacy lawyer, Privacy Culture -
Adhering to GDPR codes of conduct: A possible option for SMEs to GDPR certification
Eric Lachaud, CRM Senior IT consultant; PhD candidate, TILT -
Learning to walk a tightrope: Challenges DPOs face in the day-to-day exercise of their responsibilities
Barbara Eggl, DPO, ECB -
Overview of the data protection regime in Uganda
Kenneth Muhangi, Partner, KTA Advocates -
Consent for data processing under the General Data Protection Regulation: Could ‘dynamic consent’ be a useful tool for researchers?
Megan Prictor, Academic Lawyer and Research Fellow, HeLEX programme, Melbourne Law School, Harriet J. A. Teare, Deputy Director, HeLEX, University of Oxford, Jessica Bell, Research Fellow, HeLEX, University of Oxford, Mark Taylor, Associate Professor in health law and regulation, Melbourne Law School; Deputy Director, HeLEX, Melbourne Law School, and Jane Kaye, Director, HeLEX, University of Oxford -
Book review: Data Protection Law in the EU: Roles, Responsibilities and Liability
Reviewed by Dr Jacob Kornbeck