Volume 3 (2019-20)

Each volume of Journal of Data Protection & Privacy consists of four 100-page issues. The articles published in Volume 3 are listed below. 

Volume 3 Number 3

Special issue: Privacy Enhancing Technologies

  • Editorial: Privacy Enhancing Technologies
    Ardi Kolah, Founding Editor-in-Chief, Journal of Data Protection & Privacy
  • Foreword
    Dr Khaled El-Emam, CHEO Research Institute
  • Ethics in Artificial Intelligence: A disjoint between knowing and acting
    Henry Chang, Adjunct Associate Professor, The University of Hong Kong

    While artificial intelligence (AI) is changing our daily lives, its impacts and risks to privacy remain one of the top agendas in its developments. As AI frequently relies on massive and seemingly irrelevant data to discover insights that are sometimes unexpected, the technology is at odds with the traditional data protection principles of data minimisation and transparency. Aware of the challenge, data protection authorities (DPAs) are looking for a paradigm shift in how personal data privacy may be respected in the AI era. Now many of them have started to advocate the idea of the ethical use of AI. There is evidence, however, to suggest that there is a disjoint between knowing the importance of ethics and acting ethically. This paper describes the current efforts by DPAs in promoting ethics in AI, discusses the flux concept of ethics, suggests the reasons behind the disjoint between the ethical ‘knowing’ and ‘acting’, and proposes how education, promotion of desired behaviours and a cross-functional approach may help bridge the gap.
    Keywords: data protection, privacy, AI, ethics, behaviour, ethical decision-making

  • De-identification as public policy
    Gilad L. Rosner, Founder, IoT Privacy Forum

    Canada’s data privacy law, the Personal Information Protection and Electronic Documents Act (PIPEDA), does not require or incentivise de-identification of personal data for purposes of sharing or research. This regulatory lacuna puts Canadian national law at a disadvantage in contrast with the privacy regimes of other countries, such as the United Kingdom, Australia and the United States, all of whom have regulatory language requiring or incentivising de-identification by custodians of personal data. This paper is based on a report commissioned by the Office of the Privacy Commissioner of Canada in service of eventual reform of PIPEDA to include de-identification. The paper addresses terminology, definitions, key debates and policy in other jurisdictions. It recommends legal reform, specific regulatory actions, and investigation of emerging policy strategies and lists remaining open questions for the development of a national Canadian de-identification policy. Chief among these recommendations is a reorientation from a regulatory focus on ‘outputs’ (‘Is the dataset rendered anonymous?’) to a focus on ‘process’ (‘Has the data custodian taken proper steps to reduce identification and privacy risks?’). In part, this is based on a rejection of the possibility of ‘irreversible anonymisation’. Relatedly, the paper argues for requiring a risk management approach to de-identification and for the discouragement of the ‘release-and forget’ model of data disclosure, which relies only on data transformations while ignoring technical, physical, administrative and contractual controls.
    Keywords: de-identification, PIPEDA, Canadian law, data protection, anonymisation, risk management

  • How privacy-enhancing technologies are transforming privacy by design and default: Perspectives for today and tomorrow
    Joseph Srouji, Srouji Avocats and Thibault Mechler, Panthéon — Assas

    This paper explains how privacy-enhancing technologies (PETs) fit into the overall European Union privacy legal framework, providing numerous examples of different types of PETs and their applicability. Given the close interaction between technology and legal aspects, this paper seeks to provide a broad legal and technological perspective for understanding how businesses can leverage PETs. The paper begins with a refresher of the legal framework of privacy by design and by default, with special attention provided to the sanctioning regime. The authors then provide a review of how PETs fit into privacy by design and by default, with a sampling of the more interesting tools that allow organisations to address issues, such as data subject consent and control over personal data, not to mention transparency and data minimisation. The paper concludes with a word of prudence, noting some of the pitfalls to avoid.
    Keywords: privacy by design, privacy by default, privacy-enhancing technology, PET, data protection, GDPR, General Data Protection Regulation

  • Applications of privacy-enhancing technology to data sharing at a global pharmaceutical company
    Stephen Bamford, Head of Clinical Data Standards & Transparency, The Janssen Pharmaceutical Companies of Johnson & Johnson

    Janssen has been at the forefront of the recent pharmaceutical industry trend towards more transparency and sharing of clinical trials data, committing early on to make its data available for both internal and external innovation. Janssen is also committed to protecting patient privacy and giving individuals a voice on how their data is used and disclosed. This paper outlines Janssen’s data-sharing initiatives and describes how it is using leading-edge privacy-enhancing technologies (PETs) to mitigate privacy risks and find the right balance between innovation and privacy.
    Keywords: privacy-enhancing technologies, data sharing, transparency, pharmaceutical industry, open data, clinical trial data

  • Does de-identification require consent under the GDPR and English common law?
    Khaled El Emam, Senior Scientist, CHEO Research Institute, Mike Hintze, Partner, Hintze Law and Ruth Boardman, Partner, Bird & Bird

    Data de-identification has many benefits in the context of the General Data Protection Regulation (GDPR). One of the recurring questions is whether consent is required to anonymise or de-identify data. In this paper, the authors make the case that no consent is required for anonymisation or other forms of de-identification under the GDPR, although additional conditions have to be met where special category data is anonymised. Further, under the English equitable duty of confidentiality, consent is generally not required if the de-identification is performed by the direct care team or on behalf of the direct care team; it is arguable that de-identification can also be performed by others outside of the direct care team, but less clear. The alternative would be special authorisation under section 251 of the National Health Service (NHS) Act.
    Keywords: Privacy-enhancing technologies, de-identification, anonymisation, consent, secondary purposes, data sharing, General Data Protection Regulation Compliance, English common law

  • Reasoning about unstructured data de-identification
    Patricia Thaine Co-Founder & CEO and Gerald Penn, Co-Founder & Chief Science Officer, Private AI

    We frame the problem of de-identifying unstructured text within the greater landscape of privacy-enhancing technologies. We then cover what sort of background knowledge can be gained from only stylistic information about a written document and how we can use research on authorship attribution and author profiling to improve our understanding about the sorts of inferences that can be made from an otherwise de-identified text. Finally, we provide a risk score for determining the likelihood that a message will be attributed to a particular author within a dataset using only author profiling tools.
    Keywords: anonymisation, de-identification, authorship attribution, author profiling, unstructured data, risk

  • Blockchain and the GDPR: Coexisting in contradiction?
    John Timmons, Associate and Tim Hickman, Partner, White & Case

    The recent adoption of the General Data Protection Regulation (GDPR) has fundamentally altered the legal landscape in the European Union and beyond with respect to data protection. Organisations that process personal data must ensure their data-processing practices are compliant with the requirements of the GDPR, irrespective of the technology used. The use of new technologies to process personal data can lead to additional complexities from a compliance perspective, particularly where the technology has intrinsic features that appear to be at odds with certain fundamental requirements of data protection law. This is an issue that applies to the use of blockchain technology as key features of the technology do not, at first glance, appear to be consistent with the requirements of the GDPR. While it is accurate to state that the GDPR has created some challenges regarding the adoption of blockchain technology to process personal data, these challenges are not necessarily insurmountable. This paper discusses the most pertinent challenges to the adoption of blockchain technology from a data protection compliance perspective.
    Keywords: General Data Protection Regulation (GDPR), blockchain, cryptocurrency, Bitcoin, data protection, Data Protection Act, privacy, data privacy, cybersecurity, EU law, EU Data Protection Law, UK Data Protection Law, data protection law, right to be forgotten

  • Data loss prevention as a privacy-enhancing technology
    William Stallings, Independent consultant

    Data loss prevention (DLP) is a mature technology deployed by many enterprises to support information security requirements. Key characteristics of DLP also make it a powerful privacy-enhancing technology that can satisfy a wide range of information-privacy requirements. In essence, DLP is a set of integrated technologies that detect sensitive data and prevent unauthorised use, release or delivery to specific destinations or recipients, as well as their storage at prohibited locations. DLP works in real time to identify personally identifiable information and react to privacy risks based on data content and the dynamic context of the information environment. Thus, DLP provides technical enforcement of terms and conditions, or policies more generally, to prevent privacy leaks. This paper provides an overview of the main features and elements of DLP.
    Keywords: content analysis, context analysis, data at rest, data in motion, data in use, data leakage prevention, data loss prevention

  • Engineering risk-based anonymisation solutions for complex data environments
    Luk Arbuckle, Chief Methodologist and Muhammad Oneeb Rehman Mian, Senior Data Scientist, Privacy Analytics

    Technological advancements have dramatically increased the ability to collect, store and process vast quantities of data. The general applicability and precision of analytical tools in artificial intelligence and machine learning have driven organisations to leverage these advances to process personal data in new and innovative ways. As stewards of personal data, organisations need to keep that data safe and ensure processing is legal and appropriate. Having more data, however, has also led to an increased interest to process personal data for purposes other than why they were originally collected, known as secondary purposes. The reuse of personal data introduces important regulatory challenges, increasing the need to disassociate data used for secondary purposes from personal data, be it to safeguard the data, support a legitimate interest, or anonymise the data. Whereas some academics have focused on specific issues preventing more widespread adoption of this privacy-enhancing technology, others have reframed the discussion around anonymisation as risk management. Combining technology-enabled processes with measures of identifiability provides an opportunity to meet complex business needs while ensuring best practice is adopted in reusing sensitive data. This paper examines these many considerations and demonstrates how risk-based anonymisation can and should be detailed, evidence based and objectively supported through measures of identifiability. The engineering of privacy solutions, through the application of risk-based anonymisation, is also briefly explored for complex use cases involving data lakes and hub and spoke data collection, to provide the reader with a deeper understanding of real-world risk-based anonymisation in practice.
    Keywords: anonymisation, de-identification, pseudonymisation, privacy engineering, secondary uses

  • Viewpoint: Implementing privacy-enhancing technologies in the time of a pandemic
    Khaled El Emam, Senior Scientist, CHEO Research Institute

    This paper provides a personal perspective on the implementation of privacy enhancing technologies (PETs) based on almost two decades of work in the field. As we are currently in the midst of a global pandemic, this fact will modify our views on PETs and shed light on some key factors shaping the use of privacy technology. Ongoing and expected challenges that may inhibit the wide deployment of PETs at this critical time will also be highlighted. The pandemic has illuminated many of the reasons as to why access to health data is crucial from a public health perspective. Access needs to be, however, provided in a responsible way, even during a crisis, making PETs all the more important as a means by which to facilitate data access while helping to manage the associated privacy risks.
    Keywords: privacy, data sharing, privacy-enhancing technologies, pandemic, risk management

Volume 3 Number 2

  • Editorial: Look back, leap forwards
    Ardi Kolah, Founding Editor-in-Chief, Journal of Data Protection & Privacy
  • Papers: The big data race
    Alex Brown, Partner, Simmons & Simmons Information, Communications & Technology Group
  • How to specify the content of a cookie consent request
    Georg Philip Krog, Cofounder and Chief of Legal Counsel, Signatu
  • Mapping the supervisory authorities’ activities: Pragmatic problem-solvers or new practice creators?
    Anna Aurora Wennäkoski, doctoral student in law, University of Helsinki
  • Dubai International Financial Centre’s updated data protection law, Part 1: Developing a modern, global law in a UAE financial free zone
    Lori Baker, Vice President, Legal Affairs & Director of Data Protection, Dubai International Financial Centre Authority and Julie Beeton, General Counsel and Assistant Professor, Canadian University Dubai
  • Ad tech in a data privacy world
    Ezgi Pilavci, Privacy Lawyer and Steve Wright, CEO and Partner, PrivacyCulture
  • Artificial intelligence, facial recognition technology and data privacy
    Steve Wilkinson, subject matter industry expert
  • Protection and control of personal identifiable information: The PoSeID-on approach
    Rui Casaleiro, et al.
  • Book review: Decoding AI in Financial Services: Business Implications for Boards and Professionals
    Reviewed by Ardi Kolah

Volume 3 Number 1

  • Editorial: In memoriam - a true visionary: Giovanni Buttarelli (1957–2019)
    Ardi Kolah, Founding Editor-in-Chief, Journal of Data Protection & Privacy
  • The implications of a ‘no-deal’ Brexit for data protection in the United Kingdom
    Oliver Butler, Research Fellow, Bonavero Institute of Human Rights; Fellow, Wadham College, University of Oxford
  • Accountability and human rights in the age of tech
    Dan Shefet, French lawyer; author, ‘Online Radicalization’ UNESCO report, Individual Specialist, Expert, Council of Europe on the Internet Ombudsman; President of AAID
  • Personal data protection in blockchain
    Steve Wright, DPO, CEO, Privacy Culture and Ezgi Pilavci, privacy lawyer, Privacy Culture
  • Adhering to GDPR codes of conduct: A possible option for SMEs to GDPR certification
    Eric Lachaud, CRM Senior IT consultant; PhD candidate, TILT
  • Learning to walk a tightrope: Challenges DPOs face in the day-to-day exercise of their responsibilities
    Barbara Eggl, DPO, ECB
  • Overview of the data protection regime in Uganda
    Kenneth Muhangi, Partner, KTA Advocates
  • Consent for data processing under the General Data Protection Regulation: Could ‘dynamic consent’ be a useful tool for researchers?
    Megan Prictor, Academic Lawyer and Research Fellow, HeLEX programme, Melbourne Law School, Harriet J. A. Teare, Deputy Director, HeLEX, University of Oxford, Jessica Bell, Research Fellow, HeLEX, University of Oxford, Mark Taylor, Associate Professor in health law and regulation, Melbourne Law School; Deputy Director, HeLEX, Melbourne Law School, and Jane Kaye, Director, HeLEX, University of Oxford
  • Book review: Data Protection Law in the EU: Roles, Responsibilities and Liability
    Reviewed by Dr Jacob Kornbeck