Our website uses cookies so we can analyse our site usage and give you the best experience. Click "Accept" if you’re happy with this, or click "More" for information about cookies on our site, how to opt out, and how to disable cookies altogether.

We respect your Do Not Track preference.

UN Special Rapporteur’s consultation on Big Data-Open Data Joanna Hayward and Michael Harrison
15 August 2018 at 10:05

digital

We recently participated in a two-day consultation on Big Data-Open Data at the University of New South Wales (UNSW) in Sydney. The session was convened by Professor Joseph Cannataci, the United Nations Special Rapporteur for Privacy.

A multidisciplinary group met to discuss the Special Rapporteur’s initial report on Big Data and Open Data, co-authored by Australian academics Professor David Watt and Dr Vanessa Teague.

The consultation led to a fascinating discussion about the challenges of Big Data and Open Data and the implications for automated decision-making by algorithms. These are developments we are following closely and that Privacy Commissioner John Edwards covered in his key note speech at the Privacy Forum in Wellington this year.

One of the central themes of the meeting was the very difficult challenge of effectively de-identifying datasets before publicly releasing them. Dr Teague discussed her work to identify vulnerabilities in the online release of medical billing records in 2016. This event was a breach of the Australian Privacy Act by the Department of Health.

The Australian Information Commissioner and Privacy Commissioner investigation found that the risk of re-identifying medical providers whose information was in the data set was not sufficiently low. Further, the investigation found that the department’s processes for assessing the risks associated with publication were inadequate.

Best and worst of Big Data

John Edwards chaired a panel of Australian academics from sociology and indigenous research, law and computer science on the topic of The Best and Worst of Big Data.

Professor Maggie Walter (sociology at the University of Tasmania) found that Big Data-Open Data is not necessarily consistent with indigenous data sovereignty – the right of indigenous people and nations to govern the collection, ownership and application of data about indigenous people and communities. The data sovereignty framework she discussed is built on The First Nations Principles of Ownership, Control, Access and Possession (OCAP).

Responding to the well-known “Five Safes Framework” developed to protect statistical datasets (safe people, safe projects, safe settings, safe data and safe output), Professor Walter posed the questions “Safe for whom?” and  juxtaposed the “5Ds” of collecting indigenous data (disparity, deprivation, disadvantage, dysfunction and difference). As Aboriginal Australians are among the most surveilled through data collection, there is a perceived lack of privacy for these communities and concern that Big Data-Open Data can magnify the alienation and harm of aboriginal groups.

Professor Toby Walsh (computer science at the UNSW) described three cardinal sins in the use of predictive analytics in child welfare interventions. First, trying to measure child abuse is inherently difficult and using previous referrals to authorities is a poor proxy measure.

Second, this focuses attention exclusively on people already known to authorities and fails to look for other risks to children. Third, the temptation is to include all available statistics in predictive risk models and as about 25 percent of these are measures of poverty, the result is that predictive models measure the parenting of the poor, rather than poor parenting.

Professor Fleur Johns (international law at UNSW) reported on her collaborative research on the use of data in humanitarian contexts. Her main point was that privacy issues cannot be addressed without taking inequality into account. While data collection and surveillance in humanitarian contexts has the potential to provide better access or services, many individuals may be largely absent from records other than being reflected within Big Data. Privacy is therefore not a level playing field and the implications of this form of data representation for individuals in vulnerable situations need to be considered.

The comments of the panel generated a number of questions and discussion among participants. As panel moderator, John Edwards observed that naming the issue specifically as privacy may not adequately convey the fundamental nature of the range of rights that can be affected by Big Data and, to ensure a fuller discussion, it is more useful to use terms like autonomy, self-determination, control and governance.

Critical elements of best practice

John Edwards was also on a panel discussing the Critical Elements of Best Practice for Big Data-Open Data. Other panellists were Professor Tahu Kukutai from the University of Waikato, a founder of the Maori Data Sovereignty initiative, along with UNSW data protection academic Professor Graham Greenleaf, and UNSW competition law academic Dr Katherine Kemp, whose research includes abusive market dominance.

Mr Edwards discussed the work our office is undertaking on predictive analytics, and how this led to establishing six Principles for the Safe and Effective Use of Data and Analytics with Statistics New Zealand. These principles are intended to help agencies with data analytics activities, such as Big Data-Open Data.

He also discussed the 2017 report on the Ministry of Social Development’s policy of collecting individual client level data from NGOs. A perverse outcome of the now abandoned policy was that there was a disincentive for individuals to come forward for assistance. The unintended result is that those individuals would no longer be represented in the data.

The largely Australian audience was interested in Statistics New Zealand’s Integrated Data Infrastructure (IDI) that makes datasets available for research under safe conditions, providing a controlled alternative to the open release of datasets and the problem of ensuring adequate de-identification.

The Special Rapporteur is due to report his findings to the United Nations later this year.

Image credit: Digital by Kai Stachowiak (via PublicDomainPictures.net)

, ,

Back