Our website uses cookies to give you the best experience and for us to analyse our site usage. If you continue to use our site, we will take it you are OK about this. Click on More for information about the cookies on our site and what you can do to opt out.

We respect your Do Not Track preference.

A design blueprint for privacy Riki Jamieson-Smyth
19 December 2016

5664 a red and white bullseye design pv

I was recently lucky enough to attend the Asian Privacy Scholars Network 5th International Conference, hosted by the Business School at the University of Auckland.

The inspiring line up of privacy thinkers from around the world included the Honourable Michael Kirby, Prof Kiyoshi Murata from Japan’s Meiji University, and Professor Dr Sarah Hosell of the University of Applied Sciences in Cologne. You can find out more about the speakers and topics here. Their presentations will also be published in due course. 

Snapchat and sexting

One outstanding privacy commentator was Prof Woodrow Hartzog of Samford University, Alabama. Prof Hartzog is the Starnes Professor of Law at Cumberland School of Law, as well as being an Affiliate Scholar at The Center for Internet and Society at Stanford Law School and he spoke about his upcoming book - Privacy's Blueprint: The battle to control the design of new technology.

Prof Hartzog began his presentation with the example of Snapchat- a smart phone application with an invitation by design to send sensitive information. Its picture messages disappear within seconds of the recipient opening them. When Prof Hartzog asked what the purpose of such an app might be, there were delighted calls of "sexting!" from the mostly middle aged scholarly audience.

Third party operators soon appeared after the advent of Snapchat and these provided ways for snap-chatterers to capture the images before they disappeared. Inevitably, this led to the data breach known as ‘The Snappening’. But shouldn’t Snapchat have been prepared for this eventuality?

Hacks and data breaches

Recently, there have been many other hacks and data breaches in the news media - Ashley Madison, the Australian Census site, Hacking Team, Yahoo to name a few - and yet we see agencies applying sticking-plaster solutions and some governments even acting to criminalise ‘white hat’ (or ethical) hackers who work to expose vulnerabilities safely and alert the relevant agency.

What’s the answer? Prof Hartzog makes three broad points:

  1. Design matters for privacy;
  2. Privacy law should take design more seriously; and
  3. A design agenda should have its roots in consumer protection and surveillance law. 

Making Privacy by Design meaningful

There are huge gaps in privacy law concerning the design of new technology, and Privacy by Design (PBD) has a long way to go before it reaches the universal acceptance it deserves, according to Hartzog.  Furthermore, we need to make sure PBD is a meaningful concept and not just a slogan.

Prof Hartzog says privacy's three basic rules are:

  1. Give individuals some control over their own data;
  2. Don't tell lies; and
  3. Don't cause any harm.

But what do these three aspirational points mean in the real world? How can people control what they don't understand? How can you understand what you are consenting to with a single click as you eagerly wait to use your new app? And how realistic is to go back and check the 50 apps you already have on your phone?

Also while designers might not deliberately tell lies, what about obscuring the important stuff in the usual "accept all" requirement before downloading a new app?

And finally, how do we define harm? In New Zealand, we have a definition in our Privacy Act and some guidance from the Human Rights Review Tribunal, particularly following this precedent-setting case, and others like this one. But harm can be difficult to attribute to a single cause when your personal information is leaking from numerous sources.

Prof Hartzog says the big problem is the overwhelming incentive to design technology which maximises the collection, use, and disclosure of personal information. The value of personal information encourages a “collect first, ask questions later mentality” which marginalises the virtue of being transparent.

While there are some good examples of privacy-protective design, many new digital products and services are not good enough and erode our privacy rights.

In short, the design in new information technologies is failing us.

Three values for design

The three values of Prof Hartzog’s blueprint for designing for privacy are trust, obscurity and autonomy. These three values are intertwined. Autonomy is furthered as a design value when privacy law nurtures technologies that protect our ability to trust and maintain obscurity. Trust and obscurity are complementary values. Trust protects our information within relationships. Obscurity protects us when there is no one to trust.

He also says designers need to design to standards so their products are not deceptive, abusive or dangerous. Lawmakers and the courts need the right tools to discourage deceptive, abusive or dangerous design. These tools vary in strength from soft to moderate to robust. Robust responses should be used to confront the most serious privacy design problems. Lawmakers should seek balance and fit when choosing the appropriate legal response. Their toolbox should include privacy enhancing technologies, education, investigations and enforcement, fines and penalties and international collaboration. If you have others, we welcome your suggestions.

In conclusion, Woodrow Hartzog is a bit of a privacy hero with some really cool ideas. You can follow him on Twitter at @hartzog. When his book is published in 2017, I will be reading it.

Image credit: Red and white bullseye design by Peter Kratochvil

1 comments

, ,

Back

Comments

  • I look forward to reading more from Prof Hartzog. One of the challenges for organisations is finding individuals who can provide tangible examples and actions of applying Privacy by Design (PbD) concepts. There are plenty of articles and books about the importance of PbD and of course the regulatory stick of the GDPR requires PbD concepts are applied, however there are few privacy engineer roles in existence. A role that we should surely see grow in the future.

    Posted by Jacqueline Peace, 11/01/2017 10:46am (9 months ago)

    Post Reply

    The aim of the Office of Privacy Commissioner’s blog is to provide a space for people to interact with the content posted. We reserve the right to moderate all comments. We will not publish any content that is abusive, defamatory or is obviously commercial. We ask for your email address so that we can contact you if necessary to clarify your comment. Please be respectful of authors and others leaving comments.

Post your comment

The aim of the Office of Privacy Commissioner’s blog is to provide a space for people to interact with the content posted. We reserve the right to moderate all comments. We will not publish any content that is abusive, defamatory or is obviously commercial. We ask for your email address so that we can contact you if necessary to clarify your comment. Please be respectful of authors and others leaving comments.

Latest Blog Entries