Our website uses cookies so we can analyse our site usage and give you the best experience. Click "Accept" if you’re happy with this, or click "More" for information about cookies on our site, how to opt out, and how to disable cookies altogether.
We respect your Do Not Track preference.
On 3-4 December, we hosted the 50th Asia-Pacific Privacy Authorities (APPA) Forum in Wellington. We welcomed representatives from fifteen privacy authorities from the Asia-Pacific – as well as guests from around the world – to discuss global trends, share experiences, and work towards greater cooperation.
This post covers some of the highlights from the forum. The forum communiqué includes a summary of the two days and a list of organisations represented at the forum.
Read the 50th APPA Forum Communiqué (external link)
Markus Heyder of the Centre for Information Policy Leadership moderated a panel of APPA regulators and guests from global companies on regulatory approaches to challenges of artificial intelligence (AI). On the panel were:
All the panellists agreed that it’s vital for people to be able to trust how agencies use AI. Without trust, people won’t provide agencies with their data and the mutually beneficial relationship breaks down.
Mr Tse presented the PCPD’s “Ethical Accountability Framework for Hong Kong, China,” the findings of a project which aims to help foster a culture of fair and ethical data processing. The project called for agencies to incorporate ethical values into their data processing policies and be transparent about processes. The project also recommended practical tools to help agencies do this.
Read the “Ethical Accountability Framework for Hong Kong, China” (external link)
Mr Brick emphasised that Microsoft was committed to design AI to “earn trust”. To that end, Mr Brick said that Microsoft was developing its AI with reliability, security, inclusiveness, transparency, and accountability in mind.
Mr Brick also recognised that AI processing was vulnerable to bias – he said that Microsoft’s plan to counter this is to employ a diverse pool of AI talent and develop analytical techniques to detect and eliminate bias, as well as using people to review AI decisions.
Ms Micas called AI an essential enabler for Facebook. She described how Facebook uses AI to help keep the platform safe by identifying 86% of violent content for removal before users had to report it. AI has also helped Facebooks identify spam, fake accounts, and posts that may need translation. Ms Micas stressed that Facebook was committed to managing and countering the risks of AI through ethical processes.
Like Hong Kong, Singapore is working to supporting AI development and adoption through governance and ethics. Mr Yeong explained the PDPC’s strategy:
Mr Yeong also detailed PDPC’s efforts to strike a balance between regulation and innovation. This includes proposed amendments to Singapore’s Personal Data Protection Act that will create a notification and opt-out approach for circumstances where there’s no foreseeable adverse impact on individuals, and an exception to consent where there’s a need to protect legitimate interests that will benefit the public.
Mr Yeong stressed that these will be subject to accountability measures, and the proposed amendments will also make re-identification of anonymised data a criminal offence.
The right to be de-linked from personal information that’s available to the public, or the “right to be forgotten,” has been a contentious privacy topic over the last few years. APPA members presented on and discussed how elements of the right to be forgotten worked in their jurisdictions.
Mark Eichorn, Assistant Director at the Federal Trade Commission USA, noted that California’s recently passed Consumer Privacy Act gives consumers the right to request deletion of their personal information. Mr Eichorn noted that a general right to be forgotten needs to be balanced with protections under the First Amendment, which guarantees Americans the right to freedom of speech.
Guest speaker Kylie Jackson-Cox, PhD candidate at Auckland University, gave a presentation about whether public facts may, through the passage of time, gain privacy protection in law.
Ms Jackson-Cox discussed some of the factors that need to be weighed, such as a reasonable expectation of privacy, public interest, time, rehabilitation, and freedom of expression.
She also spoke about some legal cases that have reached different conclusions about the privatisation of public facts. In 1931, a woman in California successfully sued the producer of a film that depicted her trial and acquittal for murder and used her real name. The court held that using her name in conjunction with incidents from her life was a breach of privacy.
Read Melvin v Reid 297 P 91 (Cal App 1931) (external link)
More recently, a 2018 case in the United Kingdom led to the High Court ordering Google to delist several results on its search engine that linked a businessman to his prosecutions for hacking and phone tampering.
Read NT 1 & NT 2 v Google LLC [2018] EWHC 799 (QB) (external link)
At the other end of the spectrum, the Supreme Court of Idaho found that a newspaper publishing a statement by a third party containing allegations that the plaintiff engaged in homosexual activity did not breach the plaintiff’s privacy because the information was on the public record.
Read Uranga v. Federated Publications, Inc., 67 P.3d 29 (Idaho 2003) (external link)
We were lucky to have representatives from the African Network of Data Protection Authorities (RAPDP) in attendance. Marguerite Ouedraogo (Chairwoman of RAPDP) and Lahoussine Aniss (Permanent Secretariat of RAPDP) spoke to APPA members about privacy and data protection in Africa.
Ms Ouedraogo and Mr Aniss said that privacy and data protection are a growing concern in Africa – countries are recognising its status as a fundamental human right, as well as its importance to economic development, security, and digital sovereignty.
According to the United Nations Conference on Trade and Development (UNCTD), 23 African countries have enacted privacy laws, and another seven countries are drafting legislation. Numerous countries also participate in Pan-African, regional and international frameworks that include data protection measures.
UNCTD – Data protection and privacy legislation worldwide (external link)
The RAPDP’s role is to promote privacy and data protection in Africa and develop cooperation and synergy between African data protection authorities (DPAs). The RAPDP and DPAs are part of the growing privacy and data protection ecosystem in Africa, which also includes NGOs, business representatives and academics.
Ms Ouedraogo and Mr Aniss described the challenges that this ecosystem is facing. The need for privacy regulation is increasing along with the spread of internet use across the continent. National identity schemes, compulsory SIM card registrations, and surveillance legislation are affecting people’s privacy in some African nations. At the same time, there is a sense of “privacy myopia”; governments, corporations and individuals all underestimate the importance of privacy and data protection. DPAs are under-resourced, and there is a common perception that privacy is a barrier for agencies rather than an opportunity.
Advancing technology and a lack in privacy awareness were familiar challenges to APPA members. We welcomed the opportunity to hear about the exciting developments in Africa and look forward to further cooperation in the future.
Back