American Journal of Science, Engineering and Technology

| Peer-Reviewed |

Data Protection and Artificial Intelligence Law: Europe Australia Singapore - An Actual or Perceived Dichotomy

Received: 01 November 2019    Accepted: 27 November 2019    Published: 05 December 2019
Views:       Downloads:

Share This Article

Abstract

Artificial Intelligence (AI) is moving so rapidly policy makers, regulators, governments and the legal profession are struggling to keep up. However, AI is not new and it has been used for more than two decades. Coupled with AI, personal data, along with cyber security law, and the challenges posed by the current legal frameworks are nothing short of immense. They are, in part, at odds with each other, and are doing very different things. This paper explores some of the challenges emerging in Australia, Europe and Singapore. The challenge of the interrelationship between personal data and AI arguably begins with who has manufactured the AI. Secondly, who owns the AI. Another challenge that has also emerged is defining AI. Most people are able to understand what AI is and how it is beginning to impact the economy and our daily lives. However, there is no clear legal definition of AI, because AI is so nebulous. This burgeoning area of law is going to challenge society, privacy and economic experts, regulators, innovators of technology, as there continues to be a collision between them. Furthermore, the collection of personal data by AI challenges the notion of where responsibility lies. That is, AI may collect, use and disclose personal data at different points along the technology chain. It will be highlighted how the current data protection laws rather than promote AI projects, largely inhibit its development. This paper identifies some of the tensions between data protection law and AI. This paper argues that there is a need for an urgent and detailed understanding of the opportunities, legal and ethical issues associated with data protection and AI. Doing so will ensure an ongoing balance between the economic and social issues that are attached to the two areas of the law.

DOI 10.11648/j.ajset.20190404.11
Published in American Journal of Science, Engineering and Technology (Volume 4, Issue 4, December 2019)
Page(s) 55-65
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2024. Published by Science Publishing Group

Keywords

Artificial Intelligence, Data Protection, Australia, European Union, Singapore

References
[1] Robert Walters, Leon Trakman, Bruno Zeller, Data Protection Law: Data Protection Law: A Comparative Analysis of Asia-Pacific and European Approaches, Springer (2019).
[2] Japan Times, Full text of the G20 Osaka leaders' declaration https://www.japantimes.co.jp/news/2019/06/29/national/full-text-g20-osaka-leaders-declaration/#. XTu6qWVeK8U
[3] Ibid. To further promote innovation in the digital economy, we support the sharing of good practices on effective policy and regulatory approaches and frameworks that are innovative as well as agile, flexible, and adapted to the digital era, including through the use of regulatory sandboxes. The responsible development and use of Artificial Intelligence (AI) can be a driving force to help advance the SDGs and to realize a sustainable and inclusive society. To foster public trust and confidence in AI technologies and fully realize their potential, we commit to a human-centered approach. We affirm the importance of protection of intellectual property. Along with the rapid expansion of emerging technologies including the Internet of Things (IoT), the value of an ongoing discussion on security in the digital economy is growing. We, as G20 members, affirm the need to further work on these urgent challenges.
[4] MIT Technology Review, When an AI finally kills someone, who will be responsible? Legal scholars are furiously debating which laws should apply to AI crime, https://www.technologyreview.com/s/610459/when-an-ai-finally-kills-someone-who-will-be-responsible/
[5] J. K.C. Kingston, Artificial Intelligence and Legal Liability, https://arxiv.org/pdf/1802.07782.pdf.
[6] Stanford University, One Hundred Year Study on Artificial Intelligence (AI100), (2016) https://ai100.stanford.edu/sites/g/files/sbiybj9861/f/ai_100_report_0901fnlc_single.pdf
[7] The National Science and Technology Council, The National Artificial Intelligence Research and Development Strategic Plan (2016), https://www.nitrd.gov/PUBS/national_ai_rd_strategic_plan.pdf.
[8] World Economic Forum, Artificial Intelligence Collides with Patent Law, White Paper, 2018, http://www3.weforum.org/docs/WEF_48540_WP_End_of_Innovation_Protecting_Patent_Law.pdf.
[9] Sinta Dewi Rosadi, Robert Walters, Bambang Pratama, Siti Yuniarti, Personal Data and Smart Appliances Used in the Home (forthcoming)
[10] Phillip Jackson, Introduction To Artificial Intelligence 1, Dover Publ’n, Inc., 2d ed. (1974), pp. 192-338.
[11] World Intellectual Property Organization Technology Trends, Artificial Intelligence, https://www.wipo.int/edocs/pubdocs/en/wipo_pub_1055.pdf.
[12] Greenberg, A., An AI That Reads Privacy Policies So That You Don’t Have To, wired.com ( 2018), available at https://www.wired.com/story/polisis-ai-reads-privacy-policies-so-you-dont-have-to/.
[13] Rob Sumroy, Natalie Donovan, AI and Data Protection Balancing Tension, Slaughter and May https://www.slaughterandmay.com/media/2537572/ai-and-data-protection-balancing-tensions.pdf
[14] Council of Europe, Artificial Intelligence and Data Protection: Challenges and Possible Remedies https://rm.coe.int/artificial-intelligence-and-data-protection-challenges-and-possible-re/168091f8a6 Accurate testing of the training phase before the deployment of AI algorithms on a large scale could reveal hidden bias. Moreover, hidden bias may also involve machine-generated bias which is different from human bias. In the AI context, the assessment of potential bias can also become controversial, given the multiple variables involved and the classification of people into groups which do not necessarily correspond to the traditional discriminatory categories. Questions regarding machine bias cannot be deflected by the argument that human decisions are fallible, and that AI is a way to reduce human error.
[15] Australian Human Rights Commission, Human Rights and Technology Issues Paper, July 2018, https://tech.humanrights.gov.au/sites/default/files/2018-7/Human%20Rights%20and%20Technology%20Issues%20Paper%20FINAL.pdf
[16] Australian Human Rights Commission and World Economic Forum 2019, Artificial Intelligence: governance and leadership, White paper, January 2019, https://www.humanrights.gov.au/sites/default/files/document/publication/AHRC%20WEF%20White%20Paper%20online%20version%20FINAL.pdf. They note that the potential impact of AI, including on other human rights, goes beyond privacy. For example, AI and related technologies could: bring radical changes in how we work, with predicted large-scale job creation and destruction and new ways of working, transform decision-making that affects citizens’ basic rights and interests increase our environmental impact, become so important in how we live that accessibility of that technology becomes an even more important human rights issue, have a profound impact on our democratic institutions and processes.
[17] Human Rights and Technology, Decision making and Artificial Intelligence, https://tech.humanrights.gov.au/our-work
[18] Personal Data Protection Commission Singapore, Discussion Paper on Artificial Intelligence (AI) AND Personal Data – Fostering Responsible Development and Adoption of AI, https://www.pdpc.gov.sg/-/media/Files/PDPC/PDF-Files/Resource-for-Organisation/AI/Discussion-Paper-on-AI-and-PD---050618.pdf.
[19] The Organization for Economic Cooperation and Development (OECD) Guideline, governing the Protection of Privacy and Transborder Flows of Personal Data’ (‘OECD Guidelines. The OECD member countries are: Australia, Austria, Belgium, Canada, Chile, the Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Iceland, Ireland, Israel, Italy, Japan, Korea, Luxembourg, Mexico, the Netherlands, New Zealand, Norway, Poland, Portugal, the Slovak Republic, Slovenia, Spain, Sweden, Switzerland, Turkey, the United Kingdom and the United States of America. The Commission of the European Communities takes part in the work of the OECD, http://www.oecd.org/sti/ieconomy/49710223.pdf, accessed 15 June 2018. A notable absentee from this list is Singapore.
[20] Regulation 2016/679 Of the European Parliament and the European Council, on the protection of natural persons with regard to the processing of personal data and on the free movement of such data and repealing Directive 95/46/EC (General Data Protection Regulation), Official Journal of the European Union L 119/1, Article 4.
[21] Privacy Act 1988, section 6.
[22] Ibid. Additional categories of personal information include, Membership of a professional or trade association; Membership of a trade union; Sexual orientation or practices; Criminal record; Health information about an individual; Genetic information (that is not otherwise health information); Biometric information that is to be used for the purpose of automated biometric verification or biometric identification; or Biometric templates.
[23] Australian Privacy Principle 3, the collection is reasonably necessary for an APP entity’s functions or activity, or a listed exception applies.
[24] Personal Data Protection Act 2012, section 2.
[25] Personal Data Protection Act 2012. rganization for the Economic Co-operation and Development, Guidelines on the Protection of Privacy and Transborder Flows of Personal Data 2013. http://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.ht m.
[26] EPrivacy Regulation, European Data Protection Supervisor, Opinion on the Proposal for a Regulation on Privacy and Electronic Communications (ePrivacy Regulation), Article 10 and Recital 23.
[27] Ibid, Article 7. The withdrawal of consent shall not affect the lawfulness of processing based on consent before its withdrawal. Prior to giving consent, the data subject shall be informed thereof. Notably, the word shall provide a flexible approach to whether the data subject is informed or otherwise. When assessing whether consent is freely given, utmost account shall be taken of whether, inter alia, the performance of a contract. This also includes the provision of a service, which is conditional on consent to the processing of personal data that is not necessary for the performance of that contract. iller v Procopets (2008) 24 VR 1.
[28] Office of Information Commissioner, Australian Government: Key Concepts, https://www.oaic.gov.au/agencies-and-organisations/app-guidelines/chapter-b-key-concepts, accessed 12 November 2018.
[29] Privacy Act 1988.
[30] Ibid. In Direct Marketing, APP 7.15 The ‘reasonably expect’ test is an objective test that has regard to what a reasonable person, who is properly informed, would expect in the circumstances. This is a question of fact in each individual case. It is the responsibility of the organization to be able to justify its conduct. 7.16 Factors that may be important in deciding whether an individual has a reasonable expectation that their personal information will be used or disclosed for the purpose of direct marketing include where: the individual has consented to the use or disclosure of their personal information for that purpose (see discussion in paragraph 7.23 below and Chapter B (Key concepts) for further information about the elements of consent): the organization has notified the individual that one of the purposes for which it collects the personal information is for the purpose of direct marketing under APP 5.1 (see Chapter 5 (APP 5)) the organization made the individual aware that they could request not to receive direct marketing communications from the organization, and the individual does not make such a request (see paragraph 7.21).
[31] Australian Privacy Principles, 7.2, 7.3, 7.4. Express consent is given explicitly, either orally or in writing. This could be a handwritten signature, oral statement, or use of an electronic or voice signature. Generally, it cannot be assumed a person has provided consent on the basis they did not object in the first place to allow their data to be processed or transferred to a third party. Furthermore, it will be difficult for an APP entity to establish that an individual’s silence can be taken as consent.
[32] Rogers v Whitaker (1992) 175 CLR 479, 490.
[33] Office of Australian Information Commissioner, Australian Government, https://www.oaic.gov.au/agencies-and-organisations/app-guidelines/chapter-b-key-concepts.
[34] Australian Privacy Principles 3.
[35] Office of Information Commissioner, Australian Government: Key Concepts, https://www.oaic.gov.au/agencies-and-organisations/app-guidelines/chapter-b-key-concepts.
[36] Personal Data Protection Act 2012, Division 1.
[37] Wong YongQuan, B Data privacy law in Singapore: the Personal Data Protection Act 2012 International Data Privacy Law, Vol. 7, No. 4 (2017).
[38] Personal Data Protection Commission, Advisory Guidelines on Key Concepts in the Personal Data Protection Act at para 12.5.
[39] Personal Data Protection Act 2012, division 1, section 13-17.
[40] Personal Data Protection Commission, Advisory Guidelines on Key Concepts in the Personal Data Protection Act, (2017).
[41] Personal Data Protection Act 2012, section 15 and 17. In accordance with Second Schedule (collection), Third Schedule (use) and Fourth Schedule (disclosure).
[42] Yip, M Personal Data Protection Act 2012: Understanding the consent obligation, Singapore Management University (2017).
[43] Personal Data Protection Act 2012, section 13.
[44] Personal Data Protection Act 2012, section 20, Notification Obligation.
[45] Yip, M Personal Data Protection Act 2012: Understanding the consent obligation, Singapore Management University (2017). The PDPA acknowledges that certain forms of socially, morally or legally acceptable uses of personal data do not require the individual’s consent.
[46] Personal Data Protection Act 2012, section 21.
[47] Personal Data Protection Act 2012, section 21 (2), Third, Fourth and Fifth Schedule provides a list of exemptions. Use of Data without consent, Disclosure of data without consent and Exemption from Access. Section 21 (3), provides circumstances in which an organization ‘must not’ provide personal data or other information. A provision such as is this is important for, and applies to, the protection of physical or mental health, or, reveals the identity of an individual who has provided personal data about another individual. Therefore, no data or information is to be released that is in the national interest.
[48] Data Protection Impact Assessment (DPIA) and prior consultation of the Supervisory Authority, Articles 35 and 36 of GDPR. In addition, “where a data protection impact assessment under Article 35 indicates that the processing would result in a high risk in the absence of measures taken by the controller to mitigate the risk”, the data controller shall consult the relevant Data Protection Supervisory Authority under Article 36 of GDPR.
[49] Personal Data Protection Commissioner, Guide to Data Protection Impact Assessments (2017). Data protection risks are best addressed when the system or process is i) new and in the process of being designed, or ii) in the process of undergoing major changes. Introducing changes to address data protection risks after the design of a process or system has been finalised or implemented will likely lead to increased cost and effort. Some examples of when to conduct a DPIA include: creating a new system that involves the handling of personal data (e.g. new website that collects personal data); creating a new process, including manual processes, that involves the handling of personal data (e.g. receptionist collecting personal data from visitors); changing the way that existing systems or processes handle personal data (e.g. redesign of the customer registration process); and changes to the organisational structure that affecting the department handling personal data (e.g. mergers and acquisition, restructuring).
[50] Office of the Australian Information Commissioner, Guide to undertaking privacy impact assessments, https://www.oaic.gov.au/agencies-and-organisations/guides/guide-to-undertaking-privacy-impact-assessments.
Author Information
  • Lecturer, Victoria Law School, Victoria University, Melbourne, Australia

  • Associate, Asian Law Centre, Faculty of Law, University of Melbourne, Melbourne, Australia

Cite This Article
  • APA Style

    Robert Walters, Matthew Coghlan. (2019). Data Protection and Artificial Intelligence Law: Europe Australia Singapore - An Actual or Perceived Dichotomy. American Journal of Science, Engineering and Technology, 4(4), 55-65. https://doi.org/10.11648/j.ajset.20190404.11

    Copy | Download

    ACS Style

    Robert Walters; Matthew Coghlan. Data Protection and Artificial Intelligence Law: Europe Australia Singapore - An Actual or Perceived Dichotomy. Am. J. Sci. Eng. Technol. 2019, 4(4), 55-65. doi: 10.11648/j.ajset.20190404.11

    Copy | Download

    AMA Style

    Robert Walters, Matthew Coghlan. Data Protection and Artificial Intelligence Law: Europe Australia Singapore - An Actual or Perceived Dichotomy. Am J Sci Eng Technol. 2019;4(4):55-65. doi: 10.11648/j.ajset.20190404.11

    Copy | Download

  • @article{10.11648/j.ajset.20190404.11,
      author = {Robert Walters and Matthew Coghlan},
      title = {Data Protection and Artificial Intelligence Law: Europe Australia Singapore - An Actual or Perceived Dichotomy},
      journal = {American Journal of Science, Engineering and Technology},
      volume = {4},
      number = {4},
      pages = {55-65},
      doi = {10.11648/j.ajset.20190404.11},
      url = {https://doi.org/10.11648/j.ajset.20190404.11},
      eprint = {https://download.sciencepg.com/pdf/10.11648.j.ajset.20190404.11},
      abstract = {Artificial Intelligence (AI) is moving so rapidly policy makers, regulators, governments and the legal profession are struggling to keep up. However, AI is not new and it has been used for more than two decades. Coupled with AI, personal data, along with cyber security law, and the challenges posed by the current legal frameworks are nothing short of immense. They are, in part, at odds with each other, and are doing very different things. This paper explores some of the challenges emerging in Australia, Europe and Singapore. The challenge of the interrelationship between personal data and AI arguably begins with who has manufactured the AI. Secondly, who owns the AI. Another challenge that has also emerged is defining AI. Most people are able to understand what AI is and how it is beginning to impact the economy and our daily lives. However, there is no clear legal definition of AI, because AI is so nebulous. This burgeoning area of law is going to challenge society, privacy and economic experts, regulators, innovators of technology, as there continues to be a collision between them. Furthermore, the collection of personal data by AI challenges the notion of where responsibility lies. That is, AI may collect, use and disclose personal data at different points along the technology chain. It will be highlighted how the current data protection laws rather than promote AI projects, largely inhibit its development. This paper identifies some of the tensions between data protection law and AI. This paper argues that there is a need for an urgent and detailed understanding of the opportunities, legal and ethical issues associated with data protection and AI. Doing so will ensure an ongoing balance between the economic and social issues that are attached to the two areas of the law.},
     year = {2019}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - Data Protection and Artificial Intelligence Law: Europe Australia Singapore - An Actual or Perceived Dichotomy
    AU  - Robert Walters
    AU  - Matthew Coghlan
    Y1  - 2019/12/05
    PY  - 2019
    N1  - https://doi.org/10.11648/j.ajset.20190404.11
    DO  - 10.11648/j.ajset.20190404.11
    T2  - American Journal of Science, Engineering and Technology
    JF  - American Journal of Science, Engineering and Technology
    JO  - American Journal of Science, Engineering and Technology
    SP  - 55
    EP  - 65
    PB  - Science Publishing Group
    SN  - 2578-8353
    UR  - https://doi.org/10.11648/j.ajset.20190404.11
    AB  - Artificial Intelligence (AI) is moving so rapidly policy makers, regulators, governments and the legal profession are struggling to keep up. However, AI is not new and it has been used for more than two decades. Coupled with AI, personal data, along with cyber security law, and the challenges posed by the current legal frameworks are nothing short of immense. They are, in part, at odds with each other, and are doing very different things. This paper explores some of the challenges emerging in Australia, Europe and Singapore. The challenge of the interrelationship between personal data and AI arguably begins with who has manufactured the AI. Secondly, who owns the AI. Another challenge that has also emerged is defining AI. Most people are able to understand what AI is and how it is beginning to impact the economy and our daily lives. However, there is no clear legal definition of AI, because AI is so nebulous. This burgeoning area of law is going to challenge society, privacy and economic experts, regulators, innovators of technology, as there continues to be a collision between them. Furthermore, the collection of personal data by AI challenges the notion of where responsibility lies. That is, AI may collect, use and disclose personal data at different points along the technology chain. It will be highlighted how the current data protection laws rather than promote AI projects, largely inhibit its development. This paper identifies some of the tensions between data protection law and AI. This paper argues that there is a need for an urgent and detailed understanding of the opportunities, legal and ethical issues associated with data protection and AI. Doing so will ensure an ongoing balance between the economic and social issues that are attached to the two areas of the law.
    VL  - 4
    IS  - 4
    ER  - 

    Copy | Download

  • Sections