Singapore: PDPC releases advisory guidelines on the use of personal data in AI recommendation and decision systems

In brief

The Personal Data Protection Commission (PDPC) has issued the finalized Advisory Guidelines on the Use of Personal Data in AI Recommendation and Decision Systems ("Guidelines"). These Guidelines provide guidance on the use of personal data during three stages of AI system implementation: development, deployment (business-to-consumers) and procurement (business-to-business). In particular, the Guidelines clarify and elaborate on the application of the Consent Obligation and Notification Obligation, and their exceptions, under the Personal Data Protection Act (PDPA) to the use of personal data in AI systems.


Contents

Key takeaways

  • When developing AI systems, organizations should consider whether the business improvement and research exception may be relied upon instead of seeking consent. In addition, appropriate data protection measures must be put in place.
  • When deploying AI systems in consumer-facing products and services, such as in recommendation and decision systems, meaningful consent must be obtained. Alternatively, organizations can consider whether the legitimate interests exception may be relied upon.
  • Data intermediaries that develop or deploy bespoke AI systems for organizations controlling personal data should assist these organizations in meeting their obligations under the PDPA, such as by data mapping, data labeling and maintaining a data provenance record, which appear to be beyond the current best practices for data intermediaries.
  • Such data intermediaries are encouraged to assist organizations in meeting their notification, consent and accountability obligations, although they are not themselves subject to those obligations.

In more detail

Background

In July 2023, the PDPC carried out a public consultation on the proposed advisory guidelines on the use of personal data in AI systems (see our newsletter from July 2023), which closed on 31 August 2023.

On 1 March 2024, the PDPC issued a closing note to the public consultation to address issues raised and simultaneously issued the Guidelines.

The closing note summarizes the key matters raised by respondents and the PDPC's responses to these matters. Among others, the PDPC stated as follows:

  • The Guidelines will assist organizations in applying exceptions to the consent obligation under the PDPA, such as the business improvement and research exceptions. Examples may be included, but are not intended to be exhaustive.
  • The PDPA will not override requirements in sectoral regulations, such as any that may apply to the financial sector.
  • Following requests for clarification, the PDPC has included guidance on the application of the legitimate interests exception.
  • The section on best practices for service providers is intended to apply to system integrators and bespoke solution providers rather than software-as-a-service companies.
  • Guidance on allocation of liability in relation to AI systems outside data protection is beyond the scope of the PDPA and would not be proper to include in the Guidelines.

Scope

The Guidelines cover situations involving the design or deployment of AI systems that interact with personal data. The Guidelines aim to (i) clarify how the PDPA applies when organizations use personal data to develop and train AI systems; and (ii) provide baseline guidance and best practices for organizations on how to be transparent about whether and how their AI systems use personal data to make recommendations, predictions or decisions.

The Guidelines cover the use of personal data in three main stages of AI system implementation, as follows:

  • Development, testing and monitoring of AI systems
  • Deployment: collecting and using personal data in AI systems (business-to-consumer)
  • Procurement: provision of bespoke AI systems (business-to-business)

Use of personal data in AI system development, testing and monitoring

Business improvement and research exceptions

The business improvement exception and research exception may be relied upon, instead of seeking consent, where appropriate. In general, the business improvement exception is relevant where the organization has developed a product or has an existing product that it is enhancing, and also caters for intragroup and intracompany sharing. The research exception is relevant when the organization is conducting commercial research to advance the science and engineering without a product development roadmap.

The Guidelines provide several factors for organizations to consider in determining whether to rely on the exceptions. Some pertinent factors include (but are not limited to) the following:

  • For the business improvement exception:
    • Whether using personal data for this purpose contributes toward improving the effectiveness or quality of the AI systems
    • Common industry practices or standards on how to develop, test and monitor such AI systems or ML models
  • For the research exception:
    • The use of personal data helps develop more effective methods to improve quality or performance of the AI system
    • Developing industry practices or standards for the development and deployment of AI systems or ML models

The usual restrictions and conditions for relying on the business improvement and research exceptions will apply, such as, (i) for the business improvement exception, it must be ensured that the business improvement purposes cannot be reasonably achieved without using the personal data in an individually identifiable form, and (i) for the research exception, there must be a clear public benefit to using the personal data for the research purpose.

Data protection considerations

The Guidelines remind organizations that appropriate technical, process or legal controls for data protection should be included during the AI system development process. The practice of data minimization will also reduce unnecessary risks. In deciding the kinds of controls to be implemented, companies can consider (i) the types of risks that the personal data would be subject to and (ii) the sensitivity and volume of the personal data used.

Best practices may include pseudonymization or de-identification where appropriate. Where this is not possible, organizations are encouraged to conduct a data protection impact assessment. And even for anonymized data (which falls beyond the scope of the PDPA), there is still the risk of re-identification, and so appropriate controls should be implemented.

Use of personal data in AI system deployment

The Guidelines address the situation where organizations deploy AI systems that collect and use personal data to provide new functionalities or enhance product features, such as to provide recommendations to users. The consent obligation and notification obligation under the PDPA will apply.

To obtain users' informed and meaningful consent, notifications should be sufficient, but need not be overly technical. The information provided can also be "layered," such that the most relevant information is provided more prominently, while more details are provided elsewhere.

In addition, the legitimate interests exception may apply; this generally refers to any lawful interests of an organization. In addition, several specific purposes are defined to constitute "legitimate interests" under the PDPA. In particular, the Guidelines provide that the detection or prevention of illegal activity would be such a purpose. Where this exception applies, personal data may be processed without consent.

Finally, the Guidelines also encourages organizations to ensure that they have sufficiently discharged the accountability obligation. The use of AI systems should be made known to users, and the level of detail provided should be proportionate to the risks of each use case. To build consumer confidence, measures taken to safeguard personal data and ensure fairness of recommendations could also be preemptively disclosed to users, instead of only upon request.

Use of personal data in AI system procurement

The final section of the Guidelines addresses service providers, such as systems integrators, that are engaged by organizations for professional services for the development and deployment of bespoke or fully customizable AI systems. It does not apply to organizations that develop their own AI systems in-house or those that use off-the-shelf solutions.

Such service providers could constitute data intermediaries under the PDPA, and it is a good practice for such service providers to adopt practices such as data mapping, data labeling and maintaining a data provenance record. While this will support data intermediaries in assessing whether there has been unauthorized access and modification of training data sets, these obligations appear to go beyond the current best practices of data intermediaries.

In addition, although they are not subject to these obligations themselves, service providers are also encouraged to support their clients in meeting their notification, consent and accountability obligations. This is because these clients may rely on the technical expertise of the service providers to meet their obligations under the PDPA. This recommendation appears to be beyond the obligations currently imposed on data intermediaries pursuant to the PDPA.

* * * * *

For further information and to discuss what this might mean for you, please get in touch with your usual Baker McKenzie contact.

LOGO_Wong&Leow_Singapore

© 2024 Baker & McKenzie.Wong & Leow. All rights reserved. Baker & McKenzie.Wong & Leow is incorporated with limited liability and is a member firm of Baker & McKenzie International, a global law firm with member law firms around the world. In accordance with the common terminology used in professional service organizations, reference to a "principal" means a person who is a partner, or equivalent, in such a law firm. Similarly, reference to an "office" means an office of any such law firm. This may qualify as "Attorney Advertising" requiring notice in some jurisdictions. Prior results do not guarantee a similar outcome.

Contact Information
Andy Leck
Principal
Singapore/Yangon
andy.leck@bakermckenzie.com
Ken Chia
Principal
Singapore
ken.chia@bakermckenzie.com

Copyright © 2024 Baker & McKenzie. All rights reserved. Ownership: This documentation and content (Content) is a proprietary resource owned exclusively by Baker McKenzie (meaning Baker & McKenzie International and its member firms). The Content is protected under international copyright conventions. Use of this Content does not of itself create a contractual relationship, nor any attorney/client relationship, between Baker McKenzie and any person. Non-reliance and exclusion: All Content is for informational purposes only and may not reflect the most current legal and regulatory developments. All summaries of the laws, regulations and practice are subject to change. The Content is not offered as legal or professional advice for any specific matter. It is not intended to be a substitute for reference to (and compliance with) the detailed provisions of applicable laws, rules, regulations or forms. Legal advice should always be sought before taking any action or refraining from taking any action based on any Content. Baker McKenzie and the editors and the contributing authors do not guarantee the accuracy of the Content and expressly disclaim any and all liability to any person in respect of the consequences of anything done or permitted to be done or omitted to be done wholly or partly in reliance upon the whole or any part of the Content. The Content may contain links to external websites and external websites may link to the Content. Baker McKenzie is not responsible for the content or operation of any such external sites and disclaims all liability, howsoever occurring, in respect of the content or operation of any such external websites. Attorney Advertising: This Content may qualify as “Attorney Advertising” requiring notice in some jurisdictions. To the extent that this Content may qualify as Attorney Advertising, PRIOR RESULTS DO NOT GUARANTEE A SIMILAR OUTCOME. Reproduction: Reproduction of reasonable portions of the Content is permitted provided that (i) such reproductions are made available free of charge and for non-commercial purposes, (ii) such reproductions are properly attributed to Baker McKenzie, (iii) the portion of the Content being reproduced is not altered or made available in a manner that modifies the Content or presents the Content being reproduced in a false light and (iv) notice is made to the disclaimers included on the Content. The permission to re-copy does not allow for incorporation of any substantial portion of the Content in any work or publication, whether in hard copy, electronic or any other form or for commercial purposes.