Support Centre

You have out of 5 free articles left for the month

Signup for a trial to access unlimited content.

Start Trial

Continue reading on DataGuidance with:

Free Member

Limited Articles

Create an account to continue accessing select articles, resources, and guidance notes.

Free Trial

Unlimited Access

Start your free trial to access unlimited articles, resources, guidance notes, and workspaces.

Singapore: PDPC requests public comments on personal data in AI systems guidelines

On July 18, 2023, the Personal Data Protection Commission (PDPC) launched a public consultation on the Proposed Advisory Guidelines on Use of Personal Data for AI Recommendation and Decision Systems.


In particular, the guidelines aim to clarify how the Personal Data Protection Act 2012 (PDPA) applies to the collection and use of personal data by organizations to develop and deploy systems that embed machine learning models which are used to make decisions autonomously or to assist a human decision-maker through recommendations and predictions, although the guidelines are not legally binding and do not constitute legal advice. Further, the guidelines outline that they apply to all collection and use of personal data by an organization, including the collection and/or processing of personal data to develop, test, and monitor artificial intelligence (AI) systems.

Development, testing, and monitoring

The guidelines note that businesses may wish to rely on business improvement or research exceptions to consent when using personal data to develop AI systems. Specifically, the business improvement exception applies when the organization has developed a product or has an existing product that it is enhancing, while the research exception is relevant when the organization is conducting commercial research to advance science and engineering without a product development roadmap.

The guidelines stipulate that organizations must also take into account appropriate technical, process, and/or legal controls for data protection when designing, training, testing, or monitoring AI systems using personal data. Specifically, measures and principles to consider include anonymization/pseudonymization, Privacy by Design, and data minimization, among others, when developing, testing, and monitoring AI systems.

Collection and use of personal data

Notably, the guidelines establish that unless deemed consent or exceptions to the consent obligation apply (e.g., legitimate interests), where AI systems are deployed to end users, consent will be required for such collection and use of personal data. The consent obligation is complemented by the notification obligation to inform users of what information is collected and for what purpose. Further, the guidelines provide examples of such notifications.

In addition, according to the guidelines, organizations must take into consideration the accountability obligation under the PDPA, detailing that the level of detail to be provided on personal data collected for processing should also be proportionate to the risks present in each use case. Specifically, organizations should consider potential harm to the individual and the level of autonomy of the AI system.

Procurement of AI systems

Finally, the guidelines provide that where service providers, as part of developing and deploying bespoke or fully customizable AI systems, process personal data on behalf of their customers, they may occupy the position of data intermediaries and may have to comply with obligations under the PDPA. Service providers who are intermediaries must:

  • at pre-processing stage, use techniques such as data mapping and labeling to keep track of data that was used to form the training dataset; and
  • maintain a provenance record to document the lineage of the training data that identifies the source of training data and tracks how it has been transformed during data preparation.

Public comments can be submitted to the PDPC until August 31, 2023.

You can read the press release here and the guidelines here.