In brief
The Cyber Security Agency (CSA) has just released Guidelines on Securing AI Systems ("Guidelines") and a Companion Guide on Securing AI Systems ("Companion Guide").
The Guidelines advocate for a "secure by design" and "secure by default" approach, addressing both existing cybersecurity threats and emerging risks, such as adversarial machine learning. The aim is to provide system owners with principles for raising awareness and implementing security controls throughout the AI lifecycle.
The Companion Guide is an open-collaboration resource, and while not mandatory, it offers guidance on useful measures and controls informed by industry best practices, academic insights and resources such as the MITRE ATLAS database and OWASP Top 10 for Machine Learning and Generative AI.
CSA is currently seeking feedback on both the Guidelines and the Companion Guide. Interested organizations have until 11:59 pm on 15 September 2024 to submit comments to Aisecurity@csa.gov.sg.
Guidelines on Securing AI Systems
The Guidelines support the securing of AI systems throughout their lifecycle, with a focus on cybersecurity risks rather than AI safety, fairness, transparency or misuse in cyberattacks. Organizations are encouraged to:
- Raise awareness and conduct risk assessments at the planning and design phase
- Secure supply chains, choose appropriate models, track and protect AI assets and secure development environments at the development phase
- Secure deployment infrastructure, establish incident management procedures and release responsibly at the deployment phase
- Monitor inputs and outputs, manage updates securely, and establish vulnerability disclosure processes at the operations and maintenance stage
- Dispose of data and models properly at the end-of-life phase
Companion Guide on Securing AI Systems
The Companion Guide is a more detailed document aimed at supporting system owners in implementing the Guidelines, and sets out practical controls that system owners may consider in adopting AI systems. For example, the Companion Guide explains how organisations should:
- Begin with risk assessments
- Identify relevant measures/controls in checklists for each phase of the AI lifecycle spanning planning and design, development, deployment, operations and maintenance, and end-of-life
The Companion Guide also provides detailed walkthrough and implementation examples showing how controls may be applied to AI systems.
Key takeaways
The Guidelines and Companion Guide are welcome developments that underscore the CSA's commitment to a collaborative and proactive approach in fostering security in AI systems. As Singapore continues to position itself at the forefront of technological innovation, resources such as the Guidelines and the Companion Guide will continue to play an important role in building trust by ensuring that Singapore's AI systems remain robust and resilient against vulnerabilities.
* * * * *
© 2024 Baker & McKenzie.Wong & Leow. All rights reserved. Baker & McKenzie.Wong & Leow is incorporated with limited liability and is a member firm of Baker & McKenzie International, a global law firm with member law firms around the world. In accordance with the common terminology used in professional service organizations, reference to a "principal" means a person who is a partner, or equivalent, in such a law firm. Similarly, reference to an "office" means an office of any such law firm. This may qualify as "Attorney Advertising" requiring notice in some jurisdictions. Prior results do not guarantee a similar outcome.