
Development of Standards, Auditing, and Regulatory Compliance Evaluation
-
Development of standards and audit, monitoring, and certification methodologies for systems using or applying artificial intelligence and other emerging technologies, including in peace and security contexts.
​
-
Auditing and regulatory compliance evaluation services in areas such as privacy, data protection, cybersecurity, intellectual property, and user rights, including safety in AI system design, training, and use across sectors, including peace and security.
​
-
Design and implementation of internal and external auditing programs to identify, evaluate, and mitigate legal risks in AI use and other technologies across their entire life cycles, involving all actors in each phase.
​
-
Preparation of compliance reports and recommendations to adjust development and use processes for emerging technologies, aligning with current local and international regulations.
​
-
Providing training, best practices, and response capacity to various actors to ensure compliance with requirements and standards for responsible use and the protection of AI systems and emerging technologies.