The OECD.AI Policy Navigator

Our policy navigator is a living repository from more than 80 jurisdictions and organisations. Use the filters to browse initiatives and find what you are looking for.

Artificial Intelligence Safety Institute (AISI)


Added by:   National contact point
Added on:   09 Jul 2025
Updated by:   OECD analyst
Updated on:   09 Jul 2025

The Korea AISI is Korea AISI) is an organization dedicated to evaluating the risks associated with AI models or systems and researching technologies for preventing and mitigating those risks. We represent the Republic of Korea in collaborating with AI safety research institutes and related organizations worldwide.

Name in original language

인공지능안전연구소

Initiative overview

as a leading organization in AI safety policy, carries out the following responsibilities:1. AI Safety PoliciesResearch on AI Safety PoliciesAI Safety Consulting ServiceAI Technology Impact AssessmentEstablishment of AI Safety Information SystemNational Security Risk Response2. AI Safety AssessmentAI Risk Definition and ClassificationAI Safety Framework DevelopmentAI Safety Evaluation and Infrastructure Establishment AI Safety Testing Verification and ValidationComprehensive AI Safety Evaluation3. AI Safety ResearchFundamental AI Safety TechnologiesResearch on future risks and preemptive responses4. External Collaboration on AI SafetyNational Hub for AI SafetyGlobal Collaboration Gateway

Name of responsible organisation (in English)

Electronics and Telecommunications Research Institute (ETRI)

About the policy initiative


Organisation:

  • Electronics and Telecommunications Research Institute (ETRI)

Category:

  • National – AI governance bodies or mechanisms

Initiative type:

  • Advisory bodies, offices or processes

Participating organisations:


Participating countries:


Status:

  • Active

Start Year:

  • 2024

Target Sectors:


Other relevant urls: