
The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.
The FDA's generative AI tool, Elsa, designed to expedite drug approval processes, has been found to hallucinate—fabricating non-existent studies and misinterpreting real research. Employees report that Elsa's unreliable outputs require extensive human verification, raising concerns about potential risks to drug safety if its outputs are trusted without oversight.[AI generated]











































