Entity: large language model
📊 Facts Database / Entities / large language model

large language model

1 Facts
1 Related Topics
AI hallucinations are instances when a generative AI system or large language model produces false, misleading, or inaccurate information and presents it as factual.
high definition
Term used to describe reliability failures of generative AI and large language models.