Note
Proves via diagonalization that LLMs cannot eliminate hallucination for all computable functions; external symbolic reasoning required.
Citation Key
xu2024hallucination Proves via diagonalization that LLMs cannot eliminate hallucination for all computable functions; external symbolic reasoning required.
xu2024hallucination