• BombOmOm@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    2 days ago

    A hallucination is something that disagrees with your active inputs (ears, eyes, etc). AIs don’t have these active inputs, all they have is the human equivalent of memories. Everything they draw up is a hallucination, literally all of it. It’s simply coincidence when a hallucination matches reality.

    Is it really surprising that the thing that can only create hallucinations is often wrong? That the thing that can only create hallucinations will continue to be wrong on a regular basis in the future as well?

    • mindbleach@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      My guy, Microsoft Encarta 97 doesn’t have senses either, and its recollection of the capital of Austria is neither coincidence nor hallucination.