• finitebanjo@piefed.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    20 days ago

    Unfortunately, an LLM lies about 1 in 5 to 1 in 10 times: 80% to 90% accuracy, with a proven hard limit by OpenAI and Deepmind research papers that state even with infinite power and resources it would never approach human language accuracy. Add on top of that the fact that the model is trained on human inputs which themselves are flawed, so you multiply an average person’s rate of being wrong.

    In other words, you’re better off browsing forums and asking people, or finding books on the subject, because the AI is full of shit and you’re going to be one of those idiot sloppers everybody makes fun of, you won’t know jack shit and you’ll be confidently incorrect.

        • Cabbage_Pout61@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          20 days ago

          How would I search something I don’t know how it’s called? As I explained the AI is just responsible to tell me “hey this thing X exists”, and after that I go look for it on my own.

          Why am I a moron? Isn’t it the same as asking another person and then doing the heavy lifting yourself?

          ^(edit: typo)