• Kyrgizion@lemmy.world
    link
    fedilink
    arrow-up
    38
    arrow-down
    2
    ·
    4 days ago

    I suppose this can be mitigated by installing a local LLM that doesn’t phone home. But there’s still a risk of getting downright bad advice since so many LLM’s just tell their users they’re always right or twist the facts to fit that view.

    I’ve been guilty of this as well, I’ve used ChatGPT as a “therapist” before. It actually gives decently helpful advice, compared to what’s out there available after a google search. But I’m fully aware of the risks “down the road”, so to speak.

    • TrueStoryBob@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      3 days ago

      so many LLM’s just tell their users they’re always right

      This is the problem, they apparently cannot be objective as just a matter of course.