• SpaceNoodle@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    24 days ago

    Sure, but we’re still not even close to useful AI agents. We need ones that are correct, not just designed to sound correct.

    • 0x01@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      24 days ago

      For programming we are well into useful territory, but definitely not for other walks of life

      • SpaceNoodle@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        24 days ago

        Depends on the type of programming. The output is still inconsistent and a little sloppy from real-life examples I’ve seen, and my personal attempts at getting anything useful have been fruitless, but I’m not doing boilerplate junk.

        • naeap@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          4
          ·
          24 days ago

          Yeah, I tried just to be lazy and generate some regex and config file for a custom log format in lnav (multi log file viewer - at least for Linux, don’t know about other platforms)

          And it pretty much ran circles with the mistakes it made. Every time I corrected it with something, it would make a mistake from before again.

          In the end I did the thing myself and could have spared me off nearly an hour of trying this stupid trend of vibe coding…

        • 0x01@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          24 days ago

          I’ve used it extensively, and yes it can have inconsistent output but so can humans, useful doesn’t imply perfect imo

          • SpaceNoodle@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            24 days ago

            I’ve found humans to not vary their style so wildly from module to module. Humans also make human mistakes that are of a different nature from LLM hallucinations.