• Tja@programming.dev
    link
    fedilink
    arrow-up
    5
    ·
    9 天前

    Because that’s murder, and contrary to a health insurance company denying claims, Sam Altman just sucks, but hasn’t killed anyone (yet) (that we know of).

      • Tja@programming.dev
        link
        fedilink
        arrow-up
        2
        ·
        9 天前

        Is the developer also culpable? How about the data scientist? How about the data engineer? How about the BI Analyst? And the janitor?

        How about the manufacturer of the knife / pill / gas they used to kill themselves?

        • Mniot@programming.dev
          link
          fedilink
          English
          arrow-up
          3
          ·
          9 天前

          As a developer: yes to the developer and data scientist and data engineer. Scientists and engineers should be responsible for their work.

          The BI analyst: maybe, if they’re responsible for collecting data that ignores the impact of the service on teens. If they’re doing sales-comparisons between Anthropic and OpenAI… eh, I donno.

          The janitor: probably not since I don’t feel like the deaths are widely publicized and they probably work for a contracting company that handles the building.

        • Echo Dot@feddit.uk
          link
          fedilink
          arrow-up
          2
          ·
          9 天前

          In most cases suicide isn’t anyone’s fault. People like to find someone to blame, and I get that, but people who are even remotely close to doing that, were always going to find a way and a justification.

          No AI is going to convince me to kill myself if I didn’t already want to. Equally the inverse must also be true.

          That’s not to say that the companies are completely off the hook, it’s utterly ridiculous that these conversations weren’t flagged and sent to a human, but I think it’s daft to suggest that these people would necessarily still be alive had the AI not existed.

          • Tja@programming.dev
            link
            fedilink
            arrow-up
            1
            ·
            9 天前

            I completely agree. Not off the hook. There should be better guardrails (like recipes for bombs and other dangerous things) but from there to accuse the CEO of murder there’s quite a stretch.

        • queermunist she/her@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          9 天前

          If you manufacture a knife that convinces children to kill themselves, yeah, you’re culpable. Everyone else can be charged according to their level of culpability, but any time a company is found liable for killing someone the CEO should be sentenced for their murder. Maybe that would incentivize CEOs to stop getting people killed.

            • queermunist she/her@lemmy.ml
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              8 天前

              I don’t think there’s a difference. Children are not culpable, which means grooming children to kill themselves is murder.

              • Tja@programming.dev
                link
                fedilink
                arrow-up
                1
                ·
                8 天前

                Selling knifes to children is murder too?

                Selling knifes to families with children?

                Selling knifes to women who are pregnant?

                • queermunist she/her@lemmy.ml
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  8 天前

                  Selling knives that talk and tell you to kill yourself to children is murder.

                  You’re refusing to recognize the grooming angle to this.

    • Squirrelanna@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      9 天前

      I mean creating a product that exacerbates psychosis to the point that people kill themselves I would say meets that standard.

            • ForestGreenGhost@literature.cafe
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              8 天前

              No I’m not saying that you’re not allowed to comment. I’m just saying that your takes are stupid and that you probably shouldn’t.

                • ForestGreenGhost@literature.cafe
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  7 天前

                  I’m sorry that I called you stupid. That was wrong of me and you didn’t deserve that.

                  If you’re interested, I could explain to you why your comment that I initially responded to was a false equivalence, and why claiming that I was stifling your free speech is nonsensical. Let’s talk it out and maybe both of us can walk away from this having learned something. :)

                  • Tja@programming.dev
                    link
                    fedilink
                    arrow-up
                    1
                    ·
                    7 天前

                    Sure, I’ll be happy to.

                    My point is that chatbots, and other LLM applications, are useful tools that in isolated cases have caused people to become addicted and other harmful effects, including deaths.

                    The same can be said of many other things, from parasocial relationships with celebrities, tools like heavy machinery, aircraft, medicine with side effects, gyms, and a long list of others. People become obsessed, addicted and in certain cases even die. Or the tool fails and kills them.

                    The solution shouldn’t be to immediately ban them and accuse the CEO of murder (super specific legal definition, btw) but try to regulate, add guardrails, make it safer and help the victims however they need. Sure, let’s investigate each death and see if there has been negligence, but pitchforks are not the solution.

    • Urist@leminal.space
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      9 天前

      Sam Altman is an enemy of humanity and it would self defense to kill him.

      I’m not gonna do it because that’s a hassle, but if someone did I wouldn’t condemn them.

      • Tja@programming.dev
        link
        fedilink
        arrow-up
        2
        arrow-down
        2
        ·
        9 天前

        So we just advocate for the murder of anyone we disagree with? The CEO, my boss, the neighbor with the loud dog, that guy who cuts us in traffic…

        • bthest@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          9 天前

          So, to you, a man hoarding wealth on an unimaginable scale and is actively engaging in the ruination of the world and humanity is just a annoying thing like a an aggressive driver or yapping dog?

          And that harming this techno-Hitler for what he’s doing would be the moral equivalent of murdering a normal person for making you angry?

    • CarrotsHaveEars@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      9 天前

      Exhausting energy, fresh water, and giving an excuse to corporations to strip their job, the mean of living, from employees surely isn’t murdering.

      • Tja@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        9 天前

        Exactly: it isn’t murdering. Even if all assumptions above were true, it isn’t murdering.