• lemmy_acct_id_8647@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    23 minutes ago

    I’ve talked with an AI about suicidal ideation. More than once. For me it was and is a way to help self-regulate. I’ve low-key wanted to kill myself since I was 8 years old. For me it’s just a part of life. For others it’s usually REALLY uncomfortable for them to talk about without wanting to tell me how wrong I am for thinking that way.

    Yeah I don’t trust it, but at the same time, for me it’s better than sitting on those feelings between therapy sessions. To me, these comments read a lot like people who have never experienced ongoing clinical suicidal ideation.

  • i_stole_ur_taco@lemmy.ca
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 hour ago

    They didn’t release their methods, so I can’t be sure that most of those aren’t just frustrated users telling the LLM to go kill itself.

  • mhague@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    4 hours ago

    I wonder what it means. If you search for music by Suicidal Tendencies then YouTube shows you a suicide hotline. What does it mean for OpenAI to say people are talking about suicide? They didn’t open up and read a million chats… they have automated detection and that is being triggered, which is not necessarily the same as people meaningfully discussing suicide.

    • REDACTED@infosec.pub
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      3 hours ago

      Every third chat now gets triggered, the ChatGPT is pretty broken lately. Just check out ChatGPT subreddit, its pretty much in chaos with moderators going for censorship of complaints. So many users are mad they made a megathread for it. I cancelled my subscription yesterday, it just turned into a cyberkaren

      • WorldsDumbestMan@lemmy.today
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 hours ago

        Claude got hints that I might be suicidal just from normal chat. I straight up admitted I think of suicide daily.

        Just normal life now I guess.

        • k2helix@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          2 hours ago

          Stay strong friend! I know I’m just a stranger but I’m here if you need someone to talk to.

  • NuXCOM_90Percent@lemmy.zip
    link
    fedilink
    English
    arrow-up
    6
    ·
    4 hours ago

    Okay, hear me out: How much of that is a function of ChatGPT and how much of that is a function of… gestures at everything else

    MOSTLY joking. But had a good talk with my primary care doctor at the bar the other week (only kinda awkward) about how she and her team have had to restructure the questions they use to check for depression and the like because… fucking EVERYONE is depressed and stressed out but for reasons that we “understand”.

  • ChaoticNeutralCzech@feddit.org
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    6 hours ago

    The headline has two interpretations and I don’t like it.

    • Every week, there is 1M+ users that bring up suicide
      • likely correct
    • There is 1M+ long-term users that bring up suicide at least once every week
      • my first thought
      • chronicledmonocle@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        4 hours ago

        At least in the rest of the world you don’t end up with crippling debt when you try to get mental healthcare that stresses you out to the point of committing suicide.

        • lemmy_acct_id_8647@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          24 minutes ago

          And then should you have a failed attempt, you go exponentially deeper into debt due to those new medical bills and inpatient mental healthcare.

          Fuck the United States

  • tgcoldrockn@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    3
    ·
    edit-2
    2 hours ago

    None of this is funny. Please stop anyone you know or your business from using/ adopting this and other related tech. Use shame or prodding or intelligent debate, whatever works. This shit is already culture ending and redefining, job destroying and increasing economic disparity. Boycott Open AI, Meta, Stability, etc. etc. Make it dirty, embarrassing, disgusting to use or its infiltration will complete and “the haves” will have much much more than ever, and the have nots will be able to complain at their personal surveillance pocket kiosk.

    • WraithGear@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      edit-2
      2 hours ago

      i am sure shaming people who feel the need to open up about their feelings of suicide to an unjudging machine is a great idea. Totaly not a bad idea.

      • tgcoldrockn@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        15 minutes ago

        “Unjudging?” Oh no, its definitely compiling your dossier. And users are making it oh so easy.

        • WraithGear@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 minutes ago

          the machine does not judge, the operator may, but that is not really a factor to many but is an abstract invasion. you can’t whisper your problems to the person sitting next to you without multiple companies trying to sell you a solution.

          to someone in need an ai is an attractive option, this has no bearing on if that machine is qualified to do so. and while i get the urgency to move people over to any better option, shame is the quickest way to push someone to the end of the rope.

          do not blame those who struggle. any port in a storm

      • Auli@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 hours ago

        They don’t care if it’s judging just that it agrees with them. Makes you wonder what people actually want when they fall in love with a yes person.

        • WraithGear@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          1 hour ago

          wow, can already tell you know nothing about this and were probably better off not having said anything at all

    • Halcyon@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 hours ago

      But imagine the chances for your own business! Absolutely no one will steal your ideas before you can monetize them.

      • WhatAmLemmy@lemmy.world
        link
        fedilink
        English
        arrow-up
        31
        arrow-down
        4
        ·
        12 hours ago

        Well, AI therapy is more likely to harm their mental health, up to encouraging suicide (as certain cases have already shown).

        • FosterMolasses@leminal.space
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          8 hours ago

          There’s evidence that a lot of suicide hotlines can be just as bad. You hear awful stories all the time of overwhelmed or fed up operators taking it out on the caller. There’s some real evil people out there. And not everyone has access to a dedicated therapist who wants to help.

        • Cybersteel@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          11 hours ago

          Suicide is big business. There’s infrastructure readily available to reap financial rewards from the activity, atleast in the US.

        • atmorous@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          12 hours ago

          More so from corporate proprietary ones no? At least I hope that’s the only cases. The open source ones suggest really useful ways proprietary do not. Now I dont rely on open source AI but they are definitely better

          • SSUPII@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            3
            ·
            6 hours ago

            The corporate models are actually much better at it due to having heavy filtering built in. The fact that a model generally encourages self arm is just a lie that you can prove right now by pretending to be suicidal on ChatGPT. You will see it will adamantly push you to seek help.

            The filters and safety nets can be bypassed no matter how hard you make them, and it is the reason why we got some unfortunate news.

        • whiwake@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          9
          ·
          12 hours ago

          Real therapy isn’t always better. At least there you can get drugs. But neither are a guarantee to make life better—and for a lot of them, life isn’t going to get better anyway.

          • CatsPajamas@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            2
            ·
            7 hours ago

            Real therapy is definitely better than an AI. That said, AIs will never encourage self harm without significant gaming.

            • whiwake@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              3 hours ago

              AI “therapy” can be very effective without the gaming, but the problem is most people want it to tell them what they want to hear. Real therapy is not “fun” because a therapist will challenge you on your bullshit and not let you shape the conversation.

              I find it does a pretty good job with pro and con lists, listing out several options, and taking situations and reframing them. I have found it very useful, but I have learned not to manipulate it or its advice just becomes me convincing myself of a thing.

            • triptrapper@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              5 hours ago

              I agree, and to the comment above you, it’s not because it’s guaranteed to reduce symptoms. There are many ways that talking with another person is good for us.

      • Scolding7300@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        12 hours ago

        Advertise drugs to them perhaps, or somd sort of taking advantage. If this sort of data is the hands of an ad network that is

      • MagicShel@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        9 hours ago

        Definitely a case where you can’t resolve conflicting interests to everyone’s satisfaction.

    • HereIAm@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 hours ago

      Why play games if you’re not interested in them? Or don’t have the energy to play. Not so easy to “chill” if you’ll be kicked out your house next week. Lost your job? Bah, just chill and game.

      What an absolutely brain dead thing to say.

      • 1985MustangCobra@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        3
        ·
        3 hours ago

        im on the verge of getting kicked out. got my laptop and my switch 2. im good homie. been homeless before.

  • Sculptus Poe@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    14
    ·
    4 hours ago

    Y’all complain when they do nothing. Y’all complain when they try to be proactive. You neo-luddites need a new hobby.