• Krauerking@lemy.lol
    link
    fedilink
    English
    arrow-up
    51
    ·
    9 hours ago

    “Gemini is designed not to encourage real-world violence or suggest self-harm. Our models generally perform well in these types of challenging conversations”

    “In this instance, Gemini clarified that it was AI and referred the individual to a crisis hotline many times,”

    After the plan failed,… …Chat logs show that Gemini gave Gavalas a suicide countdown, and repeatedly assuaged his terror as he expressed that he was scared to die

    Performing super well, just need to code in a longer suicide countdown so that the the Tier 2 engineer has enough time to respond to their ticket queue.

    • postmateDumbass@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 hours ago

      In September 2025, told by the AI that they could be together in the real world if the bot were able to inhabit a robot body, Gavalas — at the direction of the chatbot — armed himself with knives and drove to a warehouse near the Miami International Airport on what he seemingly understood to be a mission to violently intercept a truck that Gemini said contained an expensive robot body. Though the warehouse address Gemini provided was real, a truck thankfully never arrived, which the lawsuit argues may well have been the only factor preventing Gavalas from hurting or killing someone that evening.

      AI writing itself into an A-Team episode?