• rafoix@lemmy.zip
    link
    fedilink
    arrow-up
    52
    ·
    4 days ago

    Why not just have the AI say please and thank you at every possible opportunity on a loudspeaker?

    • ThePantser@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      ·
      4 days ago

      If an AI can scam old people out of their retirement I dont understand how the drive through attendant isn’t just replaced with an AI yet. I know easy to trick and all that but that’s the one job most people hate at fast food. Add like 5 speakers so people can place 5 orders at once and then have the person go to work making food instead of taking orders.

        • eatCasserole@lemmy.worldM
          link
          fedilink
          arrow-up
          7
          ·
          4 days ago

          I saw a story recently where a guy spent some time with a customer service chatbot, and ended up convincing it to give him 80% off, and then ordered like $6000 of stuff.

          LLMs just don’t produce reliable/predictable output, it’s much easier for the user to get them to go off the rails.

          • Kairos@lemmy.today
            link
            fedilink
            arrow-up
            2
            ·
            3 days ago

            Aren’t there also tons of studies and math that show/prove they cant differentiate between instructions (e.g. from the company) vs data (e.g. that guy’s messages)?

            • eatCasserole@lemmy.worldM
              link
              fedilink
              arrow-up
              2
              ·
              3 days ago

              Yes, I believe that is the case.

              Of course in any other application, keeping instructions and data separate is very important. Like an SQL injection attack is when you’re able to sneak instructions in where data is supposed to go, and then you can just delete the entire database, if you want. But with LLMs the distinction doesn’t exist.

      • greenskye@lemmy.zip
        link
        fedilink
        arrow-up
        5
        arrow-down
        3
        ·
        4 days ago

        McDonald’s briefly had an AI run their drive thru. Apparently it got a lot of complaints, but honestly it massively improved my local McDonald’s order accuracy and speed. It was significantly better than the extremely shitty employees they normally have working the line.

        • Rivalarrival@lemmy.today
          link
          fedilink
          English
          arrow-up
          4
          ·
          4 days ago

          Taco Bell had it, too. I have never actually completed a purchase with an AI.

          Years ago, I adopted a personal policy of driving off as soon as a restaurant attempted to upsell. The Taco Bell AI always attempted to upsell me. 100% reliable on that offensive behavior. But what really and truly pissed me off was that even if I told it “No” or remained silent to its query, it always added the item to my order.

          I’m happy to tank their KPIs as “reward” for their AI bullshit.

    • scops@reddthat.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      4 days ago

      Taco Bell has already automated the drive though orders at a number of locations. The staff still have to listen to the conversation to make sure the AI agent doesn’t go off the rails. I bet they’ve got some fun stories

      • brbposting@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 days ago

        Walking past a Taco Bell it seems someone competent implemented the system—seems to understand people just as well as the best software I’m aware of can.

    • quick_snail@feddit.nl
      link
      fedilink
      arrow-up
      2
      ·
      4 days ago

      Because AI is more likely to say “fuck you” and then rant about how the Holocaust wasn’t real.

      • rafoix@lemmy.zip
        link
        fedilink
        arrow-up
        1
        ·
        4 days ago

        I might even submit myself to eat the disgusting slop Burger King sells just to hear that every now and then.