I don’t disagree, but this is an issue of when/where it’s appropriate to use an LLM to interact with customers and when they shouldn’t. If you present an LLM to the public, it will get manipulated by people who are prepared to in order to get it to do something it shouldn’t.
This also happens with human employees, but it’s generally harder to do so it’s less common. This sort of behaviour is called social engineering and is used by fraudsters and scammers to get people to do what they want, typically handing over their bank details, but the principal is the same, you’re manipulating someone (or something in this case) into getting it do do something they/it shouldn’t.
Just because we don’t like the fact that the business owner deployed an LLM in a manner they probably shouldn’t have, doesn’t mean the customer isn’t the one in the wrong and themself voided whatever contract they had through their actions. Whether it’s a human or LLM on the other end of the chat doesn’t actually make any difference.
We’ll see what the court says, because my intuition is that an LLM making an offer is categorically different from a human doing it. No one was scammed, this would be more like your own poorly designed website giving you the option to get 80% off if someone clicks the right combination of settings.
I don’t disagree, but this is an issue of when/where it’s appropriate to use an LLM to interact with customers and when they shouldn’t. If you present an LLM to the public, it will get manipulated by people who are prepared to in order to get it to do something it shouldn’t.
This also happens with human employees, but it’s generally harder to do so it’s less common. This sort of behaviour is called social engineering and is used by fraudsters and scammers to get people to do what they want, typically handing over their bank details, but the principal is the same, you’re manipulating someone (or something in this case) into getting it do do something they/it shouldn’t.
Just because we don’t like the fact that the business owner deployed an LLM in a manner they probably shouldn’t have, doesn’t mean the customer isn’t the one in the wrong and themself voided whatever contract they had through their actions. Whether it’s a human or LLM on the other end of the chat doesn’t actually make any difference.
We’ll see what the court says, because my intuition is that an LLM making an offer is categorically different from a human doing it. No one was scammed, this would be more like your own poorly designed website giving you the option to get 80% off if someone clicks the right combination of settings.