• Bubbaonthebeach@lemmy.ca
    link
    fedilink
    English
    arrow-up
    16
    ·
    4 hours ago

    To everyone I’ve talked to about AI, I’ve suggested a test. Take a subject that they know they are an expert at. Then ask AI questions that they already know the answers to. See what percentage AI gets right, if any. Often they find that plausible sounding answers are produced however, if you know the subject, you know that it isn’t quite fact that is produced. A recovery from an injury might be listed as 3 weeks when it is average 6-8 or similar. Someone who did not already know the correct information, could be damaged by the “guessed” response of AI. AI can have uses but it needs to be heavily scrutinized before passing on anything it generates. If you are good at something, that usually means you have to waste time in order to use AI.

    • laranis@lemmy.zip
      link
      fedilink
      arrow-up
      1
      ·
      30 minutes ago

      Happy cake day, and this absolutely. I figured out its game the first time I asked it a spec for an automotive project I was working on. I asked it the torque specs for some head bolts and it gave me the wrong answer. But not just the wrong number, the wrong procedure altogether. Modern engines have torque to yield specs, meaning essentially you torque them to a number and then add additional rotation to permanently distort the threads to lock it in. This car was absolutely not that and when I explained back to it the error it had made IT DID IT AGAIN. It sounded very plausible but someone following those directions would have likely ruined the engine.

      So, yeah, test it and see how dumb it really is.

    • NABDad@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      3 hours ago

      I had a very simple script. All it does is trigger an action on a monthly schedule.

      I passed the script to Copilot to review.

      It caught some typos. It also said the logic of the script was flawed and it wouldn’t work as intended.

      I didn’t need it to check the logic of the script. I knew the logic was sound because it was a port of a script I was already using. I asked because I was curious about what it would say.

      After restating the prompt several times, I was able to get it to confirm that the logic was not flawed, but the process did not inspire any confidence in Copilot’s abilities.