• lemmy_outta_here@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    1 day ago

    Good summary.

    I also find that LLMs are mostly good at the parts of my job that i enjoy or that are too important to entrust to something that is right 80% of the time.

    Tech boosters suggest using LLMs to draft emails. My job involves explaining mathematical ideas to (well educated) people who have less math training than i have. That part of the job is fun and challenging - why would i outsource a task that i like doing and am good at? “Everyday” emails are not important enough to stress about - i can dash them off as fast as i can write a prompt.

    The first time i used a computer instead of a typewriter to compose something, i knew that the world had changed. all that AI has changed is that i now doubt every article and picture i see.

    • RedstoneValley@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      15 hours ago

      Its similar for me as a software developer. I’m sometimes using LLMs to get alternative concepts or implementations to the task at hand or simply to refresh my memory about something with an example. This works well, because I’m already well aware of what I am trying to accomplish and so my prompts are precise enough to get a decent result. However generating directly usable code that meets my expectations just with prompts is really hard to do. There is so much fine tuning necessary that I’m faster just doing it myself.

      I don’t see the technology itself as evil, there are some good uses if you know about the capabilities and limitations. What’s evil is big corporations selling this technology to people who are not prepared to understand or handle the limitations.

      But personally I’m using it less and less these days. I don’t want to take part in the massive environmental damage AI is causing and the thought of contributing to this mess makes me feel icky every time I think of using it.