I’m not really here to tell you why you should care - you’re free to care about whatever you want to care about. But to explain why other people might care, it’s because it can do things a Google search can’t do. Google search can’t copy-edit your CV or cover letter. Google search can’t synthesise a bunch of different Stackoverflow answers and fit them to the exact scenario you’re talking about. LLMs can and do.
And those are two examples where the cost of an error is low: if your CV comes out with made up shit in it, you can just read through it and check (but you may not have the ability to re-write it better). If the code example doesn’t work, you’re going to run it and check anyway. (It may have a subtle bug, but so can Stackoverflow answers, and that never stopped people from using them)
Could be anything. The point is if I don’t have the skill to write my own CV well. Then I also don’t have the skill to determine if an AI generated CV is written well
So I don’t think that is true. It’s possible to recognise that a book is well written even if you can’t write that well.
I think the problems from LLM use in that area are more about hallucination, if it inserts a false job or something, which is easily checked. OTOH if it just edits it and it looks no better to your eyes, you’re probably ok to go with your initial version.
I think you can certainly enjoy it book or think it’s subjectively a good piece of art without being a skilled writer of course. But you wouldn’t be a very good EDITOR without understanding anything about writing. Which I think is a more accurate picture of what we’re describing doing here.
Enjoying something or forming an opinion on it as a piece of art is a different activity and skillset from knowing if it’s in a fit state to be published, and if it’s not then being able to recognize and fix the errors
Ok, but then explain why I would care about a technology that’s 10 times less efficient than an existing, 25 year old technology
I’m not really here to tell you why you should care - you’re free to care about whatever you want to care about. But to explain why other people might care, it’s because it can do things a Google search can’t do. Google search can’t copy-edit your CV or cover letter. Google search can’t synthesise a bunch of different Stackoverflow answers and fit them to the exact scenario you’re talking about. LLMs can and do.
And those are two examples where the cost of an error is low: if your CV comes out with made up shit in it, you can just read through it and check (but you may not have the ability to re-write it better). If the code example doesn’t work, you’re going to run it and check anyway. (It may have a subtle bug, but so can Stackoverflow answers, and that never stopped people from using them)
If you don’t have the ability to write it better what would make you think someone would have the ability to recognize and fix the errors in their CV?
What does an error in a CV look like, to you?
Could be anything. The point is if I don’t have the skill to write my own CV well. Then I also don’t have the skill to determine if an AI generated CV is written well
So I don’t think that is true. It’s possible to recognise that a book is well written even if you can’t write that well.
I think the problems from LLM use in that area are more about hallucination, if it inserts a false job or something, which is easily checked. OTOH if it just edits it and it looks no better to your eyes, you’re probably ok to go with your initial version.
I think you can certainly enjoy it book or think it’s subjectively a good piece of art without being a skilled writer of course. But you wouldn’t be a very good EDITOR without understanding anything about writing. Which I think is a more accurate picture of what we’re describing doing here.
Enjoying something or forming an opinion on it as a piece of art is a different activity and skillset from knowing if it’s in a fit state to be published, and if it’s not then being able to recognize and fix the errors