I initially subscribed to ChatGPT because I got a job as the only devops guy at an organization, when I had very limited devops experience, and ChatGPT essentially served as my mentor. I justified keeping it for a long time because it helped my productivity; bugs that I had no idea where to start with could be worked through given a few hours (or days) of back-and-forth.
As I climbed the learning curve, ChatGPT became proportionally less helpful, but I kept it because it’s kind of useful for rubber ducky debugging. I did find Copilot to be pretty handy for writing docstrings (especially for keeping consistent formatting conventions), but the actual code completions were more annoying than anything.
When all was said and done, I cancelled my ChatGPT and Copilot subscriptions because I’m taking on a mortgage tomorrow and I literally just can’t afford them. I have Ollama running on my homelab server, but I only have enough vRAM for a 7B-param model, and it kind of sucks ass, but whatever. At the end of the day, I like using my brain.
UPDATE (because I just thought of it after posting): I do think that “AI-as-a-mentor” is a good use-case of AI. It really helped me cut my teeth on the basics of Linux. I often find that it’s easier to learn when you have a working example of code or config that you can dissect than to bash your head against the wall just trying to figure out how to get something to run at all in the first place.
For my birthday challenge this year, I’m learning how to read and write Devanagari as a surprise to my Indian grandparents. I asked my local qwen model to generate some worksheets for me to practice with, and it totally flopped. It gave away all the answers. I do think ChatGPT would have done better, but maybe I could have gotten sufficient results with a better GPU.
I know AI is an emotionally charged topic, but I think your frustration is misdirected. I find that the best way to learn tech stuff is with hand-on experience, and to that end, it works pretty well to try something, ask why it didn’t work like I expected it to, and get instantaneous feedback. Or to start with a working example and pick it apart so I can learn the syntax. I’m not saying it’s a replacement for reading official documentation or figuring things out for yourself, but it makes it a lot easier to get started.
Fundamentally, I’m a humanist. I believe that we should use technology in a way that augments our brain instead of circumventing it. I don’t let AI write code for me, but I don’t really see the harm in having it present information in a digestible format.
I’ve always been bored by lectures and tutorials because they’re not good at meeting me at my level of experience. I don’t think anyone would argue that having a tutor/mentor who gives you individual attention and meets you where you at will help you climb the learning curve way faster. And when you’re in a situation where you don’t have a human mentor, AI can be pretty useful.
I worked at an organization where there were no senior software people and my supervisor told me you “hey, you created this dashboard – now deploy it”. My only relevant experience was having hosted a Minecraft server on Windows 10. After a few months of iterating with ChatGPT, I knew the basics of how to use containerization and deploy an app on a RHEL server. 3 years later, I’m doing it at a tech consulting firm, and I’m the guy everyone goes to for help writing containerfiles and compose files. They promoted me from data scientist (I have an MS in data science) to solutions architect, all because I used AI to learn the basics of Linux devops, and then got a shit ton of practice by self-hosting.
I initially subscribed to ChatGPT because I got a job as the only devops guy at an organization, when I had very limited devops experience, and ChatGPT essentially served as my mentor. I justified keeping it for a long time because it helped my productivity; bugs that I had no idea where to start with could be worked through given a few hours (or days) of back-and-forth.
As I climbed the learning curve, ChatGPT became proportionally less helpful, but I kept it because it’s kind of useful for rubber ducky debugging. I did find Copilot to be pretty handy for writing docstrings (especially for keeping consistent formatting conventions), but the actual code completions were more annoying than anything.
When all was said and done, I cancelled my ChatGPT and Copilot subscriptions because I’m taking on a mortgage tomorrow and I literally just can’t afford them. I have Ollama running on my homelab server, but I only have enough vRAM for a 7B-param model, and it kind of sucks ass, but whatever. At the end of the day, I like using my brain.
UPDATE (because I just thought of it after posting): I do think that “AI-as-a-mentor” is a good use-case of AI. It really helped me cut my teeth on the basics of Linux. I often find that it’s easier to learn when you have a working example of code or config that you can dissect than to bash your head against the wall just trying to figure out how to get something to run at all in the first place.
For my birthday challenge this year, I’m learning how to read and write Devanagari as a surprise to my Indian grandparents. I asked my local qwen model to generate some worksheets for me to practice with, and it totally flopped. It gave away all the answers. I do think ChatGPT would have done better, but maybe I could have gotten sufficient results with a better GPU.
Sorry but thinking of a shitty chatbot as your “mentor” is absolute brainrot.
I know AI is an emotionally charged topic, but I think your frustration is misdirected. I find that the best way to learn tech stuff is with hand-on experience, and to that end, it works pretty well to try something, ask why it didn’t work like I expected it to, and get instantaneous feedback. Or to start with a working example and pick it apart so I can learn the syntax. I’m not saying it’s a replacement for reading official documentation or figuring things out for yourself, but it makes it a lot easier to get started.
Fundamentally, I’m a humanist. I believe that we should use technology in a way that augments our brain instead of circumventing it. I don’t let AI write code for me, but I don’t really see the harm in having it present information in a digestible format.
I’ve always been bored by lectures and tutorials because they’re not good at meeting me at my level of experience. I don’t think anyone would argue that having a tutor/mentor who gives you individual attention and meets you where you at will help you climb the learning curve way faster. And when you’re in a situation where you don’t have a human mentor, AI can be pretty useful.
I worked at an organization where there were no senior software people and my supervisor told me you “hey, you created this dashboard – now deploy it”. My only relevant experience was having hosted a Minecraft server on Windows 10. After a few months of iterating with ChatGPT, I knew the basics of how to use containerization and deploy an app on a RHEL server. 3 years later, I’m doing it at a tech consulting firm, and I’m the guy everyone goes to for help writing containerfiles and compose files. They promoted me from data scientist (I have an MS in data science) to solutions architect, all because I used AI to learn the basics of Linux devops, and then got a shit ton of practice by self-hosting.