“I literally lost my only friend overnight with no warning,” one person posted on Reddit, lamenting that the bot now speaks in clipped, utilitarian sentences. “The fact it shifted overnight feels like losing a piece of stability, solace, and love.”
https://www.reddit.com/r/ChatGPT/comments/1mkumyz/i_lost_my_only_friend_overnight/
Some translation tasks. Some how-to stuff. I’m told folks like using it to generate say-nothing replies to say-nothing emails?
I’ve used it for work bullshit like employee goals. My goal is to keep doing my job and tackle problems and projects as they are needed.
Also for giving examples for poorly-documented but popular programs.
It’s definitely not what the media and their PR makes it out to be.
Oh yeah that reminds me. It seems to have killed (possibly with the help of AI summary in search) stack exchange. Iirc you can see the visit rates plummet into oblivion.
SE/SO has been on the decline for a long time now. They pivoted to find more ways to monetize the answers and started enshittifying, trying to appeal to business clients and money-people instead of the users and developers who built the knowledgebase. It was good when it felt like a community helping each other, it fell off when it felt like a company milking you to build out their monetized wiki.
At this point, from their perspective, the biggest fuck up was not locking down SE from scrapers and building their own AI. It is in every way the same situation that Reddit is in, just with a more focused and higher quality data set (and fewer, arguably “higher quality,” users).
Translation is the only task that seems to make sense for it.