There are many less evils. Use open source/weight AI like Kimi, GLM, Deepseek, Mistral, Olmo, Arcee, Minimax, Qwen, Exaone, NVidia, Sarvam…
If you don’t have the hardware to run locally, you can pay for API. If you find the company problematic for whatever reason, you can switch to the same model served by a third party (possible because the model weights are publicly released).
For me I just use it to get verbose answers to questions.
I use open-weight LLM over search engines when I can, because Google/Bing/Yandex are complete proprietary black boxes run by corporations of questionable morality.
There are many less evils. Use open source/weight AI like Kimi, GLM, Deepseek, Mistral, Olmo, Arcee, Minimax, Qwen, Exaone, NVidia, Sarvam…
If you don’t have the hardware to run locally, you can pay for API. If you find the company problematic for whatever reason, you can switch to the same model served by a third party (possible because the model weights are publicly released).
Other than wanting a verbose answer to a question, what is it for?
For me I just use it to get verbose answers to questions.
I use open-weight LLM over search engines when I can, because Google/Bing/Yandex are complete proprietary black boxes run by corporations of questionable morality.
Or you could just not use LLMs. Fuck AI.