I think people are overly critical - it is alright for some things, and it has gotten things right for me before, but generally I have to spend so much time double checking that it’s right that it isn’t worth the time. If it gets a detail wrong 10-15% of the time, then I have to check it every time.
Literally any thing that isn’t trained on blogspam or notorious for making up shit. You’re basically using a magic 8-ball to “learn” it just repeats what you say back at you. Its useless for research.
I use ChatGPT to learn all kinds of stuff. I say it’s replaced 50% of my searches. Not that it’s always right, but neither is all the blogspam.
I think people are overly critical - it is alright for some things, and it has gotten things right for me before, but generally I have to spend so much time double checking that it’s right that it isn’t worth the time. If it gets a detail wrong 10-15% of the time, then I have to check it every time.
I do find it useful for admin tasks though.
chatgpt is trained on all the blogspam.
jfc please have some dignity, you know you can do better than that
This restaurant has a terrible food safety rating so I just eat off their floor.
Say how and I’ll do it. Kagi+ChatGPT is getting me the quickest answers.
Literally any thing that isn’t trained on blogspam or notorious for making up shit. You’re basically using a magic 8-ball to “learn” it just repeats what you say back at you. Its useless for research.
If a human “expert” was a known liar and fantasist who never provided sources or footnotes - would you listen to them? And if you did - why?