This website contains age-restricted materials including nudity and explicit depictions of sexual activity.
By entering, you affirm that you are at least 18 years of age or the age of majority in the jurisdiction you are accessing the website from and you consent to viewing sexually explicit content.
I mentioned to someone that I ask ChatGPT things all the time and they were like, “Don’t you know it doesn’t actually know facts? It just spews bullshit that sounds plausible.”
The joyous thing for me is that’s why I’m using it: To generate plausible sounding nonsense for dungeons and dragons. That, to me, has been one of the biggest use cases for me. Name generation is fantastic through it. “List 10 suggestions for epic sounding names for a tavern built into a cliffside in a deep elven rain forest” and then work shopping it from there.
As a programmer, I also make pretty consistent use of GitHub Copilot… Because half of programming is boiler plate that LLMs are really good at generating. Super useful for explaining what kind of statically defined array I want without having to type out the whole thing myself. Or, and I think this is my favorite use, any time I need to translate from one data format to another, just describing my input and my desired output gets me a great starting point that I can refine.
But asking them for facts? Nah lol
I’m also a programmer. I’ve found it’s pretty useless except for code that is very repetitive (test cases) or for documentation… But been there it’s a coin flip as to if I’ll have to go in and correct it.
And there’s no indication that it’ll ever be better than that tbh. No matter what articles on MSN say.
Agreed 100% on all points. It’s an incredible tool, but just not for factual information.