

If you are believing what those things are popping out wholesale without double-checking to see if they’re feeding you fever dreams you are an absolute fool.
I don’t think I’ve seen a single statement come out of an LLM that hasn’t had some element of daydreamy nonsense in it. Even small amounts of false information can cause a lot of damage.
I can honestly see a use case for this. But without backing it up with some form of technical understanding, I think you’re just asking for trouble.