Good point, hallucinations only add to the fake news problem and artificial content problem.
I’ll counter with this: how do you know the stuff you look up online is legit? Should we go back to encyclopedias? Who writes those?
Edit: in case anyone isn’t aware, GPT “hallucinates” made up information in specific cases when temperature and top_p settings aren’t optimized, wasn’t saying anyone’s opinion was a hallucination of course