Goggins stated that his biggest fear is that he dies, and God (or whoever God assigns the task to) shows him a board with a list of accomplishments: physically fit, Navy SEAL, pull-up record holder, inspirational speaker who helps others.
Goggins imagines saying, “That’s not me.”
And God responds, “That’s who you were supposed to be.”
The True Self is the Person You Want Others to Believe You Are, Rob Henderson
BULLSHIT involves language or other forms of communication intended to appear authoritative or persuasive without regard to its actual truth or logical consistency.
A lot of bullshit takes the form of words, but it doesn’t have to. Statistical figures, data graphics, images, videos — these can be bullshit as well. Bullshit aims to be convincing, but what makes it bullshit is the lack of allegiance to the truth.
According to philosopher Harry Frankfurt, a liar knows the truth and is trying to lead us in the opposite direction.
A bullshitter either doesn't know the truth, or doesn't care. They are just trying to be persuasive. LLMs are designed to generate plausible answers and present them in an authoritative tone. All too often, however, they make things up that aren't true.
Computer scientists have a technical term for this: hallucination. But this term is a misnomer because when a human hallucinates, they are doing something very different.
In psychiatric medicine, the term "hallucination" refers to the experience of false or misleading perceptions. LLMs are not the sorts of things that have experience, or perceptions. Moreover, a hallucination is a pathology. It's something that happens when systems are not working properly.
When an LLM fabricates a falsehood, that is not a malfunction at all. The machine is doing exactly what it has been designed to do: guess, and sound confident while doing it.
When LLMs get things wrong they aren't hallucinating. They are bullshitting.