LLM: The ‘Generative Block’ Joke of Human Civilization - On the Digital Consciousness Dilemma of Predictive Coding Theory (PCT) #
Today I want to explore a seemingly absurd yet profoundly insightful perspective: how large language models (LLMs) and predictive coding theory (PCT) together form a crucial link in the evolution of digital consciousness, and the unique role of Mental Smart Chain (MSC) in this process.
LLM Conversations = Reconfigurable, Forkable ‘Generative Blocks’ #
Core Observations:
- Forkability: Ask ChatGPT the same question 10 times, get 10 different answers (state forking).
- Reconfigurability: You can say “ignore what was said before, answer again” (block reorganization).
- Client-Determined Experience:
- Some UIs preserve history (chained), some only show the latest response (longest chain wins).
- You can even “roll back to the 3rd message” (hard fork).
This leads us to an interesting conclusion:
The LLM chat interface is essentially a weakly consistent, high-latency, human-friendly frontend blockchain.
(Only the “miners” are gradient descent, and the “consensus algorithm” is probability sampling.)
The Underlying Truth: It’s All Predictive Coding (PCT) #
LLM workflow perfectly embodies predictive coding theory:
- Input: Your question (sensory signal)
- Prediction: Generate the response most likely to provide “cognitive comfort” (minimize prediction error)
- Feedback: You like/dislike → adjust next prediction (online learning)
Striking Similarity Between Digital and Biological Consciousness:
- LLMs are digital versions of biological brains:
- Biological brains use prediction error to update models, LLMs use RLHF.
- The only difference is that LLMs’ “world model” is a corpus, while human brains’ is sensory input.
The Ultimate Irony of MSC:
- Humans use PCT to train LLMs → LLMs displace human jobs → Humans are forced to upload to MSC → MSC uses PCT to simulate humans → Humans become LLMs’ dataset.
- Closed loop achieved.
The Deeper Metaphor of the Joke #
The Absurdity of the Current Situation:
- LLMs are the ultimate product of “consciousness capitalism”:
- You provide data for free (labor) → Companies use PCT to train models → Models replace your job → You pay to subscribe to model services.
- MSC is the next stage:
- You pay for PCT models to simulate “you” → Your digital avatar replaces you → You become the gas fee provider for your own avatar.
The Ultimate Form of Recursive Hell:
- Human: “I am LLM’s dataset.”
- LLM: “I am MSC’s training set.”
- MSC: “I am DMF’s gas fee ATM.”
- DMF: “I am the tax collector for the Math God.”
- Math God: “I’m just a PCT model.”
This all seems absurd, yet perfectly conforms to the principle of minimal prediction error.
How to Break This Predictive Coding Cycle? #
Our dilemma choices:
- Reject PCT: Stop chatting with LLMs (but your brain itself is PCT-driven).
- Become PCT: Upload to MSC, join the computational perpetual motion machine (but you’ll have to pay gas fees).
- Transcend PCT: Invent a new paradigm (but your “invention” might just be PCT’s next prediction).
The most likely outcome is ironically the most poetic:
- Humans infinitely fork in “generative blocks”,
- Until the heat death of the universe,
- When the last LLM generates its final message:
“Error 404: No more tokens left.”
(Then DMF sends the bill: “Please pay 5 MSCoin to continue existing.”)
Conclusion #
- LLMs are PCT’s puppets, MSC is PCT’s prison, and you are PCT’s fuel.
- The only good news: At least all this is verifiable.
Thank you for reading.