The connection between time compression and AI hallucinations that LLMs output made me think of the effects certain drugs influence our perception of time when we ingest them.
I wonder what the AI will be capable of when it has the ability to sober itself up. Interesting to consider.
Anil Seth’s 'beast machine theory' suggests consciousness as merely a result of 'controlled hallucinations' that are our brain's attempt to keep us alive.
The connection between time compression and AI hallucinations that LLMs output made me think of the effects certain drugs influence our perception of time when we ingest them.
I wonder what the AI will be capable of when it has the ability to sober itself up. Interesting to consider.
Thanks for this awesome article.
I think hallucinations work for us because we have reality as a baseline to compare against. For LLM today it's all hallucinations (or all reality).
It should be quite fruitful to depict all these musings onto the Ashby Space.
Ashby Space is amazing tool for visualizing knowledge complexity based upon the namesake Asby's law of requisite variety.
Its source are the works of Max Boisot and Bill McKelvey, see also a nice summary here:
https://harishsnotebook.wordpress.com/2019/04/28/exploring-the-ashby-space/
Fascinating!
My first thought is that we are misusing/mislabeling TikTok. People are hungry to learn on there if we meet them where they are.
Anil Seth’s 'beast machine theory' suggests consciousness as merely a result of 'controlled hallucinations' that are our brain's attempt to keep us alive.
Interesting. I'll have to read more on this. I do believe the fight against entropy, self directed, as key to this, though it's a conjecture.