A Simplified A Priori Theory Of Meaning, –Nature based AI ‘first principles’–
Abstract
This paper explores a key issue in information theory seen by Claude Shannon and Warren Weaver as a missing "theory of meaning”. It names structural fundaments to cover the matter. Varied informatic roles are first noted as likely elements for a general theory of meaning. It next deconstructs Shannon Signal Entropy in a priori terms to mark the signal literacy (contiguous logarithmic Subject-Object primitives) innate to 'scientific' notions of information. It therein initiates general intelligence 'first principles' alongside a dualist-triune (2-3) pattern. This study thus tops today's vague sense of 'meaningful intelligence' in artificial intelligence, framed herein via an Entropic/informatic continuum of serially varied 'functional degrees of freedom'; all as a mildly-modified view of Signal Entropy.