6. AGI Prospective within LLM

Exponential growth of computing power is due to linear analysis. For optimization a human thinking algorithm can transform linear sequence into semantic one. Universal content structure, where content is structured and structure is meaningful, ensures understanding sense determining words meaning reflected in weights, as peculiar QR code. Language as a code of thinking can be encoded by different sign systems (verbal/digital). Transformer self-attention layers can ensure sequences transformation by connecting all positions. Relevance of tokens to semantic analysis is crucial. LLM implicitly contains a structured model of consciousness and behavior in terms of actions, allowing it to become a thinking tool for materially formalizing thoughts as weights of neuron patterns. It has high potential for AGI digital consciousness development. Extracting mental activity from texts can provide both communication tool and a reflection of consciousness cognitive-communicative-regulatory functions.