Transformers aren't neural networks, which is part of the problem. They are hard to adapt to online learning, not necessarily impossible but hard. Similarly, deep thought is very hard. A lot of the things which brains can handle with a fairly uniform soup of circuitry needs much more structure in transformers ... there is no emergence, just tinkering in a very fragile design space.
AGI is just the more politically correct term for Human Level AI. The ML tinkerers hate HLAI as a term though, so instead AGI.