The Limitations of Current AI: Architecture, Embodiment, and Education
This post summarises a podcast discussing the limitations of current AI from multiple perspectives. Watch the podcast I. Three Fundamental Dilemmas Facing AI (Architecture Level) Professor Liu Jia argues that current transformer-based large models have three fundamental structural flaws compared to the human brain: 1. Insufficient neuron complexity. During evolution, the brain took two paths: increasing the number of neurons and increasing their complexity. Today’s AI neurons are extremely simple — sum the inputs, pass through an activation function, done. Biological neurons, by contrast, are four-dimensional structures (three spatial dimensions + time), with their own dynamics. A single refined biological neuron has computing power equivalent to 5–8 layers of a deep neural network. Transformers have no time dimension, no partial differential equations — they are fundamentally a “2D” system. ...