Table of Contents
Technology is the practice of creating and manipulating formal systems to achieve goals in the physical world. The Simplicity-Expressive Power Principle is therefore not just an abstract concept but the fundamental engineering trade-off that governs every field in this domain. The constant tension is between creating systems that are simple enough to be reliable, efficient, and understandable, and complex enough to have the expressive power needed to perform their desired function in a high-entropy, unpredictable world.
Computer Science
Computer science is the study of formal systems of computation. SEPP provides a deep, information-theoretic foundation for its most profound limits. The Turing machine, the field's root axiom, is a model of profound simplicity. The Halting Problem and the P vs. NP problem are direct consequences of this. The simple formal system of Turing computability lacks the expressive power to certify the halting behavior of all possible programs (a high-entropy set). Similarly, the apparent gap between P (problems a simple system can solve efficiently) and NP (problems whose solutions it can certify efficiently) can be framed as a fundamental mismatch between the low complexity of our computational model and the high informational complexity of the problems we wish to solve.
Artificial Intelligence and Machine Learning
An AI model is a formal system,
- A simple model (low
), like linear regression, has low expressive power and will underfit a complex dataset (high bias). - A complex model (high
), like a deep neural network, has high expressive power but requires vast amounts of data to constrain and is prone to overfitting (high variance).
The recent explosion in the size of Large Language Models is a direct attempt to "buy" more expressive power by dramatically increasing complexity, aiming to build a formal system complex enough to approximate the high-entropy distribution of human language. The "No Free Lunch" theorem is a direct corollary: no single, finitely-complex learning algorithm can have sufficient expressive power to perform optimally on all possible datasets.
Robotics
SEPP provides a formal explanation for Moravec's paradox: the observation that robots excel at tasks humans find hard (like logic and calculation) but struggle with tasks humans find easy (like sensorimotor skills). Abstract tasks like chess operate within a simple, low-entropy formal system that a computer can easily master. In contrast, navigating the physical world requires processing a massive, high-entropy stream of sensory data. A robot's internal model of the world is a formal system,
Software Engineering
The principle provides an information-theoretic basis for the inevitability of software bugs. A program is a formal system, defined by its finite source code (
Formal Methods and Verification
Formal methods are the attempt to manage the SEPP limitations of software. To formally verify a program, one must create a specification, which is itself another formal system,
Cryptography
Cryptography brilliantly inverts the SEPP principle to create security. A public-key encryption algorithm is a publicly known, simple formal system,
Blockchain and Distributed Systems
SEPP clarifies the "Blockchain Trilemma," the trade-off between decentralization, security, and scalability. A blockchain protocol is a formal system. To achieve high security and decentralization, the protocol must be made incredibly complex, with intricate rules for consensus, validation, and fault tolerance (high
Cybersecurity
The principle formalizes the "defender's dilemma." A security architecture (firewalls, intrusion detection systems, etc.) is a formal system,
Machine Learning
SEPP formally explains the "No Free Lunch" theorem for machine learning. No single, finitely-complex learning algorithm
Human–Computer Interaction
SEPP governs the fundamental design trade-off in HCI between simplicity and power. A user interface is a formal system
UX
SEPP implies that a "perfect" user experience for all users is impossible. The population of potential users and their goals is a high-entropy distribution. Any single, finitely-complex interface design will have a limited expressive power and will inevitably fail to be optimal for some subset of that population. This justifies the need for user research, personalization, and accessibility features—all of which are methods to increase the effective complexity of the system to better match the complexity of its users.
Cryptoeconomics
Cryptoeconomic systems are formal systems of incentives designed to produce a desired emergent behavior (e.g., network security). SEPP implies that the complexity of the incentive rules bounds the system's ability to be robust against all possible high-entropy attack vectors. Simple incentive models will have unforeseen exploits, as the complexity of adversarial strategies will always exceed the expressive power of a simple rule set.
Quantum Computing
SEPP applies to quantum computing not as a barrier, but as a clarifying principle. A quantum computer is a different type of formal system, one whose axioms are the laws of quantum mechanics. Its "expressive power" is not universally greater than a classical computer, but it is structured differently, making it exponentially more powerful for specific problems whose complexity structure matches the computer's own (e.g., simulating quantum systems, factoring). SEPP still holds: the complexity of a quantum algorithm bounds the complexity of the problems it can solve. It does not provide a "free lunch," but rather a new, more powerful formal system for a specific class of high-entropy problems.
Quantum Information
SEPP resonates with the principles of quantum information, like the no-cloning theorem. The theorem can be seen as a statement about the limited expressive power of any physical operation. A physical process (a formal system) cannot have the expressive power to take a single, high-entropy quantum state and produce two identical, independent copies, as this would violate the conservation of information, a principle conceptually parallel to SEPP.
Signal Processing
The principle provides an information-theoretic basis for the Nyquist-Shannon sampling theorem. The theorem states that to perfectly reconstruct a signal, one must sample it at a rate at least twice its highest frequency. This can be re-framed via SEPP: the sampling process is a formal system for describing the signal. If the signal's complexity (entropy rate) is too high for the sampling rate (the expressive power of the descriptive system), information is irrecoverably lost.
Control Theory
SEPP provides a formal basis for the necessity of feedback in control systems, echoing the principles of cybernetics. A purely open-loop controller is a simple formal system whose expressive power is limited by its internal model of the world. It cannot adapt to high-entropy disturbances from the real environment. A closed-loop (feedback) controller is a more complex system that constantly takes in new information, effectively increasing its expressive power to match the complexity of the environment it is trying to control.
Telecommunications
Shannon's channel capacity theorem is a direct instantiation of the SEPP concept in communications. A communication channel is a formal system with properties like bandwidth and signal-to-noise ratio. These properties define its complexity, which in turn sets a hard upper bound—the channel capacity—on the rate of information (the "expressive power") it can reliably transmit. SEPP is the generalization of this core idea from a physical channel to any formal system of logic or description.
Networking
The design of internet protocols like TCP/IP reflects the SEPP trade-off. The core protocols are remarkably simple and robust (low
Data Science
SEPP explains the fundamental challenge of data science: extracting a simple, meaningful model from a high-entropy dataset. The data itself is complex. A useful model (a decision tree, a regression formula) is a formal system that must be much simpler than the data,
Analytics
The principle implies that any analytics dashboard or report is a low-complexity projection of a high-complexity reality. It can certify certain facts (e.g., "sales went up 10%") but lacks the expressive power to certify the complex, high-entropy causal web that produced that result. This is a formal warning against mistaking simple correlation for the full, complex story of causation.
Visualization
A data visualization is a formal system for mapping data onto visual properties. SEPP dictates that the complexity of the visualization's design (its axes, colors, shapes—its "visual grammar") limits its expressive power. A simple bar chart can express a small amount of information effectively. To visualize a high-entropy, multi-dimensional dataset requires a correspondingly more complex visual grammar. As Tufte noted, a good visualization maximizes the "data-ink ratio," which is a variant of maximizing expressive power for a given level of complexity.
Infographics
SEPP explains the danger of misleading infographics. By using an overly simple visual model to represent complex data, an infographic can have an expressive power that is insufficient to capture the nuances of the data, leading to a distorted or outright false understanding.
Standards, and Interoperability
A technical standard (like USB-C or HTTP) is a formal system designed to enable interoperability. SEPP implies a core trade-off: a simple standard is easy to implement but has limited expressive power, failing to cover all edge cases. A complex standard has greater expressive power to handle more situations but is harder to implement correctly, leading to interoperability failures. The evolution of standards is a constant process of increasing complexity to gain more expressive power.
Regulatory Affairs
In technology regulation, SEPP shows that simple regulations will be ineffective at governing complex technologies. A regulation is a formal system. To have sufficient expressive power to govern a high-entropy system like a social media network or an AI, the regulation itself must be sufficiently complex and adaptive. This formally justifies the shift from simple rules-based regulation to more complex, principles-based or co-regulatory models.
Open Science
Open Science can be seen as a strategy to overcome the SEPP limitations of individual researchers or labs. A single lab has a finite "complexity budget" for creating knowledge. By making data, methods, and code (the formal systems) open, the collective scientific community can bring a much higher level of complexity to bear on a problem, increasing the expressive power of the scientific endeavor as a whole and allowing for the certification of more complex results than any single entity could achieve alone.
Open Data
Open data initiatives increase the "entropy" available to the scientific system. SEPP implies that to make sense of this increased data complexity, the complexity of the models and theories used must also increase. This formally explains why the open data movement has been co-extensive with the rise of complex machine learning techniques.
Citizen Science
Citizen science is a way of parallelizing the process of dealing with high-entropy data. A problem like galaxy classification or protein folding has a complexity that is too high for a small group of scientists to process. By distributing the task, the "expressive power" of the human processing system is scaled up to meet the complexity of the dataset.