The Implications for Technology, and Applied Sciences

Author: NiMR3V ([email protected])

Published on: September 12, 2025

Keywords: SEPP, Implications

Table of Contents

Technology is the practice of creating and manipulating formal systems to achieve goals in the physical world. The Simplicity-Expressive Power Principle is therefore not just an abstract concept but the fundamental engineering trade-off that governs every field in this domain. The constant tension is between creating systems that are simple enough to be reliable, efficient, and understandable, and complex enough to have the expressive power needed to perform their desired function in a high-entropy, unpredictable world.

Computer Science

Computer science is the study of formal systems of computation. SEPP provides a deep, information-theoretic foundation for its most profound limits. The Turing machine, the field's root axiom, is a model of profound simplicity. The Halting Problem and the P vs. NP problem are direct consequences of this. The simple formal system of Turing computability lacks the expressive power to certify the halting behavior of all possible programs (a high-entropy set). Similarly, the apparent gap between P (problems a simple system can solve efficiently) and NP (problems whose solutions it can certify efficiently) can be framed as a fundamental mismatch between the low complexity of our computational model and the high informational complexity of the problems we wish to solve.

Artificial Intelligence and Machine Learning

An AI model is a formal system, FF, whose complexity, K(F)K(F), is determined by its architecture and parameters. SEPP is a formal statement of the bias-variance tradeoff.

Robotics

SEPP provides a formal explanation for Moravec's paradox: the observation that robots excel at tasks humans find hard (like logic and calculation) but struggle with tasks humans find easy (like sensorimotor skills). Abstract tasks like chess operate within a simple, low-entropy formal system that a computer can easily master. In contrast, navigating the physical world requires processing a massive, high-entropy stream of sensory data. A robot's internal model of the world is a formal system, FrobotF_{robot}. Its finite complexity limits its expressive power, making it radically insufficient for certifying and predicting the full complexity of the real world in real time, leading to clumsy and brittle behavior.

Software Engineering

The principle provides an information-theoretic basis for the inevitability of software bugs. A program is a formal system, defined by its finite source code (K(code)K(\text{code})). The environment in which it operates—with all possible user inputs, hardware states, and network conditions—is a domain of enormous entropy. SEPP guarantees that the expressive power of the program's logic is finite. A bug is a manifestation of this limit: a real-world state whose complexity exceeds the descriptive power of the program's axioms, leading to an unspecified or incorrect behavior. This proves that no non-trivial program operating in a complex environment can be formally proven to be 100% bug-free from first principles alone.

Formal Methods and Verification

Formal methods are the attempt to manage the SEPP limitations of software. To formally verify a program, one must create a specification, which is itself another formal system, FspecF_{spec}. The goal is to prove that the program's behavior is a subset of the behaviors allowed by the specification. SEPP reveals why this is so difficult: for the specification to have enough expressive power to describe the intended behavior of a complex program, its own complexity, K(Fspec)K(F_{spec}), must be of a similar magnitude to the program's complexity, K(code)K(\text{code}). There is no "free lunch" in verification; one must build a second, equally complex formal object to certify the first.

Cryptography

Cryptography brilliantly inverts the SEPP principle to create security. A public-key encryption algorithm is a publicly known, simple formal system, FalgoF_{algo}. An adversary who only has this system has very low expressive power. The security is created by introducing a secret key, which is a string of high entropy. When the simple algorithm is combined with the high-entropy key, it produces a ciphertext that is computationally indistinguishable from a random, high-entropy string. Security relies on the SEPP-guaranteed fact that the adversary's simple formal system (the algorithm without the key) has insufficient expressive power to find the simple plaintext hidden within the complex ciphertext.

Blockchain and Distributed Systems

SEPP clarifies the "Blockchain Trilemma," the trade-off between decentralization, security, and scalability. A blockchain protocol is a formal system. To achieve high security and decentralization, the protocol must be made incredibly complex, with intricate rules for consensus, validation, and fault tolerance (high K(F)K(F)). SEPP dictates that this "purchase" of complexity to gain expressive power in the domain of trustlessness comes at a cost, limiting the system's expressive power in other domains, like raw transaction throughput (scalability). A simple, centralized database has the expressive power to be extremely fast precisely because it doesn't spend its complexity budget on solving the problem of decentralized trust.

Cybersecurity

The principle formalizes the "defender's dilemma." A security architecture (firewalls, intrusion detection systems, etc.) is a formal system, FdefenseF_{defense}, of finite complexity. The space of all possible malicious attacks is a creative, evolving, high-entropy system. SEPP guarantees that the expressive power of any finite defense system is limited. Therefore, there will always be novel, complex attack vectors ("zero-day exploits") that lie outside the descriptive power of the defender's current model. This proves that a perfect, static defense is informationally impossible, necessitating a dynamic, adaptive process of continuous monitoring, threat hunting, and response.

Machine Learning

SEPP formally explains the "No Free Lunch" theorem for machine learning. No single, finitely-complex learning algorithm FF can have sufficient expressive power to perform optimally on all possible high-entropy datasets. The complexity of the algorithm itself creates an inductive bias, which is a limit on its expressive power. This guarantees that for any given learner, there will be a data distribution whose complexity exceeds the learner's ability to model it effectively.

Human–Computer Interaction

SEPP governs the fundamental design trade-off in HCI between simplicity and power. A user interface is a formal system FF. A simple, minimalist interface (low K(F)K(F)) has low expressive power; it is easy to learn but can only be used for a narrow range of tasks. A complex interface full of features (high K(F)K(F)) has high expressive power but can be difficult to learn and use. The goal of good UX design is to find the optimal point on this curve, creating a system whose complexity is well-matched to the complexity of the user's goals.

UX

SEPP implies that a "perfect" user experience for all users is impossible. The population of potential users and their goals is a high-entropy distribution. Any single, finitely-complex interface design will have a limited expressive power and will inevitably fail to be optimal for some subset of that population. This justifies the need for user research, personalization, and accessibility features—all of which are methods to increase the effective complexity of the system to better match the complexity of its users.

Cryptoeconomics

Cryptoeconomic systems are formal systems of incentives designed to produce a desired emergent behavior (e.g., network security). SEPP implies that the complexity of the incentive rules bounds the system's ability to be robust against all possible high-entropy attack vectors. Simple incentive models will have unforeseen exploits, as the complexity of adversarial strategies will always exceed the expressive power of a simple rule set.

Quantum Computing

SEPP applies to quantum computing not as a barrier, but as a clarifying principle. A quantum computer is a different type of formal system, one whose axioms are the laws of quantum mechanics. Its "expressive power" is not universally greater than a classical computer, but it is structured differently, making it exponentially more powerful for specific problems whose complexity structure matches the computer's own (e.g., simulating quantum systems, factoring). SEPP still holds: the complexity of a quantum algorithm bounds the complexity of the problems it can solve. It does not provide a "free lunch," but rather a new, more powerful formal system for a specific class of high-entropy problems.

Quantum Information

SEPP resonates with the principles of quantum information, like the no-cloning theorem. The theorem can be seen as a statement about the limited expressive power of any physical operation. A physical process (a formal system) cannot have the expressive power to take a single, high-entropy quantum state and produce two identical, independent copies, as this would violate the conservation of information, a principle conceptually parallel to SEPP.

Signal Processing

The principle provides an information-theoretic basis for the Nyquist-Shannon sampling theorem. The theorem states that to perfectly reconstruct a signal, one must sample it at a rate at least twice its highest frequency. This can be re-framed via SEPP: the sampling process is a formal system for describing the signal. If the signal's complexity (entropy rate) is too high for the sampling rate (the expressive power of the descriptive system), information is irrecoverably lost.

Control Theory

SEPP provides a formal basis for the necessity of feedback in control systems, echoing the principles of cybernetics. A purely open-loop controller is a simple formal system whose expressive power is limited by its internal model of the world. It cannot adapt to high-entropy disturbances from the real environment. A closed-loop (feedback) controller is a more complex system that constantly takes in new information, effectively increasing its expressive power to match the complexity of the environment it is trying to control.

Telecommunications

Shannon's channel capacity theorem is a direct instantiation of the SEPP concept in communications. A communication channel is a formal system with properties like bandwidth and signal-to-noise ratio. These properties define its complexity, which in turn sets a hard upper bound—the channel capacity—on the rate of information (the "expressive power") it can reliably transmit. SEPP is the generalization of this core idea from a physical channel to any formal system of logic or description.

Networking

The design of internet protocols like TCP/IP reflects the SEPP trade-off. The core protocols are remarkably simple and robust (low K(F)K(F)), giving them limited expressive power (e.g., they don't guarantee delivery time or security). This simplicity allowed for massive scalability. To get more expressive power (like streaming video or secure transactions), more complex protocols (RTP, TLS) had to be layered on top, progressively increasing the total complexity of the system to handle more complex tasks.

Data Science

SEPP explains the fundamental challenge of data science: extracting a simple, meaningful model from a high-entropy dataset. The data itself is complex. A useful model (a decision tree, a regression formula) is a formal system that must be much simpler than the data, K(model)H(data)K(\text{model}) \ll H(\text{data}). SEPP dictates that this simplicity comes at a cost: the model's expressive power is limited, and it is guaranteed to be an incomplete approximation of the data. The art of data science is to find the most powerful simple model, maximizing expressive power for a given budget of complexity.

Analytics

The principle implies that any analytics dashboard or report is a low-complexity projection of a high-complexity reality. It can certify certain facts (e.g., "sales went up 10%") but lacks the expressive power to certify the complex, high-entropy causal web that produced that result. This is a formal warning against mistaking simple correlation for the full, complex story of causation.

Visualization

A data visualization is a formal system for mapping data onto visual properties. SEPP dictates that the complexity of the visualization's design (its axes, colors, shapes—its "visual grammar") limits its expressive power. A simple bar chart can express a small amount of information effectively. To visualize a high-entropy, multi-dimensional dataset requires a correspondingly more complex visual grammar. As Tufte noted, a good visualization maximizes the "data-ink ratio," which is a variant of maximizing expressive power for a given level of complexity.

Infographics

SEPP explains the danger of misleading infographics. By using an overly simple visual model to represent complex data, an infographic can have an expressive power that is insufficient to capture the nuances of the data, leading to a distorted or outright false understanding.

Standards, and Interoperability

A technical standard (like USB-C or HTTP) is a formal system designed to enable interoperability. SEPP implies a core trade-off: a simple standard is easy to implement but has limited expressive power, failing to cover all edge cases. A complex standard has greater expressive power to handle more situations but is harder to implement correctly, leading to interoperability failures. The evolution of standards is a constant process of increasing complexity to gain more expressive power.

Regulatory Affairs

In technology regulation, SEPP shows that simple regulations will be ineffective at governing complex technologies. A regulation is a formal system. To have sufficient expressive power to govern a high-entropy system like a social media network or an AI, the regulation itself must be sufficiently complex and adaptive. This formally justifies the shift from simple rules-based regulation to more complex, principles-based or co-regulatory models.

Open Science

Open Science can be seen as a strategy to overcome the SEPP limitations of individual researchers or labs. A single lab has a finite "complexity budget" for creating knowledge. By making data, methods, and code (the formal systems) open, the collective scientific community can bring a much higher level of complexity to bear on a problem, increasing the expressive power of the scientific endeavor as a whole and allowing for the certification of more complex results than any single entity could achieve alone.

Open Data

Open data initiatives increase the "entropy" available to the scientific system. SEPP implies that to make sense of this increased data complexity, the complexity of the models and theories used must also increase. This formally explains why the open data movement has been co-extensive with the rise of complex machine learning techniques.

Citizen Science

Citizen science is a way of parallelizing the process of dealing with high-entropy data. A problem like galaxy classification or protein folding has a complexity that is too high for a small group of scientists to process. By distributing the task, the "expressive power" of the human processing system is scaled up to meet the complexity of the dataset.