To simulate a system with N states/particles with full fidelity, the simulator needs resources that scale with N (or worse, exponentially with N for quantum systems). This create a hierarchy problem:
- Level 0 (base reality): has X computational resources
- Level 1 (first simulation): needs X resources to simulate Level 0, but exists within Level 0, so can only access some fraction of X
- Level 2: would need even more resources than Level 1 has available
The logical trap is that each simulation layer must have fewer resources than the layer above it (since it is contained within it), but needs MORE resources to simulate that layer. This is mathematically impossible for high-fidelity simulations.
This means either:
* we're in base reality - there's no way to create a full-fidelity simulation without having more computational power than the universe you're simulating contains
* simulations must be extremely "lossy" - using shortcuts, approximations, rendering only what's observed (live video games), etc. But then we face the question of why unobserved quantum experiments still produce consistent results. Why does the unifier render distant galaxies we'll never visit?
* the simulation uses physics we don't understand - perhaps the base reality operates on completely different principles that are vastly more computationally efficient. But that is an unfalsifiable speculation.
This is also sometimes called the "substrate problem"; you cannot create something more complex than yourself using only your own resources.