Trim is building a foundation model for physics. We found almost all well-maintained transformer-based architectures are optimized for language models. We’re releasing our own open-source transformer package that is optimized for physics. It achieves a 90% reduction in memory usage and a 3.5x speedup in a standard Navier-Stokes dataset.