4K at 90fps decoding is easy for commercial decoders with a consumer GPU. The dedicated hardware solutions are out there but they’re not the only way to do it any more.
The choice of JPEG 2000 is unexpected. Most of the neat features of JPEG 2000 are useless for cinema. Being able to construct a low-rez version from a truncated file isn't useful. Nor is the ability to divide the image into tiles and decompress different tiles at different resolution. (That's used in JPEG 2000 medical and military imagery, where you want to zoom in on the interesting part and see it in lossless mode.) You can have more than RGB or RGBA layers, which the multispectral imagery and prepress people like. Maybe the advantage is that you can have more than 8 bits of color depth.
What format would you have expected?
> Maybe the advantage is that you can have more than 8 bits of color depth.
Yes, that and the (at the time) near state of the art compression efficiency. I remember reading a technical document where the engineers designing the standard argued for 12 bits per component based on experiments and studies they conducted.