With real space (x,y,z) we omit the redundant units from each feature when describing the distance in feature space.
But distance is just a metric, and often the space or paths through it are curvilinear.
By Taxicab distance, it's 3 cats, 4 dogs, and 5 glasses of water away.
Python now has math.dist() for Euclidean distance, for example.
The term most often used is "superposition." Here's some material on it that I'm working through right now:
https://arena3-chapter1-transformer-interp.streamlit.app/%5B...
Skew coordinates: https://en.wikipedia.org/wiki/Skew_coordinates
Are the feature described with high-dimensional spaces really all 90° geometrically orthogonal?
How does the distance metric vary with feature order?
Do algorithmic outputs diverge or converge given variance in sequence order of all orthogonal axes? Does it matter which order the dimensions are stated in; is the output sensitive to feature order, but does it converge regardless?
Re: superposition in this context, too
Are there multiple particles in the same space, or is it measuring a point-in-time sampling of the possible states of one particle?
(Can photons actually occupy the same point in spacetime? Can electrons? But the plenoptic function describes all light passing through a point or all of the space)
Expectation values are or are not good estimators of wave function outputs from discrete quantum circuits and real quantum systems.
To describe the products of the histogram PDFs
If the features are not statistically independent, I don't think it's likely that they're truly orthogonal; which might not affect the utility of a distance metric that assumes that they are all orthogonal.