The real world is somewhere in between. It must involve quantum mechanics (in a way I don't really understand), as maximum bandwidth/minimum wavelength bump up against limits such as the Planck length and virtual particles in a vacuum.
An interesting anecdote from Lanczos[1] claims that Michelson (of interferometer fame) observed Gibbs ringing when he tried to reconstruct a square wave on what amounted to a steampunk Fourier analyzer [2]. He reportedly blamed the hardware for lacking the necessary precision.
1: https://math.univ-lyon1.fr/wikis/rouge/lib/exe/fetch.php?med...
2: https://engineerguy.com/fourier/pdfs/albert-michelsons-harmo...
For example, one viewpoint is that "Gibbs ringing" is always present if the bandwidth is limited, just that in the "non-aliased" case the sampling points have been chosen to coincide with the zero-crossings of the Gibbs ringing.
I find that my brain explodes each time I pick up the Fourier Transform, and it takes a few days of exposure to simultaneously get all the subtle details back into my head.
No amount of precision, no number of coefficients, no degree of lowpass filtering can get around the fact that sin(x)/x never decays all the way to zero. So if you don't have an infinitely-long (or seamlessly repeating) input signal, you must apply something besides a rectangular window to it or you will get Gibbs ringing.
There is always more than one way to look at these phenomena, of course. But I don't think the case can be made that bandlimiting has anything to do with Gibbs.