And, with your own drawing: https://gofigure.impara.ai
Turns out... they are not! You can do the same thing using a different set of functions, like Legendre polynomials, or wavelets.
The real world is somewhere in between. It must involve quantum mechanics (in a way I don't really understand), as maximum bandwidth/minimum wavelength bump up against limits such as the Planck length and virtual particles in a vacuum.
Yup, any set of orthogonal functions! The special thing about sines is that they form an exceptionally easy-to-understand orthogonal basis, with a bunch of other nice properties to boot.
Like you can make any vector in R^3 `<x,y,z>` by adding together a linear combination of ` <1,0,0> `, ` <0,1,0> `, ` <0,0,1> `, turns out you can also do it using `<exp(j2pi0/30), exp(j2pi0/31), exp(j2pi0/32)>`, `<exp(j2pi1/30), exp(j2pi1/31), exp(j2pi1/32)>`, and `<exp(j2pi2/30), exp(j2pi2/31), exp(j2pi2/32)>`.
You can actually do it with a lot of different bases. You just need them to be linearly independent.
For the continuous case, it isn't all that different from how you can use a linear combination of polynomials 1,x,x^2,x^3,... to approximate functions (like Taylor series).
Which to your point: You're absolutely correct that you can use a bunch of different sets of functions for your decomposition. Linear algebra just says that you might as well use the most convenient one!
Essentially it's just projection in infinite-dimensional vector spaces.
For someone reading this with only a calculus background, an example of this is that you get back a sine (times a constant) if you differentiate it twice, i.e. d^2/dt^2 sin(nt) = -n^2 sin(nt). Put technically, sines/cosines are eigenfunctions of the second derivative operator. This turns out to be really convenient for a lot of physical problems (e.g. wave/diffusion equations).
An interesting anecdote from Lanczos[1] claims that Michelson (of interferometer fame) observed Gibbs ringing when he tried to reconstruct a square wave on what amounted to a steampunk Fourier analyzer [2]. He reportedly blamed the hardware for lacking the necessary precision.
1: https://math.univ-lyon1.fr/wikis/rouge/lib/exe/fetch.php?med...
2: https://engineerguy.com/fourier/pdfs/albert-michelsons-harmo...
For example, one viewpoint is that "Gibbs ringing" is always present if the bandwidth is limited, just that in the "non-aliased" case the sampling points have been chosen to coincide with the zero-crossings of the Gibbs ringing.
I find that my brain explodes each time I pick up the Fourier Transform, and it takes a few days of exposure to simultaneously get all the subtle details back into my head.
No amount of precision, no number of coefficients, no degree of lowpass filtering can get around the fact that sin(x)/x never decays all the way to zero. So if you don't have an infinitely-long (or seamlessly repeating) input signal, you must apply something besides a rectangular window to it or you will get Gibbs ringing.
There is always more than one way to look at these phenomena, of course. But I don't think the case can be made that bandlimiting has anything to do with Gibbs.
Luckily, we live in a physical universe, where such mathematical oddities, like infinite bandwidth signals, cannot exist, so this isn't an actual issue. Any signal that that contains infinite bandwidths only exists because it has sampling artifacts. You would, necessarily, be attempting to reconstruct errors. There are many "tricks" around dealing with such flawed signals. But yes, you can't fully reconstruct impossible signals with FFT.