From what I have seen, Ethernet ports always have a small isolation transformer for each twisted pair, between the connector and the PHY. Usually four of such transformers are combined in one small magnetics package. The insulation in the transformer is specified to withstand over a kilovolt of lightning induced voltage -- that's one of the purposes of such galvanic isolation.
The data travels as the differential voltage in each of the twisted pairs, and is transmitted magnetically by the transformer to the secondary winding. The power is applied between different pairs, and in each pair appears as a common mode voltage. This is all stopped by the transformer, and in devices designed to support PoE, the PoE circuits tap the mid-point of the primary windings to access the supplied voltage.
So at a first glance, it seems that if 48 volts is applied between the twisted pairs to a non-PoE device, this voltage would simply be blocked by the transformer. But since there is a widespread concern about this, there must be more to the story -- maybe somebody who actually worked with these circuits can explain why this is more complicated than it seems at first?
Edit: Found an answer. It seems that at least some of the designs of non-PoE Ethernet jacks terminate the common mode signals to a common ground though 75 Ohm resistors. In this case, if the voltage were applied between the twisted pairs, the resistors would dissipate far too much power and would burn out. So there is definitely a concern with the dumb PoE injectors and at least some non-PoE devices.
https://electronics.stackexchange.com/questions/459169/how-c...