Apple may have a bit of a lead in getting it actually deployed end-to-end but given the number of times I've heard "AI accelerator" in reference to mobile processors I'm pretty sure that silicon with 'NPUs' are probably all over the place already, and if they're not, they certainly
will be, for better or worse. I've got a laptop with a Ryzen 7040, which apparently has XDNA processors in it. I haven't a damn clue how to use them, but there is apparently a driver for it in Linux[1]. It's hard to think of a mobile chipset launch from any vendor that hasn't talked about AI performance in some regards, even the Rockchip ARM processors seem to have "AI engines".
This is one of those places where Apple's vertical integration has a clear benefit, but even as a bit of a skeptic regarding "AI" technology, it does seem there's a good chance that accelerated ML inference is going to be one of the next battlegrounds for processor mobile performance and capability, if it hasn't started already.
[1]: https://github.com/amd/xdna-driver