But in general I have gone for for generalist not specialist in my career.
But there is diminishing value in how deep you want to go into the rabbit hole. Of course there's always more to learn, but with C you're fairly quickly leaving the language and move into the layers of compiler- and hardware-trivia (good to know nonetheless, but often not really relevant for being productive in C) where in other higher-level languages you're still working your way through the standard library ;)
It's not as big of a problem these days with things becoming less heterogeneous; almost everything is little endian now, much of it 64-bit but at least 32-bit, and we can kind of rely on POSIX being there most of the time. Most new code uses stdint.h and is explicit about word lengths by using int32_t, etc. and follows good conventions there.
But venture off the beaten path into odd microcontrollers or into retro machines or port older code or whatever ... and there's glass hidden in the grass all over.
C also exposes a model of the machine that looks low level but behind the scenes a modern processor does all sorts of branch prediction and pipelining and so on that can blow up your assumptions.
What looks like optimized clever C code can actually end up running really slow on a modern machine, and vice versa.