Log(n) is unbounded, yes, but if a Turing machine halts in sub-linear time then there is some length input for which that TM halts before it reads every cell of the input tape. Therefore, it doesn’t matter how the size of the input grows, that TM must halt in constant time. Make sense?
What makes a/your TM special that this is the case?
I don’t mean to take too much of your time though, maybe I’m too dumb for this.
Edit: I can sort of see how O(log n) is impossible or at least O(n) in a TM, but to reduce it to O(1) makes no sense to me
Now consider any input I, and let I_N be the subset of I that fits in N characters (I.e. I_N is the same as I except with all characters past the Nth replaced by blanks).
Then by our earlier statement defining N, the TM must halt on I_N after fewer than N steps. But the TM’s behavior on I for the first N steps must be the same as its behavior on I_N, because it can’t possibly have read any different characters (since in N steps it can only have read at most the first N characters, and those two inputs are the same on that interval). Thus the machine must halt in less than N steps on I. But I was chosen arbitrarily, so we’ve shown that the machine halts in less than N steps on any input, and recall that N was a fixed constant.