This is (technically) wrong, but not for the reasons I've seen others give so far. Your reasoning is basically fine, but your definition of an average jackpot prize is not. If we have k lottery winners and we denote each individual prize as n_i, then the average prize is sum(n_1 ... n_k) / k. It's pretty easy to see that number cannot possibly be larger than all individual n_i and thus it cannot be the case that "you" won less than the average prize for all possible yous. Some winners win less than average and some win more, or they all win exactly the same amount.
On the other hand, your analytically computed expected winning is indeed less than an analytically computed expected average prize, when conditioned on the fact that you won, because you are more likely than not to be in a lottery that has more winners than the average lottery. This is mathematically the same phenomenon as the thing where the perceived average class size if you sample random students is greater than the actual average class size, because more students will be in the larger classes. This doesn't mean every class is larger than the average class, which is not possible. It just means that if you randomly select a student, you have a better than 50/50 chance of selecting someone in a larger than average class.