For anyone unconvinced, let's simulate this.
Instead of 1 in 14 million, we'll just do 1 in 2, and 8 players.
So we'll check how many bits are set in the average random byte:
void Main()
{
byte[] buffer = new byte[256*1024];
Random.Shared.NextBytes(buffer);
var avg = buffer.Average(b => System.Runtime.Intrinsics.X86.Popcnt.PopCount(b));
Console.WriteLine(avg);
}
Okay, bounces around 3.998 to 4.001, seems normal.
Now let's check how many bits are set in the average random byte given that the low bit is 1 (i.e. player 1 has won!)
void Main()
{
byte[] buffer = new byte[256*1024];
Random.Shared.NextBytes(buffer);
var avg = buffer
.Where(b => (b & ((byte)0x01)) == 0x01)
.Average(b => System.Runtime.Intrinsics.X86.Popcnt.PopCount(b));
Console.WriteLine(avg);
}
Now ~=4.500
Which is 1+3.5
In this case, we're 1+ average from the 7 other players, so being an average of 7 others not 8 others is significant.
If we simulate with millions of players, you'll see that removing 1 person from the pool makes essentially no difference.