Maybe that's just for dynamic pages. Probably most of the most popular pages are cached and wouldn't be considered in that 20 req/sec. Just my own wild guess though.
That's not the bottleneck. Essentially there's an in-memory database (known as hash tables). Stuff is lazily loaded off disk into memory, but most of the frequently needed stuff is loaded once at startup.
The bottleneck is the amount of garbage created by generating pages. IIRC there is some horrible inefficiency involving UTF-8 characters.