I do a fair amount of PHP (really, I run hhvm, but let's say modern php), and effectively nothing in that article affects me.
No one should be using mysql_* anything; no one should have been using it for years. It's deprecated. It throws warnings when run. PHP's love of forced backward compatibility causes some very interesting things to be remained.
Paamayim Nekudotayim isn't called that anymore and hasn't for a long time.
The author raises issues about things like @fopen(url) - no one I know, or myself, has ever done that. It makes absolutely no sense to.
PHP has autoloading/"modules"/dependency injection/a proper package manager/etc (article is from early 2012: composer was largely popularised from what I can see a bit after that).
The author complains about no stack traces - sure, there won't be any on a single file bad function name, but any time I've used a proper request library or anything best practice, I've gotten a proper stack trace with file names, lines, etc.
--
There are a lot of somewhat odd complaints or wildly strange, like "No Unicode support" and "PHP is naturally tied to Apache. Running it separately, or with any other webserver, requires just as much mucking around" which makes absolutely no sense to me. The author of this article even says that if you want to run two versions of PHP, you rebuild Apache? That's absurd.
> 'The “bunch of files” approach, besides making routing a huge pain in the ass, also means you have to carefully whitelist or blacklist what stuff is actually available, because your URL hierarchy is also your entire code tree'
None of this makes sense either: unless you are doing some seriously outdated horrors, you have effectively one /index.php router and a public asset folder for the webroot.