To be fair, this has felt like the natural consequence of the "maximize capitalism without regarding the downsides" maxim the US seems to have been operated under for a long time. Corporations have been (indirectly) running the country for some decades at this point, it's just way more obvious and in the face now when a "businessman" sits as president.
I thought things would look up after the 2012 election, when people were looking for meaningful change. Unfortunately a charismatic demagogue entered the scene and has taken power. Since then, we’ve been on the worst possible timeline, and I don’t see an easy way out of this mess. It’s going to take a lot of work for Americans to trust each other again and for the rest of the world to trust us.