In general Next.js has so many layers of abstraction that 99.9999% of projects don't need. And the ones that do are probably better off building a bespoke solution from lower level parts.
Next.js is easily the worst technology I've ever used.
In general Next.js has so many layers of abstraction that 99.9999% of projects don't need. And the ones that do are probably better off building a bespoke solution from lower level parts.
Next.js is easily the worst technology I've ever used.
Things will get far worse before they get better. Right now, online courses such as the ones in PluralSight are pushing Next.js on virtually all courses related to React. I have no idea what ill-advised train of thought resulted in this sad state of affairs but here we are.
It's pretty absurd to have such a broad range of web solutions, and think the same solution can cover everything.
I'm not so sure about that. We're seeing Next.js being pushed as the successor of create-react-app even in react.dev[1], which as a premise is kind of stupid. There is something wrong definitely going on.
We do a 30-min tops exercise where you create a React project to show how to use useState and useEffect, etc. I help with whatever command they want to use and allow Google/ChatGPT.
More than half of the candidates had no idea how to use React without Next.js, and some argued it was impossible, even after I told them the opposite.
To many people, it's just basic logic: "everyone must want the latest React features, and the only way to get those is with Next, so everyone must want Next".
For me, lately, the interview question is "here's code that ChatGPT generated for (previous interview question as related to the role we're hiring for that we could do)", what's wrong with it? What do now? (ChatGPT may or may not have actually generated the code in question.)
Next.js is essentially the reference and test bed impl.
Where people go wrong is thinking they need to default to the inherently complex niche feature of client hydration which is a niche optimization enabled by a quirk of web tech.
It is more like test on whether or not you can figure out random React minutiae (with Google/ChatGPT, if needed) when presented with a need. Which isn't a bad approximation for how well you will do at finding any random minutiae as needs present themselves. React-based development doesn't require much original thought — the vast majority of the job really is just figuring out the minutiae of your dependencies to fit your circumstantial need.
For fun, I asked ChatGPT for an answer and it gave a perfectly good one back without hesitation. Even if you had no idea what React was beyond knowing it is a library for developing web components, you should still be able to answer that particular question with ease.
When you're in a work meeting, do you just put ChatGPT up on one laptop and Claude on another and just sit back for 30 minutes to an hour?
My point is that it's fishy how they push features that just so happen to be the value proposition of the only corporation that just so happens to be able to implement them.
One of the factors is that web dev pushes for a complete separation of concerns, and thus allows frontend developers to specialize in front end development. Therefore it becomes far easier to hire someone to do frontend work with a webdev background than a win32/MFC background.
Number of applicants is also a big factor. There is far more demand for webdev than pure GUI programming. You can only hire people who show up, and if no one shows up then you need to scramble.
Frontend development is also by far the most expensive part of a project. In projects which use low-level native frameworks you are forced to hire a team for each target platform. Adopting technologies that implement GUIs with webpages running in a WebView allow projects to halve the cost. This is also why technologies like React Native shine.
Also, apps like Visual Studio Code prove that webview-based apps can be both nice to look at and be performant.
It's not capabilities. It's mainly the economics.
It's like not knowing how to write a for loop or how to access an object's property in JavaScript.
Then there came small web applications, and still no "front-end developers", since functionality could only work on the server.
It's only when AJAX was introduced in the mid 2000's that you could start to talk about "front-end developers".
By that time, win32 and MFC was old. We had Java, C# with .net framework, etc.
It's also dismissive of market forces, i.e. developers have to pay bills and therefore are easier to hire if they know the skillset that is in wide use.
I've never worked or interviewed a single senior that wanted to use Next.