LLMs completely change the way people do things, in the same way that methamphetamine addictions completely change the way people do things.
Take a quick look at that summary graph. Then read the X axis labels, and laugh, and weep.
LLMs are literally basically cocaine addiction: delivering the feeling of competence and success directly to your brain, while all actual real-world evidence points to the opposite. They also actually work for some purposes, of course.
It's like if you nag a kid into cleaning their room, and give them detailed instructions on what to do next once they begin the activity, you might underestimate the time you spent, focusing only on the actual clean up time executed by the kid.
Part of the reason may be that the interactions are not recorded. The record of development consists of deliverables like source code, bug reports and documentation. Interactions with AI are gone, just like editor keystrokes.