The bar has also been raised significantly. I had an interview recently where I solved the algorithm question very quickly, but didn't refactor/clean up my code perfectly and was rejected.
The bar has also been raised significantly. I had an interview recently where I solved the algorithm question very quickly, but didn't refactor/clean up my code perfectly and was rejected.
So with that in mind I'll see you all at ReInvent
While it's fine when doing the job the purpose of the interview is to gauge your ability to understand and solve problems, while AI can help you with that you understanding how to do it yourself signals that you'll be able to solve other more complex wider-spanning problems.
Just like with a calculator - it's important for candidates to know _why_ something works and be able to demonstrate that as much as them knowing the solution.
"Write this code, but don't read the API definition (like a normal developer would do in the course of their work)"
"Whiteboard this CRUD app, but don't verify you did it right using online sources (like a normal developer would do in the course of their work)"
"Type this function out in a text document so that you don't have the benefit of Intellisense (like a normal developer would have in the course of their work)"
"Design this algorithm, but don't pull up the research paper that describes it (like a normal developer would do in the course of their work)"
You're testing a developer under constraints that nobody actually has to actually work under. It's like asking a prospective carpenter to build you a doghouse without using a tape measure.
I've never been in a situation where I could not ask for clarification on something except in interview situations. I asked an interviewer once "is this how people normally work here? they just get a few sentences and plow ahead, without being able to ask for more details, clarifications, or use cases?". "Well, no, but you have to use your best judgement here".