I had almost this exact interview experience recently with a popular AI startup. The exercise was to build a search UI over a static array of dictionary terms. It was a frontend role so I wired it up with filter and startsWith and spent more time polishing the UI and UX.
The final interview question was: “Okay, how do you make this more performant?” My answer was two-tiered:
- Short term: debounce input, cache results.
- Long term: use Algolia / Elastic, or collaborate with a backend engineer to index the data properly.
I got rejected anyway (even with a referral). Which drove home OP's point: I wasn't being judged on problem solving, but auditioning for the "senior eng" title.
With candidate interview tools and coding aids increasingly hard to detect in interviews, this gap between interview performance and delivering in the role is only going to widen. Curious how many of these "AI-assisted hires" will start hitting walls once they're outside of the interview sandbox.
replies(3):