I want to reach for my tools when I want to use them.
Remember the term "smart" as applied to any device or software mode that made ~any assumptions beyond "stay on while trigger is held"? "AI" is the new "smart." Even expert systems, decision trees, and fulltext search are "AI" now.
Not really, I'm taking the hint. If they call a feature "AI", there's a 99% chance it's empty hype. If they call a feature "machine learning", there may be something useful in there.
Notice how Apple, in this event even, uses the term "machine learning" for some features (like some of their image processing stuff) and "AI" for other features. Their usage of the terms more or less matches my line of features I want and features I don't want.
But that's not true of any other actor in the market. Everyone else — but especially venture-backed companies trying to get/retain investor interest — are still trying to find a justification for calling every single thing they're selling "AI".
(And it's also not even true of Apple themselves as recently as six months ago. They were approaching their marketing this way too, right up until their whole "AI" team crashed and burned.)
Apple-of-H2-2025 is literally the only company your heuristic will actually spit out any useful information for. For everything else, you'll just end up with 100% false positives.
Anyway, it's not the same thing: I'm fine with machine learning to give me better image search results, I'm not fine with machine learning to generate "art" or machine learning to generate text. Everyone has collectively agreed to call the latter "AI" rather than machine learning, so the term is a useful distinction.
The same product could be produced five years ago or today, and the one produced five years ago would not be described as having "AI features", while the one produced today would.
(You can check for yourself: look at the online product listing for any mature "smart" device that got a new rev in the last three years. The Clapper would be described as an "AI" device today.)