Slight tangent - but I feel that our need for "explainable AI" might lead to us missing out on some fundamentally different ways of thinking and reasoning that AI could provide. Machines might reach conclusions in ways that simply can't be "dumbed down" to human reasoning. That doesn't mean that these means are esoteric or outside of logic, but it just takes a few levels of derivations and statistic to make something really hard to comprehend for us.
replies(1):