Okay and then what? There's already huge issues with getting home insurance in places like Florida, what will they do, force companies to offer it against their will and see them go broke? Convince people that insurance is woke? There are real problems that aren't going away, regardless of their beliefs.