It’s not only that it discriminated against certain groups, but also that it in itself has a high enough error rate to make it unusable for any decision making. MAYBE to select people for screening, but we would be falling further down into a dystopian future.
At the current performance, none if these AIs should be involved in anything this critical.