If a system is poorly understood, then by definition it cannot be factored into predictions. When we say something is “unlikely” we mean “it is unlikely based on what we understand”. I don’t think it’s very useful to ask, “Well, is it unlikely based on what we don’t understand?”, because that’s not a question that can be answered.