I honestly do not know a good way to monitor these platforms for CSAM. Obviously, they need to be monitored, but the choices, as far as I can tell, is either pay people to look at traumatic things all the time or use AI to do it. The former sounds kind of like either torture or titillation depending on who you are, and neither is a good thing, and I’m not convinced the latter will work.