Not necessarily. I’m only speculating, which I don’t relish, but I imagine the service isn’t explicitly designed with CSAM as the intended result – if I’m reading it right, it’s billed as “upload picture of clothed person X and received ‘naked’ picture of same person”, and is almost certainly trained on appropriately-aged pornographic material.
I don’t know the inner workings of technology like this, and I certainly don’t intend to critique the images talked about in this article, but the resulting picture doesn’t have to be an accurate representation of a minor’s body to be distressing to the people involved.