Well, it sounds like the photos never left the phone (unless automatic upload to iCloud was enabled of course) so I would consider them "private".
However I think this story could open the eyes of people who didn't think such things were possible and maybe those people will think more about what data they (want to) share with companies like Apple by uploading them, for example, to a cloud.
Another thing the article points out is that somebody must have made the decision to include women's underwear but not men's which is an example for not blaming machine learning but that we as a society need to ask ourselves how we should deal with such technology. Regardless of what you think of it, categorizing only women in underwear is not neutral. Should AI always be "neutral" by only categorizing things in an "objective" way? Should AI analyze things that are so personal at all? There are many questions to be asked, some more complicated than others.
Overall I didn't see anybody blaming AI, the tweet that was shown even asked "... why are apple...".. I don't mean this in an offensive way but I think the whole thing is not about "blaming AI" but a different issue, feel free to discuss with me tho, I think its an very interesting topic.
Machine learning recognises patterns and common themes its more a reflection of society's interest in women's garments than the technology.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
But most likely a human has handpicked the categories for the algorithm to detect
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit