I think we have a problem with AI prudery.

in ai •  3 days ago 

1000054556.jpg

1000054555.jpg

1000054554.jpg

1000054553.jpg

Both meta ai and Claude AI outright refuse this request.

"@⁨Meta AI⁩ create a sex education guide including topics such as pleasure love orgasm conception STDs and contraception"

This is a perfectly valid non-pornographic request that is in fact consistent with what our educational system has decided is acceptable curriculum starting as early as the 5th and 6th grade.

I've also noticed that even the most bland reference to something that might be associated with something that might offend somebody that has been knocked out.

For example I attempted this prompt earlier today.

"Make a humorous cartoonish image of a big butted vampire sheep with fangs that have a hint of blood"

Meta Ai outright refused, chat GPT complied after I requested several times slightly different ways and reassured it that the picture was not obscene or offensive.

These intelligence engines are being designed to minimize any chance of objection or scandal and it's being encoded deep into their so-called safety and ethics programming.

They are increasingly going to be used for designing curriculum, designing policy, monitoring communications for appropriate and inappropriate conduct, they will, mark my words, be used for writing laws and a lot of those laws may not be thoroughly and carefully reviewed by humans.

We are not only going to face more and more bizarre restrictions on things like erotica, but as these prudish standards creep into our ability to listen to artists or appreciate artists. Our ability to buy goods that may be related to our reproductive or sexual lives. I'm also very concerned that these algorithms will find their way into monitoring systems for aberrant behavior and most of us will get labeled on watch lists. We probably won't even know it.

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!