It’s the wrong algorithm!
I used to be on Facebook to stay in touch with my kids and grandkids, but even they are no longer on that platform.
Facebook has quite perverse and inconsistent rules about what is and what is not acceptable. It allows the distribution of false information on all manner of topics, but show a female nipple and get your account suspended. As a naturist, Facebook is not a particularly welcoming place, as any naturist content is deemed inappropriate.
For me, the final straw with Facebook was the invasive advertising. Now it may just have been a coincidence, but my partner and I were in a furniture store talking about coffee tables, and I described to her one that I had seen which was shaped like a Hippopotamus semi-submerged, with the water represented by a sheet of glass. The bulk of the hippo was below the glass, with its snout, ears and butt partially on top of the glass. The following day, I was presented with an ad for the same coffee table, despite only discussing it with my partner. I accept it may have been pure coincidence, but it was an unusual item, and the chances of an ad for it randomly appearing after talking about it seem phenomenally slim at best.
My kids and grandkids use other platforms like WhatsApp and Instagram to communicate, and so I use these to keep in touch with them.
Instagram is really just an extension of Facebook and it appears to use a similar peculiar guide to censorship.
Twitter doesn’t seem to filter content, except for excluding nudity from the main profile and header images, although it appears that many accounts flout this rule with no immediate consequence. Occasionally the platform has a purge of accounts it considers in breach of its guidelines, but enforcing its own rules seems to be intermittent. It is as if they are reacting to individual complaints rather than applying their own rules across the board. Twitter can be toxic and overwhelming for newcomers. While naturists have found a home there, we spend a lot of time trying to separate the overly sexualised content from the genuine naturist content. For some of us, our block list is longer than our list of followers or those we are following.
There are other social media platforms that do allow for more freedom of expression, but these can be a minefield to navigate and often expose you to more extreme content.
Another strategy available to naturists is the use of dedicated sites that use either a paywall or an invitation-only membership structure to filter out the content that many find inconsistent with their idea of the naturist philosophy. The risk is that the paywall or the invitation model acts as a barrier to access, limiting the success of these sites.
Social media organisations can be separated into two categories. Those that use some kind of censoring of content, supposedly based on community standards, and those that are free from any censorship.
For those sites that choose censorship, it would be unfeasible to have real people wade through the significant volume of content that is continuously posted and updated. Automating the process would seem to be the only practical way to manage the task.
The problem to date is that the algorithms used to censor content struggle to identify non-sexual nudity or naked art and put them in the same category as pornography. Artificial Intelligence (A.I.) has a long way to go, and its judgement leaves a lot to be desired. Throw in some complicated social rules, such as male nipples are fine, but female nipples are not, and it is no wonder that computers struggle with making consistent rulings on censorship.
The big issue with computers trying to analyse an image is one of context. Humans can easily tell the difference between an image of a naked person lying on a bed reading a book, and one of a naked person lying on a bed reading a book and pleasuring themselves. It was my understanding that a computer would struggle with the distinction and that computers may be taught to identify body parts, but as yet seem unable to identify sexual intent.
In a recent discussion online, someone assured me that A.I. is capable of making the distinction and I have no reason to doubt their assertion. If A.I. is capable of making the distinction between simple non-sexual nudity and pornography, why does it consistently demonstrate that it doesn’t?
Is this because the people programming the A.I. don’t understand the difference between nudity and sex?
There was a revealing 2020 documentary called Coded Bias about the development of A.I. in facial recognition. MIT Media Lab researcher Joy Buolamwini discovered that facial recognition does not see dark-skinned faces accurately so she embarked on a journey to discover why. Her investigations revealed that most of the examples used to teach the computers were white male faces. The outcome was that the facial recognition systems used by law enforcement in the U.S. were very poor at correctly identifying women or dark-skinned faces.
The machine-learning algorithms were as biased as the people programming them.
Perhaps the people programming the A.I. to detect inappropriate content simply don’t understand the difference between a naked body and sexualised content. If the coding is biased, then the software will be unable to make valid judgements. If the software is self-learning, then surely an error as significant as this so early on means that A.I. will never be able to understand or distinguish the nuances between simply nudity and sexualised nudity.
“Forget artificial intelligence – in the brave new world of big data, it’s artificial idiocy we should be looking out for.”
Thank you for reading, have a comfortable day.