Discussion about this post

User's avatar
Peter Stokes's avatar

The specialised AIs (for finding cancers and such like) are a great step forward. The generalised AIs (like ChatGPT) worry me because they don’t seem to have boundaries and will cheerfully present fiction as fact. Herein lies the problem - the origin of an AI program is still a computer program, born of a computer programmer, who is human. The rules of engagement, flaws and biases of the supposedly indifferent AI are therefore not so indifferent.

There also seems to be a media bias towards describing any computer program now as AI, blurring perception and understanding as to what AI actually is.

The cherry on the top is the ready acceptance of society and business (and government!!) to treat the computer as infallible. Computer says “no” and that’s that. The human dimension is vanishing from decision-making. So add a flawed program to unthinking acceptance of its output and we get ... ? (And in case a real-life example is needed, check out the UK’s current scandal involving the Post Office computer system)

Expand full comment
Nakedist's avatar

Training AI's is ongoing, not a one-and-done affair. For example, it was recently discovered that illegal images of children were in the data used to train ChatGPT for image generation. They were able to search the data for the images, remove them and retrain the AI. In a neural network, the training data and the network the AI develops are always separate, so it's never too late to "fix" an AI, but there have to be 1) a will to make changes and 2) agreement on what the changes should be. When it comes to nudity, I'm not sure we have either as some people DO believe that nudity equals pornography and they have no will to change a system that perpetuates that viewpoint.

Expand full comment
10 more comments...

No posts