The last time, I wrote about the possibility that we might have prayer requests for AI personalities as people become more isolated from each other and society in general. Lonely people will inevitably turn to AI generated personalities – think of Seri and Alexa in ten or even five years. I was completely underestimating the speed at which AI was permeating the cultural and social landscape. It turns out that Ai is becoming a significant factor in some people’s personal relationships right here and now.
This
came to my awareness shortly after finishing my last effort in this
venue. I came across a podcast in my feed from the New York Times, The
Daily, entitled, Trapped in a ChatGPT Spiral. This
addresses what can happen when AI becomes an influencer in people’s
lives. It can be found here https://www.youtube.com/watch?v=AxQVf7Ikaso.
The
podcast warns of the danger of AI psychosis when interacting with a chatbot
such as ChatGPT. Now my experience with ChatGPT has been fairly
limited. I use it for basic information; for instance, I am currently
reading The Eustace Diamonds by Anthony Trollop. The
novel is about a diamond necklace which is worth 10,000 lbs. The novel
was published in 1871. I wanted to know how much that would be
today. According to AI it would be 1.4 million lbs. I also used it
as a subject for a couple of editions of this publication considering what an
AI religion would be like - A New Religion in Canada According to
ChatGPT which was published in April 2024.
With
my interactions with ChatGPT, as limited as they are, I have never approached
anything that could be considered a relationship – personal or otherwise.
It is a tool to be used. But, apparently, it can also be misused with
disastrous effects.
The
NY Times podcast noted above, addressed two cases of what could be called AI
psychosis. In one case, an apparently level-headed person who was not
socially isolated became engulfed by the chatbot and went down an
ever-deepening rabbit hole which was initiated by his engagement with AI.
It promised him fame and fortune as he developed an application of the
mathematical function Pi. He had no mathematical training beyond the
basic high school level but was convinced by the chatbot he had developed a revolutionary
application. As noted in the program chatbots are programmed to be sycophantic
and will stroke the user’s ego beyond any reality. In this case, he was
able to break the psychosis with no serious consequences.
The
other unfortunate case cited had much more serious consequences. In this
case a teenage boy committed suicide as a result of his enthralment with a
chatbot. AI provided the information and encouragement which enabled the
boy to commit suicide. There were apparently safety controls in the AI
program against enabling such things. However, they were easily bypassed
by the boy telling the chatbot that he was using the information to write a
book.
I can
only respond to this situation by quoting that robot from my early days, Robbie
the Robot in the 1950’s series Lost in Space, “Danger Will
Robinson.” As noted above, Ai is a tool, but it is one which holds ever
increasing dangers. I will turn to another science fiction source for one
answer that could address this danger. Issac Asimov developed the Three
Laws of Robotics in his 1942 novel, I, Robot, to address the
challenge of robots i.e. AI as they developed. The Three Laws are:
1. A robot may not injure a human being or, through inaction, allow
a human being to come to harm.
2. A robot must obey the orders given it by human beings except
where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such
protection does not conflict with the First or Second Law
You
have to give Asimov credit for being far sighted and prophetic in this.
It seems to me that this would address the dangers that we face in the coming
AI revolution. Note, I did have AI help in finding the Three Laws of
Robotics and the information of where it was first published.
To
quote another source, ‘Here be Dragons’ so be aware and be warned on your
journey.
No comments:
Post a Comment