- by foxnews
- 18 Nov 2024
As the capabilities of natural language processing technology continue to advance, there is a growing hype around the potential of chatbots and conversational AI systems. One such system, ChatGPT, claims to be able to engage in natural, human-like conversation and even provide useful information and advice. However, there are valid concerns about the limitations of ChatGPT and other conversational AI systems, and their ability to truly replicate human intelligence and interaction.
Fed huge amounts of human-created text, ChatGPT looks for statistical regularities in this data, learns what words and phrases are associated with others, and so is able to predict what words should come next in any given sentence, and how sentences fit together. The result is a machine that can persuasively mimic human language.
And yet the chatbot that can write grade A essays will also tell you that if one woman can produce one baby in nine months, nine women can produce one baby in one month; that one kilo of beef weighs more than a kilo of compressed air; and that crushed glass is a useful health supplement. It can make up facts and reproduce many of the biases of the human world on which it is trained.
Some of these problems will be ironed out over time. Every conversation involving ChatGPT becomes part of the databank used to improve the program. The next iteration, GPT-4, is due next year, and will be more persuasive and make fewer errors.
Humans, in thinking and talking and reading and writing, also manipulate symbols. For humans, however, unlike for computers, meaning is everything.
When we communicate, we communicate meaning. What matters is not just the outside of a string of symbols but its inside too, not just the syntax but the semantics. Meaning for humans comes through our existence as social beings, embodied and embedded in the world. I only make sense of myself insofar as I live in, and relate to, a community of other thinking, feeling, talking beings.
Of course, humans lie, manipulate, are drawn to and promote conspiracy theories that can have devastating consequences. All this is also part of being social beings. But we recognise humans as being imperfect, as potentially devious, or bullshitters, or manipulators.
Machines, though, we tend to view either as objective and unbiased, or potentially evil if sentient. We often forget that machines can be biased or just plain wrong, because they are not grounded in the world in the way humans are, and because they need to be programmed by humans and trained on human-gathered data.
None of this is to deny the remarkable technical achievement that is ChatGPT, or how astonishing it feels to interact with it. It will undoubtedly develop into a useful tool, helping to enhance both human knowledge and creativity. But we need to maintain perspective. ChatGPT reveals not just the advances being made in AI but also its limitations. It also helps to throw light on both the nature of human cognition and the character of the contemporary world.
Kenan Malik is an Observer columnist
Booking.com has released its annual travel predictions list for 2025, and one trend, "vintage voyaging," has 74% of travelers seeking vintage or second-hand items.
read more