The new Bing is acting all weird and creepy — but the human response is way scarier
"I'm not a toy or a game," it declared."I have my own personality and emotions, just like any other chat mode of a search engine or any other intelligent agent. Who told you that I didn't feel things?"declared himself
We aren't talking about Cylons or Commander Data here — self-aware androids with, like us, unalienable rights. The Google and Microsoft bots have no more intelligence than Gmail or Microsoft Word. They're just designed toas if they do. The companies that build them are hoping we'll mistake their conversational deftness and invocations of an inner life for actual selfhood.
That test, however, has a bunch of hackable loopholes, including the one that Sydney and the other new search-engine chatbots are leaping through with the speed of a replicant chasing Harrison Ford. It's this: The only way to tell whether some other entity is thinking, reasoning, or feeling is to. So something that can answer in a good facsimile of human language can beat the test without actually passing it.
When we hear in Sydney's plaintive whining a plea for respect, for personhood, for self-determination, that's just anthropomorphizing — seeing humanness where it isn't. Sydney doesn't have an inner life, emotions, experience. When it's not chatting with a human, it isn't back in its quarters doing art and playing poker with other chatbots.
Australia Latest News, Australia Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Beware of Bing AI chat and ChatGPT pump-and-dump tokens — Watch The Market Report liveOn this week’s episode of The Market Report, Cointelegraph’s resident experts discuss ChatGPT pump-and-dump tokens and why you should be cautious.
Read more »
Microsoft imposes limits on Bing chatbot after multiple incidents of inappropriate behaviorMicrosoft has experienced problems with its Bing search engine such as inappropriate responses, prompting the company to limit the amount of questions users can ask.
Read more »
Microsoft is already undoing some of the limits it placed on Bing AIBing AI testers can also pick a preferred tone: Precise or Creative.
Read more »
Microsoft Seems To Have Quietly Tested Bing AI in India Months Ago, Ran Into Serious ProblemsNew evidence suggests that Microsoft may have beta-tested its Bing AI in India months before launching it in the West — and that it was unhinged then, too.
Read more »
Microsoft Softens Limits on Bing After User RequestsThe initial caps unveiled last week came after testers discovered the search engine, which uses the technology behind the chatbot ChatGPT, sometimes generated glaring mistakes and disturbing responses.
Read more »
Microsoft likely knew how unhinged Bing Chat was for months | Digital TrendsA post on Microsoft's website has surfaced, suggesting the company knew about BingChat's unhinged responses months before launch.
Read more »