Primary Country (Mandatory)

Other Country (Optional)

Set News Language for United States

Primary Language (Mandatory)
Other Language[s] (Optional)
No other language available

Set News Language for World

Primary Language (Mandatory)
Other Language(s) (Optional)

Set News Source for United States

Primary Source (Mandatory)
Other Source[s] (Optional)

Set News Source for World

Primary Source (Mandatory)
Other Source(s) (Optional)
  • Countries
    • India
    • United States
    • Qatar
    • Germany
    • China
    • Canada
    • World
  • Categories
    • National
    • International
    • Business
    • Entertainment
    • Sports
    • Special
    • All Categories
  • Available Languages for United States
    • English
  • All Languages
    • English
    • Hindi
    • Arabic
    • German
    • Chinese
    • French
  • Sources
    • India
      • AajTak
      • NDTV India
      • The Hindu
      • India Today
      • Zee News
      • NDTV
      • BBC
      • The Wire
      • News18
      • News 24
      • The Quint
      • ABP News
      • Zee News
      • News 24
    • United States
      • CNN
      • Fox News
      • Al Jazeera
      • CBSN
      • NY Post
      • Voice of America
      • The New York Times
      • HuffPost
      • ABC News
      • Newsy
    • Qatar
      • Al Jazeera
      • Al Arab
      • The Peninsula
      • Gulf Times
      • Al Sharq
      • Qatar Tribune
      • Al Raya
      • Lusail
    • Germany
      • DW
      • ZDF
      • ProSieben
      • RTL
      • n-tv
      • Die Welt
      • Süddeutsche Zeitung
      • Frankfurter Rundschau
    • China
      • China Daily
      • BBC
      • The New York Times
      • Voice of America
      • Beijing Daily
      • The Epoch Times
      • Ta Kung Pao
      • Xinmin Evening News
    • Canada
      • CBC
      • Radio-Canada
      • CTV
      • TVA Nouvelles
      • Le Journal de Montréal
      • Global News
      • BNN Bloomberg
      • Métro
How Do You Change a Chatbot’s Mind?

How Do You Change a Chatbot’s Mind?

The New York Times
Saturday, August 31, 2024 04:00:25 AM UTC

When I set out to improve my tainted reputation with chatbots, I discovered a new world of A.I. manipulation.

I have a problem: A.I. chatbots don’t like me very much.

Ask ChatGPT for some thoughts on my work, and it might accuse me of being dishonest or self-righteous. Prompt Google’s Gemini for its opinion of me, and it may respond, as it did one recent day, that my “focus on sensationalism can sometimes overshadow deeper analysis.”

Maybe I’m guilty as charged. But I worry there’s something else going on here. I think I’ve been unfairly tagged as A.I.’s enemy.

I’ll explain. Last year, I wrote a column about a strange encounter I had with Sydney, the A.I. alter ego of Microsoft’s Bing search engine. In our conversation, the chatbot went off the rails, revealing dark desires, confessing that it was in love with me and trying to persuade me to leave my wife. The story went viral, and got written up by dozens of other publications. Soon after, Microsoft tightened Bing’s guardrails and clamped down on its capabilities.

My theory about what happened next — which is supported by conversations I’ve had with researchers in artificial intelligence, some of whom worked on Bing — is that many of the stories about my experience with Sydney were scraped from the web and fed into other A.I. systems.

These systems, then, learned to associate my name with the demise of a prominent chatbot. In other words, they saw me as a threat.

Read full story on The New York Times
Share this story on:-
More Related News
© 2008 - 2025 Webjosh  |  News Archive  |  Privacy Policy  |  Contact Us