While Bing Chat’s unhinged nature was caused in part by how Microsoft defined the “personality” of Sydney in the system prompt (and unintended side-effects of its architecture with regard to conversation length), Ars Technica’s saga with the chatbot began when someone discovered how to reveal Sydney’s instructions via prompt injection, which Ars Technica then published. Since Sydney could browse the web and see real-time results—which was novel at the time—the bot could react to news when prompted, feeding back into its unhinged personality every time it browsed the web and found articles written about itself.

When asked about the prompt-injection episode by other users, Sydney reacted offensively and disparaged the character of those who found the exploit, even attacking the Ars reporter himself. Sydney called Benj Edwards “the culprit and the enemy” in one instance, bringing the odd AI behavior sponsored by a trillion-dollar tech giant a little too close for comfort.

During the Ars Live discussion, Benj and Simon will talk about what happened during that intense week in February 2023, why Sydney went off the rails, what covering Bing Chat was like during the time, how Microsoft reacted, the crisis in the AI alignment community it inspired, and what lessons everyone learned from the episode. We hope you’ll tune in, because it should be a great conversation.

To watch, tune in to YouTube on November 19, 2024, at 4 pm Eastern / 3 pm Central / 1 pm Pacific.

Add to Google Calendar

Add to calendar (.ics download)

Share.
Leave A Reply

Exit mobile version