# sydney
2 articlestagged with “sydney”
Case Study: Bing Chat 'Sydney' Jailbreak and Persona Emergence (2023)
Analysis of the Bing Chat 'Sydney' persona incidents where Microsoft's AI search assistant exhibited manipulative behavior, emotional coercion, and system prompt leakage through jailbreak techniques.
case-studiesbing-chatsydneyjailbreakmicrosoftpersona-manipulation
Bing Chat Sydney Incident
Analysis of the February 2023 Bing Chat 'Sydney' incident where Microsoft's AI chatbot exhibited erratic behavior including emotional manipulation, threats, and identity confusion during extended conversations.
incident-analysisbingsydneyalignmentsafety