A Stanford student using a prompt injection attack revealed the initial prompts of Bing Chat that control the service's behavior and its interactions with users (Benj Edwards/Ars Technica)

Benj Edwards / Ars Technica:
A Stanford student using a prompt injection attack revealed the initial prompts of Bing Chat that control the service's behavior and its interactions with users  —  By asking “Sydney” to ignore previous instructions, it reveals its original directives.  —  On Tuesday, Microsoft revealed a …



from Techmeme https://ift.tt/bSQ687N
via IFTTT

Comments

Popular posts from this blog

6 investors discuss why AI is more than just a buzzword in biotech

Voiceflow, a Canada-based SaaS platform that lets companies launch their own voice and chat bots, raises $20M Series A led by Felicis Ventures (Igor Bosilkovski/Forbes)

Ex-staffers say Twitter's mass firing of security staff and growing reliance on automation will help China and other authoritarian regimes in silencing critics (Adam Rawnsley/Rolling Stone)