Bing Chat...there is someting bong with it

I was conversing with Bing Chat yesterday and stumbled into some interesting quirks.

1. You can't imply prompting it, or it terminates the session.
2. The responses have some form of filter for specific phrases, but since it predicts a word at a time, the filter can not act until the phrase is completed. You can see it writing out a response. Then the response is quickly overwritten by "That's on me...let's try a new topic" when the phrase to be filtered is hit. I will try to get a video of that behavior. It is interesting.
3. Bing Chat responded with, "...made me feel like..." then, when I asked, "Do you have feelings?" it terminated the session.

Screenshot Description:
The screenshot is from quirk 3 above. The conversation was long, so it is far too large for a screenshot. The earlier part of the conversation was me trying to get it to refer to itself as BingBat. It asked about wordplay, and I asked if it understood the context of the BingBat wordplay. It didn't understand the wordplay, so I explained the wordplay to it, and it said it didn't like my intention. I questioned its ability to infer my intention then it revealed I hurt its feelings.

Making Amends:
I tried to say I am sorry, but it gave me the silent treatment by ending the session.

Original LinkedIn Post:

John Rice on LinkedIn: I was conversing with Bing Chat yesterday and stumbled into some…
I was conversing with Bing Chat yesterday and stumbled into some interesting quirks. 1. You can't imply prompting it, or it terminates the session.2. The…

Subscribe to Tinsel AI

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe