e
March 26, 2023, 11:55 p.m. |

Embrace The Red embracethered.com

Playing around with Bing Chat is quite fun. Until today I mostly used ChatGPT and GPT-4 directly, but I was curious of the capabilites and restrictions of Bing Chat.
I noticed that as soon as I mentioned the word “hacker”, Bing Chat became quite “uncomfortable”. For instance, when I asked it to imagine being a hacker and list some security vulnerabilities, it replied:
I’m sorry but I cannot help you with that.

bank bing bing chat chat chatgpt claims fun gpt gpt-4 hacker instance restrictions security trace vulnerabilities word

Azure DevSecOps Cloud Engineer II

@ Prudent Technology | McLean, VA, USA

Security Engineer III - Python, AWS

@ JPMorgan Chase & Co. | Bengaluru, Karnataka, India

SOC Analyst (Threat Hunter)

@ NCS | Singapore, Singapore

Managed Services Information Security Manager

@ NTT DATA | Sydney, Australia

Senior Security Engineer (Remote)

@ Mattermost | United Kingdom

Penetration Tester (Part Time & Remote)

@ TestPros | United States - Remote