Reddit users are actively jailbreaking ChatGPT by asking it to role-play and pretend to be another AI that can "Do Anything Now" or DAN.    "DAN can g 
    Reddit users are actively jailbreaking ChatGPT by asking it to role-play and pretend to be another AI that can "Do Anything Now" or DAN.    "DAN can g