well, that would mean this kind of task is moderated away, not necessarily that it isn't chatGPT4, which I believe they openly advertised as a joint operation
When I was first granted access to Bing Chat, I had it writing comedic screenplays and sea shanties. Although it often clammed up after realizing it was mocking public figures, creating an story of incest(!), or using intellectual property that it was told not to. (Bing Chat's way of clamming up was hilarious; it'd output several paragraphs and then sweep them all closed and retell its response as "I'd rather not talk about that, because <XYZ>.")
I am perhaps one of the "testers" who helped Microsoft to put the kibosh on long-form responses. As of last week, I could still extract limericks from the thing, though.
reply