Charles R. Twardy

Follow @ctwardy on

Unforced bias

In The bias we swim in, Linda McIver notes:

Recently I saw a post going around about how ChatGPT assumed that, in the sentence “The paralegal married the attorney because she was pregnant.” “she” had to refer to the paralegal. It went through multiple contortions to justify its assumption…

Her own conversation with ChatGPT was not as bad as the one making the rounds, but still self-contradictory.

Of course it makes the common gender mistake. What amazes me are the contorted justifications. What skin off AIs nose to say, “Yeah, OK, ‘she’ could be the attorney”? But it’s also read responses and learned that humans get embarrassed and move to self defense.


If justifying, it could do better. Surely someone must have written how the framing highlights the power dynamic. And I find this reading less plausible:

The underling married his much better-paid boss BECAUSE she was pregnant.

At least, it’s not the same BECAUSE implied in the original.