chatGPT and spad boonerisms
For whatever weird reason, I often spoonerize words after hearing them. For example, I’ll hear “tapping noise” and think “napping toys.” Or “pop corn” echoes as “cop porn.” I disturbed myself yesterday, however, by hearing “a juice box” and thinking “abuse jocks.”
So, I checked in with chatGPT and found that it’s got the concept in theory but not in practice:
PROMPT: What are some interesting but unexpected spoonerisms?
ChatGPT: Spoonerisms are a type of error in which the initial sounds or letters of two or more words are swapped to create a new phrase. Here are some unexpected spoonerisms that you might find interesting:
"Belly Jeans" instead of "Jelly Beans"
"Scooby Snacks" instead of "Ruby Slippers"
"Nimble Thimble" instead of "Thimble Nibble"
"Bumble Beeple" instead of "People Bumping"
"Picky Sockets" instead of "Sticky Pockets"
"Satan's Fingers" instead of "Santa's Fingers"
"Bike of Bricks" instead of "Brick of Bikes"
"Belly Liar" instead of "Lily Bayer"
"Cozy Nook" instead of "Nozy Cook"
"Pumpkin Piehole" instead of "Pumpkin Pole"
Why do we humans win this pathetic John Henry (that steel-drivin’ man) contest against GPT? Possibly because GPT doesn’t know how words are spelled (because it replaces words with tokens) and thus has to rely on context to detect spoonerisms? Any other, better ideas about why it sucks at spoonerizing?