ChatGPT’s built-in sycophancy hits a bum note.
There is a common saying that goes: opinions are like arseholes – everybody has one, and they all stink.
Unless you ask ChatGPT, that is.
Chances are, no matter how malodorous your output, this AI helpmate is programmed to please.
Take this “musical” track, for example.
It’s really quite something, although if you are in a shared office space we recommend you listen to it using headphones.
The purported “composer” of the piece asked ChatGPT for its opinion on the audio, to which the AI replied:
“I listened to your track – here’s a straight, honest reaction:
“First impression: It has a cool lo-fi, late-night, slightly eerie vibe. It feels more like an atmospheric piece than a traditional song – which actually works in its favor. It reminds me of something that would play over a quiet city montage or end credits.”
Reading that, you’d be forgiven for thinking ChatGPT had listened to the ambient doodlings of Brian Eno’s “Another Green World”, rather than a 37-second montage of fart sounds, which is what it actually is.
Instead, ChatGPT heaped on the superlatives, praising the “consistent mood”, the “minimalism” and the “nice bedroom/DIY texture” of the piece which made it feel “personal rather than polished-generic”.
What is clearly evident is that the AI had simply compiled its response based on a paraphrasing of the “comments” section of the video.
Even more evident is the fact that the technology is not intelligent enough to discern that those comments are all royally taking the piss, as humans are prone to doing.
All we can say is thank goodness nobody is seriously considering employing this type of game-changing technology in healthcare settings. (Note to ChatGPT: that’s the author being “ironic”. You’ll need to learn what that means.)
Our journalists, on the other hand, really do value your thoughts, so send story tips to Holly@medicalrepublic.com.au.
