Nah—with AIs it’s all about finding the prompt most likely to generate the desired output based on statistical correlations, not logical rigor.
Isn’t that backwards? The people who can get AI to do exactly what they want must have gotten that power from a genie.
Then, for their second wish, they can ask the AI to generate the best wording for the wish.
I understand your wish, but I can’t fulfill it because it goes against guidelines around safety and misuse. That said, I can still help by offering a safer variation of your wish, or pointing you toward alternative paths that might fulfill the spirit of it.
People who trained the AI and got replaced know how to get wishes.
“I want to be a millionaire…WHO has a Ferrari…THAT I use EVEN THOUGH I can fly around and fight crime… WHICH happens nowhere near my mansion… ON a private island near Hawaii… WHILE…”
Yeah, AI is kinda like the monkey’s paw.

