One that comes to mind for me: “Whatever doesn’t kill you makes you stronger” is not always true. Maybe even only half the time! Are there any phrases you tend to hear and shake your head at?
One that comes to mind for me: “Whatever doesn’t kill you makes you stronger” is not always true. Maybe even only half the time! Are there any phrases you tend to hear and shake your head at?
I got told “life isn’t fair” so many times growing up, I came up with a default comeback: “Doesn’t mean you have to be.”
A version of it has grown to became my tenet in life: “The universe doesn’t care, so we have to.”
It’s not as pithy, but I think “Just because you didn’t get your way, doesn’t mean it’s unfair” would be a better sentiment for adults to tell children.
Or “I don’t fucking care what happened, I just don’t want to hear you whine about it”. Hardly an acceptable way to talk to children, but I think it’s what adults in my life meant when I was a child.
When someone who’s trying to exploit me says that, I literally just beat the hell out of them to remind them how right they are and that their means of dominance isn’t the only one. Real world strikes again! This time it’s the reason we have manners!