You are viewing a single comment's thread from:

RE: Becoming an Adult in Western Culture

in #news7 years ago

Adult? The West? Lol, two words that can no longer be combined. I left the West a long time ago. The West is a daycare where every little need is subsidized and "taken care of". You can't possibly be an adult there. And anyone that believes they are one... Well, let's just say they're probably wrong about that.