Alucard
Neetdom>wageslavery
-
- Joined
- May 10, 2018
- Posts
- 172
What happened? Having a tan was something that was associated with peasants who spent their days working in fields here in the UK. It was undesirable. Everyone who actually had money shielded themselves from the sun in summer with umbrellas. Why was there a sudden shift to everyone wanting a tan? Do the normies not realize that tanning severely ages your skin and makes you look like a leather bag by the time you're 40? At least the Japanese still have it right. They still shield themselves from the sun in summer to preserve their natural skin tone. I've only gotten a tan a few times in my entire life and each time I didn't like how I looked with it. It brings out my freckles, it makes my skin look dirty and I end up looking like a Turk. Other white people have actually tried to make me feel ashamed of my pale skin before by saying I look unhealthy. It's not unhealthy. It's my natural skin colour and also yours, you buffoon. If anything is unhealthy, it's spending hours out in the sun soaking up cancer-causing UV rays. My dad especially pisses me off with this. He loves spending whole days sunbathing in summer and getting really tanned. He thinks he looks good with it, but it actually makes him look like shit and it's probably why he hasn't aged well over the years. I'm 23 now (Turning 24 in July) and people tell me I still look 18. Oh, the wonders of staying indoors.