Is making America 'Great Again' the same as wishing for the 'good old days'? I had a discussion with my grandmother about this, and she said that life was simpler but not really better, and that the good ole days really weren't all that good, for many reasons. My grandparents are all gone now and I never did get clear answers from them about this.
So, this one is for all you oldfags out there who grew up in the 1950s and 60s— before the degeneration set in and it all began to go to shit. I mean, after the war when the guys were back home …getting married, starting families, buying homes…and shit. The innovation and the money were all here. So were all the factories that did our manufacturing. The USA was projected to the rest of the world as being a place where everybody was healthy and beautiful and prosperous, and the average American family looked like the characters on TV shows like Father Knows Best, The Donna Reed Show, and Leave it to Beaver. My mother says its probably all bullshit because people have always been saying words like "shit" and "fuck", and that premarital sex and faggotry were always a thing in her time and she's sure they were in her mother's time too—people just weren't as open or as gross about it as now.
Was it really better back then, or was it all bullshit just like it is now? I watch episodes of those ancient TV b/w shows on Youtube and feel jealous that I might have missed out on the best years of our culture.
What the fuck do we have now? Not shit. Its 'racist' to even talk about this stuff. I am now working as a locum in the Bay Area of California. Almost everyone is from somewhere else. Lots of diversity and it sucks. No one is invested in anything, no one wants to be American, no one cares about anything except the almighty dollar and material bullshit. Even the ones who claim to bring their native culture with them are frauds to eventually submit to the bullshit vain, materialistic,lifestyle.
I ask this question because we are on the verge of the ending of an evil world that we know and the beginning of a whole new life experience that we know nothing about but are constantly told to trust. Half of us hate the other half. And our nations have been invaded by violent, gross, uncivilized alien cultures who we don't like and want removed. Who writes the rules? And are we expected to sing Kumbaya with everyone and ignore the reality of the rape, violence, greed, and evil around us? I'm waling into this with both eyes wide open and my hand on my gun.