Do anons believe this country is ever going to be able to heal itself? Will we ever go back to the days that Americans loved each other. I feel like I'm just tired of all the hate and disdain on both sides, without pointing fingers.
Can't we all just get along?