Where Did the Liberals Go?
Liberals in the 60's would say, "judge people by the content of their character not the color of their skin, race doesn't matter." So called liberals in 2017 now say, "all white people are inherently racist and if you're a minority you're a victim, your race means everything." Oh how far they have fallen since the days of Martin Luther King Jr. What happened to judging the individual on their merits? How have people come to believe in this tribalism? They say they are fighting racism, but are they not the racist themselves? The judge people only by what group those people belong to? This way of thinking is only breading hate into the world.
The left has destroyed liberalism. They no longer say, "I disagree with what you have to say but I will defend your right to say it." Now they must shut down speech they do not like. For instance, the riots at UCLA Berkeley University when conservative writer Ann Coulter was invited to speak, or even the riots and violence seen toward the supporters of President Trump. I believe it comes from the colleges where the United States is taught to be shameful and full of hate and oppression when the opposite is true. America has had its faults over the past 230 years but in every instance the faults have gotten better. We ended slavery, women have been elevated to equals for the first time in the history of civilization. We have stopped terrible tragedies and built a country that is unmatched in its freedom and prosperity.
While there are many more problems that need to be addressed the trend of America has always stayed the same. That is, of course, the trend towards more freedom for the individual. The left threatens this with violence and oppressive behavior towards those who they disagree with. The goal of America has been and should always be the promotion of individual liberty. We must not let the left destroy the very foundations on which our society is built.