Tag Archives: America

Racism isn’t Dead. You’re in Denial

If you are a person who believes that you live in a post-racial America, either you have really bought into the hype or you are simply in denial. I can remember in 2008 when President Barak Obama was elected and the media immediately started saying that we were over racism. Those statements were far from […]

Read More

America or Amerikkka: Which Country Do You Live In?

For as long as I can remember, my parents have always taught me that inequality exists within this beautiful country we live in. I have been taught that there is a certain protocol that we, as blacks, have to follow in order to survive in America. Through self-study, I have also learned the hidden and […]

Read More