Nov 01, 2004 - 0 Comments - Uncategorized -

Are Americans Evil?

I adore the accent it’s cute and so many of them I’ve met are so nice. One of the reasons I always wanted to go to America was because of their generosity and their hospitality but this was the America I knew pre 9/11. Now I and many of my colleagues view America as the dominating empire on the planet following in the footsteps of some of the great Empires which eventually collapsed might I add. It’s become a place of fear and hate for anything foreign or alien and much more… Why?

Is it the people or is it the administrations. And why does American media always feed to their public fear hatred and violence. I know it probably makes good news and keeps ratings up but is that really good journalism or is there something else there hence fox news.

I long to know the America I knew pre 9/11 but she’s changed. Usually change is for the better but in this case I don’t think so. Would you really like to lead a life and raise kids in an environment which there are guns a plenty and where people view you as an alien? I wouldn’t. But then I have only visited America twice before 9/11. These are just how I think many people view things across the pond. Someone please prove me wrong…