Sunday, December 18, 2011

Why does the States portray itself as a nation that only wants peace when your track record says otherwise?

It's human nature. We all have a tendency to view ourselves favorably even though we don't always do the right thing. But in America's case it's painfully obvious to the rest of the world but either we turn a blind eye to our indiscretions or our "patriotism" does not allow us to point out hypocrisy. Although I do believe at times war is necessary to achieve peace, Americans just hate to admit how significant war and violence are simply a part of our culture

No comments:

Post a Comment