America. The new world. A land where one could start afresh. What went wrong? I am not certain but I have a theory. People form across Europe colonised America. Wiping out hundreds of thousands of Native Americans who were viewed as savages and displaced. Fast forward a bit: The colonists were discontented (no taxation without representation etc) with their British rulers. Fast forward a few years and there is a bloody conflict across the colonies, France aids America and helps them significantly in winning their independence. Now America is the land of the free... Unless you are poor, black, homosexual, a non-Christian or a woman. This false equality and sense of 'freedom' instilled a price in many Americans. Did I say pride? No a cancer. The mentality of 'I can' that many Americans today possess is ridiculous. They feel entitled to whatever they want. They are blissfully ignorant to the suffering they inflict. They turn a blind eye to their own deeply rooted social problems... Eh. That's enough context for now. Let us examine a typical American family: Here we have an artists impression of an American family (Yes I know it's a satirical cartoon you dolt) obese, slobbish, gun wielding and having a strange pride in their lifestyle...the life of freedom. Americans also traditionally are Christian...with a lot being Evangelicals...'nuff said... Enough rambling. Let's get down to Presidents: Trump is a possible president... Enough said there. America is a war mongering nation responsible for the unnecessary deaths of millions. The people blindly defend it. Your nation is socially backwards, you have awful gun laws (in most states) Radical Christians able to preach hatred as they please, bigoted views are commonplace and you slaughter millions to obtain resources. Good job America