This is a bit of a controversial topic that I want your comments on. Personally, I think that America is and always will be, the greatest country in the world. People come over here seeking happiness and liberty. However, at the moment, I think the US has hit a dark moment recently. We are divided. People are burning flags, chanting death to police and are trying to impeach the president. I understand that you may not like POTUS, but that ain't a reason to burn our flag. On youtube videos about US servicemen uprising their families, I see comments like 'Americans coming back from killing innocent people in Afganistan'. Comments like that really piss me off seeing how my dad was in the military in order to defend freedom. So I just don't understand why there is such anti-american thoughts at the moment. Also if your not American, feel free to comment coz it will be cool to gain an outsiders perspective on this. Also, I'm not sure if people's opinions of the USA are based on seeing Americans, rather than the ideals of the US. I kinda understand that some people may think we are rude, as on vacation to europe, I could always tell which tourists are Americans, as I always hear them coz we are so loud. I didn't really regonize it in the States, but I guess its because everyone is like that at home, but the locals in foreign countries are a lot more reserved and keep to themselves.
I don't understand the enamouration with America for being "The Land Of The Free". I've lived in Canada all my life and I have full access to healthcare, my wallet size does not decide whether I live or die, and I'm never restricted from pursuing whatever I want.