Let us be honest here The only time americans mention soccer is when the world cup comes around. Why? Because americans just want to be superior in everything Sports wars you know it america wants to dominate it So do you agree or nah? Tell me ur thoughts
Soccer isn't a huge sport in the U.S. I think it's cool that people are pumped for the World Cup though. Don't necessarily see a problem with it.
I see it as our country just getting more involved in soccer. We were never really crazy about it before but as more people are growing up who have been around it and it's spreading. I dunno though that's just my opinion. Some people could also just be excited cuz they like to support their country. But I have no idea. I just live in an unknown suburb where there actually is quite a bit of soccer ? as well as a bunch of other sports.
I hate it when people who normally don't give a cap about football suddenly turn into Gary Lineker when the World Cup starts...
I personally enjoy the sport I just hate watching it on tv. I prefer to play it (even though I suck) or go to the actual event and watch. Watching sports on tv bores me and I get too distracted and I always feel like I need new glasses which may or may not be true although I just got new glasses so I think I just somehow fail at watching TVs