Been seeing a lot about how the government passes shitty laws, lot of mass shootings and expensive asf health care. I come from a developing nation and we were always told how America is great and whatnot. Are all states is America bad ?
Been seeing a lot about how the government passes shitty laws, lot of mass shootings and expensive asf health care. I come from a developing nation and we were always told how America is great and whatnot. Are all states is America bad ?
As an American who left, it looks batshit insane to me. Everything is crazy expensive and they’re passing restrictive laws that, if passed anywhere in Asia or Africa, would be run as “look at these backwards shitty country” news stories.
I’ve got a trans kid. We’re not returning any time soon. It seems unsafe for them to exist in the us for the foreseeable future.
But I’ve got us friends who feel the opposite. We visited a friend in Bainbridge a few years ago who really couldn’t comprehend why everyone wouldn’t want to live on their island.
Asia (here) isn’t really any more unsafe. I visited India recently and it felt less safe, but everyone I know there also said it wasn’t really. It depends on areas as well, and much of it (everywhere) is just media depictions and racism telling your brain to panic.
The real advantage of the us is just cash. You can make a lot more money there. They’re rich. Money is good. It makes life easier. Its also expensive there. To save at any income level, you have to be thrifty.