HousingMay 23, 2018
Newgoodboi

Does anyone feel working in the US isn't worth it anymore?

I just feel that the only reason people come here is because of the hog paying jobs. But everything else is ridiculous. I'm not comfortable of the thought of sending my future kids to school here. School shootings aren't a shocker anymore, gun violence in general is ridiculous here. Healthcare is a joke. Can't even say politically we're safe because our ridiculous politicians are making us both a target and a laughing stock. I grew up here, I'm a citizen, I used to love this country, but living 10 years abroad and viewing it from the outside, and then coming back and viewing everything from the inside, I can't help but ask what the fuck went wrong here. The bigger question I ask myself, do I want to bring my fiance here, raise my future kids here, when I feel threatened more than I feel safe. Is it worth it? Are these high paying salaries worth it? I dont know anymore. I don't know if other feel the same way, but I'm sick and tired of this country.

Add a comment