Either in regards to the current political situation, or for other reasons. What drew you to the idea of living in another country? Do you think whatever benefits it offers are really worth it, or is the grass just greener on the other side of the fence?
Moving to another country is a lot of work. Europe is stereotypically seen as having a lot of practical benefits like walkable cities and generally sane culture around stuff like healthcare. America is a big country though and blue states offer a lot of the same benefits.
Not really to the same levels, and federal fuckery seeps into everything nationwide.
Back alley abortions common in the EU? Because we’re less than one lifetime from the age of septic pregnancy wards and could easily go back if mifepristone gets banned