does anyone else feel like when white people talk about traveling “the world” but actually only mean western europe is kinda an imperialist attitude
More you might like
Most people I know or have spoken to about their travels have gone to China, Japan and Thailand as well as Africa. They go all over besides North Korea and the more dangerous Middle Eastern countries.
I genuinely think that’s just you who thinks otherwise.
