shakahislop
Well-known member
Where are you talking about. It's not obvious to meI guess it depends on one's definition of colony or colonialism, but if a country's major industry and source of income and employer is a foreign firm, whose position also then enables it to move the political and social markets, then there's not much missing other than the political branding that would ironically make the relationship less likely to be exploitative by making it more obvious. There are Western companies whose exit from African countries would collapse those countries, and that gives them a lot of clout.