Westerners always had a strong feelgood factor about imperialism.Both populations and governments felt good about "civilizing" Asians and Africans,and both governments and populations in the West felt proud of their empires and the international prestige it brought their country.
As for the Asians and Africans being colonized,the ruling elites didn't like it because they lost their political power and independence,but the ordinary people were often no worse off,and in many cases better off. KIngs and chieftains and other rulers in pre imperialized Africa and Asia weren't running some kind of utopian paradise and were doing little to improve their infrastructures or to expand their economies.After all, they were on top with all the power and wealth,they didn't want change,in particular political change.
While this doesn't hold true for Leopold's Congo Free State or German Southwest Africa - both very destructive imperialist regimes for the ordinary people as well as the ruling elites - these were the exceptions rather than the norm.There was less famine,less slavery,les despotism and more rule of law under Western imperialism than there had been under indigenous rulers,and although the general population had little say in national or economic policy,and little or no political power,they hadn't had these things before anyway under their own rulers.
Imperialism has many forms & so many connotations. I hope we are addressing the question of Overseas Imperialism that implies Colonisation the lands of distinctively dissimilar peoples, commonly undertstood.
It is a zero sum game. If it is positive for Westerners, naturally it is negative for non-Westerners, who constitute about 80% of humanity.
The often flaunted argument is by mouthing crap like "the colonies & the people there were civilised". If it is true even, it is a very poor vicarious consolation. Also, there is no gainsaying that these regions would remain backward for ever.
What is evident now is that these regions (mainly in Africa) are yet to get up on their feet as they were exploited so badly by the Imperial powers. The "Civilisation" that was taught goes against the grain and even the very spirit of the cultures & lifestyles there. The local food grain produce that was consumed was replaced by wheat based (& corn based) foods that weren't indigenous products. The import of this stuff (because the civilised portion of the population eats it) pauperised their economies further. There are many such cases. Western educated individuals in these countries became useless for the development of these countries. They are unable to contribute & nor even able to work in that envirronment.
Yes, Imperialism was seen for the time being as positive by Western rulers and most of the population, because it was all about prestige and status. However, imperialism was criticized by left radicals like socialist and pacifists as well as natives. This is because colonies were exploited by all Western empires, esp. Congo as private property of king Leopold II and the British Empire were particularly destructive. For instance, India has experienced the lowest growth and more famines in its history during the British Empire than at any other given time, beside the rule of Delhi Sultanate and the reign of Aurangzeb. In later generations, their empires were not always regarded as positive by Western nations. They were most popular between the 18th century and declined throughout the 20th century steadily from a high point before WW1.
It is generally not even thought of by "westerners."
I mean, do you really think about where your iPhone comes form?
Or are you just interested in getting the best phone for your money?
As for Africa and Asia, China is doing very well now that they have embraced "imperialism" in the form of economic expansion. For a generation it was a backward agrarian country.
As for Africa. Well, the nature of the continent left them isolated and undeveloped. When the rest of the world moved on, they were stalled in the past. Left to themselves they would still be in trouble.
interesting enough imperialism does damage to the society that evolves as a whole
this has been discussed at length in deconstructing colonisation which is of course a more hands on imperialism
but
to address the statement directly
imperialism displaces that which exists in the geographic area prior to its arrival
often this is justified by the imperialist as having brought "civilisation", development, education and so on
but
no value is acknowledged for what existed before
in a truly respectful exchange the traditional societies would be met by the interested aliens and offered to exchange their wares for the benefits of the aliens
sadly this approach has as good as never been used
Answers & Comments
Verified answer
Yes.
Westerners always had a strong feelgood factor about imperialism.Both populations and governments felt good about "civilizing" Asians and Africans,and both governments and populations in the West felt proud of their empires and the international prestige it brought their country.
As for the Asians and Africans being colonized,the ruling elites didn't like it because they lost their political power and independence,but the ordinary people were often no worse off,and in many cases better off. KIngs and chieftains and other rulers in pre imperialized Africa and Asia weren't running some kind of utopian paradise and were doing little to improve their infrastructures or to expand their economies.After all, they were on top with all the power and wealth,they didn't want change,in particular political change.
While this doesn't hold true for Leopold's Congo Free State or German Southwest Africa - both very destructive imperialist regimes for the ordinary people as well as the ruling elites - these were the exceptions rather than the norm.There was less famine,less slavery,les despotism and more rule of law under Western imperialism than there had been under indigenous rulers,and although the general population had little say in national or economic policy,and little or no political power,they hadn't had these things before anyway under their own rulers.
It is true. That is the direct answer.
Imperialism has many forms & so many connotations. I hope we are addressing the question of Overseas Imperialism that implies Colonisation the lands of distinctively dissimilar peoples, commonly undertstood.
It is a zero sum game. If it is positive for Westerners, naturally it is negative for non-Westerners, who constitute about 80% of humanity.
The often flaunted argument is by mouthing crap like "the colonies & the people there were civilised". If it is true even, it is a very poor vicarious consolation. Also, there is no gainsaying that these regions would remain backward for ever.
What is evident now is that these regions (mainly in Africa) are yet to get up on their feet as they were exploited so badly by the Imperial powers. The "Civilisation" that was taught goes against the grain and even the very spirit of the cultures & lifestyles there. The local food grain produce that was consumed was replaced by wheat based (& corn based) foods that weren't indigenous products. The import of this stuff (because the civilised portion of the population eats it) pauperised their economies further. There are many such cases. Western educated individuals in these countries became useless for the development of these countries. They are unable to contribute & nor even able to work in that envirronment.
Yes, Imperialism was seen for the time being as positive by Western rulers and most of the population, because it was all about prestige and status. However, imperialism was criticized by left radicals like socialist and pacifists as well as natives. This is because colonies were exploited by all Western empires, esp. Congo as private property of king Leopold II and the British Empire were particularly destructive. For instance, India has experienced the lowest growth and more famines in its history during the British Empire than at any other given time, beside the rule of Delhi Sultanate and the reign of Aurangzeb. In later generations, their empires were not always regarded as positive by Western nations. They were most popular between the 18th century and declined throughout the 20th century steadily from a high point before WW1.
Not really.
It is generally not even thought of by "westerners."
I mean, do you really think about where your iPhone comes form?
Or are you just interested in getting the best phone for your money?
As for Africa and Asia, China is doing very well now that they have embraced "imperialism" in the form of economic expansion. For a generation it was a backward agrarian country.
As for Africa. Well, the nature of the continent left them isolated and undeveloped. When the rest of the world moved on, they were stalled in the past. Left to themselves they would still be in trouble.
interesting enough imperialism does damage to the society that evolves as a whole
this has been discussed at length in deconstructing colonisation which is of course a more hands on imperialism
but
to address the statement directly
imperialism displaces that which exists in the geographic area prior to its arrival
often this is justified by the imperialist as having brought "civilisation", development, education and so on
but
no value is acknowledged for what existed before
in a truly respectful exchange the traditional societies would be met by the interested aliens and offered to exchange their wares for the benefits of the aliens
sadly this approach has as good as never been used