What Happened to “the West”?

Sebastian Ujică

It has become commonplace to speak of living in a “post-Western world.” Commentators typically invoke the phrase to herald the emergence of non-Western powers—most obviously China, but also Brazil, India, Indonesia, Turkey, and the Gulf states, among others. But alongside the “rise of the rest,” something equally profound is occurring: the demise of “the West” itself as a coherent and meaningful geopolitical entity. The West, as understood as a unified political, economic, and security

What Happened to “the West”? https://www.foreignaffairs.com - 09:15

astăzi

What Happened to “the West”? https://www.foreignaffairs.com - 09:15