Tools
Change country:

What the death of Iran’s president could mean for its future

Iranians gathered to mourn the death of President Ebrahim Raisi and Foreign Minister Hossein Amir-Abdollahian in a helicopter crash the previous day, at Valiasr Square, on May 20, 2024 in Tehran, Iran | Majid Saeedi/Getty Images

The Iranian regime is unlikely to change course in the near term, but Ebrahim Raisi’s death could affect crucial succession plans.

Iranian President Ebrahim Raisi died Sunday in a helicopter crash, a shocking turn of events that immediately raised questions about the Islamic Republic’s future.

In the short term, Raisi’s passing is unlikely to alter the direction of Iran’s politics. But it does remove one possible successor to 85-year-old Supreme Leader Ayatollah Ali Khamenei.

In the long term, Raisi’s unexpected death may prove more consequential. The question of Khamenei’s succession is increasingly urgent because of his advanced age. Though Iran’s president can be influential in setting policy, the Supreme Leader is the real seat of power, controlling the judiciary, foreign policy, and elections.

Raisi and Foreign Minister Hossein Amirabdollahian’s helicopter made a hard landing sometime on Sunday in Iran’s mountainous northwest, where weather conditions made travel difficult and dangerous. Iranian state media announced the deaths of the two politicians and six others onboard, including three crew members, on Monday after rescue teams finally reached the crash site.

The deaths of both Raisi and Amirabdollahian come at a time of internal and external challenges for the Iranian regime. A harsh crackdown after the widespread protests of 2022 and significant economic problems domestically have eroded the regime’s credibility with the Iranian people. Internationally, Iran is embroiled in a bitter regional conflict with Israel as well as a protracted fight with the US over its nuclear program.

In the near term, the first vice president, Mohammad Mokhber, will be the acting president as the country prepares to hold elections within the next 50 days as dictated by its constitution. (The Iranian government includes vice presidencies overseeing different government agencies, similar to US Cabinet-level secretaries; the first vice president is roughly equivalent to the US vice president.)

Raisi was considered a potential successor to Khamenei, having already been vetted by the ruling clerics during his 2021 presidential run and having been committed to the regime’s conservative policies. With his death, amid one of the regime’s most challenging periods, Iran’s long-term future is a little less certain.

Within Iran, succession is the biggest question

A hardline conservative cleric, Raisi always wore a black turban symbolizing his descent from the prophet Muhammad. His close relationship with the powerful Islamic Revolutionary Guard Corps (IRGC) fueled speculation that he could succeed Khamenei. The paramilitary force exerts significant sway over internal politics and also wields influence throughout the broader region through aligned groups and proxy forces in Iraq and Syria, as well as Hezbollah in Lebanon, the Houthis in Yemen, and Hamas in Gaza.

Raisi was initially elected in 2021 with 62 percent of the vote, though turnout was only 49 percent — the lowest ever in the history of the Islamic Republic, evidence of the crisis of legitimacy in which the government increasingly finds itself.

“People don’t want to legitimate the government by participating in what they consider either fraudulent or just non-representative political outcomes,” Firoozeh Kashani-Sabet, Walter H. Annenberg professor of history at the University of Pennsylvania, told Vox.

Throughout his judicial career, Raisi is alleged to be responsible for or implicated in some of the government’s most brutal repression and human rights abuses since the 1979 revolution, including serving on the so-called Death Committee, which was tasked with carrying out thousands of extrajudicial executions of political prisoners in the 1980s. During and after the Iran-Iraq war, there were a number of groups opposed to the regime, as well as supporters of the Iraqi position and even an attempt to attack Iran from Iraq. In order to preserve the Islamic Republic’s legitimacy, Ayatollah Ruhollah Khomeini ordered a sweeping purge of the opposition; many of the dissidents who were arrested were chosen for execution arbitrarily.

Following the disputed 2009 election — which birthed the Green Movement, the most significant threat to the regime in decades — Raisi, then a high-level member of the judiciary, called for the punishment and even execution of people involved in the movement. And as president, he helped oversee the violent backlash to the Woman, Life, Freedom movement that erupted following the death in police custody of Mahsa Amini, a 22-year-old Kurdish woman arrested by the morality police for allegedly wearing her hijab improperly.

Raisi’s unpopularity due to his repressive past and worsening living standards for ordinary Iranians had helped further erode the government’s legitimacy, which may affect the upcoming presidential contest.

“On the one hand, bringing people to the ballot boxes is going to be difficult,” Ali Vaez, director of the Iran program at the International Crisis Group, told Vox. “On the other, I think [the Council of Guardians, which oversees elections in Iran] also don’t want, necessarily, the people to come to the ballot boxes. And they also don’t want to have an open election, because the entire focus of the leadership right now is on ideological conformity at the top, they don’t really care about legitimacy from below.”

That will mean a highly manicured list of candidates in the upcoming election. Though there are possibilities for some marginal change, Negar Mortazavi, a journalist and senior fellow at the Center for International Policy, said during a panel discussion Monday that there will be little room for any significant shift.

“[Raisi] could potentially be replaced by someone like Mohammad Bagher Ghalibaf,” the current speaker of the Parliament, who is not a cleric and may be less socially conservative, Mortazavi said. “So I see a little bit of openings in the enforcement of, for example, mandatory hijab, the lifestyle policing of young Iranians. That’s the one area that we could potentially see any policy change direction or enforcement of existing laws and regulations.”

But the next president, whoever it is, will likely be a caretaker and not the successor to Khamenei. That person — potentially Khamenei’s own son, Mojtaba — will be the conduit for power and policy in Iran over the coming decades. Iran’s political future will also be dictated by the IRGC, which has grown its power, visibility, and centrality in recent years.

“What the [Iranian] deep state wants is a leader who’s no longer supreme, and is basically a frontman for the current office and the Revolutionary Guards to be able to preserve their vested economic and political interest in the system,” Vaez said. “There are clerics who would fit that profile — either Ayatollahs who are too old to be able to actually run their own affairs, and they certainly would not be able to run the country, or are too young and too inexperienced and lack constituency of their own.”

Iran’s international precariousness, explained

Raisi’s death comes as Iran is engaged in a deepening proxy war with Israel as the Jewish state fights Hamas in Gaza, particularly through Iran’s affiliated group in Lebanon, Hezbollah. Its allies in Yemen, the Houthis, have traded fire with US forces in the Red Sea, and Syrian and Iraqi militias have attacked US anti-terror installations in those countries.

In April, Iran launched hundreds of drones and missiles in retaliation for Israel’s assassination of an Iranian military official in Damascus, Syria earlier that month. It was the first time Iran had launched such an attack on Israeli territory from its own, and prompted further retaliation from Israel in the form of its own missile and drone attack.

Iran’s conflict with Israel usually comes through allied non-state groups in its “axis of resistance” across the Middle East, like the militias in Syria and Iraq that attack American positions or Hezbollah in southern Lebanon, which trades rocket fire with the Israeli military over the southern Lebanese border.

Those international efforts are not likely to change significantly in the near future following Raisi’s death. Amirabdollahian was close to the IRGC command, the Associated Press reported Monday, and they are likely to maintain significant sway over Iran’s internal and external affairs.

Deputy Foreign Minister Ali Bagheri Kani will take over as acting foreign minister until a new government is formed. His portfolio includes negotiation over Iran’s nuclear program, which will continue to be a critical part of its foreign policy agenda. Some experts fear that any uncertainty about Iran’s internal politics, given the nuclear stakes, elevates the risk of direct conflict between Iran and the US or Israel.

“Iran is already a nuclear weapons threshold state, and regional tensions are high,” Kelsey Davenport, director for nonproliferation policy at the Arms Control Association, said in a panel discussion Monday. “We’ve seen this uptick in Iranian statements about weaponization potential. So the risk of the United States or Israel miscalculating Iran’s nuclear intentions was already quite high, and any injection of domestic political turmoil increases the risk of misinterpreting Iranian actions. I think that the risk of miscalculation will remain.”

On the other hand, this period of turnover, during which the Iranian government’s priority will likely be to mitigate any risk of major change or upheaval, could present an opportunity for the international community and the Biden administration to de-escalate relations with Iran, particularly concerning its nuclear projects, Davenport said.

“I think the Biden administration should be prepared to try to put a package on the table that incentivizes Iran to take some short-term steps that reduce proliferation risk,” she added.

Real change in Iran will not come through a single person, but through systemic change, Kashani-Sabet told Vox.

“Iran needs a new political framework; we need a new constitutional framework,” she said. “I think this is really the only way out for Iran — a constitutional framework that helps to forge a more participatory and inclusive political culture.”


Read full article on: vox.com
Singer-songwriter Huey Lewis on seeing his songs come to life on stage
Singer-songwriter Huey Lewis joins "CBS Mornings" to talk about his new Broadway musical, "The Heart of Rock and Roll," and working through hearing loss.
1m
cbsnews.com
How a woke military has created a recruiting crisis — and put Americans in danger
Fox News host Pete Hegseth tackles the issue in his new book “The War on Warriors: Behind the Betrayal of the Men and Women Who Keep Us Free."
nypost.com
These Anti-Wrinkle Serums Soften Fine Lines and Combat Sun Damage
Scouted/The Daily Beast/Retailers.Scouted selects products independently. If you purchase something from our posts, we may earn a small commission.As we navigate the ever-evolving landscape of ‘anti-aging’ skincare products, searching for the right active serum to suit your specific skin goals can be challenging. Whether your aim is to soften fine lines and crow’s feet, remove UV-induced hyperpigmentation, or smooth out texture and the appearance of enlarged pores, there’s a targeted formula for everything nowadays. Of course, not all anti-aging serums are created equal.To help you narrow down the best one for you (and your skin type), we’ve rounded up some of our favorite skin-rejuvenating serums to help correct and prevent multiple signs of aging on the skin. From potent retinoid-forward serums to damage-erasing (and preventing) vitamin C formulas, these serums will help you achieve a radiant, youthful complexion.Read more at The Daily Beast.
thedailybeast.com
Man Shocks With 100-Burrito Meal Prep System That 'Changed the Game'
"For this specific video, it was one marathon of a day," Tom Walsh told Newsweek. "I made a little over 100 burritos."
newsweek.com
Donald Trump Rails Against Sentencing Date His Own Lawyer Agreed To
Defense asked for a "mid to late July" sentencing date, court transcripts show.
newsweek.com
Selena Gomez says she chooses to be friends with ‘levelheaded people’: ‘Girls are mean’
“It’s a cliché, but girls are mean,” the "Love On" singer, 31, said. “I love having levelheaded people around that couldn’t give two f--ks about what I do."
nypost.com
Trump Begs Supreme Court for Help as He Awaits Hush-Money Sentencing
Joe Camporeale/USA Today Sports via ReutersDonald Trump has called on the Supreme Court to weigh in on his hush-money case as his sentencing looms next month.The former president, who was convicted on 34 felony counts of falsifying business records, is set to be sentenced on July 11, four days before the beginning of the Republican National Convention in Milwaukee. He has vowed to appeal his history-making conviction on charges related to his efforts to unlawfully influence the 2016 election with a scheme to cover up a hush-money payment made to porn star Stormy Daniels.“The ‘Sentencing’ for not having done anything wrong will be, conveniently for the Fascists, 4 days before the Republican National Convention,” Trump wrote on his Truth Social platform on Sunday evening.Read more at The Daily Beast.
thedailybeast.com
Social Security Update: Why You Won't Be Getting a Payment This Week
Because of the number of weeks in the month, there are slight changes to the usual payment schedule in June.
newsweek.com
Florida Condo Owners in Race Against Time Before Hurricane Season
A new program will offer Florida condo associations the opportunity to get public funding to harden their buildings as hurricane season kicks off.
newsweek.com
One in Three Republicans Now Think Donald Trump Was Wrong Candidate Choice
A new poll has revealed changing attitudes to Trump from his Republican supporters.
newsweek.com
Michael Doulas visits Israel to show solidarity as war in Gaza continues
Actor Michael Douglas paid a solidarity visit to an Israeli kibbutz that was hit hard in the Oct. 7 Hamas attack that sparked Israel's war against the Islamic militant group.
cbsnews.com
Mohamed Hadid claims he’s the ‘victim’ in bitter feud with lender after filing fifth bankruptcy
Financially-strapped real estate developer Mohamed Hadid -- the celebrity dad of supermodels Gigi and Bella Hadid -- claimed he's the "victim" of a predatory lender after filing for bankruptcy over a prized California property, The Post has learned.
nypost.com
Family sues butcher who slaughtered pet pigs when he went to wrong house
Natalie and Nathan Gray say Port Orchard, Wash., butcher Jonathan Hines “recklessly” caused their family harm. Hines said he apologized to the Grays.
washingtonpost.com
Will ‘boots on the ground’ be the next red line crossed in Ukraine?
Until now, the West has ruled out sending troops to Ukraine. France’s Emmanuel Macron has other ideas.
washingtonpost.com
There's a man, a woman and a dog. But don't call 'Colin From Accounts' wacky
Harriet Dyer and Patrick Brammall created, star in and produce the Australian romantic comedy.
latimes.com
Aileen Cannon Playing 'Dangerous Game' in Donald Trump Trial: Attorney
Former President Donald Trump has been making statements that could put FBI lives at risk, said Joyce Vance.
newsweek.com
American complacency is Trump’s secret weapon
Popular culture instills the idea that good ultimately triumphs over evil. Real life begs to differ.
washingtonpost.com
China Claims Arrest of Spies Turned by US Ally
China's Ministry of State Security is continuing a monthslong campaign of spy wars against the West.
newsweek.com
Women Turn Up at Airport for Flight, Make Embarrassing Realization
Social media users were amused by the scene in the viral clip, with one wondering "how does this even happen."
1 h
newsweek.com
The campaign dichotomy in one newsletter :slightly_smiling_face:
In today’s edition … Hunter Biden’s trial set to start today … Sen. Menendez’s wife remains key figure in trial even in her absence.
1 h
washingtonpost.com
Europeans Are Watching the U.S. Election Very, Very Closely
American allies see a second Trump term as all but inevitable. “The anxiety is massive.”
1 h
theatlantic.com
Elon Musk, America’s richest immigrant, is angry about immigration. Can he influence the election?
The most financially successful immigrant in the U.S. — the third-richest person in the world — has frequently repeated his view that it is difficult to immigrate to the U.S. legally but “trivial and fast” to enter illegally.
1 h
latimes.com
Op-comic: What one doctor learned as a guinea pig for AI
I was skeptical of bringing artificial intelligence into the exam room, but it promised to reduce my screen time and shift the focus back to the patients.
1 h
latimes.com
What would the great George Balanchine do? L.A. ballet director thinks he has the answers
It's provocative to aspire to slip into the mind of one of ballet’s great masters, but Lincoln Jones sees it as a progression in his long devotion to George Balanchine’s art.
1 h
latimes.com
They cut their water bill by 90% and still have a 'showstopping' L.A. garden
A Los Angeles couple tore out 1,150 square feet of thirsty lawn, replacing it with a showstopping mix of low-water California native plants.
1 h
latimes.com
The U.S. Drought Monitor is a critical tool for the arid West. Can it keep up with climate change?
New research raises questions about the familiar map's ability to address long-term drying trends, including persistent dry spells across the American West.
1 h
latimes.com
Forget the trendy juice bars. This is the place to go for green juice
TK
1 h
latimes.com
Santa Monica sci-fi museum controversy: A child porn conviction, delays and angry ‘Star Trek’ fans
Questions surround Santa Monica’s Sci-Fi World as staff and volunteers quit and claim that its founder, who was convicted for possession of child pornography, remains active in the museum.
1 h
latimes.com
After 13 years, a homeless Angeleno broke into her old, vacant home and wants to stay forever
Maria Merritt has faced addiction, death of loved ones and other tragedies. A publicly owned home in El Sereno she had, lost, then regained gives her the strength to go on.
1 h
latimes.com
The transformative joys (and pains) of painting your own house
I self-impose and prolong my chaotic paint experiments because collectively, they form a promise: that one day I’ll be able to live happily in the house I’ve always wanted.
1 h
latimes.com
'Resident Alien' star Alan Tudyk is in no hurry to return to his home planet
'Mork and Mindy,' Looney Tunes and Mel Brooks all helped shape the actor as a young person.
1 h
latimes.com
WeHo Pride parade-goers talk joy and inclusivity, trans rights and a thread of fear
Threats against queer people didn't quell the joyful celebration at this year's West Hollywood Pride Parade.
1 h
latimes.com
Who should be the next LAPD chief? Public shrugs as city asks for input
As the Police Commission continues its citywide listening tour to hear about what residents want to see in the department's next leader, many of the stops have seen a low turnout.
1 h
latimes.com
Newsom finally gets moving on fixing California's homeowner insurance crisis
California Gov. Gavin Newsom has proposed urgency legislation to expedite the hiking of homeowner insurance rates. It’s about time. Because the alternative for many is no insurance at all.
1 h
latimes.com
Letters to the Editor: A lifeguard who can't tolerate the LGBTQ+ Pride flag shouldn't be a lifeguard
The lifeguard so upset by the presence of an LGBTQ+ Pride flag that he's suing L.A. County might want to find another line of work.
1 h
latimes.com
Letters to the Editor: California's new electricity billing scheme discourages conversation. That's crazy
A flat fee of $24.15 on most utility customers. Reduced per-kilowatt hour rates. How is this supposed to encourage power conservation?
1 h
latimes.com
Biden and Trump share a faith in import tariffs, despite inflation risks
Both candidates’ trade plans focus on tariffs on imported Chinese goods even as economists warn they could lead to higher prices.
1 h
washingtonpost.com
Caltrans' lapses contributed to 10 Freeway fire, Inspector General finds
For over 15 years, Caltrans failed to enforce safety at its property where a fire broke out last year, shutting down the 10 Freeway.
1 h
latimes.com
13 essential LGBTQ+ television shows (and a parade) to watch during Pride Month
Here’s a guide to queer TV shows, from 'Dead Boy Detectives' to 'Veneno' to 'The L Word,' to make your Pride Month merry.
1 h
latimes.com
Senate Democrats to unveil package to protect IVF as party makes reproductive rights push
The package comes as Senate Majority Leader Chuck Schumer has outlined plans for the chamber to put reproductive rights "front and center" this month.
1 h
cbsnews.com
Hunter Biden's federal gun trial to begin today
Hunter Biden faces three felony charges related to his purchase and possession of a gun while he was a drug user.
1 h
cbsnews.com
Home buyers beware: Buying a property with unpermitted structures can lead to hefty fines
California realtors advise that buyers understand a property's history and structure condition before finalizing their purchase, saving them the headache and cost of future fixes.
1 h
latimes.com
The internet peaked with “the dress,” and then it unraveled
If you were on the internet on February 26, 2015, you saw The Dress. Prompted by a comment on Tumblr, BuzzFeed writer Cates Holderness posted a simple low-quality image of a striped dress, with the headline “What Colors Are This Dress?” The answers: blue and black or white and gold. The URL: “help-am-i-going-insane-its-definitely-blue.” Do you really need me to tell you what happened next? In just a few days, the BuzzFeed post got 73 million page views, inspiring debate across the world. Seemingly every news outlet (including this one) weighed in on the phenomenon. How was it possible that this one image divided people so neatly into two camps? You either saw — with zero hint of variability — the dress as black and blue, or white and gold. There was no ambiguity. Only a baffling sense of indignation: How could anyone see it differently? Looking back, the posting of “the dress” represented the high-water mark of “fun” on the mid-2010s internet. Back then, the whole media ecosystem was built around social sharing of viral stories. It seemed like a hopeful path for media. BuzzFeed and its competitors Vice and Vox Media (which owns this publication) were once worth billions of dollars. The social-sharing ecosystem made for websites that would, for better or worse, simply ape each other’s most successful content, hoping to replicate a viral moment. It also fostered an internet monoculture. Which could be fun! Wherever you were on the internet, whatever news site you read, the dress would find YOU. It was a shared experience. As were so many other irreverent moments (indeed, the exact same day as the dress, you probably also saw news of two llamas escaping a retirement community in Arizona.) More from This Changed Everything The last 10 years, explained How the self-care industry made us so lonely Serial transformed true crime — and the way we think about criminal justice Since 2015, the engines of that monoculture have sputtered. Today, BuzzFeed’s news division no longer exists; the company’s stock is trading at around 50 cents a share (it debuted at about $10). Vice has stopped publishing on its website and laid off hundreds of staffers. Vox Media is still standing (woo!), but its reported value is a fraction of what it used to be (sigh). The dress brought us together. It was both a metaphor and a warning about how our shared sense of reality can so easily be torn apart. Whether you saw gold and white or black and blue, the meme revealed a truth about human perception. Psychologists call it naive realism. It’s the feeling that our perception of the world reflects its physical truth. If we perceive a dress as looking blue, we assume the actual pigments inside the dress generating the color are blue. It’s hard to believe it could be any other color. But it’s naive because this is not how our perceptual systems work. I’ve written about this a lot at Vox. The dress and other viral illusions like the similarly ambiguous “Yanny” vs. “Laurel” audio reveal the true nature of how our brains work. We’re guessing. As I reported in 2019: Much as we might tell ourselves our experience of the world is the truth, our reality will always be an interpretation. Light enters our eyes, sound waves enter our ears, chemicals waft into our noses, and it’s up to our brains to make a guess about what it all is. Perceptual tricks like … “the dress” … reveal that our perceptions are not the absolute truth, that the physical phenomena of the universe are indifferent to whether our feeble sensory organs can perceive them correctly. We’re just guessing. Yet these phenomena leave us indignant: How could it be that our perception of the world isn’t the only one? Scientists still haven’t figured out precisely why some people see the dress in one shade and some see it in another. Their best guess so far is that different people’s brains are making different assumptions about the quality of the light falling on the dress. Is it in bright daylight? Or under an indoor light bulb? Your brain tries to compensate for the different types of lighting to make a guess about the dress’s true color. Why would one brain assume daylight and another assume indoor bulbs? A weird clue has arisen in studies that try to correlate the color people assume the dress to be with other personal characteristics, like how much time they spend in daylight. One paper found a striking correlation: The time you naturally like to go to sleep and wake up — called a chronotype — could be correlated with dress perception. Night owls, or people who like to go to bed really late and wake up later in the morning, are more likely to see the dress as black and blue. Larks, a.k.a. early risers, are more likely to see it as white and gold. In 2020, I talked to Pascal Wallisch, a neuroscientist at New York University who has researched this topic. He thinks the correlation is rooted in life experience: Larks, he hypothesizes, spend more time in daylight than night owls. They’re more familiar with it. So when confronted with an ill–lit image like the dress, they are more likely to assume it is being bathed in bright sunlight, which has a lot of blue in it, Wallisch points out. As a result, their brains filter it out. Night owls, he thinks, are more likely to assume the dress is under artificial lighting, and filtering that out makes the dress appear black and blue. (The chronotype measure, he admits, is a little crude: Ideally, he’d want to estimate a person’s lifetime exposure to daylight.) Other scientists I talked to were less convinced this was the full answer (there are other potential personality traits and lifetime experiences that could factor in as well, they said). Even if there’s more to this story than chronotype, there’s an enduring lesson here. Our differing life experiences can set us up to make different assumptions about the world than others. Unfortunately, as a collective, we still don’t have a lot of self-awareness about this process. “Your brain makes a lot of unconscious inferences, and it doesn’t tell you that it’s an inference,” Wallisch told me. “You see whatever you see. Your brain doesn’t tell you, ‘I took into account how much daylight I’ve seen in my life.’” Moments like the dress are a useful check on our interpretations. We need intellectual humility to ask ourselves: Could my perceptions be wrong? The dress was an omen because, in many ways, since 2015, the internet has become a worse and worse place to do this humble gut check (not that it was ever a great place for it). It’s become more siloed. “You see whatever you see” Its users are seemingly less generous to one another (not that they were ever super generous!). Shaming and mocking are dominant conversational forms (though, yes, irreverence and fun can still be had). This all matters because our shared sense of reality has fractured in so many important ways. There were huge divides on how people perceived the pandemic, the vaccines that arose to help us through it, the results of the 2020 election. Not all of this is due to the internet, of course. A lot of factors influence motivated reasoning and motivated perceptions, the idea that we see what we want to see. There are leaders and influencers who stoke the flames of conspiracy and misinformation. But in a similar way to how our previous experiences can motivate us to see a dress in one shade or another, they can warp our perception of current events, too. Though, I will admit: Maybe my perception of a more siloed internet is off! It’s hard to gauge. Algorithm-based feeds today are more bespoke than ever before. I can’t know for sure whether my version of the social internet is like anyone else’s. My TikTok feed features a lot of people retiling their bathrooms. That can’t possibly be the average user’s experience, right? I have no idea if we’re all seeing the same things — and even less of an idea if we’re interpreting them the same way. More chaos is coming, I fear. AI tools are making it easier and easier to manipulate images and videos. Every day, it gets easier to generate content that plays into people’s perceptual biases and confirms their prior beliefs — and easier to warp perceptions of the present and possibly even change memories of the past. The dress represents, arguably, a simpler time on the internet, but also offers a mirror to some of our most frustrating psychological tendencies. What I wonder all the time is: What piece of content is out there, right now, generating different perceptual experiences in people, but we don’t even know we’re seeing it differently?
1 h
vox.com
How the self-care industry made us so lonely
Where were you the first time you heard the words “bath bomb?” What about “10-step skin care routine?” Perhaps you have, at some point, canceled plans in order to “unplug,” drink some tea, and take a bit of “me time.” Maybe you’ve ordered an assortment of candles meant to combat anxiety and stress or booked a rage room to exorcise your demons.  A warped notion of self-care has been normalized to the point where everyday activities like washing yourself and watching TV are now synonymous with the term. Generally understood as the act of lovingly nursing one’s mind and body, a certain kind of self-care has come to dominate the past decade, as events like the 2016 election and the Covid pandemic spurred collective periods of anxiety layered on top of existing societal harms. It makes sense that interest in how to quell that unease has steadily increased.  More from This Changed Everything The last 10 years, explained The internet peaked with “the dress,” and then it unraveled Serial transformed true crime — and the way we think about criminal justice Brands stepped forward with potential solutions from the jump: lotions, serums, journals, blankets, massagers, loungewear, meditation apps, tinctures. Between 2014 and 2016, Korean beauty exports to the US more than doubled. The Girls’ Night In newsletter was founded in 2017, with a mission to share “recommendations and night-in favorites … all focused on a topic that could use a bigger spotlight right now: downtime.” YouTube was soon saturated with videos of sponsored self-care routines. By 2022, a $5.6 trillion market had sprung to life under the guise of helping consumers buy their way to peace.  As the self-care industry hit its stride in America, so too did interest in the seemingly dire state of social connectedness. In 2015, a study was published linking loneliness to early mortality. In the years that followed, a flurry of other research illuminated further deleterious effects of loneliness: depression, poor sleep quality, impaired executive function, accelerated cognitive decline, cardiovascular disease, higher risk of coronary heart disease and stroke. US Surgeon General Vivek Murthy classified the prevalence of loneliness as an epidemic. By 2018, half of the country reported feeling lonely at least sometimes, according to a Cigna survey, a number that has only grown.  There is no singular driver of collective loneliness globally. A confluence of factors like smartphones, social media, higher rates of anxiety and depression, vast inequality, materialism, and jam-packed schedules have been identified as potentially spurring the crisis. But one practice designed to relieve us from the ills of the world — self-care, in its current form — has pulled us away from one another, encouraging solitude over connection.  How self-care became a commercial product The self-care of decades past was decidedly less individualistic and capitalist. In the 1950s, self-care was a term used in health care contexts: activities patients and their families could perform to promote their health and well-being separate from the care of medical professionals. “To me, self-care is a subjective and dynamic process aimed at maintaining health and preventing diseases or managing diseases when they appear,” says Michela Luciani, an assistant professor of nursing at the University of Milano-Bicocca. In this context, self-care can encompass everything from getting annual medical screenings to eating well.  In the years that followed, the Black Panthers stressed the importance of caring for oneself as a political act amid the civil rights movement. Through community efforts like free food programs for children and families as well as free health clinics, the Black Panthers focused on collective well-being. “[This] image of caring for your people and self-care,” says Karla D. Scott, a professor of communication at Saint Louis University, “evoked the African phrase ‘I am because we are’: ubuntu.” For Black activists, partaking in rejuvenating rituals was crucial in order to survive within and to fight against racist, classist, and sexist systems. This approach to self-care is especially evident in the works of bell hooks and Audre Lorde, who is often referenced in the context of self-care: “Caring for myself is not self-indulgence,” she wrote, “it is self-preservation, and that is an act of political warfare.” This definition of self-care emphasizes the importance of engaging with others. Not only do we receive support from family, friends, and neighbors, but communing itself is a form of care. People report high levels of well-being while spending time with their friends, romantic partners, and children. Social interaction with trusted companions has been found to help stave off depression. Even chatting with acquaintances and strangers promotes happiness and belonging. Buy a new eyeshadow, a bullet journal, Botox, a vacation to fill the need for care that never seems to abate By the late 1960s, wellness entered the lexicon. Beyond simply avoiding illness, “wellness” as a concept centered the pursuit of a higher level of existence: a more emotional, spiritual, physical, and intellectual way of living. A wellness resource center opened in California in 1975; nearly a decade later, a wellness-focused newsletter from the University of California Berkeley helped legitimize the concept. This model of well-being features individuals, not communities, moving toward their “ever-higher potential of functioning,” as posited by Halbert L. Dunn, who helped popularize the contemporary idea of wellness. (Dunn also includes the “basic needs of man” — communication, fellowship with other people, and love — as integral to wellness.)  The ethos of wellness soon became synonymous with a sullied version of self-care, one that mapped neatly to the rising fitness culture of the ’80s through the early 2000s and the concept of “working on yourself.”  The Great Recession of 2008 marked a shift in how Americans viewed their health and well-being. In her book Fit Nation: The Gains and Pains of America’s Exercise Obsession, Natalia Mehlman Petrzela argues that fitness became “a socially acceptable form of conspicuous consumption” during this time when social media and boutique fitness classes allowed people to broadcast their lavish spending in pursuit of their health. Gwyneth Paltrow’s wellness brand Goop was founded the same year, espousing occasionally unfounded health advice and recommending (and selling) “aspirational products which embody and encourage restriction, control, and scarcity,” according to one academic paper. Commoditized self-care was here to stay, reaching mass saturation right around the time Trump was elected to office. Young people, disillusioned by polarized politics, saddled with astronomical student loan debt, and burned out by hustle culture, turned to skin care, direct-to-consumer home goods, and food and alcohol delivery — aggressively peddled by companies eager to capitalize on consumers’ stressors. While these practices may be restorative in the short term, they fail to address the systemic problems at the heart of individual despair.  Thus, a vicious, and expensive, cycle emerges: Companies market skin care products, for example, to prevent the formation of fine lines, supposedly a consequence of a stressful life. Consumers buy the lotions to solve this problem, lather themselves in solitude, and feel at peace for a little while. Once the anxiety, the exhaustion, and the insufficiency creeps in again, as it inevitably does, the routine begins anew. Buy a new eyeshadow, a bullet journal, Botox, a vacation to fill the need for care that never seems to abate.  Because buying things does not solve existential dread, we are then flooded with guilt for being unable to adequately tend to our minds and bodies. We just have to self-care harder, and so the consumerism masquerading as a practice that can fix something broken becomes another rote to-do list item. Individualistic approaches to wellness promote isolation This isn’t to say that solitary activities can’t be effective forms of self-care. Many people are easily depleted by social interaction and take solace in regular quiet evenings alone; solo time is indeed integral to a balanced social regimen. Conversely, people who are constantly surrounded by others can still feel lonely. However, when companies market genuinely vitalizing practices as individualized “solutions” to real problems (like burnout) requiring structural change (such as affordable child care), we increasingly look inward. “I worry that because of this ideology we live in, rugged individualism,” Scott says, “it lands in a way where folks feel that they’re deficient. It is deflating.” Pooja Lakshmin, a psychiatrist and clinical assistant professor at George Washington University, calls this self-soothing capitalist version of self-care “faux self-care” in her best-selling book Real Self-Care: A Transformative Program For Redefining Wellness. Faux self-care manifests in two ways: I deserve to splurge on Doordash and binge Netflix because I’m so burned out and I’m going to push myself so hard in this spin class because I need to be the best. Secluding oneself by summoning sustenance to our doorstep comes at the expense of the worker earning paltry wages to deliver you that food. The doors of our apartments quite literally separate those who can afford to “care” for themselves and those who cannot. While this form of restoration appears to be more isolating, the hyper-competitive version of faux self-care is equally as confining, Lakshmin says. “They’re not engaging or present,” she says. “They’re competing with themselves.”  While many surveys and reports outline a recent rise in loneliness, researchers lack sufficient longitudinal data to definitively say whether people are lonelier now than in the past, says Luzia Heu, an assistant professor in interdisciplinary social sciences at Utrecht University. However, people in wealthier societies have more opportunities to spend time alone now, she says, whether through remote work, living alone, or participating in solitary hobbies. “We spend more time alone and we are more isolated,” Heu says. “That is where people immediately assume that loneliness must also have increased a lot.” Whether or not loneliness has grown compared to historical accounts, recent statistics show that individuals are reporting higher levels of loneliness over the last decade, especially in the wake of the pandemic. “Self-care transformed into self-obsession”  America’s loneliness epidemic is multifaceted, but the rise of consumerist self-care that immediately preceded it seems to have played a crucial role in kicking the crisis into high gear — and now, in perpetuating it. You see, the me-first approach that is a hallmark of today’s faux self-care doesn’t just contribute to loneliness, it may also be a product of it. Research shows self-centeredness is a symptom of loneliness. But rather than reaching out to a friend, we focus on personalized self-care and wonder why we might not feel fulfilled. Another vicious cycle. “Instead of self-care being this mechanism to take care of yourself so that you can then show up for others,” says psychologist Maytal Eyal and co-founder of women’s health company Gather, “self-care transformed into self-obsession.”  The wellness industry wouldn’t be as lucrative if it didn’t prey on our insecurities. It must imagine new insufficiencies for us to fixate on, new elixirs and routines — like colostrum and 75 Hard — simultaneously meant to improve your mind and body by keeping them occupied in solitude.  That isolation is detrimental to the self and to society. When people are lonely, they tend to distrust others — they’re on the lookout for social threats and expect rejection. Being so disconnected and suspicious of their neighbors, their communities, and institutions could impact their propensity to cooperate with others and act in prosocial ways. A lack of social belonging has been linked to a person’s increased likelihood of voting for populist candidates. Similarly, social rejection can lead one toward extremist views. This is especially good news for political figures who wish to sow discontent and chaos. A secluded electorate is an unengaged one. Those in positions of power have it in their best interests to keep workers, neighbors, and citizens separate, self-centered, and distracted. As Scott mentioned, the tradition of American individualism doesn’t help. When people are told they are solely responsible for their own happiness and well-being, they increasingly seek it out via solitary means. If they’re lonely to begin with — if they feel disappointed in their relationships or don’t feel understood — they have a stronger tendency to withdraw, says Heu, the social and behavioral science professor. Perhaps they seek out a form of commodified self-care to cope, but “it’s not something that tackles the cause of your loneliness,” Heu says. “For many people, the cause of the loneliness will be something else.” For women, to whom self-care is most aggressively targeted, the source of their loneliness may be tied to the demands of their lives. Even when they earn the same as their male partners, women in heterosexual relationships still do the lion’s share of housework, according to a Pew Research Center study. Women also spend more time on caregiving than their husbands, the survey found. An expensive candle won’t ease the burdens of home life or allow for more time to connect with peers outside of the household.  The narrative that the only one we can depend on, and thus should prioritize, is ourselves perpetuates the idea of the personal above the collective — and reinforces the notion of self-sufficiency. Self-care is individual, says Luciani, the nursing professor: No one else can force us to get enough sleep or go to the gym. But it shouldn’t be individualistic. “Self-care is influenced by the support from others,” she says, like a partner who cooks dinner and cares for the children while you lie down with a headache, or a friend who advocates for you at medical appointments. Communal self-care means creating space for others to tend to their needs and supporting them when necessary.  Despite the powerful forces working against us, we can reclaim self-care. We can choose to ignore compelling advertisements promising quick fixes. We can partake in revitalizing communal practices, whether they be a yoga class or a movie night with friends. We can avoid blaming ourselves for feeling stressed and scared and despondent in a violent, tumultuous, and unjust world. We can get to the root of our loneliness. True self-care involves connecting with others. Showing up for a friend in need or exchanging a few kind words with a stranger is more fulfilling than a face mask anyway. 
1 h
vox.com
The last 10 years, explained
The past decade was filled with so many unexpected turning points: moments big and small that we now understand to be truly important. These events ignited real change, warned of a not-so-far-off future, or had surprising effects that we couldn’t have imagined at the time. We started thinking about this particular time period because Vox just happened to turn 10 this year, but 2014 saw much more than the birth of our news organization. It was an incredibly divisive year kicking off an incredibly divisive decade. This was the year the police killings of Michael Brown and Eric Garner mainstreamed the Black Lives Matter movement; this was also the year of Gamergate, a harassment campaign that became entwined with the ascendant alt-right. It was a wildly online year, too, that set all sorts of attitudes and behaviors in motion (see: BLM and Gamergate, but also The Fappening and Kim Kardashian’s special brand of virality, below). Our reporters set out to explain the last 10 years of indelible moments — the good, the bad, the fascinating — in a series of pieces you can find across the site. If you want to understand how we got to where we are in 2024, read on. When nude leaks went from scandal to sex crime It’s been trendy lately to talk about how differently we now treat women, particularly famous women, than we did in the aughts. We talk about how today, we understand that it was wrong for tabloids to harass Britney Spears and publish all those upskirt photos and ask teen pop stars if their boobs were real on live TV.  There’s a specific moment, though, when we saw that much-remarked-upon evolution tip into reality, the purity culture of the 2000s coming up against the feminist outrage of the 2010s and crumbling.  More from This Changed Everything How the self-care industry made us so lonely The “racial reckoning” of 2020 set off an entirely new kind of backlash 10 big things we think will happen in the next 10 years The grossly named Fappening occurred on August 31, 2014, when one hacker’s stash of nearly 500 celebrity nudes (including Jennifer Lawrence, then at the height of her fame) leaked out to the mainstream internet. They became the fodder for a thousand op-eds about what was just beginning to be called revenge porn. (Ten years later, 2014’s cutting-edge term is now considered inaccurate, putting too much emphasis on the intent of the perpetrator and trivializing the severity of the crime being committed.) The previous decade had a playbook in place for talking about leaked photos of naked stars. You talked about them as something titillating for you, the viewer, to look at without apology, and something shameful for the woman (it was always a woman) pictured to apologize for.  For some media outlets, it seemed only natural to continue the playbook of the 2000s into 2014. “#JenniferLawrence phone was hacked her #nude pics leaked Check them out in all their gloriousness,” tweeted Perez Hilton, publicizing a post that reproduced the uncensored pictures of Lawrence.  But instead of getting the traffic windfall he might have expected, Perez was slammed with outrage across social media. He had to apologize for his post and replace it with a censored version. As Hilton and his cohort scrambled to catch up, the rest of the media was allying itself fiercely on the side of the hacking victims, denouncing anyone who looked at the leaked nudes. That included outlets that had previously covered every nipslip and upskirt photo to hit the internet with panting eagerness.  “We have it so easy these days,” the pop culture website Complex had mused in 2012 in a roundup of recent celeb nude leaks. “Who do you want to see naked?”  When the Fappening happened two years later, Complex changed its mind. “Consider this,” the website declared. “These women, regardless of their public persona, are entitled to privacy and to express their sexuality however they wish. It’s their basic human right. These women have lives, too.” It’s hard to say exactly what swung the discourse quite so hard against the hackers this time around. Perhaps it was the ubiquity of camera phones, which had made nudes so inescapable: that feeling that it could happen to you. Perhaps it was because the media at the time was obsessed with Jennifer Lawrence, like everyone else was, and they wanted to be on her side. Perhaps the collective hive mind had just decided the time had come for feminism to trend upward. Whatever the reason, the press had now established a new narrative it could use to talk about sex crimes in the social media era, especially sex crimes that involved famous and beloved actresses. Three years later, it would put that knowledge to use to break a series of stories about Harvey Weinstein as the decade-old Me Too movement re-energized itself. Me Too saw reputational losses and criminal charges wielded against powerful men who for decades had been able to get away with sexual violence with impunity. It was able to do that because of what we all learned from The Fappening. —Constance Grady A fringe, racist essay foretold the fate of a MAGAfied Republican Party In 2016, a then-minor conservative writer named Michael Anton wrote what would become the defining case for electing Donald Trump. In Anton’s view, a Clinton victory would doom the country to collapse — primarily, albeit not exclusively, due to “the ceaseless importation of Third World foreigners with no tradition of, taste for, or experience in liberty.” Whatever Trump’s faults, he alone stood in the way of national suicide. Therefore, all true conservatives must get behind him. This sort of rhetoric may seem normal now: the kind of thing you hear every day from Trump and his deputies in the conquered Republican Party. At the time, it was immensely controversial — so much so that Anton originally published it under a pseudonym (Publius Decius Mus). But it became so influential on the pro-Trump right that Anton would be tapped for a senior post in President Trump’s National Security Council. The essay’s emergence as the canonical case for Trumpism marked a turning point: the moment when the conservative movement gave into its worst impulses, willing to embrace the most radical forms of politics in the name of stopping social change. The anti-establishment Trumpers have become the establishment The title of Anton’s essay, “The Flight 93 Election,” points to its central conceit. United Airlines Flight 93 was the one flight on September 11 that did not hit its ultimate target, crashing in a field in Pennsylvania thanks to a passenger uprising. Anton argued that Americans faced a choice analogous to that of Flight 93’s passengers: either “charge the cockpit” (elect Trump) or “die” (elect Hillary).  Anton spends much of his essay castigating the conservative movement — what he calls “Conservatism, Inc” or the “Washington Generals” of politics — for refusing to acknowledge that immigration has made the electoral stakes existential. Trump “alone,” per Anton, “has stood up to say: I want to live. I want my party to live. I want my country to live. I want my people to live. I want to end the insanity.” The racism in Anton’s view of “Third World foreigners” is unmistakable. Yet there is no doubt that his basic theses are now widespread among the Republican Party and conservative movement. The anti-establishment Trumpers have become the establishment. Anton’s essay was ahead of the curve, clearly articulating where the movement was heading under Trump. “The Flight 93 Election” marked the moment in which the unstated premises of the conservative movement’s most radical wings came out into the open. That those premises are now widely shared goes to show what the movement has become — and why Anton, and many others like him, would later rationalize an attempt to overturn an American election. —Zack Beauchamp The number that made the extinction crisis real Scientists have known for decades that plants and animals worldwide are in peril — the tigers and frogs, wildflowers and beetles. But it wasn’t until recently that the true gravity of the problem, dubbed the biodiversity crisis, started sinking in with the public.  That shift happened largely thanks to a single number, published in 2019.  In spring of that year, an intergovernmental group of scientists dedicated to wildlife research, known as IPBES, released a report that found that roughly one million species of plants and animals are threatened with extinction. In other words, much of the world’s flora and fauna is at risk of disappearing for good.  “The health of ecosystems on which we and all other species depend is deteriorating more rapidly than ever,” Robert Watson, IPBES’s director for strategic development and former chair, said when the report was published. “We are eroding the very foundations of our economies, livelihoods, food security, health and quality of life worldwide.” Extinction is far from the only important metric for measuring the health of the planet. Some scientists argue that it obscures other signs of biodiversity loss, such as shrinking wildlife populations, that typically occur long before a species goes extinct.  Yet this number marked an evolution in the public’s understanding of biodiversity loss.  Extinction is an easy concept to grasp, and it’s visceral. And so the number — calculated based on estimates of the total number of species on Earth, and how threatened different groups of them are — hit especially hard. It not only raised awareness but inspired an unprecedented wave of conservation action.  World leaders have since used the IPBES number to justify major efforts to protect nature, including a historic global deal, agreed on by roughly 190 countries in 2022, to halt the decline of wildlife and ecosystems. It has also been cited by government hearings, state resolutions, corporate actions, and hundreds of scientific papers — not to mention countless news reports.  The concept of biodiversity loss is vague. This number made it concrete, and more urgent than ever. —Benji Jones One state’s chilling ban was the beginning of the end for abortion access in America In May 2019, Alabama banned almost all abortions. It was the most aggressive abortion law passed by a state in decades, and clearly flouted the protections set forth in Roe v. Wade.  With no exceptions for rape or incest, the Alabama law hit a new level of restrictiveness amid a slate of state abortion bans passed in 2018 and 2019. These measures marked a major change in anti-abortion strategy: After 10 years of pushing smaller restrictions aimed at closing clinics or requiring waiting periods for patients, abortion opponents had begun aiming squarely at the landmark 1973 Supreme Court decision establishing Americans’ right to terminate a pregnancy.  Emboldened by Donald Trump’s presidency and two new conservative Supreme Court justices, these activists believed that the constitutional right to an abortion was finally vulnerable. They were correct.  The Alabama abortion ban was expressly designed as a challenge to Roe, with sponsor and Alabama state Rep. Terri Collins telling the Washington Post, “What I’m trying to do here is get this case in front of the Supreme Court so Roe v. Wade can be overturned.”   At first, Alabama’s ban, along with six-week bans in Georgia and elsewhere, were tied up in lower courts. In 2020, however, Justice Ruth Bader Ginsburg died and a third Trump nominee, Amy Coney Barrett, was confirmed, creating a rock-solid conservative majority on the Supreme Court. Less than two years later, the court held in Dobbs v. Jackson Women’s Health Organization that Roe “must be overruled.” With the federal right to an abortion gone, Alabama’s ban went into effect. While it was once the most restrictive in the country, now more than a dozen other states have instituted near-total bans. Several more have imposed gestational limits at 15 weeks or earlier. Alabama was once again on the front lines of reproductive health restrictions in February of this year, after a  judge ruled that frozen embryos used in IVF count as “children” under state law.   The landscape of reproductive health law in America has been utterly remade, and anti-abortion activists are far from finished. While the Alabama ban once seemed to many like radical legislation that would never survive the courts, it was in fact an early look at where the country was headed, and at the extreme circumstances under which millions of Americans are living today.  —Anna North  Avengers: Endgame forced an entirely new era of storytelling There will probably never be another movie like Avengers: Endgame, the 2019 film with a lead-up that wholly altered the movie industry and even the way stories are told. For over a decade, Marvel told one central story — Earth’s mightiest heroes working to defeat the great villain Thanos — through the MCU’s plethora of interlocking superhero blockbusters. In that era, each film, with its Easter eggs and credit scenes, built toward the culmination known as Endgame.  By signaling to its audience that all 23 movies mattered to the larger story, Marvel ensured each was a financial success, including a slew — Black Panther, Captain Marvel, Avengers: Infinity War — of billion-dollar worldwide box offices. Marvel’s grand design made Endgame the second-biggest movie in history. It’s not surprising that seemingly everyone in Hollywood tried to replicate this triumph, often at the expense of creative achievement. Studio heads hoped that they could grow and cash in on properties with extant popularity, the way Marvel had with its comic book characters, and began investing in sequels and spinoffs of proven IP. Marvel’s parent company Disney developed countless new Star Wars projects and capitalized on its hits like Frozen and Moana by lining up continuations of those stories. The company created Disney+ not just to sell its existing properties, but to house its avalanche of spinoff TV shows. Amazon’s take on Tolkien’s Lord of the Rings franchise and HBO’s interest in multiple Game of Thrones spinoffs could certainly be seen as trying to capture Marvel’s magic.   Across the board, the people in charge of the purse strings became less interested in original ideas, as well as in mid-budget films; it was blockbuster or… bust.  Marvel changed what kind of stories were being told, but also how they were being told. Competitors became convinced that audiences wanted a connected cinematic universe, a format that mirrored comic book structure. Warner Bros., which owns the rights to Superman, Batman, and Wonder Woman, tried its hand at creating the DC superhero universe. Universal also played with the idea, teasing a linked movie world featuring classic monsters like Dracula. Neither of those fully panned out — some movies were critical flops, others didn’t find large audiences — signaling how difficult it is to execute what Marvel had. This tactic spread beyond the big studios too; indie darling A24, for example, has tapped into connected worlds and multiverses to tell expansive stories. Marvel’s other innovations have also lodged themselves firmly in the pop culture firmament. Easter eggs — embedding “secrets” into art — are commonplace today (see: Swift, Taylor), and foster fan loyalty. Post-credits scenes have been added to all kinds of films.  Perhaps the real testament to Endgame’s singularity, though, is that it wasn’t only rivals who were unable to replicate what Marvel was able to do. None of the studio’s post-Endgame movies have had pre-Endgame box office results, and Marvel is no longer an unstoppable force. The studio’s cinematic universe looks as vulnerable as ever.  What Marvel didn’t realize was that Endgame was truly the end of the game. In its wake — for better or worse — we’re left with new ideas about what kind of stories we tell and why we tell them.  —Alex Abad-Santos The manifesto that changed China’s place in the world Xi Jinping Thought on Socialism with Chinese Characteristics for a New Era — a definitive manifesto known as Xi Jinping Thought for short — first appeared in China in 2017, laying out the ideology and priorities not just of China’s president Xi, but of the entire Chinese Communist party-state under him. Xi’s policies and consolidation of power didn’t start with the document itself; they were developed over time, starting even before Xi became president in 2012. But given the opacity of the Chinese political apparatus and increasing censorship, the compiled doctrine provided a historic window into how Xi sees the world and his own place in it.  And what he wants is a dominant China that harks back to its former greatness, with himself at the center. Xi Jinping Thought is, according to the document, the roadmap for “a Chinese solution for world peace and human progress, and of landmark significance in the history of the rejuvenation of the Chinese nation, the history of the development of Marxism, and the progress of human society.” Xi Jinping Thought articulates a vision that harnesses military development and aggressive diplomacy Rather than lay low and just use its economic growth and the decline of US influence to propel China to world power status — as the country’s former president Deng Xiaoping advocated — Xi Jinping Thought articulates a vision that harnesses military development and aggressive diplomacy as critical factors in China’s dominance. That has translated to China deploying its increasing military might to assert dominance in the South China Sea and the Taiwan Strait and cracking down on pro-democracy protests in Hong Kong, in addition to further opening the economy and becoming a global investment powerhouse via the Belt and Road initiative. It has also meant taking significant geopolitical leadership positions — expanding the BRICS economic bloc, brokering a deal to restore relations between Iran and Saudi Arabia, and attempting to negotiate peace between Ukraine and Russia. Arguably, China and the US would have ended up on a collision course with or without Xi Jinping Thought. But that tension really kicked into higher gear after the introduction of the doctrine at the start of Xi’s second term, according to Neil Thomas, Chinese politics fellow at the Asia Society — and after the US itself started to explicitly attempt to contain China’s rise. “Looking from Beijing, you start to see this big pushback,” starting with the Trump administration’s trade war and continuing under Biden. “That has fed into a much more securitized view of the world in China,” Thomas said, as well as the notion that geopolitics “was increasingly zero sum.” Lately the aggressive policy seems to be faltering due to China’s economic troubles. Thomas says Xi has “become increasingly aware of the costs of war [to] diplomacy and has adjusted his tactics to pursue the same ambitious strategic goals but with a more sensible strategy that focuses more on making friends than making enemies.” That has not deterred the US military buildup in Asia to counter China, though diplomatic relations between the two countries have warmed somewhat in recent months. But cutting down the bluster doesn’t mean a change in priorities, just a change in tactics — for now, anyway. —Ellen Ioanes The 2016 election made us realize we know nothing about class Since Donald Trump eked out an electoral college victory in 2016 and reshaped the GOP, journalists, academics, and politicians have been trying to explain what, exactly, happened. One  prevailing narrative is that Trump spoke directly to a forgotten voting bloc — poor and working-class white people, especially those living in rural America. There’s a kernel of truth in that theory; Trump did indeed outperform previous Republican candidates among that demographic. But the stereotype of the average Trump voter that’s been born out of that narrative — the blue-collar union worker who hasn’t seen a meaningful raise in decades — is misleading at best. In fact, one of the lessons of 2016 was that there is no universal definition of what constitutes a “working-class” voter, and that class solidarity is still deeply misunderstood. As it turns out, Trump’s biggest, most reliable voting block wasn’t the downtrodden white worker; it was largely white people from middle- and high-income households. When voters were divided up by income in various exit polls, Trump was only able to beat Joe Biden in one of three tiers: those making over $100,000 a year.  Trump’s win wasn’t a high-water mark for the role of class in elections like many thought, but rather for the media focus on the role of class in elections. Even still, we haven’t really figured out how to measure our class divides or even talk about them. This lack of clarity has underscored a big problem in American politics: We have categories that are used as proxies for class — like someone’s college education level or union membership status — but they are imprecise substitutes that blur the bigger picture of the US electorate.   As a result, analysis has exaggerated, or even distorted, reality, painting the Democratic Party, for example, as a political organization that’s growing more and more elitist and out of touch, and the GOP as the party that’s winning over the working class.  That’s despite the fact that Democrats have embraced the most ambitious anti-poverty agenda since Lyndon Johnson’s presidency — championing programs that are, by and large, supported by the poor — while Republicans continue to advocate for programs that almost exclusively benefit the wealthiest members of American society. Trump’s victory may have turned people’s attention to class politics, but there’s still a long way to go before Americans get a clearer picture of how class will shape, or even determine, the election in November — and those in the years to come. —Abdallah Fayyad  A photograph of a 3-year-old refugee’s death altered global opinion on migrants  There are certain photographs that stop the world. The “Tank Man” of Tiananmen Square. A nine-year-old girl, on fire from napalm during the Vietnam War. A migrant woman in 1936 California. To this list we can add the image of the body of a three-year-old refugee boy, face down in the sand of Bodrum, Turkey, after drowning in the Mediterranean Sea on September 2, 2015. Alan Kurdi was fleeing the Syrian civil war, one of an estimated million people who were seeking safe refuge in Europe. Minutes after Kurdi and his family left the Turkish city of Bodrum in the early hours, hoping to reach the Greek island of Kos and European territory, their overloaded rubber dinghy capsized. Kurdi, along with his brother Ghalib and mother Rehana, slipped beneath the waves. That morning the Turkish photographer Nilüfer Demir came upon what she would later call a “children’s graveyard” on the beach. Alan’s body had washed up on the shore, his sneakers still on his tiny feet. Demir took the photograph. Alan Kurdi was only one of an estimated 3,700 other asylum seekers who drowned in the eastern Mediterranean that year, desperately trying to reach Europe. But Demir’s photograph, shared on social media by Peter Bouckaert of Human Rights Watch, spread to every corner of the world, where it was viewed by an estimated 20 million people. At a moment when Europeans seemed unsure whether to accept the unprecedented flow of asylum seekers, the image of a three-year-old left to die on the very edge of Europe galvanized political leaders, opening up a route for hundreds of thousands of refugees to find safety in the EU.   But the story doesn’t end there, for the compassion for asylum seekers generated by Kurdi’s image proved to have a short half-life. In the years since 2015, Europe has largely turned against asylum seekers, tightening its borders and closing off the Mediterranean. A little more than a year after Kurdi’s death, Donald Trump would win the White House, leading to a sharp reduction in the number of asylum seekers admitted into the US. That same year the UK voted for Brexit, in large part over concerns about immigration and asylum policy. The European Parliament elections held later this year are expected to cement policies that will make the EU even less welcoming to migrants and asylum seekers.   Yet with some 114 million people around the world forcibly displaced from their homes, nothing will stop the flow of refugees. We know there will be more Alan Kurdis in the future. And they will likely be met with less compassion than his photographed death generated.  —Bryan Walsh What Kim Kardashian wrought when she “broke the internet” One of the most circulated images of the past decade is of a reality star’s rear end. In November 2014, Paper Magazine unveiled its winter issue starring Kim Kardashian with a photoshoot centered around her most notable asset and the ambitious goal of “break[ing] the internet.” On one cover, Kardashian creates a champagne fountain with her curvaceous body, unleashing foam into a glass perched on her PhotoShopped backside. (The image is a recreation of controversial photographer Jean-Paul Goude’s 1976 “Carolina Beaumont, New York” photo, but drew even more fraught comparisons to Sarah Baartman, an enslaved South African woman who was made into a freak-show attraction in 19th-century Europe for her large buttocks.) The other cover, however — where Kardashian flashes her impossibly small waist and cartoonishly round butt — is what we mainly associate with the issue. She’s wearing nothing but pearls and a self-aware smile. What was once a source of mockery for Kardashian in tabloids had now become the culture’s most coveted possession.  Lest we forget, these photos arrived at the tailend of a year all about butts. White artists like Miley Cyrus, Iggy Azalea, and even Taylor Swift were incorporating twerking into their music videos and performances. Hit songs like Meghan Trainor’s “All About That Bass,” Jennifer Lopez’s “Booty,” and Nicki Minaj’s “Anaconda” were exalting curvy bodies. These moments contributed to the “slim-thick” physique becoming more accepted and desired outside Black and brown communities. (Twerking and voluptuous “video vixens” have long been features of rap videos.) However, it was Kardashian and, later, her sisters, who would come to represent the social complications this “trend” posed regarding the fetishization of Black bodies, cultural appropriation, and plastic surgery.  The American Society of Plastic Surgeons found a 90 percent increase in Brazilian butt lift procedures from 2015 to 2019. The surgery, where patients’ stomach fat is injected into their butts, has a sordid history embedded in Brazil’s eugenics movement and the hyper-sexualization of the mixed-race Black woman, known as the “mulata.” BBLs have also garnered headlines for their deadly health risks, mainly a result of fat embolisms. Nevertheless, it became hard not to notice the amount of Instagram influencers who had apparently gotten the surgery or, at least, were digitally enhancing their butts. Then, just as quickly, it seemed like the tide had turned once again for the “ideal” female body. In 2022, a controversial New York Post article declared that “heroin chic” was back in. Social media observers also began noticing that Kardashian was suddenly a lot smaller. At the same, the diabetes drug Ozempic emerged as Hollywood’s latest weight-loss craze. Thus, the media eagerly questioned whether the BBL era was “over,” despite the surgery’s persisting popularity.  The question illuminated the ways Black people — their culture, their aesthetics, their literal bodies — are objectified and easily discarded under the white gaze. As Rachel Rabbit White wrote, “to celebrate the supposed ‘end of the BBL’ is synonymous with the desire to kill the ways in which Black women, especially Black trans women, and especially Black trans sex workers, have shaped the culture.” Writer Ata-Owaji Victor pondered where the rejection of this trend leaves “Black women and people who naturally have the ‘BBL’ body.” The answer is seemingly: in the same position Black women have always been put — useful until they’re not. —Kyndall Cunningham  The sweeping strike that put power back in teachers’ hands In 2018, roughly 20,000 educators went on strike in West Virginia, protesting low pay and high health care costs. Their historic nine-day labor stoppage led to a 5 percent pay increase for teachers and school support staff.  With organizers galvanized by the victory in West Virginia, labor actions in states like Oklahoma, Kentucky, North Carolina, Colorado, and Arizona soon followed. According to federal statistics, more than 375,000 education workers engaged in work stoppages in 2018, bringing the total number of strikers that year to 485,000— the largest since 1986. The uprising sparked national attention and enthusiasm both about the future of school politics and the possibility of resurging worker activism more broadly. It went by the shorthand “Red for Ed” — a reference to the red clothing educators and their allies wore every time they took to the streets.  The momentum continued the next year: In 2019, more than half of all workers in the US who went on strike came from the education sector, with new teacher actions spreading to states like Arkansas, Indiana, and Illinois. Red for Ed changed the national political narrative To be sure, the movement didn’t create lasting change in all aspects of education policy. Average teacher pay has stayed flat for decades, and fewer people are entering the teaching profession. Union membership writ large has continued to decline. And despite educators’ pushback against school privatization, conservatives managed to push through new expansions of public subsidies for private and religious schools following the pandemic.  But the teacher uprising earned the support of parents and the public, who reported in surveys strong backing for the educators’ organizing and for increased teacher pay. This strengthened support likely helped explain why parents largely stood by their kids’ teachers during the tough months of the pandemic, when educators again banded together for stronger mitigation standards to reduce the spread of Covid-19. During the Obama era, a powerful bipartisan coalition for education reform spent much of their time attacking educators and their unions — a scapegoat for public education’s problems that most people ultimately did not buy. Red for Ed changed the national political narrative around teachers, and in many ways was a fatal nail in the coffin for that movement. —Rachel Cohen Malaria in Maryland (and Florida, and Texas, and Arkansas) showed that the future of climate change is now Last year, for the first time in two decades, mosquitoes transmitted malaria on American soil. The geographic range was unprecedented, with cases in Florida, Texas, Maryland, and Arkansas. 2023 was the hottest year on record since 1850, and for the mosquitoes that spread malaria, heat is habitat; the US cases occurred amid an uptick in malaria infections on a global scale.  Scientists have been warning us for years that without more public health resources, climate change was bound to push infectious threats into environments and populations unprepared for their consequences. Malaria’s reappearance in the US signaled to many that the future has arrived.  Wild weather turns previously inhospitable areas into ones newly suitable for lots of so-called vector insects to live. It’s not just different species of mosquitoes whose migration is changing disease trends. Ticks — different species of which spread diseases like Lyme, babesiosis, and ehrlichiosis — have progressively moved into new parts of the US in recent years as they’ve warmed. Changing weather patterns also cause many of these insects to reproduce in higher numbers in their usual habitats.  Insect habitats aren’t the only ones affected by climate change. Weather is pushing animals that serve as disease reservoirs into new environments, which can lead to more “spillover” events where germs get spread from one species to another. That’s thought to explain, at least in part, the fatal borealpox infection transmitted to an Alaska man by a vole bite last year; it’s also a concern when it comes to rabies transmission. Extreme and unseasonable heat waves are also turning a progressively large part of the US into newly comfortable digs for fungi — including molds that cause severe lung and other infections in healthy people. Warming fresh and sea waters more frequently become home to noxious blooms of toxic algae and bacteria. What’s more, the heat is kicking pathogens’ evolution into overdrive: The microorganisms that can survive it are more likely than ever to also survive in our bodies, making them more likely to cause disease — and harder to fight. As with many health risks, the consequences of climate-related infectious threats land hardest on the people with the fewest resources — and are almost incomparably worse in lower-resource countries than inside the US.  There’s a lot we still don’t understand about how climate change interacts with communicable diseases, including malaria. Some of the shifts caused by severe weather may reduce certain risks even as they amplify others. And disentangling the effects of severe weather from changes in policy, behavior, and human immunity, especially during and after a pandemic, is a formidable task. Still, the comeback — or debut — of peculiar pathogens on American shores makes understanding these links viscerally urgent. Our warming planet isn’t going to wait until we’ve reformed and funded our public health system, seamlessly integrated disease surveillance into health care, renewed public trust in vaccines, and realigned incentives for novel antibiotic production before the fallout of climate change quite literally bites us in the ass.  —Keren Landman Letting language models learn like children tipped the AI revolution Imagine you have a little kid. You want to teach them all about the world. So you decide to strap them to a chair all day, every day, and force them to stare at endless pictures of objects while you say, “That’s a banana, that’s a car, that’s a spaceship, that’s…” That’s not (I hope!) how you would actually teach a kid, right? And yet it’s the equivalent of how researchers initially tried to teach AI to understand the world. Until a few years ago, researchers were training AIs using a method called “supervised learning.” That’s where you feed the AI carefully labeled datasets. It actually yielded some decent results, like teaching AI models to tell apart a banana and a spaceship. But it’s very labor-intensive because humans have to label every bit of data. Then some researchers tried a different method: “unsupervised learning,” where the AI learns more like a real child does, by exploring the world freely, vacuuming up tons of unlabeled data, and gradually picking out the patterns in it. It figures out that bananas are those yellow oblong-shaped things without ever explicitly being told that. Turns out this leads to much more powerful AI models, like OpenAI’s ChatGPT and Google’s Gemini, which can explain complicated topics better and use language more naturally than the older, clunkier models. Of course, AIs are not actually kids, and there’s a lot we still don’t understand about what’s happening inside the models. Yet when these companies realized that the key to unlocking progress wasn’t spoon-feeding AI every bit of information but letting it play around until it figured things out, they ushered in the AI revolution we’re seeing today. Alison Gopnik, a developmental psychologist at Berkeley, was an early voice arguing that studying kids can give us useful hints about how to build intelligent machines. She’s compared children and AIs — for instance, by putting four-year-olds and AIs in the same online environments to see how each is able to learn — and found that the kids make much better inferences. Others are catching on. A team at NYU released a study this year in which a baby wore a helmet camera, and whatever the baby saw and heard provided the training data for an AI model. From a total of just 61 hours of data, the AI learned how to match words to the objects they refer to — the word “banana,” say, to that yellow oblong fruit. Researchers are pinpointing some of the qualities that make kids such amazing learning machines: they’re embodied, they’re curious, and they’re able to interact socially with others. Perhaps that’s why the researchers are now trying to create embodied multimodal AIs that can take in not just text, but sights, sounds, touch, and movement. They are, maybe without realizing it, embarking on an effort to replicate what evolution already did in making babies. —Sigal Samuel  The drug that supercharged a crisis also spelled a synthetic destiny  When doctors began liberally prescribing opium and morphine to Civil War veterans for everything from amputations to diarrhea, they inadvertently kicked off the opioid epidemic over 150 years ago. It wasn’t until pharmaceutical companies started pushing prescription opioids as painkillers in the 1990s that the problem escalated to a national emergency. By 2015, pills and heroin had already made the opioid epidemic the deadliest drug crisis in US history. Then came the fentanyl boom.  The synthetic and extremely potent opioid, introduced in the 1960s, has been used as pain medication for decades. In 2016, it became responsible for the majority of overdose deaths. It pushed the number of US drug overdoses above 100,000 in 2022, more than doubling 2015’s death toll. Because of fentanyl’s potency, it takes much less to overdose: A fatal dose fits on the tip of a sharpened pencil. Fentanyl put an already dire crisis into hyperdrive. But its spread also marked a deadlier, more prolific era of drugs where synthetics reign supreme.  Fentanyl’s rise hinges on its synthetic nature. It can be made from just a few chemicals, while heroin and opium require the slow cultivation of poppy flowers. Compared to Oxycodone — considered a “semi-synthetic” because its production involves chemically modifying natural opioids rather than brewing them from scratch — fentanyl is roughly 60 times more potent.  Fentanyl is also up to 50 times stronger than heroin, which makes smuggling it much easier since doses require far less of the actual drug. In the mid-2010’s, Mexican cartels trafficking opioids began “cutting” drugs with fentanyl to save money, since it provided a similar high with less volume. In some cities where heroin use was widespread, suppliers have altogether replaced it with fentanyl, leaving users little choice but to switch.  Before fentanyl, overdose deaths were concentrated among opioid users. But fentanyl can be found as a filler in cocaine and MDMA supplies, spreading the overdose crisis into new terrain. Variations of fentanyl — of which there are now more than 1,400 — are already making their way into the illicit drug supply. Take carfentanil, which was developed to sedate large animals like elephants, but is now showing up in thousands of human overdoses. Carfentanil is estimated to be 100 times more potent than fentanyl itself. Pure synthetics like fentanyl are where drug development is headed. Despite progress along many measurable dimensions, life in the 21st century will remain painful and unhealthy and full of ways to kill us. The incentive to continue developing legions of new synthetic drugs will stay strong as ever, which will continue unearthing cheaper and easier to make substances. As those make their way to patients, the risk of adding novel, more powerful drugs to the illicit drug supply will follow.  Rising awareness of fentanyl’s harms has driven some progress, from reducing production and investing in harm reduction strategies like testing strips to combating
1 h
vox.com
The overlooked conflict that altered the nature of war in the 21st century
On the second day of the 2020 Armenia-Azerbaijan war, the Armenian military posted a video of one of its surface-to-air missile systems shooting down a surprising enemy aircraft: an Antonov AN-2 biplane.  As it turned out, it wasn’t a sign of desperation on Azerbaijan’s part that its military was flying a plane first produced in the Soviet Union in 1947, and today used mostly for crop-dusting. Azerbaijan had converted several AN-2s into unmanned aircraft and used them as so-called bait drones. After the Armenians shot down the planes, revealing the positions of their anti-aircraft systems, their forces came under attack from more modern drones.  More from This Changed Everything The last 10 years, explained The “racial reckoning” of 2020 set off an entirely new kind of backlash Serial transformed true crime — and the way we think about criminal justice It seems strangely fitting that what was also known as the Second Nagorno-Karabakh War, a conflict that has been called “the first war won primarily with unmanned systems” and even the “first postmodern conflict,” could also end up being the last one in which biplanes played a significant role. The conflict between these two former Soviet republics in the Caucasus, on the border between Europe and Asia, was the culmination of tensions that had been building for more than 25 years and intercommunal dynamics that were far older than that. It was in some sense a throwback to a traditional type of war — two nation-state armies fighting over disputed territory — that was far more prevalent in previous centuries.  But it was also a hypermodern war where unmanned systems played an unprecedented role on the battlefield, and social media played an unprecedented role off it. Though it got relatively little coverage in the international media at the time — coming as it did at the height of the Covid-19 pandemic, a wave of global protests, and a bitter US presidential election campaign — it was in some ways a preview of the much larger war that would break out in Ukraine just two years later, and may yet be seen as the harbinger of a new and potentially devastating era of international conflict. A frozen conflict heats up The Armenia-Azerbaijan dispute is one of the so-called frozen conflicts left over from the collapse of the Soviet Union. Nagorno-Karabakh, often referred to as Artsakh by Armenians, is an ethnically Armenian region within the borders of neighboring Azerbaijan. Violence in the region erupted in the 1980s when authorities in Nagorno-Karabakh demanded to be transferred to Armenia. (At the time, all were part of the Soviet Union.)  After the Soviet collapse, when both Armenian and Azerbaijan became independent, full-scale war broke out, resulting in more than 30,000 deaths and the displacement of hundreds of thousands of people, mainly Azeris. The first war ended with a Russian-brokered ceasefire in 1994 that left Nagorno-Karabakh as a semi-independent — but internationally unrecognized — territory surrounded by Azerbaijan, and Armenia retained control of some of the nearby areas. Effectively, it was an Armenian victory.  In the years that followed, the ceasefire was frequently violated by both sides and the underlying issues never resolved. Then on September 27, 2020, Azerbaijan’s forces launched a rapid dawn offensive, beginning 44 days of war.  This time, it was a resounding success for Azerbaijan, retaking all of the Armenian-held territory around Nagorno-Karabakh as well as about a third of the territory itself. At least 6,500 people were killed before the two sides agreed to a Russian-monitored ceasefire and only a winding mountain road was left to connect Armenia and Karabakh. (Though Russia, the preeminent military power in the region, is a traditional ally of Armenia, it has been hedging its bets more in recent years, particularly since the 2018 protests that brought a Western-inclined, democratic government to power in Armenia.) Finally, in 2023 — with Russia distracted and bogged down by its war in Ukraine — Azerbaijan launched a blockade of Nagorno Karabakh, eventually seizing the region and causing the majority of its Armenian population to flee. The Republic of Nagorno-Karabakh was dissolved in 2024.     A glimpse of the future of war What made Azerbaijan’s rapid victory possible? One major factor was Turkey’s strong military support for Azerbaijan, a fellow Muslim, Turkic-speaking nation that Turkey saw as a key ally in extending its influence into the Caucasus. Another related factor was Azerbaijan’s deployment of unmanned drones, particularly the Turkish-made Bayraktar TB-2 attack drone, as well as several models of exploding drones purchased from Israel. These weapons proved stunningly effective at destroying the tanks and air defense systems of the Armenian and Nagorno-Karabakh forces.  “The Armenians and Nagorno-Karabakh had their forces dug in in advantageous positions, and they might have won if this war had unfolded the way it did in 1994, but it didn’t,” Sam Bendett, a senior fellow at the Center for a New American Security and expert on drone warfare, told Vox. “The Azeris understood that they couldn’t dislodge the Armenians in any other way than to send drones rather than piloted aircraft.” As much as this war was fought under the global radar, these tactics caused a tectonic shift in the prevailing perception of drones as a weapon. From the beginning of the 20-year-long US war on terrorism, unmanned aircraft played an important role, but they were primarily multimillion-dollar machines like the Predator and Reaper that were employed mostly as weapons for remote targeting of specific targets away from declared battlefields. Large numbers of simple, replaceable drones could turn the tide on the battlefield in a conventional war The Nagorno-Karabakh war showed how large numbers of simple, replaceable drones could turn the tide on the battlefield in a conventional war. As the military analyst Michael Kofman wrote at the time, “Drones are relatively cheap, and this military technology is diffusing much faster than cost-effective air defense or electronic warfare suitable to countering them.” Lessons learned in the Nagorno-Karabakh conflict were employed in the Ukraine war, when Ukrainian forces made effective use of cheap drones — including, once again, the Turkish TB-2 — to negate the invading Russians’ advantages in mass and firepower. Over time, the evolving use of masses of cheap drones for strikes and surveillance by both sides in Ukraine have made traditional maneuver warfare vastly more difficult, another dynamic predicted by the conflict in Nagorno-Karabakh. Two years into the war, drones are one major reason why the front line often appears stuck in place. Another way Nagorno-Karabakh seemed to be a harbinger of conflicts to come was in the role of social media in shaping global perceptions of the war. As the media scholar Katy Pearce wrote in 2020, “Armenians and Azerbaijanis in country and those who have settled elsewhere have long battled on social media, and this escalated during the war … For Armenians and Azerbaijanis, whether still in the region or part of the wider diaspora, social media provided a way to participate, and feel engaged.”  As with Ukraine two years later, this was a war with an extraordinary amount of battlefield footage that was available to the public, and where that footage was captured by the participants themselves via drone camera or smartphone, rather than conventional (and more impartial) war reporters. This allowed both sides to shape public perceptions of what was happening on the battlefield, a phenomenon we’re seeing again with the Israel-Hamas war and the way social media images have driven coverage of that conflict. Journalists attempting to write objectively about the conflict often came under attack online from partisans who objected to what they saw as biased or unduly negative coverage.  For Armenia, this may have backfired. When Prime Minister Nikol Pashinyan finally signed the ceasefire deal, he faced mass protests and accusations that he had sold the country, in part because many Armenians hadn’t actually believed they were losing the war — until they lost the war.  A new age of conquest?  Azerbaijan’s offensive was not a straightforward land grab. Nagorno-Karabakh’s independence was not recognized by any other country on earth — technically, not even Armenia — and as far as international law was concerned, Armenian troops were occupying part of Azerbaijan’s territory. There are many such unresolved border disputes and unrecognized semi-sovereign territories around the world today.  Still, as Thomas De Waal, a senior fellow at the Carnegie Endowment for International Peace and author of one of the definitive books on the conflict, told Vox, “Azerbaijan’s war of 2020 broke a pattern in European security where the assumption was that all these unresolved conflicts across Europe had to be resolved peacefully. Azerbaijan rewrote the rulebook, used force, and as far as it was concerned, got away with it.”  De Waal suggests the relatively muted international reaction to the war — the US called for a ceasefire but did not sanction Azerbaijan despite calls from some members of Congress to do so — may have been one of a number of factors that led Russia’s government to believe, two years later, that “there was a more permissive international environment for the use of force and there wasn’t going to be as much pushback [to invading Ukraine] as there might have been a decade before.” Was this brief conflict in the Caucasus a sign of a larger shift? In recent decades, wars of territorial conquest have been rare, and successful ones even rarer. The best-known examples — North Korea’s attempt to conquer South Korea in 1950, or Saddam Hussein’s invasion of Kuwait in 1990 — have prompted massive international interventions to protect international borders. Wars within states, sometimes drawing in international intervention, have been more common.  “For a very long time after the Second World War, there was a pretty widespread understanding on how the use of force is not a legitimate means of resolving territorial disputes,” Nareg Seferian, a US-based Armenian political analyst and writer, told Vox. “I don’t think many people realize that until at least the First World War, if not the Second, that was just a really normal thing.” The bloody and ongoing international conflict in Ukraine is in many ways quite rare. If that starts to change, a month-and-a-half-long war in the Caucasus in 2020 could eventually be remembered as a pivotal turning point — not just in how wars are fought, but why.
1 h
vox.com
The “racial reckoning” of 2020 set off an entirely new kind of backlash
It took less than a day for the world to start rallying for George Floyd in late May 2020. The events that led to Floyd’s murder unfolded over hours, but a viral 10-minute video recording of the deadly encounter with Minneapolis police officer Derek Chauvin was enough to send floods of people nationwide into the streets for months.  In the weeks after Floyd’s killing, the number of Americans who said they believe racial discrimination is a big problem and that they support the Black Lives Matter movement spiked. As books about racial injustice flew off of bookstore shelves, corporate leaders, politicians, and celebrities pledged to fight racism. The events of 2020 disturbed America’s collective conscience, and the movement for justice captivated millions. Until it didn’t.   In retrospect, there were signs of brewing right-wing resistance all along. While many peacefully protested, others called for their defeat. Arkansas Republican Sen. Tom Cotton demanded that the US military be brought in to fight “insurrectionists, anarchists, rioters, and looters.” As police officers used tear gas and rubber bullets to disperse crowds across the country, President Donald Trump deployed the National Guard to “dominate the streets” and defend “life and property,” sending thousands of troops and federal law enforcement officers to control protesters in Washington, DC; Portland, Oregon; and other cities.  More from This Changed Everything The last 10 years, explained Serial transformed true crime — and the way we think about criminal justice The overlooked conflict that altered the nature of war in the 21st century Some Americans who wanted to stamp out the unrest took it upon themselves to practice vigilantism. One of them, Kyle Rittenhouse, fatally shot two unarmed men and wounded another when he brought an AR-15-style rifle to protests in Kenosha, Wisconsin. (Rittenhouse was later acquitted of all homicide charges.) Though the mass mobilization of 2020 brought hope, it’s clear today that it also marked a turning point for backlash as the mirage of progress morphed into seemingly impenetrable resistance. Historically, backlash has embodied a white rejection of racial progress. Over the past few years, the GOP has built on that precedent and expanded its reach.  The right watched progressives rally for change and immediately fought back with the “Big Lie” of a stolen election. In many of the states that Biden flipped in 2020, Republicans rushed to ban ballot drop boxes, absentee ballots, and mobile voting units, the methods that allowed more people to vote. Since then, we’ve seen the passage of dozens of regressive laws, including anti-protest laws, anti-LGBTQ laws, and anti-diversity, equity, and inclusion laws. In state after state, these bans were coupled with incursions against reproductive rights, as some conservatives announced plans to take over every American institution from the courts to the schools to root out liberalism and progress. “[The backlash] came like a multi-front war on democracy, a multi-front war on liberalism, a multi-front war on a multicultural democracy,” said historian Carol Anderson, who has examined backlash in books such as White Rage and We Are Not Yet Equal. “It knocked some folks back on their heels.”  A brief history of backlash in America Backlash politics have long defined the country. The term “backlash” gained popularity in politics after John F. Kennedy proposed the Civil Rights Act of 1963. “Transferred to the world of politics, the white backlash aptly describes the resentment of many white Americans to the speed of the great Negro revolution, which has been gathering momentum since the first rash of sit-ins in early 1960,” said a 1964 article in Look magazine. The phenomenon, however, goes back to Reconstruction beginning in the 1860s, when white lawmakers claimed that equality for freed Black Americans threatened them, according to Larry B. Glickman, a historian at Cornell University who is writing a book about backlash since Reconstruction. Lawmakers instituted literacy tests and taxes at the polls while white agitators used violence and intimidation, all to prevent Black Americans from participating as full citizens.   “There’s a backlash impulse in American politics,” Glickman said. “I think 2020 is important because it gets at another part of backlash, which is the fear that social movements for equality and justice might set off a stronger counter-reaction.” The protests of 2020 did. And though race is still at the core of the post-George Floyd backlash, many Republicans have gone to new lengths to conceal this element.  “One of the things that the civil rights movement accomplished was to make being overtly racist untenable,” said Anderson. “Today they say, ‘I can do racist stuff, but don’t call me racist.’” For Anderson, backlash is about instituting state-level policies that undermine African Americans’ advancement toward their citizenship rights. By early 2021, alongside the effort to “stop the steal,” legislation that would limit or block voting access, give police protection, and control the teaching of concepts such as racial injustice began spreading across Republican-controlled state legislatures — all in the name of protecting America.  “They cover [voter suppression] with the fig leaf of election integrity, with the fig leaf of trying to protect democracy, and with the fig leaf of stopping massive rampant voter fraud,” Anderson said. And, she said, laws banning the teaching of history get covered “with the fig leaf of stopping indoctrination.” That coordinated legislation was a direct response to potential racial gains for Black Americans and other marginalized groups. “After the death of George Floyd in 2020, it seemed like all of our institutions suddenly shifted overnight,” conservative activist Christopher Rufo said in a 2022 interview. Rufo’s answer was to release a series of reports about diversity training programs in the federal government and critical race theory, which, he argued, “set off a massive response, or really, revolt amongst parents nationwide.”  “Race is key,” said Glickman. “When the term backlash was popularized, it was often called the ‘white backlash.’ It was very clear that it was understood as resentment. The campaign for Black equality was moving too fast and going too far. I still think that’s at the root of many backlash movements.”  The new era of backlash is grievance-driven  That racial resentment has since taken on a particularly acrid temperament since Floyd’s death. At the 2023 Conservative Political Action Conference, Trump, facing a litany of criminal and civil charges, stood on stage and told the audience, “I am your warrior. I am your justice. And for those who have been wronged and betrayed, I am your retribution.”  Trump’s words summarized the political discourse that has spread since the killing of George Floyd and highlighted the absence of a formal Republican policy agenda. “[What he said was] not policy,” said historian John Huntington, author of the book Far Right Vanguard: The Radical Roots of Modern Conservatism. “It was just vengeance for some sort of perceived wrongs.” He added, “policy has taken a backseat to cultural grievances.”  What Huntington calls out as “endless harangues against very nebulous topics like critical race theory or wokeness or whatever the current catchphrase is right now” are an important marker of this new era. “A key element of the current backlash we’re seeing is a politics of grievance,” he says. “‘I have been wronged somehow by the liberals or whoever, and Trump is going to help me get even with these people that I don’t like.’” “It’s a reversal that happens in backlash language where privileged white people take the historical position of oppressed people” Glickman calls this backlash tactic an “inversion” or “elite victimization”: “It’s a reversal that happens in backlash language where privileged white people take the historical position of oppressed people — often African Americans but sometimes other oppressed groups — and they speak from that vantage point.” To be sure, Republicans have passed dozens of laws through state legislatures to do everything from restricting voting to banning trans athletes from participating in sports. But for Huntington, these reactionary laws don’t amount to legitimate policy. “It’s very difficult to convince people to build a society rather than trying to tear down something that’s already existing,” he said. “Critiquing is easy. Building is hard.” Nationally, Republicans only passed 27 laws despite holding 724 votes in 2023.  Though other backlash movements in history, such as the response to desegregation or the Confederacy, have involved violence, today’s backlash also features a greater embrace of it from the Republican Party as a whole, according to Huntington.  “But nowadays, the GOP, having moored themselves to Trump, have very much kind of implicitly embraced this politics of violence,” Huntington said.  The January 6 insurrection, and how Trump and other Republicans have expressed a desire to pardon insurrectionists, is emblematic of how the party has aligned itself with a much more radical idea of how to gain and keep power.  “If you’re embracing the politics of violence in order to gain power,” said Huntington, “that illustrates a dark turn in American politics.”  Still, no backlash is forever. The events of 2020 triggered a particularly virulent right-wing response, but many such movements have failed, including various stages of this one.  “Backlashes have been very effective at mobilizing opposition to movements for equality, but I don’t think that they’re necessarily successful,” said Glickman. “I would say the jury’s still out.” They “are often seen as automatic and inevitable and sort of mechanistic and unstoppable. But I don’t think that,” he added. “Backlashes are political movements made up of human beings who were asserting their agency, and sometimes they’re successful and sometimes they’re not successful. I think we’ve blown up the backlash sometimes as this all-powerful phenomenon.” This current backlash certainly isn’t achieving all of its goals. Trump lost in 2020, and the decision to overturn Roe v. Wade has prompted a backlash to the backlash, with voters in several states choosing to protect abortion rights through constitutional amendments.  With all their force and fire, backlashes can fail to anticipate pushback from people committed to democratic values. “The mobilization is really quiet,” Anderson said. “We are so focused on the flames that we miss the kindling … we miss the folks who are quietly, doggedly going about the work of democracy.” 
1 h
vox.com
On D-Day, the U.S. Conquered the British Empire
For most Americans, D-Day remains the most famous battle of World War II. It was not the end of the war against Nazism. At most, it was the beginning of the end. Yet it continues to resonate 80 years later, and not just because it led to Hitler’s defeat. It also signaled the collapse of the European empires and the birth of an American superpower that promised to dedicate its foreign policy to decolonization, democracy, and human rights, rather than its own imperial prestige.It is easy to forget what a radical break this was. The term superpower was coined in 1944 to describe the anticipated world order that would emerge after the war. Only the British empire was expected to survive as the standard-bearer of imperialism, alongside two very different superpower peers: the Soviet Union and the United States. Within weeks of D-Day, however, the British found themselves suddenly and irrevocably overruled by their former colony.That result was hardly inevitable. When the British and the Americans formally allied in December 1941, the British empire was unquestionably the senior partner in the relationship. It covered a fifth of the world’s landmass and claimed a quarter of its people. It dominated the air, sea, and financial channels on which most global commerce depended. And the Royal Navy maintained its preeminence, with ports of call on every continent, including Antarctica.The United States, by contrast, was more of a common market than a nation-state. Its tendency toward isolationism has always been overstated. But its major foreign-policy initiatives had been largely confined to the Western Hemisphere and an almost random collection of colonies (carefully called “territories”), whose strategic significance was—at best—a point of national ambivalence.In the two years after Pearl Harbor, the British largely dictated the alliance’s strategic direction. In Europe, American proposals to take the fight directly to Germany by invading France were tabled in favor of British initiatives, which had the not-incidental benefit of expanding Britain’s imperial reach across the Mediterranean and containing the Soviet Union (while always ensuring that the Russians had enough support to keep three-quarters of Germany’s army engaged on the Eastern Front).Things changed, however, in November 1943, when Winston Churchill and Franklin D. Roosevelt held a summit in Cairo. The British again sought to postpone the invasion of France in favor of further operations in the Mediterranean. The debate quickly grew acrimonious. At one point, Churchill refused to concede on his empire’s desire to capture the Italian island of Rhodes. George Marshall, the usually stoic U.S. Army chief of staff, shouted at the prime minister, “Not one American is going to die on that goddamned beach!” Another session was forced to end abruptly after Marshall and his British counterpart, Sir Alan Brooke, nearly came to blows.With the fate of the free world hanging in the balance, a roomful of 60-year-old men nearly broke out into a brawl because by November 1943, America had changed. It was producing more than twice as many planes and seven times as many ships as the whole British empire. British debt, meanwhile, had ballooned to nearly twice the size of its economy. Most of that debt was owed to the United States, which leveraged its position as Britain’s largest creditor to gain access to outposts across the British empire, from which it built an extraordinary global logistics network of its own.[From the April 2023 issue: The age of American naval dominance is over]Having methodically made their country into at least an equal partner, the Americans insisted on the invasion of France, code-named “Operation Overlord.” The result was a compromise, under which the Allies divided their forces in Europe. The Americans would lead an invasion of France, and the British would take command of the Mediterranean.Six months later, on June 6, 1944, with the D-Day invasion under way, the British empire verged on collapse. Its economic woes were exacerbated by the 1.5 million Americans, and 6 million tons of American equipment, that had been imported into the British Isles to launch Operation Overlord. Its ports were jammed. Inflation was rampant. Its supply chains and its politics were in shambles. By the end of June 1944, two of Churchill’s ministers were declaring the empire “broke.”The British continued to wield considerable influence on world affairs, as they do today. But after D-Day, on the battlefields of Europe and in international conference rooms, instead of setting the agenda, the British found themselves having to go along with it.In July 1944, at the Bretton Woods Conference, the British expectation that global finance would remain headquartered in London and transacted at least partially in pounds was frustrated when the International Monetary Fund and what would become the World Bank were headquartered in Washington and the dollar became the currency of international trade. In August 1944, America succeeded in dashing British designs on the eastern Mediterranean for good in favor of a second invasion of France from the south. In September 1944, the more and more notional British command of Allied ground forces in Europe was formally abandoned. In February 1945, at a summit in Yalta, Churchill had little choice but to acquiesce as the United States and the Soviet Union dictated the core terms of Germany’s surrender, the division of postwar Europe, and the creation of a United Nations organization with a mandate for decolonization.How did this happen so quickly? Some of the great political historians of the 20th century, such as David Reynolds, Richard Overy, and Paul Kennedy, have chronicled the many political, cultural, and economic reasons World War II would always have sounded the death knell of the European imperial system. Some British historians have more pointedly blamed the Americans for destabilizing the British empire by fomenting the forces of anti-colonialism (what D. Cameron Watt called America’s “moral imperialism”).Absent from many such accounts is why Britain did not even try to counterbalance America’s rise or use the extraordinary leverage it had before D-Day to win concessions that might have better stabilized its empire. The French did precisely that with far less bargaining power at their disposal, and preserved the major constituents of their own empire for a generation longer than the British did. The warning signs were all there. In 1941, Germany’s leading economics journal predicted the rise of a “Pax Americana” at Britain’s expense. “England will lose its empire,” the article gloatingly predicted, “to its partner across the Atlantic.”[Read: How Britain falls apart]The American defense-policy scholar and Atlantic contributing writer Kori Schake recently made a persuasive case that Britain came to accept the role of junior partner in the Atlantic alliance, rather than seek to balance American power, because the two countries had become socially, politically, and economically alike in all the ways that mattered. Britain, in other words, had more to lose by confrontation. And so it chose friendship.The argument makes sense to a point, especially given how close the United Kingdom and the United States are today. But the remembered warmth of the “special relationship” in the 1940s is largely a product of nostalgia. British contempt for American racism and conformist consumerism seethed especially hot with the arrival in the U.K. of 1.5 million Americans. And American contempt for the British class system and its reputation for violent imperialism equally made any U.S. investment in the war against Germany—as opposed to Japan—a political liability for Roosevelt.The British elite had every intention of preserving the British empire and European colonialism more generally. In November 1942, as Anglo-American operations began in North Africa, Churchill assured France that its colonies would be returned and assured his countrymen, “I have not become the King’s First Minister in order to preside over the liquidation of the British Empire.”The British assumed that America’s rise was compatible with that goal because they grossly miscalculated American intentions. This was on stark display in March 1944, just over two months before D-Day, when Britain’s Foreign Office circulated a memorandum setting out the empire’s “American policy.” Given how naive the Americans were about the ways of the world, it said, Britain should expect them to “follow our lead rather than that we follow theirs.” It was therefore in Britain’s interest to foster America’s rise so that its power could be put to Britain’s use. “They have enormous power, but it is the power of the reservoir behind the dam,” the memo continued. “It must be our purpose not to balance our power against that of America, but to make use of American power for purposes which we regard as good” and to “use the power of the United States to preserve the Commonwealth and the Empire, and, if possible, to support the pacification of Europe.”It is easy to see why members of Britain’s foreign-policy elite, still warmed by a Victorian afterglow, might discount Americans’ prattling on about decolonization and democracy as empty wartime rhetoric. If anything, they thought, Americans’ pestering insistence on such ideals proved how naive they were. Churchill often grumbled with disdain about Americans’ sentimental affection for—as he put it—the “chinks” and “pigtails” fighting against Japan in China, scornful of the American belief that they could be trusted to govern themselves.And the face America presented to London might have compounded the misapprehension. Roosevelt was expected to choose George Marshall to be the American commander of Operation Overlord, a position that would create the American equivalent of a Roman proconsul in London. Instead, he picked Dwight Eisenhower.Roosevelt’s reasons for choosing Eisenhower remain difficult to pin down. The president gave different explanations to different people at different times. But Eisenhower was the ideal choice for America’s proconsul in London and Europe more generally, if the goal was to make a rising American superpower seem benign.Eisenhower had a bit of cowboy to him, just like in the movies. He was also an Anglophile and took to wearing a British officer’s coat when visiting British troops in the field. He had a natural politician’s instinct for leaving the impression that he agreed with everyone. And he offered the incongruous public image of a four-star general who smiled like he was selling Coca-Cola.He was also genuinely committed to multilateralism. Eisenhower had studied World War I closely and grew convinced that its many disasters—in both its fighting and its peace—were caused by the Allies’ inability to put aside their own imperial prestige to achieve their common goals. Eisenhower’s commitment to Allied “teamwork,” as he would say with his hokey Kansas geniality, broke radically from the past and seemed hopelessly naive, yet was essential to the success of operations as high-risk and complex as the D-Day invasion.Eisenhower, for his part, was often quite deft in handling the political nature of his position. He knew that to be effective, to foster that teamwork, he could never be seen as relishing the terrifying economic and military power at his disposal, or the United States’ willingness to use it. “Hell, I don’t have to go around jutting out my chin to show the world how tough I am,” he said privately.On D-Day, Eisenhower announced the invasion without mentioning the United States once. Instead, he said, the landings were part of the “United Nations’ plan for the liberation of Europe, made in conjunction with our great Russian allies.” While the invasion was under way, Eisenhower scolded subordinates who issued reports on the extent of French territory “captured.” The territory, he chided them, had been “liberated.”The strategy worked. That fall, with Paris liberated, only 29 percent of French citizens polled felt the United States had “contributed most in the defeat of Germany,” with 61 percent giving credit to the Soviet Union. Yet, when asked where they would like to visit after the war, only 13 percent were eager to celebrate the Soviet Union’s contributions in Russia itself. Forty-three percent said the United States, a country whose Air Force had contributed to the deaths of tens of thousands of French civilians in bombing raids.In rhetoric and often in reality, the United States has continued to project its power, not as an empire, but on behalf of the “United Nations,” “NATO,” “the free world,” or “mankind.” The interests it claims to vindicate as a superpower have also generally not been its imperial ambition to make America great, but the shared ideals enshrined soon after the war in the UN Charter and the Universal Declaration of Human Rights.Had the D-Day invasion failed, those ideals would have been discredited. Unable to open the Western Front in France, the Allies would have had no choice but to commit to Britain’s strategy in the Mediterranean. The U.S. military, and by extension the United States, would have lost all credibility. The Soviets would have been the only meaningful rival to German power on the European continent. And there would have been no reason for the international politics of national prestige and imperial interest to become outmoded.Instead, on D-Day, American soldiers joined by British soldiers and allies from nearly a dozen countries embarked on a treacherous voyage from the seat of the British empire to the shores of the French empire on a crusade that succeeded in liberating the Old World from tyranny. It was a victory for an alliance built around the promise, at least, of broadly shared ideals rather than narrow national interests. That was a radical idea at the time, and it is becoming a contested one today. D-Day continues to resonate as much as it does because, like the battles of Lexington and Concord, it is an almost-too-perfect allegory for a decisive turning point in America’s national story: the moment when it came into its own as a new kind of superpower, one that was willing and able to fight for a freer world.
1 h
theatlantic.com