Tools
Change country:
Singer-songwriter Huey Lewis on seeing his songs come to life on stage
Singer-songwriter Huey Lewis joins "CBS Mornings" to talk about his new Broadway musical, "The Heart of Rock and Roll," and working through hearing loss.
1m
cbsnews.com
American complacency is Trump’s secret weapon
Popular culture instills the idea that good ultimately triumphs over evil. Real life begs to differ.
washingtonpost.com
Europeans Are Watching the U.S. Election Very, Very Closely
American allies see a second Trump term as all but inevitable. “The anxiety is massive.”
theatlantic.com
Elon Musk, America’s richest immigrant, is angry about immigration. Can he influence the election?
The most financially successful immigrant in the U.S. — the third-richest person in the world — has frequently repeated his view that it is difficult to immigrate to the U.S. legally but “trivial and fast” to enter illegally.
latimes.com
Op-comic: What one doctor learned as a guinea pig for AI
I was skeptical of bringing artificial intelligence into the exam room, but it promised to reduce my screen time and shift the focus back to the patients.
latimes.com
What would the great George Balanchine do? L.A. ballet director thinks he has the answers
It's provocative to aspire to slip into the mind of one of ballet’s great masters, but Lincoln Jones sees it as a progression in his long devotion to George Balanchine’s art.
latimes.com
They cut their water bill by 90% and still have a 'showstopping' L.A. garden
A Los Angeles couple tore out 1,150 square feet of thirsty lawn, replacing it with a showstopping mix of low-water California native plants.
latimes.com
The U.S. Drought Monitor is a critical tool for the arid West. Can it keep up with climate change?
New research raises questions about the familiar map's ability to address long-term drying trends, including persistent dry spells across the American West.
latimes.com
Forget the trendy juice bars. This is the place to go for green juice
TK
latimes.com
Santa Monica sci-fi museum controversy: A child porn conviction, delays and angry ‘Star Trek’ fans
Questions surround Santa Monica’s Sci-Fi World as staff and volunteers quit and claim that its founder, who was convicted for possession of child pornography, remains active in the museum.
latimes.com
After 13 years, a homeless Angeleno broke into her old, vacant home and wants to stay forever
Maria Merritt has faced addiction, death of loved ones and other tragedies. A publicly owned home in El Sereno she had, lost, then regained gives her the strength to go on.
latimes.com
The transformative joys (and pains) of painting your own house
I self-impose and prolong my chaotic paint experiments because collectively, they form a promise: that one day I’ll be able to live happily in the house I’ve always wanted.
latimes.com
'Resident Alien' star Alan Tudyk is in no hurry to return to his home planet
'Mork and Mindy,' Looney Tunes and Mel Brooks all helped shape the actor as a young person.
latimes.com
WeHo Pride parade-goers talk joy and inclusivity, trans rights and a thread of fear
Threats against queer people didn't quell the joyful celebration at this year's West Hollywood Pride Parade.
latimes.com
Who should be the next LAPD chief? Public shrugs as city asks for input
As the Police Commission continues its citywide listening tour to hear about what residents want to see in the department's next leader, many of the stops have seen a low turnout.
latimes.com
Newsom finally gets moving on fixing California's homeowner insurance crisis
California Gov. Gavin Newsom has proposed urgency legislation to expedite the hiking of homeowner insurance rates. It’s about time. Because the alternative for many is no insurance at all.
latimes.com
Letters to the Editor: A lifeguard who can't tolerate the LGBTQ+ Pride flag shouldn't be a lifeguard
The lifeguard so upset by the presence of an LGBTQ+ Pride flag that he's suing L.A. County might want to find another line of work.
latimes.com
Letters to the Editor: California's new electricity billing scheme discourages conversation. That's crazy
A flat fee of $24.15 on most utility customers. Reduced per-kilowatt hour rates. How is this supposed to encourage power conservation?
latimes.com
Biden and Trump share a faith in import tariffs, despite inflation risks
Both candidates’ trade plans focus on tariffs on imported Chinese goods even as economists warn they could lead to higher prices.
washingtonpost.com
Caltrans' lapses contributed to 10 Freeway fire, Inspector General finds
For over 15 years, Caltrans failed to enforce safety at its property where a fire broke out last year, shutting down the 10 Freeway.
latimes.com
13 essential LGBTQ+ television shows (and a parade) to watch during Pride Month
Here’s a guide to queer TV shows, from 'Dead Boy Detectives' to 'Veneno' to 'The L Word,' to make your Pride Month merry.
latimes.com
Senate Democrats to unveil package to protect IVF as party makes reproductive rights push
The package comes as Senate Majority Leader Chuck Schumer has outlined plans for the chamber to put reproductive rights "front and center" this month.
cbsnews.com
Hunter Biden's federal gun trial to begin today
Hunter Biden faces three felony charges related to his purchase and possession of a gun while he was a drug user.
cbsnews.com
Home buyers beware: Buying a property with unpermitted structures can lead to hefty fines
California realtors advise that buyers understand a property's history and structure condition before finalizing their purchase, saving them the headache and cost of future fixes.
latimes.com
The internet peaked with “the dress,” and then it unraveled
If you were on the internet on February 26, 2015, you saw The Dress. Prompted by a comment on Tumblr, BuzzFeed writer Cates Holderness posted a simple low-quality image of a striped dress, with the headline “What Colors Are This Dress?” The answers: blue and black or white and gold. The URL: “help-am-i-going-insane-its-definitely-blue.” Do you really need me to tell you what happened next? In just a few days, the BuzzFeed post got 73 million page views, inspiring debate across the world. Seemingly every news outlet (including this one) weighed in on the phenomenon. How was it possible that this one image divided people so neatly into two camps? You either saw — with zero hint of variability — the dress as black and blue, or white and gold. There was no ambiguity. Only a baffling sense of indignation: How could anyone see it differently? Looking back, the posting of “the dress” represented the high-water mark of “fun” on the mid-2010s internet. Back then, the whole media ecosystem was built around social sharing of viral stories. It seemed like a hopeful path for media. BuzzFeed and its competitors Vice and Vox Media (which owns this publication) were once worth billions of dollars. The social-sharing ecosystem made for websites that would, for better or worse, simply ape each other’s most successful content, hoping to replicate a viral moment. It also fostered an internet monoculture. Which could be fun! Wherever you were on the internet, whatever news site you read, the dress would find YOU. It was a shared experience. As were so many other irreverent moments (indeed, the exact same day as the dress, you probably also saw news of two llamas escaping a retirement community in Arizona.) More from This Changed Everything The last 10 years, explained How the self-care industry made us so lonely Serial transformed true crime — and the way we think about criminal justice Since 2015, the engines of that monoculture have sputtered. Today, BuzzFeed’s news division no longer exists; the company’s stock is trading at around 50 cents a share (it debuted at about $10). Vice has stopped publishing on its website and laid off hundreds of staffers. Vox Media is still standing (woo!), but its reported value is a fraction of what it used to be (sigh). The dress brought us together. It was both a metaphor and a warning about how our shared sense of reality can so easily be torn apart. Whether you saw gold and white or black and blue, the meme revealed a truth about human perception. Psychologists call it naive realism. It’s the feeling that our perception of the world reflects its physical truth. If we perceive a dress as looking blue, we assume the actual pigments inside the dress generating the color are blue. It’s hard to believe it could be any other color. But it’s naive because this is not how our perceptual systems work. I’ve written about this a lot at Vox. The dress and other viral illusions like the similarly ambiguous “Yanny” vs. “Laurel” audio reveal the true nature of how our brains work. We’re guessing. As I reported in 2019: Much as we might tell ourselves our experience of the world is the truth, our reality will always be an interpretation. Light enters our eyes, sound waves enter our ears, chemicals waft into our noses, and it’s up to our brains to make a guess about what it all is. Perceptual tricks like … “the dress” … reveal that our perceptions are not the absolute truth, that the physical phenomena of the universe are indifferent to whether our feeble sensory organs can perceive them correctly. We’re just guessing. Yet these phenomena leave us indignant: How could it be that our perception of the world isn’t the only one? Scientists still haven’t figured out precisely why some people see the dress in one shade and some see it in another. Their best guess so far is that different people’s brains are making different assumptions about the quality of the light falling on the dress. Is it in bright daylight? Or under an indoor light bulb? Your brain tries to compensate for the different types of lighting to make a guess about the dress’s true color. Why would one brain assume daylight and another assume indoor bulbs? A weird clue has arisen in studies that try to correlate the color people assume the dress to be with other personal characteristics, like how much time they spend in daylight. One paper found a striking correlation: The time you naturally like to go to sleep and wake up — called a chronotype — could be correlated with dress perception. Night owls, or people who like to go to bed really late and wake up later in the morning, are more likely to see the dress as black and blue. Larks, a.k.a. early risers, are more likely to see it as white and gold. In 2020, I talked to Pascal Wallisch, a neuroscientist at New York University who has researched this topic. He thinks the correlation is rooted in life experience: Larks, he hypothesizes, spend more time in daylight than night owls. They’re more familiar with it. So when confronted with an ill–lit image like the dress, they are more likely to assume it is being bathed in bright sunlight, which has a lot of blue in it, Wallisch points out. As a result, their brains filter it out. Night owls, he thinks, are more likely to assume the dress is under artificial lighting, and filtering that out makes the dress appear black and blue. (The chronotype measure, he admits, is a little crude: Ideally, he’d want to estimate a person’s lifetime exposure to daylight.) Other scientists I talked to were less convinced this was the full answer (there are other potential personality traits and lifetime experiences that could factor in as well, they said). Even if there’s more to this story than chronotype, there’s an enduring lesson here. Our differing life experiences can set us up to make different assumptions about the world than others. Unfortunately, as a collective, we still don’t have a lot of self-awareness about this process. “Your brain makes a lot of unconscious inferences, and it doesn’t tell you that it’s an inference,” Wallisch told me. “You see whatever you see. Your brain doesn’t tell you, ‘I took into account how much daylight I’ve seen in my life.’” Moments like the dress are a useful check on our interpretations. We need intellectual humility to ask ourselves: Could my perceptions be wrong? The dress was an omen because, in many ways, since 2015, the internet has become a worse and worse place to do this humble gut check (not that it was ever a great place for it). It’s become more siloed. “You see whatever you see” Its users are seemingly less generous to one another (not that they were ever super generous!). Shaming and mocking are dominant conversational forms (though, yes, irreverence and fun can still be had). This all matters because our shared sense of reality has fractured in so many important ways. There were huge divides on how people perceived the pandemic, the vaccines that arose to help us through it, the results of the 2020 election. Not all of this is due to the internet, of course. A lot of factors influence motivated reasoning and motivated perceptions, the idea that we see what we want to see. There are leaders and influencers who stoke the flames of conspiracy and misinformation. But in a similar way to how our previous experiences can motivate us to see a dress in one shade or another, they can warp our perception of current events, too. Though, I will admit: Maybe my perception of a more siloed internet is off! It’s hard to gauge. Algorithm-based feeds today are more bespoke than ever before. I can’t know for sure whether my version of the social internet is like anyone else’s. My TikTok feed features a lot of people retiling their bathrooms. That can’t possibly be the average user’s experience, right? I have no idea if we’re all seeing the same things — and even less of an idea if we’re interpreting them the same way. More chaos is coming, I fear. AI tools are making it easier and easier to manipulate images and videos. Every day, it gets easier to generate content that plays into people’s perceptual biases and confirms their prior beliefs — and easier to warp perceptions of the present and possibly even change memories of the past. The dress represents, arguably, a simpler time on the internet, but also offers a mirror to some of our most frustrating psychological tendencies. What I wonder all the time is: What piece of content is out there, right now, generating different perceptual experiences in people, but we don’t even know we’re seeing it differently?
vox.com
How the self-care industry made us so lonely
Where were you the first time you heard the words “bath bomb?” What about “10-step skin care routine?” Perhaps you have, at some point, canceled plans in order to “unplug,” drink some tea, and take a bit of “me time.” Maybe you’ve ordered an assortment of candles meant to combat anxiety and stress or booked a rage room to exorcise your demons.  A warped notion of self-care has been normalized to the point where everyday activities like washing yourself and watching TV are now synonymous with the term. Generally understood as the act of lovingly nursing one’s mind and body, a certain kind of self-care has come to dominate the past decade, as events like the 2016 election and the Covid pandemic spurred collective periods of anxiety layered on top of existing societal harms. It makes sense that interest in how to quell that unease has steadily increased.  More from This Changed Everything The last 10 years, explained The internet peaked with “the dress,” and then it unraveled Serial transformed true crime — and the way we think about criminal justice Brands stepped forward with potential solutions from the jump: lotions, serums, journals, blankets, massagers, loungewear, meditation apps, tinctures. Between 2014 and 2016, Korean beauty exports to the US more than doubled. The Girls’ Night In newsletter was founded in 2017, with a mission to share “recommendations and night-in favorites … all focused on a topic that could use a bigger spotlight right now: downtime.” YouTube was soon saturated with videos of sponsored self-care routines. By 2022, a $5.6 trillion market had sprung to life under the guise of helping consumers buy their way to peace.  As the self-care industry hit its stride in America, so too did interest in the seemingly dire state of social connectedness. In 2015, a study was published linking loneliness to early mortality. In the years that followed, a flurry of other research illuminated further deleterious effects of loneliness: depression, poor sleep quality, impaired executive function, accelerated cognitive decline, cardiovascular disease, higher risk of coronary heart disease and stroke. US Surgeon General Vivek Murthy classified the prevalence of loneliness as an epidemic. By 2018, half of the country reported feeling lonely at least sometimes, according to a Cigna survey, a number that has only grown.  There is no singular driver of collective loneliness globally. A confluence of factors like smartphones, social media, higher rates of anxiety and depression, vast inequality, materialism, and jam-packed schedules have been identified as potentially spurring the crisis. But one practice designed to relieve us from the ills of the world — self-care, in its current form — has pulled us away from one another, encouraging solitude over connection.  How self-care became a commercial product The self-care of decades past was decidedly less individualistic and capitalist. In the 1950s, self-care was a term used in health care contexts: activities patients and their families could perform to promote their health and well-being separate from the care of medical professionals. “To me, self-care is a subjective and dynamic process aimed at maintaining health and preventing diseases or managing diseases when they appear,” says Michela Luciani, an assistant professor of nursing at the University of Milano-Bicocca. In this context, self-care can encompass everything from getting annual medical screenings to eating well.  In the years that followed, the Black Panthers stressed the importance of caring for oneself as a political act amid the civil rights movement. Through community efforts like free food programs for children and families as well as free health clinics, the Black Panthers focused on collective well-being. “[This] image of caring for your people and self-care,” says Karla D. Scott, a professor of communication at Saint Louis University, “evoked the African phrase ‘I am because we are’: ubuntu.” For Black activists, partaking in rejuvenating rituals was crucial in order to survive within and to fight against racist, classist, and sexist systems. This approach to self-care is especially evident in the works of bell hooks and Audre Lorde, who is often referenced in the context of self-care: “Caring for myself is not self-indulgence,” she wrote, “it is self-preservation, and that is an act of political warfare.” This definition of self-care emphasizes the importance of engaging with others. Not only do we receive support from family, friends, and neighbors, but communing itself is a form of care. People report high levels of well-being while spending time with their friends, romantic partners, and children. Social interaction with trusted companions has been found to help stave off depression. Even chatting with acquaintances and strangers promotes happiness and belonging. Buy a new eyeshadow, a bullet journal, Botox, a vacation to fill the need for care that never seems to abate By the late 1960s, wellness entered the lexicon. Beyond simply avoiding illness, “wellness” as a concept centered the pursuit of a higher level of existence: a more emotional, spiritual, physical, and intellectual way of living. A wellness resource center opened in California in 1975; nearly a decade later, a wellness-focused newsletter from the University of California Berkeley helped legitimize the concept. This model of well-being features individuals, not communities, moving toward their “ever-higher potential of functioning,” as posited by Halbert L. Dunn, who helped popularize the contemporary idea of wellness. (Dunn also includes the “basic needs of man” — communication, fellowship with other people, and love — as integral to wellness.)  The ethos of wellness soon became synonymous with a sullied version of self-care, one that mapped neatly to the rising fitness culture of the ’80s through the early 2000s and the concept of “working on yourself.”  The Great Recession of 2008 marked a shift in how Americans viewed their health and well-being. In her book Fit Nation: The Gains and Pains of America’s Exercise Obsession, Natalia Mehlman Petrzela argues that fitness became “a socially acceptable form of conspicuous consumption” during this time when social media and boutique fitness classes allowed people to broadcast their lavish spending in pursuit of their health. Gwyneth Paltrow’s wellness brand Goop was founded the same year, espousing occasionally unfounded health advice and recommending (and selling) “aspirational products which embody and encourage restriction, control, and scarcity,” according to one academic paper. Commoditized self-care was here to stay, reaching mass saturation right around the time Trump was elected to office. Young people, disillusioned by polarized politics, saddled with astronomical student loan debt, and burned out by hustle culture, turned to skin care, direct-to-consumer home goods, and food and alcohol delivery — aggressively peddled by companies eager to capitalize on consumers’ stressors. While these practices may be restorative in the short term, they fail to address the systemic problems at the heart of individual despair.  Thus, a vicious, and expensive, cycle emerges: Companies market skin care products, for example, to prevent the formation of fine lines, supposedly a consequence of a stressful life. Consumers buy the lotions to solve this problem, lather themselves in solitude, and feel at peace for a little while. Once the anxiety, the exhaustion, and the insufficiency creeps in again, as it inevitably does, the routine begins anew. Buy a new eyeshadow, a bullet journal, Botox, a vacation to fill the need for care that never seems to abate.  Because buying things does not solve existential dread, we are then flooded with guilt for being unable to adequately tend to our minds and bodies. We just have to self-care harder, and so the consumerism masquerading as a practice that can fix something broken becomes another rote to-do list item. Individualistic approaches to wellness promote isolation This isn’t to say that solitary activities can’t be effective forms of self-care. Many people are easily depleted by social interaction and take solace in regular quiet evenings alone; solo time is indeed integral to a balanced social regimen. Conversely, people who are constantly surrounded by others can still feel lonely. However, when companies market genuinely vitalizing practices as individualized “solutions” to real problems (like burnout) requiring structural change (such as affordable child care), we increasingly look inward. “I worry that because of this ideology we live in, rugged individualism,” Scott says, “it lands in a way where folks feel that they’re deficient. It is deflating.” Pooja Lakshmin, a psychiatrist and clinical assistant professor at George Washington University, calls this self-soothing capitalist version of self-care “faux self-care” in her best-selling book Real Self-Care: A Transformative Program For Redefining Wellness. Faux self-care manifests in two ways: I deserve to splurge on Doordash and binge Netflix because I’m so burned out and I’m going to push myself so hard in this spin class because I need to be the best. Secluding oneself by summoning sustenance to our doorstep comes at the expense of the worker earning paltry wages to deliver you that food. The doors of our apartments quite literally separate those who can afford to “care” for themselves and those who cannot. While this form of restoration appears to be more isolating, the hyper-competitive version of faux self-care is equally as confining, Lakshmin says. “They’re not engaging or present,” she says. “They’re competing with themselves.”  While many surveys and reports outline a recent rise in loneliness, researchers lack sufficient longitudinal data to definitively say whether people are lonelier now than in the past, says Luzia Heu, an assistant professor in interdisciplinary social sciences at Utrecht University. However, people in wealthier societies have more opportunities to spend time alone now, she says, whether through remote work, living alone, or participating in solitary hobbies. “We spend more time alone and we are more isolated,” Heu says. “That is where people immediately assume that loneliness must also have increased a lot.” Whether or not loneliness has grown compared to historical accounts, recent statistics show that individuals are reporting higher levels of loneliness over the last decade, especially in the wake of the pandemic. “Self-care transformed into self-obsession”  America’s loneliness epidemic is multifaceted, but the rise of consumerist self-care that immediately preceded it seems to have played a crucial role in kicking the crisis into high gear — and now, in perpetuating it. You see, the me-first approach that is a hallmark of today’s faux self-care doesn’t just contribute to loneliness, it may also be a product of it. Research shows self-centeredness is a symptom of loneliness. But rather than reaching out to a friend, we focus on personalized self-care and wonder why we might not feel fulfilled. Another vicious cycle. “Instead of self-care being this mechanism to take care of yourself so that you can then show up for others,” says psychologist Maytal Eyal and co-founder of women’s health company Gather, “self-care transformed into self-obsession.”  The wellness industry wouldn’t be as lucrative if it didn’t prey on our insecurities. It must imagine new insufficiencies for us to fixate on, new elixirs and routines — like colostrum and 75 Hard — simultaneously meant to improve your mind and body by keeping them occupied in solitude.  That isolation is detrimental to the self and to society. When people are lonely, they tend to distrust others — they’re on the lookout for social threats and expect rejection. Being so disconnected and suspicious of their neighbors, their communities, and institutions could impact their propensity to cooperate with others and act in prosocial ways. A lack of social belonging has been linked to a person’s increased likelihood of voting for populist candidates. Similarly, social rejection can lead one toward extremist views. This is especially good news for political figures who wish to sow discontent and chaos. A secluded electorate is an unengaged one. Those in positions of power have it in their best interests to keep workers, neighbors, and citizens separate, self-centered, and distracted. As Scott mentioned, the tradition of American individualism doesn’t help. When people are told they are solely responsible for their own happiness and well-being, they increasingly seek it out via solitary means. If they’re lonely to begin with — if they feel disappointed in their relationships or don’t feel understood — they have a stronger tendency to withdraw, says Heu, the social and behavioral science professor. Perhaps they seek out a form of commodified self-care to cope, but “it’s not something that tackles the cause of your loneliness,” Heu says. “For many people, the cause of the loneliness will be something else.” For women, to whom self-care is most aggressively targeted, the source of their loneliness may be tied to the demands of their lives. Even when they earn the same as their male partners, women in heterosexual relationships still do the lion’s share of housework, according to a Pew Research Center study. Women also spend more time on caregiving than their husbands, the survey found. An expensive candle won’t ease the burdens of home life or allow for more time to connect with peers outside of the household.  The narrative that the only one we can depend on, and thus should prioritize, is ourselves perpetuates the idea of the personal above the collective — and reinforces the notion of self-sufficiency. Self-care is individual, says Luciani, the nursing professor: No one else can force us to get enough sleep or go to the gym. But it shouldn’t be individualistic. “Self-care is influenced by the support from others,” she says, like a partner who cooks dinner and cares for the children while you lie down with a headache, or a friend who advocates for you at medical appointments. Communal self-care means creating space for others to tend to their needs and supporting them when necessary.  Despite the powerful forces working against us, we can reclaim self-care. We can choose to ignore compelling advertisements promising quick fixes. We can partake in revitalizing communal practices, whether they be a yoga class or a movie night with friends. We can avoid blaming ourselves for feeling stressed and scared and despondent in a violent, tumultuous, and unjust world. We can get to the root of our loneliness. True self-care involves connecting with others. Showing up for a friend in need or exchanging a few kind words with a stranger is more fulfilling than a face mask anyway. 
vox.com
The last 10 years, explained
The past decade was filled with so many unexpected turning points: moments big and small that we now understand to be truly important. These events ignited real change, warned of a not-so-far-off future, or had surprising effects that we couldn’t have imagined at the time. We started thinking about this particular time period because Vox just happened to turn 10 this year, but 2014 saw much more than the birth of our news organization. It was an incredibly divisive year kicking off an incredibly divisive decade. This was the year the police killings of Michael Brown and Eric Garner mainstreamed the Black Lives Matter movement; this was also the year of Gamergate, a harassment campaign that became entwined with the ascendant alt-right. It was a wildly online year, too, that set all sorts of attitudes and behaviors in motion (see: BLM and Gamergate, but also The Fappening and Kim Kardashian’s special brand of virality, below). Our reporters set out to explain the last 10 years of indelible moments — the good, the bad, the fascinating — in a series of pieces you can find across the site. If you want to understand how we got to where we are in 2024, read on. When nude leaks went from scandal to sex crime It’s been trendy lately to talk about how differently we now treat women, particularly famous women, than we did in the aughts. We talk about how today, we understand that it was wrong for tabloids to harass Britney Spears and publish all those upskirt photos and ask teen pop stars if their boobs were real on live TV.  There’s a specific moment, though, when we saw that much-remarked-upon evolution tip into reality, the purity culture of the 2000s coming up against the feminist outrage of the 2010s and crumbling.  More from This Changed Everything How the self-care industry made us so lonely The “racial reckoning” of 2020 set off an entirely new kind of backlash 10 big things we think will happen in the next 10 years The grossly named Fappening occurred on August 31, 2014, when one hacker’s stash of nearly 500 celebrity nudes (including Jennifer Lawrence, then at the height of her fame) leaked out to the mainstream internet. They became the fodder for a thousand op-eds about what was just beginning to be called revenge porn. (Ten years later, 2014’s cutting-edge term is now considered inaccurate, putting too much emphasis on the intent of the perpetrator and trivializing the severity of the crime being committed.) The previous decade had a playbook in place for talking about leaked photos of naked stars. You talked about them as something titillating for you, the viewer, to look at without apology, and something shameful for the woman (it was always a woman) pictured to apologize for.  For some media outlets, it seemed only natural to continue the playbook of the 2000s into 2014. “#JenniferLawrence phone was hacked her #nude pics leaked Check them out in all their gloriousness,” tweeted Perez Hilton, publicizing a post that reproduced the uncensored pictures of Lawrence.  But instead of getting the traffic windfall he might have expected, Perez was slammed with outrage across social media. He had to apologize for his post and replace it with a censored version. As Hilton and his cohort scrambled to catch up, the rest of the media was allying itself fiercely on the side of the hacking victims, denouncing anyone who looked at the leaked nudes. That included outlets that had previously covered every nipslip and upskirt photo to hit the internet with panting eagerness.  “We have it so easy these days,” the pop culture website Complex had mused in 2012 in a roundup of recent celeb nude leaks. “Who do you want to see naked?”  When the Fappening happened two years later, Complex changed its mind. “Consider this,” the website declared. “These women, regardless of their public persona, are entitled to privacy and to express their sexuality however they wish. It’s their basic human right. These women have lives, too.” It’s hard to say exactly what swung the discourse quite so hard against the hackers this time around. Perhaps it was the ubiquity of camera phones, which had made nudes so inescapable: that feeling that it could happen to you. Perhaps it was because the media at the time was obsessed with Jennifer Lawrence, like everyone else was, and they wanted to be on her side. Perhaps the collective hive mind had just decided the time had come for feminism to trend upward. Whatever the reason, the press had now established a new narrative it could use to talk about sex crimes in the social media era, especially sex crimes that involved famous and beloved actresses. Three years later, it would put that knowledge to use to break a series of stories about Harvey Weinstein as the decade-old Me Too movement re-energized itself. Me Too saw reputational losses and criminal charges wielded against powerful men who for decades had been able to get away with sexual violence with impunity. It was able to do that because of what we all learned from The Fappening. —Constance Grady A fringe, racist essay foretold the fate of a MAGAfied Republican Party In 2016, a then-minor conservative writer named Michael Anton wrote what would become the defining case for electing Donald Trump. In Anton’s view, a Clinton victory would doom the country to collapse — primarily, albeit not exclusively, due to “the ceaseless importation of Third World foreigners with no tradition of, taste for, or experience in liberty.” Whatever Trump’s faults, he alone stood in the way of national suicide. Therefore, all true conservatives must get behind him. This sort of rhetoric may seem normal now: the kind of thing you hear every day from Trump and his deputies in the conquered Republican Party. At the time, it was immensely controversial — so much so that Anton originally published it under a pseudonym (Publius Decius Mus). But it became so influential on the pro-Trump right that Anton would be tapped for a senior post in President Trump’s National Security Council. The essay’s emergence as the canonical case for Trumpism marked a turning point: the moment when the conservative movement gave into its worst impulses, willing to embrace the most radical forms of politics in the name of stopping social change. The anti-establishment Trumpers have become the establishment The title of Anton’s essay, “The Flight 93 Election,” points to its central conceit. United Airlines Flight 93 was the one flight on September 11 that did not hit its ultimate target, crashing in a field in Pennsylvania thanks to a passenger uprising. Anton argued that Americans faced a choice analogous to that of Flight 93’s passengers: either “charge the cockpit” (elect Trump) or “die” (elect Hillary).  Anton spends much of his essay castigating the conservative movement — what he calls “Conservatism, Inc” or the “Washington Generals” of politics — for refusing to acknowledge that immigration has made the electoral stakes existential. Trump “alone,” per Anton, “has stood up to say: I want to live. I want my party to live. I want my country to live. I want my people to live. I want to end the insanity.” The racism in Anton’s view of “Third World foreigners” is unmistakable. Yet there is no doubt that his basic theses are now widespread among the Republican Party and conservative movement. The anti-establishment Trumpers have become the establishment. Anton’s essay was ahead of the curve, clearly articulating where the movement was heading under Trump. “The Flight 93 Election” marked the moment in which the unstated premises of the conservative movement’s most radical wings came out into the open. That those premises are now widely shared goes to show what the movement has become — and why Anton, and many others like him, would later rationalize an attempt to overturn an American election. —Zack Beauchamp The number that made the extinction crisis real Scientists have known for decades that plants and animals worldwide are in peril — the tigers and frogs, wildflowers and beetles. But it wasn’t until recently that the true gravity of the problem, dubbed the biodiversity crisis, started sinking in with the public.  That shift happened largely thanks to a single number, published in 2019.  In spring of that year, an intergovernmental group of scientists dedicated to wildlife research, known as IPBES, released a report that found that roughly one million species of plants and animals are threatened with extinction. In other words, much of the world’s flora and fauna is at risk of disappearing for good.  “The health of ecosystems on which we and all other species depend is deteriorating more rapidly than ever,” Robert Watson, IPBES’s director for strategic development and former chair, said when the report was published. “We are eroding the very foundations of our economies, livelihoods, food security, health and quality of life worldwide.” Extinction is far from the only important metric for measuring the health of the planet. Some scientists argue that it obscures other signs of biodiversity loss, such as shrinking wildlife populations, that typically occur long before a species goes extinct.  Yet this number marked an evolution in the public’s understanding of biodiversity loss.  Extinction is an easy concept to grasp, and it’s visceral. And so the number — calculated based on estimates of the total number of species on Earth, and how threatened different groups of them are — hit especially hard. It not only raised awareness but inspired an unprecedented wave of conservation action.  World leaders have since used the IPBES number to justify major efforts to protect nature, including a historic global deal, agreed on by roughly 190 countries in 2022, to halt the decline of wildlife and ecosystems. It has also been cited by government hearings, state resolutions, corporate actions, and hundreds of scientific papers — not to mention countless news reports.  The concept of biodiversity loss is vague. This number made it concrete, and more urgent than ever. —Benji Jones One state’s chilling ban was the beginning of the end for abortion access in America In May 2019, Alabama banned almost all abortions. It was the most aggressive abortion law passed by a state in decades, and clearly flouted the protections set forth in Roe v. Wade.  With no exceptions for rape or incest, the Alabama law hit a new level of restrictiveness amid a slate of state abortion bans passed in 2018 and 2019. These measures marked a major change in anti-abortion strategy: After 10 years of pushing smaller restrictions aimed at closing clinics or requiring waiting periods for patients, abortion opponents had begun aiming squarely at the landmark 1973 Supreme Court decision establishing Americans’ right to terminate a pregnancy.  Emboldened by Donald Trump’s presidency and two new conservative Supreme Court justices, these activists believed that the constitutional right to an abortion was finally vulnerable. They were correct.  The Alabama abortion ban was expressly designed as a challenge to Roe, with sponsor and Alabama state Rep. Terri Collins telling the Washington Post, “What I’m trying to do here is get this case in front of the Supreme Court so Roe v. Wade can be overturned.”   At first, Alabama’s ban, along with six-week bans in Georgia and elsewhere, were tied up in lower courts. In 2020, however, Justice Ruth Bader Ginsburg died and a third Trump nominee, Amy Coney Barrett, was confirmed, creating a rock-solid conservative majority on the Supreme Court. Less than two years later, the court held in Dobbs v. Jackson Women’s Health Organization that Roe “must be overruled.” With the federal right to an abortion gone, Alabama’s ban went into effect. While it was once the most restrictive in the country, now more than a dozen other states have instituted near-total bans. Several more have imposed gestational limits at 15 weeks or earlier. Alabama was once again on the front lines of reproductive health restrictions in February of this year, after a  judge ruled that frozen embryos used in IVF count as “children” under state law.   The landscape of reproductive health law in America has been utterly remade, and anti-abortion activists are far from finished. While the Alabama ban once seemed to many like radical legislation that would never survive the courts, it was in fact an early look at where the country was headed, and at the extreme circumstances under which millions of Americans are living today.  —Anna North  Avengers: Endgame forced an entirely new era of storytelling There will probably never be another movie like Avengers: Endgame, the 2019 film with a lead-up that wholly altered the movie industry and even the way stories are told. For over a decade, Marvel told one central story — Earth’s mightiest heroes working to defeat the great villain Thanos — through the MCU’s plethora of interlocking superhero blockbusters. In that era, each film, with its Easter eggs and credit scenes, built toward the culmination known as Endgame.  By signaling to its audience that all 23 movies mattered to the larger story, Marvel ensured each was a financial success, including a slew — Black Panther, Captain Marvel, Avengers: Infinity War — of billion-dollar worldwide box offices. Marvel’s grand design made Endgame the second-biggest movie in history. It’s not surprising that seemingly everyone in Hollywood tried to replicate this triumph, often at the expense of creative achievement. Studio heads hoped that they could grow and cash in on properties with extant popularity, the way Marvel had with its comic book characters, and began investing in sequels and spinoffs of proven IP. Marvel’s parent company Disney developed countless new Star Wars projects and capitalized on its hits like Frozen and Moana by lining up continuations of those stories. The company created Disney+ not just to sell its existing properties, but to house its avalanche of spinoff TV shows. Amazon’s take on Tolkien’s Lord of the Rings franchise and HBO’s interest in multiple Game of Thrones spinoffs could certainly be seen as trying to capture Marvel’s magic.   Across the board, the people in charge of the purse strings became less interested in original ideas, as well as in mid-budget films; it was blockbuster or… bust.  Marvel changed what kind of stories were being told, but also how they were being told. Competitors became convinced that audiences wanted a connected cinematic universe, a format that mirrored comic book structure. Warner Bros., which owns the rights to Superman, Batman, and Wonder Woman, tried its hand at creating the DC superhero universe. Universal also played with the idea, teasing a linked movie world featuring classic monsters like Dracula. Neither of those fully panned out — some movies were critical flops, others didn’t find large audiences — signaling how difficult it is to execute what Marvel had. This tactic spread beyond the big studios too; indie darling A24, for example, has tapped into connected worlds and multiverses to tell expansive stories. Marvel’s other innovations have also lodged themselves firmly in the pop culture firmament. Easter eggs — embedding “secrets” into art — are commonplace today (see: Swift, Taylor), and foster fan loyalty. Post-credits scenes have been added to all kinds of films.  Perhaps the real testament to Endgame’s singularity, though, is that it wasn’t only rivals who were unable to replicate what Marvel was able to do. None of the studio’s post-Endgame movies have had pre-Endgame box office results, and Marvel is no longer an unstoppable force. The studio’s cinematic universe looks as vulnerable as ever.  What Marvel didn’t realize was that Endgame was truly the end of the game. In its wake — for better or worse — we’re left with new ideas about what kind of stories we tell and why we tell them.  —Alex Abad-Santos The manifesto that changed China’s place in the world Xi Jinping Thought on Socialism with Chinese Characteristics for a New Era — a definitive manifesto known as Xi Jinping Thought for short — first appeared in China in 2017, laying out the ideology and priorities not just of China’s president Xi, but of the entire Chinese Communist party-state under him. Xi’s policies and consolidation of power didn’t start with the document itself; they were developed over time, starting even before Xi became president in 2012. But given the opacity of the Chinese political apparatus and increasing censorship, the compiled doctrine provided a historic window into how Xi sees the world and his own place in it.  And what he wants is a dominant China that harks back to its former greatness, with himself at the center. Xi Jinping Thought is, according to the document, the roadmap for “a Chinese solution for world peace and human progress, and of landmark significance in the history of the rejuvenation of the Chinese nation, the history of the development of Marxism, and the progress of human society.” Xi Jinping Thought articulates a vision that harnesses military development and aggressive diplomacy Rather than lay low and just use its economic growth and the decline of US influence to propel China to world power status — as the country’s former president Deng Xiaoping advocated — Xi Jinping Thought articulates a vision that harnesses military development and aggressive diplomacy as critical factors in China’s dominance. That has translated to China deploying its increasing military might to assert dominance in the South China Sea and the Taiwan Strait and cracking down on pro-democracy protests in Hong Kong, in addition to further opening the economy and becoming a global investment powerhouse via the Belt and Road initiative. It has also meant taking significant geopolitical leadership positions — expanding the BRICS economic bloc, brokering a deal to restore relations between Iran and Saudi Arabia, and attempting to negotiate peace between Ukraine and Russia. Arguably, China and the US would have ended up on a collision course with or without Xi Jinping Thought. But that tension really kicked into higher gear after the introduction of the doctrine at the start of Xi’s second term, according to Neil Thomas, Chinese politics fellow at the Asia Society — and after the US itself started to explicitly attempt to contain China’s rise. “Looking from Beijing, you start to see this big pushback,” starting with the Trump administration’s trade war and continuing under Biden. “That has fed into a much more securitized view of the world in China,” Thomas said, as well as the notion that geopolitics “was increasingly zero sum.” Lately the aggressive policy seems to be faltering due to China’s economic troubles. Thomas says Xi has “become increasingly aware of the costs of war [to] diplomacy and has adjusted his tactics to pursue the same ambitious strategic goals but with a more sensible strategy that focuses more on making friends than making enemies.” That has not deterred the US military buildup in Asia to counter China, though diplomatic relations between the two countries have warmed somewhat in recent months. But cutting down the bluster doesn’t mean a change in priorities, just a change in tactics — for now, anyway. —Ellen Ioanes The 2016 election made us realize we know nothing about class Since Donald Trump eked out an electoral college victory in 2016 and reshaped the GOP, journalists, academics, and politicians have been trying to explain what, exactly, happened. One  prevailing narrative is that Trump spoke directly to a forgotten voting bloc — poor and working-class white people, especially those living in rural America. There’s a kernel of truth in that theory; Trump did indeed outperform previous Republican candidates among that demographic. But the stereotype of the average Trump voter that’s been born out of that narrative — the blue-collar union worker who hasn’t seen a meaningful raise in decades — is misleading at best. In fact, one of the lessons of 2016 was that there is no universal definition of what constitutes a “working-class” voter, and that class solidarity is still deeply misunderstood. As it turns out, Trump’s biggest, most reliable voting block wasn’t the downtrodden white worker; it was largely white people from middle- and high-income households. When voters were divided up by income in various exit polls, Trump was only able to beat Joe Biden in one of three tiers: those making over $100,000 a year.  Trump’s win wasn’t a high-water mark for the role of class in elections like many thought, but rather for the media focus on the role of class in elections. Even still, we haven’t really figured out how to measure our class divides or even talk about them. This lack of clarity has underscored a big problem in American politics: We have categories that are used as proxies for class — like someone’s college education level or union membership status — but they are imprecise substitutes that blur the bigger picture of the US electorate.   As a result, analysis has exaggerated, or even distorted, reality, painting the Democratic Party, for example, as a political organization that’s growing more and more elitist and out of touch, and the GOP as the party that’s winning over the working class.  That’s despite the fact that Democrats have embraced the most ambitious anti-poverty agenda since Lyndon Johnson’s presidency — championing programs that are, by and large, supported by the poor — while Republicans continue to advocate for programs that almost exclusively benefit the wealthiest members of American society. Trump’s victory may have turned people’s attention to class politics, but there’s still a long way to go before Americans get a clearer picture of how class will shape, or even determine, the election in November — and those in the years to come. —Abdallah Fayyad  A photograph of a 3-year-old refugee’s death altered global opinion on migrants  There are certain photographs that stop the world. The “Tank Man” of Tiananmen Square. A nine-year-old girl, on fire from napalm during the Vietnam War. A migrant woman in 1936 California. To this list we can add the image of the body of a three-year-old refugee boy, face down in the sand of Bodrum, Turkey, after drowning in the Mediterranean Sea on September 2, 2015. Alan Kurdi was fleeing the Syrian civil war, one of an estimated million people who were seeking safe refuge in Europe. Minutes after Kurdi and his family left the Turkish city of Bodrum in the early hours, hoping to reach the Greek island of Kos and European territory, their overloaded rubber dinghy capsized. Kurdi, along with his brother Ghalib and mother Rehana, slipped beneath the waves. That morning the Turkish photographer Nilüfer Demir came upon what she would later call a “children’s graveyard” on the beach. Alan’s body had washed up on the shore, his sneakers still on his tiny feet. Demir took the photograph. Alan Kurdi was only one of an estimated 3,700 other asylum seekers who drowned in the eastern Mediterranean that year, desperately trying to reach Europe. But Demir’s photograph, shared on social media by Peter Bouckaert of Human Rights Watch, spread to every corner of the world, where it was viewed by an estimated 20 million people. At a moment when Europeans seemed unsure whether to accept the unprecedented flow of asylum seekers, the image of a three-year-old left to die on the very edge of Europe galvanized political leaders, opening up a route for hundreds of thousands of refugees to find safety in the EU.   But the story doesn’t end there, for the compassion for asylum seekers generated by Kurdi’s image proved to have a short half-life. In the years since 2015, Europe has largely turned against asylum seekers, tightening its borders and closing off the Mediterranean. A little more than a year after Kurdi’s death, Donald Trump would win the White House, leading to a sharp reduction in the number of asylum seekers admitted into the US. That same year the UK voted for Brexit, in large part over concerns about immigration and asylum policy. The European Parliament elections held later this year are expected to cement policies that will make the EU even less welcoming to migrants and asylum seekers.   Yet with some 114 million people around the world forcibly displaced from their homes, nothing will stop the flow of refugees. We know there will be more Alan Kurdis in the future. And they will likely be met with less compassion than his photographed death generated.  —Bryan Walsh What Kim Kardashian wrought when she “broke the internet” One of the most circulated images of the past decade is of a reality star’s rear end. In November 2014, Paper Magazine unveiled its winter issue starring Kim Kardashian with a photoshoot centered around her most notable asset and the ambitious goal of “break[ing] the internet.” On one cover, Kardashian creates a champagne fountain with her curvaceous body, unleashing foam into a glass perched on her PhotoShopped backside. (The image is a recreation of controversial photographer Jean-Paul Goude’s 1976 “Carolina Beaumont, New York” photo, but drew even more fraught comparisons to Sarah Baartman, an enslaved South African woman who was made into a freak-show attraction in 19th-century Europe for her large buttocks.) The other cover, however — where Kardashian flashes her impossibly small waist and cartoonishly round butt — is what we mainly associate with the issue. She’s wearing nothing but pearls and a self-aware smile. What was once a source of mockery for Kardashian in tabloids had now become the culture’s most coveted possession.  Lest we forget, these photos arrived at the tailend of a year all about butts. White artists like Miley Cyrus, Iggy Azalea, and even Taylor Swift were incorporating twerking into their music videos and performances. Hit songs like Meghan Trainor’s “All About That Bass,” Jennifer Lopez’s “Booty,” and Nicki Minaj’s “Anaconda” were exalting curvy bodies. These moments contributed to the “slim-thick” physique becoming more accepted and desired outside Black and brown communities. (Twerking and voluptuous “video vixens” have long been features of rap videos.) However, it was Kardashian and, later, her sisters, who would come to represent the social complications this “trend” posed regarding the fetishization of Black bodies, cultural appropriation, and plastic surgery.  The American Society of Plastic Surgeons found a 90 percent increase in Brazilian butt lift procedures from 2015 to 2019. The surgery, where patients’ stomach fat is injected into their butts, has a sordid history embedded in Brazil’s eugenics movement and the hyper-sexualization of the mixed-race Black woman, known as the “mulata.” BBLs have also garnered headlines for their deadly health risks, mainly a result of fat embolisms. Nevertheless, it became hard not to notice the amount of Instagram influencers who had apparently gotten the surgery or, at least, were digitally enhancing their butts. Then, just as quickly, it seemed like the tide had turned once again for the “ideal” female body. In 2022, a controversial New York Post article declared that “heroin chic” was back in. Social media observers also began noticing that Kardashian was suddenly a lot smaller. At the same, the diabetes drug Ozempic emerged as Hollywood’s latest weight-loss craze. Thus, the media eagerly questioned whether the BBL era was “over,” despite the surgery’s persisting popularity.  The question illuminated the ways Black people — their culture, their aesthetics, their literal bodies — are objectified and easily discarded under the white gaze. As Rachel Rabbit White wrote, “to celebrate the supposed ‘end of the BBL’ is synonymous with the desire to kill the ways in which Black women, especially Black trans women, and especially Black trans sex workers, have shaped the culture.” Writer Ata-Owaji Victor pondered where the rejection of this trend leaves “Black women and people who naturally have the ‘BBL’ body.” The answer is seemingly: in the same position Black women have always been put — useful until they’re not. —Kyndall Cunningham  The sweeping strike that put power back in teachers’ hands In 2018, roughly 20,000 educators went on strike in West Virginia, protesting low pay and high health care costs. Their historic nine-day labor stoppage led to a 5 percent pay increase for teachers and school support staff.  With organizers galvanized by the victory in West Virginia, labor actions in states like Oklahoma, Kentucky, North Carolina, Colorado, and Arizona soon followed. According to federal statistics, more than 375,000 education workers engaged in work stoppages in 2018, bringing the total number of strikers that year to 485,000— the largest since 1986. The uprising sparked national attention and enthusiasm both about the future of school politics and the possibility of resurging worker activism more broadly. It went by the shorthand “Red for Ed” — a reference to the red clothing educators and their allies wore every time they took to the streets.  The momentum continued the next year: In 2019, more than half of all workers in the US who went on strike came from the education sector, with new teacher actions spreading to states like Arkansas, Indiana, and Illinois. Red for Ed changed the national political narrative To be sure, the movement didn’t create lasting change in all aspects of education policy. Average teacher pay has stayed flat for decades, and fewer people are entering the teaching profession. Union membership writ large has continued to decline. And despite educators’ pushback against school privatization, conservatives managed to push through new expansions of public subsidies for private and religious schools following the pandemic.  But the teacher uprising earned the support of parents and the public, who reported in surveys strong backing for the educators’ organizing and for increased teacher pay. This strengthened support likely helped explain why parents largely stood by their kids’ teachers during the tough months of the pandemic, when educators again banded together for stronger mitigation standards to reduce the spread of Covid-19. During the Obama era, a powerful bipartisan coalition for education reform spent much of their time attacking educators and their unions — a scapegoat for public education’s problems that most people ultimately did not buy. Red for Ed changed the national political narrative around teachers, and in many ways was a fatal nail in the coffin for that movement. —Rachel Cohen Malaria in Maryland (and Florida, and Texas, and Arkansas) showed that the future of climate change is now Last year, for the first time in two decades, mosquitoes transmitted malaria on American soil. The geographic range was unprecedented, with cases in Florida, Texas, Maryland, and Arkansas. 2023 was the hottest year on record since 1850, and for the mosquitoes that spread malaria, heat is habitat; the US cases occurred amid an uptick in malaria infections on a global scale.  Scientists have been warning us for years that without more public health resources, climate change was bound to push infectious threats into environments and populations unprepared for their consequences. Malaria’s reappearance in the US signaled to many that the future has arrived.  Wild weather turns previously inhospitable areas into ones newly suitable for lots of so-called vector insects to live. It’s not just different species of mosquitoes whose migration is changing disease trends. Ticks — different species of which spread diseases like Lyme, babesiosis, and ehrlichiosis — have progressively moved into new parts of the US in recent years as they’ve warmed. Changing weather patterns also cause many of these insects to reproduce in higher numbers in their usual habitats.  Insect habitats aren’t the only ones affected by climate change. Weather is pushing animals that serve as disease reservoirs into new environments, which can lead to more “spillover” events where germs get spread from one species to another. That’s thought to explain, at least in part, the fatal borealpox infection transmitted to an Alaska man by a vole bite last year; it’s also a concern when it comes to rabies transmission. Extreme and unseasonable heat waves are also turning a progressively large part of the US into newly comfortable digs for fungi — including molds that cause severe lung and other infections in healthy people. Warming fresh and sea waters more frequently become home to noxious blooms of toxic algae and bacteria. What’s more, the heat is kicking pathogens’ evolution into overdrive: The microorganisms that can survive it are more likely than ever to also survive in our bodies, making them more likely to cause disease — and harder to fight. As with many health risks, the consequences of climate-related infectious threats land hardest on the people with the fewest resources — and are almost incomparably worse in lower-resource countries than inside the US.  There’s a lot we still don’t understand about how climate change interacts with communicable diseases, including malaria. Some of the shifts caused by severe weather may reduce certain risks even as they amplify others. And disentangling the effects of severe weather from changes in policy, behavior, and human immunity, especially during and after a pandemic, is a formidable task. Still, the comeback — or debut — of peculiar pathogens on American shores makes understanding these links viscerally urgent. Our warming planet isn’t going to wait until we’ve reformed and funded our public health system, seamlessly integrated disease surveillance into health care, renewed public trust in vaccines, and realigned incentives for novel antibiotic production before the fallout of climate change quite literally bites us in the ass.  —Keren Landman Letting language models learn like children tipped the AI revolution Imagine you have a little kid. You want to teach them all about the world. So you decide to strap them to a chair all day, every day, and force them to stare at endless pictures of objects while you say, “That’s a banana, that’s a car, that’s a spaceship, that’s…” That’s not (I hope!) how you would actually teach a kid, right? And yet it’s the equivalent of how researchers initially tried to teach AI to understand the world. Until a few years ago, researchers were training AIs using a method called “supervised learning.” That’s where you feed the AI carefully labeled datasets. It actually yielded some decent results, like teaching AI models to tell apart a banana and a spaceship. But it’s very labor-intensive because humans have to label every bit of data. Then some researchers tried a different method: “unsupervised learning,” where the AI learns more like a real child does, by exploring the world freely, vacuuming up tons of unlabeled data, and gradually picking out the patterns in it. It figures out that bananas are those yellow oblong-shaped things without ever explicitly being told that. Turns out this leads to much more powerful AI models, like OpenAI’s ChatGPT and Google’s Gemini, which can explain complicated topics better and use language more naturally than the older, clunkier models. Of course, AIs are not actually kids, and there’s a lot we still don’t understand about what’s happening inside the models. Yet when these companies realized that the key to unlocking progress wasn’t spoon-feeding AI every bit of information but letting it play around until it figured things out, they ushered in the AI revolution we’re seeing today. Alison Gopnik, a developmental psychologist at Berkeley, was an early voice arguing that studying kids can give us useful hints about how to build intelligent machines. She’s compared children and AIs — for instance, by putting four-year-olds and AIs in the same online environments to see how each is able to learn — and found that the kids make much better inferences. Others are catching on. A team at NYU released a study this year in which a baby wore a helmet camera, and whatever the baby saw and heard provided the training data for an AI model. From a total of just 61 hours of data, the AI learned how to match words to the objects they refer to — the word “banana,” say, to that yellow oblong fruit. Researchers are pinpointing some of the qualities that make kids such amazing learning machines: they’re embodied, they’re curious, and they’re able to interact socially with others. Perhaps that’s why the researchers are now trying to create embodied multimodal AIs that can take in not just text, but sights, sounds, touch, and movement. They are, maybe without realizing it, embarking on an effort to replicate what evolution already did in making babies. —Sigal Samuel  The drug that supercharged a crisis also spelled a synthetic destiny  When doctors began liberally prescribing opium and morphine to Civil War veterans for everything from amputations to diarrhea, they inadvertently kicked off the opioid epidemic over 150 years ago. It wasn’t until pharmaceutical companies started pushing prescription opioids as painkillers in the 1990s that the problem escalated to a national emergency. By 2015, pills and heroin had already made the opioid epidemic the deadliest drug crisis in US history. Then came the fentanyl boom.  The synthetic and extremely potent opioid, introduced in the 1960s, has been used as pain medication for decades. In 2016, it became responsible for the majority of overdose deaths. It pushed the number of US drug overdoses above 100,000 in 2022, more than doubling 2015’s death toll. Because of fentanyl’s potency, it takes much less to overdose: A fatal dose fits on the tip of a sharpened pencil. Fentanyl put an already dire crisis into hyperdrive. But its spread also marked a deadlier, more prolific era of drugs where synthetics reign supreme.  Fentanyl’s rise hinges on its synthetic nature. It can be made from just a few chemicals, while heroin and opium require the slow cultivation of poppy flowers. Compared to Oxycodone — considered a “semi-synthetic” because its production involves chemically modifying natural opioids rather than brewing them from scratch — fentanyl is roughly 60 times more potent.  Fentanyl is also up to 50 times stronger than heroin, which makes smuggling it much easier since doses require far less of the actual drug. In the mid-2010’s, Mexican cartels trafficking opioids began “cutting” drugs with fentanyl to save money, since it provided a similar high with less volume. In some cities where heroin use was widespread, suppliers have altogether replaced it with fentanyl, leaving users little choice but to switch.  Before fentanyl, overdose deaths were concentrated among opioid users. But fentanyl can be found as a filler in cocaine and MDMA supplies, spreading the overdose crisis into new terrain. Variations of fentanyl — of which there are now more than 1,400 — are already making their way into the illicit drug supply. Take carfentanil, which was developed to sedate large animals like elephants, but is now showing up in thousands of human overdoses. Carfentanil is estimated to be 100 times more potent than fentanyl itself. Pure synthetics like fentanyl are where drug development is headed. Despite progress along many measurable dimensions, life in the 21st century will remain painful and unhealthy and full of ways to kill us. The incentive to continue developing legions of new synthetic drugs will stay strong as ever, which will continue unearthing cheaper and easier to make substances. As those make their way to patients, the risk of adding novel, more powerful drugs to the illicit drug supply will follow.  Rising awareness of fentanyl’s harms has driven some progress, from reducing production and investing in harm reduction strategies like testing strips to combating
vox.com
The overlooked conflict that altered the nature of war in the 21st century
On the second day of the 2020 Armenia-Azerbaijan war, the Armenian military posted a video of one of its surface-to-air missile systems shooting down a surprising enemy aircraft: an Antonov AN-2 biplane.  As it turned out, it wasn’t a sign of desperation on Azerbaijan’s part that its military was flying a plane first produced in the Soviet Union in 1947, and today used mostly for crop-dusting. Azerbaijan had converted several AN-2s into unmanned aircraft and used them as so-called bait drones. After the Armenians shot down the planes, revealing the positions of their anti-aircraft systems, their forces came under attack from more modern drones.  More from This Changed Everything The last 10 years, explained The “racial reckoning” of 2020 set off an entirely new kind of backlash Serial transformed true crime — and the way we think about criminal justice It seems strangely fitting that what was also known as the Second Nagorno-Karabakh War, a conflict that has been called “the first war won primarily with unmanned systems” and even the “first postmodern conflict,” could also end up being the last one in which biplanes played a significant role. The conflict between these two former Soviet republics in the Caucasus, on the border between Europe and Asia, was the culmination of tensions that had been building for more than 25 years and intercommunal dynamics that were far older than that. It was in some sense a throwback to a traditional type of war — two nation-state armies fighting over disputed territory — that was far more prevalent in previous centuries.  But it was also a hypermodern war where unmanned systems played an unprecedented role on the battlefield, and social media played an unprecedented role off it. Though it got relatively little coverage in the international media at the time — coming as it did at the height of the Covid-19 pandemic, a wave of global protests, and a bitter US presidential election campaign — it was in some ways a preview of the much larger war that would break out in Ukraine just two years later, and may yet be seen as the harbinger of a new and potentially devastating era of international conflict. A frozen conflict heats up The Armenia-Azerbaijan dispute is one of the so-called frozen conflicts left over from the collapse of the Soviet Union. Nagorno-Karabakh, often referred to as Artsakh by Armenians, is an ethnically Armenian region within the borders of neighboring Azerbaijan. Violence in the region erupted in the 1980s when authorities in Nagorno-Karabakh demanded to be transferred to Armenia. (At the time, all were part of the Soviet Union.)  After the Soviet collapse, when both Armenian and Azerbaijan became independent, full-scale war broke out, resulting in more than 30,000 deaths and the displacement of hundreds of thousands of people, mainly Azeris. The first war ended with a Russian-brokered ceasefire in 1994 that left Nagorno-Karabakh as a semi-independent — but internationally unrecognized — territory surrounded by Azerbaijan, and Armenia retained control of some of the nearby areas. Effectively, it was an Armenian victory.  In the years that followed, the ceasefire was frequently violated by both sides and the underlying issues never resolved. Then on September 27, 2020, Azerbaijan’s forces launched a rapid dawn offensive, beginning 44 days of war.  This time, it was a resounding success for Azerbaijan, retaking all of the Armenian-held territory around Nagorno-Karabakh as well as about a third of the territory itself. At least 6,500 people were killed before the two sides agreed to a Russian-monitored ceasefire and only a winding mountain road was left to connect Armenia and Karabakh. (Though Russia, the preeminent military power in the region, is a traditional ally of Armenia, it has been hedging its bets more in recent years, particularly since the 2018 protests that brought a Western-inclined, democratic government to power in Armenia.) Finally, in 2023 — with Russia distracted and bogged down by its war in Ukraine — Azerbaijan launched a blockade of Nagorno Karabakh, eventually seizing the region and causing the majority of its Armenian population to flee. The Republic of Nagorno-Karabakh was dissolved in 2024.     A glimpse of the future of war What made Azerbaijan’s rapid victory possible? One major factor was Turkey’s strong military support for Azerbaijan, a fellow Muslim, Turkic-speaking nation that Turkey saw as a key ally in extending its influence into the Caucasus. Another related factor was Azerbaijan’s deployment of unmanned drones, particularly the Turkish-made Bayraktar TB-2 attack drone, as well as several models of exploding drones purchased from Israel. These weapons proved stunningly effective at destroying the tanks and air defense systems of the Armenian and Nagorno-Karabakh forces.  “The Armenians and Nagorno-Karabakh had their forces dug in in advantageous positions, and they might have won if this war had unfolded the way it did in 1994, but it didn’t,” Sam Bendett, a senior fellow at the Center for a New American Security and expert on drone warfare, told Vox. “The Azeris understood that they couldn’t dislodge the Armenians in any other way than to send drones rather than piloted aircraft.” As much as this war was fought under the global radar, these tactics caused a tectonic shift in the prevailing perception of drones as a weapon. From the beginning of the 20-year-long US war on terrorism, unmanned aircraft played an important role, but they were primarily multimillion-dollar machines like the Predator and Reaper that were employed mostly as weapons for remote targeting of specific targets away from declared battlefields. Large numbers of simple, replaceable drones could turn the tide on the battlefield in a conventional war The Nagorno-Karabakh war showed how large numbers of simple, replaceable drones could turn the tide on the battlefield in a conventional war. As the military analyst Michael Kofman wrote at the time, “Drones are relatively cheap, and this military technology is diffusing much faster than cost-effective air defense or electronic warfare suitable to countering them.” Lessons learned in the Nagorno-Karabakh conflict were employed in the Ukraine war, when Ukrainian forces made effective use of cheap drones — including, once again, the Turkish TB-2 — to negate the invading Russians’ advantages in mass and firepower. Over time, the evolving use of masses of cheap drones for strikes and surveillance by both sides in Ukraine have made traditional maneuver warfare vastly more difficult, another dynamic predicted by the conflict in Nagorno-Karabakh. Two years into the war, drones are one major reason why the front line often appears stuck in place. Another way Nagorno-Karabakh seemed to be a harbinger of conflicts to come was in the role of social media in shaping global perceptions of the war. As the media scholar Katy Pearce wrote in 2020, “Armenians and Azerbaijanis in country and those who have settled elsewhere have long battled on social media, and this escalated during the war … For Armenians and Azerbaijanis, whether still in the region or part of the wider diaspora, social media provided a way to participate, and feel engaged.”  As with Ukraine two years later, this was a war with an extraordinary amount of battlefield footage that was available to the public, and where that footage was captured by the participants themselves via drone camera or smartphone, rather than conventional (and more impartial) war reporters. This allowed both sides to shape public perceptions of what was happening on the battlefield, a phenomenon we’re seeing again with the Israel-Hamas war and the way social media images have driven coverage of that conflict. Journalists attempting to write objectively about the conflict often came under attack online from partisans who objected to what they saw as biased or unduly negative coverage.  For Armenia, this may have backfired. When Prime Minister Nikol Pashinyan finally signed the ceasefire deal, he faced mass protests and accusations that he had sold the country, in part because many Armenians hadn’t actually believed they were losing the war — until they lost the war.  A new age of conquest?  Azerbaijan’s offensive was not a straightforward land grab. Nagorno-Karabakh’s independence was not recognized by any other country on earth — technically, not even Armenia — and as far as international law was concerned, Armenian troops were occupying part of Azerbaijan’s territory. There are many such unresolved border disputes and unrecognized semi-sovereign territories around the world today.  Still, as Thomas De Waal, a senior fellow at the Carnegie Endowment for International Peace and author of one of the definitive books on the conflict, told Vox, “Azerbaijan’s war of 2020 broke a pattern in European security where the assumption was that all these unresolved conflicts across Europe had to be resolved peacefully. Azerbaijan rewrote the rulebook, used force, and as far as it was concerned, got away with it.”  De Waal suggests the relatively muted international reaction to the war — the US called for a ceasefire but did not sanction Azerbaijan despite calls from some members of Congress to do so — may have been one of a number of factors that led Russia’s government to believe, two years later, that “there was a more permissive international environment for the use of force and there wasn’t going to be as much pushback [to invading Ukraine] as there might have been a decade before.” Was this brief conflict in the Caucasus a sign of a larger shift? In recent decades, wars of territorial conquest have been rare, and successful ones even rarer. The best-known examples — North Korea’s attempt to conquer South Korea in 1950, or Saddam Hussein’s invasion of Kuwait in 1990 — have prompted massive international interventions to protect international borders. Wars within states, sometimes drawing in international intervention, have been more common.  “For a very long time after the Second World War, there was a pretty widespread understanding on how the use of force is not a legitimate means of resolving territorial disputes,” Nareg Seferian, a US-based Armenian political analyst and writer, told Vox. “I don’t think many people realize that until at least the First World War, if not the Second, that was just a really normal thing.” The bloody and ongoing international conflict in Ukraine is in many ways quite rare. If that starts to change, a month-and-a-half-long war in the Caucasus in 2020 could eventually be remembered as a pivotal turning point — not just in how wars are fought, but why.
vox.com
The “racial reckoning” of 2020 set off an entirely new kind of backlash
It took less than a day for the world to start rallying for George Floyd in late May 2020. The events that led to Floyd’s murder unfolded over hours, but a viral 10-minute video recording of the deadly encounter with Minneapolis police officer Derek Chauvin was enough to send floods of people nationwide into the streets for months.  In the weeks after Floyd’s killing, the number of Americans who said they believe racial discrimination is a big problem and that they support the Black Lives Matter movement spiked. As books about racial injustice flew off of bookstore shelves, corporate leaders, politicians, and celebrities pledged to fight racism. The events of 2020 disturbed America’s collective conscience, and the movement for justice captivated millions. Until it didn’t.   In retrospect, there were signs of brewing right-wing resistance all along. While many peacefully protested, others called for their defeat. Arkansas Republican Sen. Tom Cotton demanded that the US military be brought in to fight “insurrectionists, anarchists, rioters, and looters.” As police officers used tear gas and rubber bullets to disperse crowds across the country, President Donald Trump deployed the National Guard to “dominate the streets” and defend “life and property,” sending thousands of troops and federal law enforcement officers to control protesters in Washington, DC; Portland, Oregon; and other cities.  More from This Changed Everything The last 10 years, explained Serial transformed true crime — and the way we think about criminal justice The overlooked conflict that altered the nature of war in the 21st century Some Americans who wanted to stamp out the unrest took it upon themselves to practice vigilantism. One of them, Kyle Rittenhouse, fatally shot two unarmed men and wounded another when he brought an AR-15-style rifle to protests in Kenosha, Wisconsin. (Rittenhouse was later acquitted of all homicide charges.) Though the mass mobilization of 2020 brought hope, it’s clear today that it also marked a turning point for backlash as the mirage of progress morphed into seemingly impenetrable resistance. Historically, backlash has embodied a white rejection of racial progress. Over the past few years, the GOP has built on that precedent and expanded its reach.  The right watched progressives rally for change and immediately fought back with the “Big Lie” of a stolen election. In many of the states that Biden flipped in 2020, Republicans rushed to ban ballot drop boxes, absentee ballots, and mobile voting units, the methods that allowed more people to vote. Since then, we’ve seen the passage of dozens of regressive laws, including anti-protest laws, anti-LGBTQ laws, and anti-diversity, equity, and inclusion laws. In state after state, these bans were coupled with incursions against reproductive rights, as some conservatives announced plans to take over every American institution from the courts to the schools to root out liberalism and progress. “[The backlash] came like a multi-front war on democracy, a multi-front war on liberalism, a multi-front war on a multicultural democracy,” said historian Carol Anderson, who has examined backlash in books such as White Rage and We Are Not Yet Equal. “It knocked some folks back on their heels.”  A brief history of backlash in America Backlash politics have long defined the country. The term “backlash” gained popularity in politics after John F. Kennedy proposed the Civil Rights Act of 1963. “Transferred to the world of politics, the white backlash aptly describes the resentment of many white Americans to the speed of the great Negro revolution, which has been gathering momentum since the first rash of sit-ins in early 1960,” said a 1964 article in Look magazine. The phenomenon, however, goes back to Reconstruction beginning in the 1860s, when white lawmakers claimed that equality for freed Black Americans threatened them, according to Larry B. Glickman, a historian at Cornell University who is writing a book about backlash since Reconstruction. Lawmakers instituted literacy tests and taxes at the polls while white agitators used violence and intimidation, all to prevent Black Americans from participating as full citizens.   “There’s a backlash impulse in American politics,” Glickman said. “I think 2020 is important because it gets at another part of backlash, which is the fear that social movements for equality and justice might set off a stronger counter-reaction.” The protests of 2020 did. And though race is still at the core of the post-George Floyd backlash, many Republicans have gone to new lengths to conceal this element.  “One of the things that the civil rights movement accomplished was to make being overtly racist untenable,” said Anderson. “Today they say, ‘I can do racist stuff, but don’t call me racist.’” For Anderson, backlash is about instituting state-level policies that undermine African Americans’ advancement toward their citizenship rights. By early 2021, alongside the effort to “stop the steal,” legislation that would limit or block voting access, give police protection, and control the teaching of concepts such as racial injustice began spreading across Republican-controlled state legislatures — all in the name of protecting America.  “They cover [voter suppression] with the fig leaf of election integrity, with the fig leaf of trying to protect democracy, and with the fig leaf of stopping massive rampant voter fraud,” Anderson said. And, she said, laws banning the teaching of history get covered “with the fig leaf of stopping indoctrination.” That coordinated legislation was a direct response to potential racial gains for Black Americans and other marginalized groups. “After the death of George Floyd in 2020, it seemed like all of our institutions suddenly shifted overnight,” conservative activist Christopher Rufo said in a 2022 interview. Rufo’s answer was to release a series of reports about diversity training programs in the federal government and critical race theory, which, he argued, “set off a massive response, or really, revolt amongst parents nationwide.”  “Race is key,” said Glickman. “When the term backlash was popularized, it was often called the ‘white backlash.’ It was very clear that it was understood as resentment. The campaign for Black equality was moving too fast and going too far. I still think that’s at the root of many backlash movements.”  The new era of backlash is grievance-driven  That racial resentment has since taken on a particularly acrid temperament since Floyd’s death. At the 2023 Conservative Political Action Conference, Trump, facing a litany of criminal and civil charges, stood on stage and told the audience, “I am your warrior. I am your justice. And for those who have been wronged and betrayed, I am your retribution.”  Trump’s words summarized the political discourse that has spread since the killing of George Floyd and highlighted the absence of a formal Republican policy agenda. “[What he said was] not policy,” said historian John Huntington, author of the book Far Right Vanguard: The Radical Roots of Modern Conservatism. “It was just vengeance for some sort of perceived wrongs.” He added, “policy has taken a backseat to cultural grievances.”  What Huntington calls out as “endless harangues against very nebulous topics like critical race theory or wokeness or whatever the current catchphrase is right now” are an important marker of this new era. “A key element of the current backlash we’re seeing is a politics of grievance,” he says. “‘I have been wronged somehow by the liberals or whoever, and Trump is going to help me get even with these people that I don’t like.’” “It’s a reversal that happens in backlash language where privileged white people take the historical position of oppressed people” Glickman calls this backlash tactic an “inversion” or “elite victimization”: “It’s a reversal that happens in backlash language where privileged white people take the historical position of oppressed people — often African Americans but sometimes other oppressed groups — and they speak from that vantage point.” To be sure, Republicans have passed dozens of laws through state legislatures to do everything from restricting voting to banning trans athletes from participating in sports. But for Huntington, these reactionary laws don’t amount to legitimate policy. “It’s very difficult to convince people to build a society rather than trying to tear down something that’s already existing,” he said. “Critiquing is easy. Building is hard.” Nationally, Republicans only passed 27 laws despite holding 724 votes in 2023.  Though other backlash movements in history, such as the response to desegregation or the Confederacy, have involved violence, today’s backlash also features a greater embrace of it from the Republican Party as a whole, according to Huntington.  “But nowadays, the GOP, having moored themselves to Trump, have very much kind of implicitly embraced this politics of violence,” Huntington said.  The January 6 insurrection, and how Trump and other Republicans have expressed a desire to pardon insurrectionists, is emblematic of how the party has aligned itself with a much more radical idea of how to gain and keep power.  “If you’re embracing the politics of violence in order to gain power,” said Huntington, “that illustrates a dark turn in American politics.”  Still, no backlash is forever. The events of 2020 triggered a particularly virulent right-wing response, but many such movements have failed, including various stages of this one.  “Backlashes have been very effective at mobilizing opposition to movements for equality, but I don’t think that they’re necessarily successful,” said Glickman. “I would say the jury’s still out.” They “are often seen as automatic and inevitable and sort of mechanistic and unstoppable. But I don’t think that,” he added. “Backlashes are political movements made up of human beings who were asserting their agency, and sometimes they’re successful and sometimes they’re not successful. I think we’ve blown up the backlash sometimes as this all-powerful phenomenon.” This current backlash certainly isn’t achieving all of its goals. Trump lost in 2020, and the decision to overturn Roe v. Wade has prompted a backlash to the backlash, with voters in several states choosing to protect abortion rights through constitutional amendments.  With all their force and fire, backlashes can fail to anticipate pushback from people committed to democratic values. “The mobilization is really quiet,” Anderson said. “We are so focused on the flames that we miss the kindling … we miss the folks who are quietly, doggedly going about the work of democracy.” 
vox.com
Serial transformed true crime — and the way we think about criminal justice
In 1966, Truman Capote’s In Cold Blood all but created the true crime genre. Nearly 50 years later, radio journalist Sarah Koenig decided the case of a Baltimore high school student, Adnan Syed, convicted of murdering his teenage ex-girlfriend Hae Min Lee, needed a second look.  With its high production values, conversational style, and a storyline unfolding in real time across episodes, 2014’s Serial fueled a new wave of interest in true crime and transformed podcasting. Its first season — with its piano-plinking earworm of an opening theme and endless parodies — was once the most downloaded podcast in the world at 300 million, a number that now feels almost quaint thanks to the influence Serial has had on the entire medium.  But Serial’s most consequential effect was on the criminal justice system itself. More from This Changed Everything The last 10 years, explained The “racial reckoning” of 2020 set off an entirely new kind of backlash The internet peaked with “the dress,” and then it unraveled Before the landmark series, the main way we received our pop culture narratives about crime came through police procedurals like Law & Order and high-profile investigations like that of O.J. Simpson or JonBenét Ramsey, where the accompanying media circus often overshadowed the facts; serious deconstruction of individual cases was relegated to niche internet forums or the occasional prestige documentary. Even in more routine circumstances, police departments typically controlled the stories around criminal investigations, choosing what the public got to know and when they knew it. This grip on information often meant the media had no choice but to parrot the police narrative of a case — a framing mirrored by the onscreen “copaganda” of procedurals and other scripted shows. Serial changed that by ushering in an age of increased scrutiny over the narratives we’re fed about policing and by making millions of listeners more fundamentally aware of the limits and flaws of the justice system. From that awareness has come serious action that arguably helped free Serial’s own subject. Much has been made of the ways in which the true crime podcasting boom may have normalized the more negative stereotypes of the genre: obsessed fans harassing suspects and thinking they know better than authorities, or boozed-up white women joking about murder as millions of fans laugh along without regard for victims or survivors. To be sure, thorny complications can arise, but little attention has been given to the positive outcomes of this kind of collectivism when it’s applied to an unjust system. True crime podcasts, starting with Serial and the high-profile podcasts that followed, “have offered a critical lens through which to scrutinize the procedures and decision-making in the criminal justice system,” Kent Bausman, a criminologist and sociology professor at Maryville University, told Vox in an email. “They have enlightened the public consciousness about the convoluted machinations of the system and revealed with great clarity the human experience of miscarriages of justice.” Bausman noted that true crime podcasts frequently provide insight into “everything from the production and use of false confessions and the inherent problems that exist regarding the use of forensic evidence in the courtroom.” Bausman pointed out that organizations like the Innocence Project have existed for decades, yet it’s only recently that they’ve become better known as a result of the true crime explosion. We’ve gained a broader cultural awareness of the factors that lead to the wrongful convictions that the Innocence Project and its peers help overturn — things like false confessions, police misconduct, bad forensics, and false testimony at trial. Additionally, terms like “Missing White Woman Syndrome” and “Missing and Murdered Indigenous Women” have sprung up to encompass an entire range of police inadequacies when it comes to the racial and socioeconomic gaps between “perfect” victims and forgotten ones. True crime has “revealed with great clarity the human experience of miscarriages of justice” True crime fans are now loud advocates for thorough investigations. They’re more knowledgeable about shady criminal justice techniques, from entrapment and “Mr. Big” operations to Brady violations and the Reid technique. There’s an increased familiarity with nonprofits that help law enforcement solve cases, from Texas EquuSearch to the DNA Doe project, as well as those that seek criminal justice reform, like End the Backlog. Several of these organizations build upon what is perhaps the biggest recent breakthrough in criminal investigations: forensic genealogy. The use of familial DNA to catch culprits has revolutionized crime-solving amid the true crime boom. In 2018, when forensic genealogy led to the capture of the Golden State Killer, the true crime world greeted the announcement like sports fans might celebrate winning the World Series — a comparison that captures the complicated nature of a genre that makes entertainment out of tragedy. Wider concerns about genealogical privacy and private companies sharing user information quickly followed. That collectivity and the sense that a “true crime community” exists also largely came about because of Serial.  After Serial, millions of people became amateur detectives. Legions of fans have made themselves an invaluable part of the crime-solving process via social media, as well as longtime true crime forums like Websleuths. They’ve pored over cases until they’ve become nigh experts themselves, drummed up tips to law enforcement, generated new interest in cold cases, and often all but led authorities by the nose to conclusions they should have reached long ago; in one famous case, this latter scenario played out before the ears of millions of listeners after an amateur sleuth made his own podcast to draw attention to the Kristin Smart case and forced his local cops to pay attention.  Journalist-led true crime podcasts have also had a direct impact on the cases they’ve investigated in the intervening years — like In the Dark, which helped free its season two subject, Curtis Flowers, from death row in 2019. In 2022, the runaway hit Murdaugh Murders helped catalyze the re-investigation of the death of Stephen Smith, which is widely believed to be connected to the byzantine crimes of Alex Murdaugh.  Not all criminal investigations benefit from millions of newly minted amateur sleuths diving into the fray. Bausman warns it can in fact “commodify both offenders and victims for the public’s amusement.” He also pointed out that despite the renewed attention true crime podcasts can bring to stagnant investigations, the clearance rates for homicide cold cases have not increased due to this influence.  Still, Serial continues to have an outsized impact on our cultural understanding of the criminal justice system, and this sea change ultimately came full circle back to Adnan Syed.  Legions of fans have made themselves an invaluable part of the crime-solving process In 2022, Syed’s hometown of Baltimore revisited dozens of convictions as part of a larger overall effort by Maryland to atone for decades of draconian sentences handed out to juvenile and young offenders, many of whom spent their entire adult lives in prison with no opportunity for parole. This is just one example of how prosecutorial divisions across the country are reexamining wrongful and unfair convictions through what are known as conviction integrity units and sentencing review units. These programs are part of the normalization of criminal justice reform that has come amid an enormous shift in attitudes about prosecutions in the decade since Serial aired. It was a dogged pursuit of local criminal justice reform that allowed Syed to finally walk free, though the flashier “whodunit” aspects of his case that initially attracted Koenig also delivered a twist. Syed’s case review uncovered new evidence, including two new suspects, that cast reasonable doubt on his trial and conviction. Prosecutors dropped all charges against Syed just days later; they later walked this back on a technicality. Those nuances also reflect a post-Serial shift in public advocacy and focus: on the rights of victims and their families in cases like this one. Although his case is still in limbo, Syed remains out of jail, his conviction stayed until Hae Min Lee’s family’s concerns can be resolved. It’s the kind of messy, satisfyingly unsatisfying conclusion that befits both Serial itself and the evolved criminal justice era we’re in — one in which answers rarely come easily, but for perhaps the first time, all of us are looking.
vox.com
10 big things we think will happen in the next 10 years
Every year, the staff at Vox’s Future Perfect section makes predictions for events we think are likely to happen (or not happen) in the year ahead. Big deal, right? Reporters and pundits make predictions all the time. What we think sets our efforts apart from the usual journalistic prognostication is that we put a numerical probability on each of our predictions, which we use to signal how confident we are in what we’re forecasting. That matters, because numbers are more meaningful and more comparable than words like “definitely,” “maybe,” or “probably.” More from This Changed Everything The last 10 years, explained The overlooked conflict that altered the nature of war in the 21st century The internet peaked with “the dress,” and then it unraveled Usually when we make our forward-looking predictions at the end of the year, we check back the following year to see how we’ve done. That’s partly because we think holding ourselves accountable is just good epistemic practice. (If we’re wrong, we should acknowledge it.) But we also do it because the best way to become a superforecaster is through forecasting, checking the results, and forecasting again.  This time we’re doing something a little different. To mark our 10th anniversary, Vox’s staff dug into the turning points of the last decade — those moments in the news when history shifted course. Future Perfect took the opportunity to try to proactively predict what might be some of the meaningful turning points of the next decade. So come back for our 20th anniversary and see how we did. (Unless, of course, prediction seven comes true, in which case we may all have other things to deal with.) —Bryan Walsh Less than 7 percent of the world will be under the World Bank’s International Poverty Line, now $2.15 a day (70 percent) Progress against global poverty has been extremely rapid in recent decades. From 1990 to 2019, the extreme poverty rate, as measured by the World Bank, fell from 38 percent to 8.9 percent globally: The decline was especially fast in East Asia (in particular China) and South Asia (in particular India). In East Asia, the rate fell from 65.4 percent in 1990, the highest of any world region, to a measly 1.2 percent in 2019. The economic disruptions of the pandemic were a major setback across the globe, but poverty reduction seems to finally be getting back on track, at least in South Asia. The bad news is that today, most poor people live in sub-Saharan Africa, and while poverty there fell in recent decades, it fell more slowly than in Asia. Weak, coup-prone states, persistent infectious disease burdens, and recurrent war and violence have made it harder for African countries to build large export-based industries like China, or robust service sectors like India’s.  The most recent projections I’ve seen by World Bank researchers suggest that the world’s extreme poverty rate will fall to 6.8 percent by 2030. At that pace of progress, the rate should still be over 6 percent by 2034. The final steps toward eliminating extreme poverty will be the hardest, and progress will likely be slower than it’s been in the past couple decades. It’s important also to remember that the $2.15 per day poverty rate used by the World Bank is very low. The US poverty rate is closer to $30 per day; by that standard, the vast majority of the world will still live in poverty in 2034. But I have faith we’ll still make progress. —Dylan Matthews A level 4 autonomous vehicle will be for sale in the US to ordinary customers (80 percent) The autonomy metric developed by SAE (the organization formerly known as the Society of Automotive Engineers) measures on a 0 to 5 scale how capable cars are at self-driving. Zero means the driving is totally manual; 1 means that one element, like the speed of the car, is automated, as in the adaptive cruise control systems common in new cars as of 2024; 2 means that multiple elements, like both speed and steering, are controlled by computer, as in Tesla Autopilot (which, to be clear, is not actually an autopilot). Level 4 is where things start to get interesting. It signifies a car that can self-drive all the time, provided it’s only used in a certain type of environment (say, only in a city, or only on paved roads). This is where the technology goes from a convenience to a game-changer, and where scenarios like pushing a button to have your car find a parking spot by itself, or commuting to work while typing at a laptop and not even looking through the front windshield, start to become possible. We already have level 4 vehicles. Sort of. On a trial basis. As of 2023, Waymo, the self-driving division of Alphabet (formerly Google) and by far the industry leader, was operating level 4 taxis in San Francisco and Phoenix. It has expanded to Los Angeles this year and is opening in Austin soon.  Until very recently, Waymo has limited itself to surface streets, and its vehicles are pointedly not for sale. They’re for rent, and even then in a small handful of places. I would very much like to use a self-driving car on my periodic 500-mile road trips from DC to New Hampshire, but alas, we’re still a ways off. I’m guessing that by 2034, we’ll be there. Waymo and other companies are gathering massive amounts of video, radar, and other data they can use to train cars to operate in diverse environments. Meanwhile the economic advantages of self-driving vehicles — in terms of labor costs, parking costs, idleness, and more — are enormous.  The fact that Waymo can run vehicles reliably in several cities — albeit cities that don’t have much experience with snow and similar adverse weather — is a very encouraging sign. There’s a gap between that and hopping into my 2035 model year Honda Civic, pulling up an address on my phone, and having the Civic chauffeur me there like a sultan. But the gap is narrowing, fast. —Dylan Matthews World life expectancy will exceed 75 (60 percent) Here, I am mirroring the UN’s World Population Prospects report, which estimates that in 2034, global life expectancy at birth will reach 75.2 years. That number will mask considerable inequality, with North America at 81.7 and Africa at 65.6. But as with global extreme poverty, this projection also relies on decades of past progress. When UN population data began in 1950, the world had a life expectancy of 46.5 years. Over the next thirty years, it rose to over 60. In 2021, in the midst of a global pandemic pushing the number down, it was 71.  As the world gets better at conquering infectious diseases like tuberculosis, HIV/AIDS, and malaria, and as countries grow richer and their residents better able to afford medical treatment and healthier living conditions, lifespans are growing. As with poverty, progress in life expectancy is becoming more gradual. In the 1960s alone, global lifespans shot up by over 8 years. We’ve picked enough of the low-hanging fruit that we’re not going to see progress that fast again. But we still have a ways to go in preventing deaths from easily treatable diseases, which means that global life expectancies have not stopped their rise yet. —Dylan Matthews US adult obesity rates will decline 5 percentage points (65 percent) In its 2017–2018 survey, the CDC’s authoritative National Health and Nutrition Examination Survey found that the US adult obesity rate was 42.4 percent. (Obesity in the survey means having a body mass index — a metric that has come in for its share of criticism — of 30 or more.) While the agency could not finish its scheduled 2019-2020 survey because of Covid-19, it took the data it had collected by March of 2020 and combined it with the previous cycle to update their estimates.  They found that the obesity rate among US adults over 20 had fallen to 41.9 percent as of that terrible spring — a time before the market for Ozempic and a promising new class of anti-obesity medications had begun to grow exponentially. There is still plenty of uncertainty about what the long-term health effects of these drugs will be. The future holds too many unknowns: How many people get prescriptions? Will patients stick with the drugs? Will they be able to afford them?  But Wegovy et al. are about to become even more commonplace than they already are, now that Medicare has decided to cover them for heart conditions, which about four in 10 Medicare patients have. Millions of prescriptions will be written.  New drugs are also in the works: Ozempic was a breakthrough because it was “just” a weekly injection (and very effective). Pill forms of semaglutide are now being tested for weight loss. Even if some people do stop taking their medicine, one large new study (not yet peer-reviewed) suggests many of them may be able to keep the weight off. There have been momentary dips in obesity before, while the long-term trend has stayed stubbornly upward. In 2000, the adult obesity rate was 30.5 percent. Now, it’s north of 40 percent. But extrapolate the improvement from 2017–2018 to 2019–2020 and the United States could potentially make up a lot of that ground over the next decade. And this time, we have the most effective anti-obesity medications ever developed. —Dylan Scott The US will slaughter 11.5 billion or more land animals per year (70 percent) In 2022, the US hit a grim milestone: It was the first year the country had slaughtered more than 10 billion land animals for food. I predict that in 2033 it will slaughter 11.5 billion or more land animals.  It’s not so much my prediction — earlier this year, the US Department of Agriculture published its meat, dairy, and egg production projections through 2033. The agency didn’t project the number of animals that’ll be raised for food each year but rather the amount of meat, eggs, and dairy that will be produced by weight, which I converted into the number of animals that’ll be slaughtered. I’m putting my confidence level at 70 percent because there are a number of unpredictable domestic and international factors that could influence the final number: economic recessions, zoonotic diseases (like the ongoing bird flu outbreaks), consumer trends, the cost of farming inputs (like fertilizer and animal feed), and agricultural and trade policy. But the most important factor, by far, will be demand for poultry, as chickens comprise over 90 percent of the total number of animals raised for food in the US. And that’s expected to continue to rise well into the next decade — and half-century.  The USDA also projects that US per capita consumption of red and white meat combined will rise from 226.8 pounds in 2022 to 235.4 pounds in 2033 — a near 4 percent increase. Given the enormous ecological and climate impact of meat and dairy production, this is all headed in the opposite direction of what climate scientists say we must do: drastically reduce livestock numbers and transition to a plant-rich food system.  If the US were to align its agricultural policy with its climate policy, we’d move in that direction instead. I’m not a betting man, but if I were — given the power of the meat lobby — I don’t expect that to happen by 2034. —Kenny Torrella Implantable brain-computer interfaces will be used for human augmentation, not just medical need (65 percent)  Over the next decade, cutting-edge technology will open the floodgates for human augmentation, which could fundamentally change what it means to be human. From CRISPR to biohacking, there are a growing number of methods people might want to use to enhance themselves.  But I want to make a prediction specifically about brain-computer interfaces (BCIs): By 2034, someone will have an implantable BCI not to help with paralysis or disease, but simply because they want to be a smarter or faster version of themselves.  So far, the FDA has greenlighted brain chips for people with medical conditions like paralysis, allowing them to type or text with just their thoughts. But it’s no secret that some of the makers of these chips — like Elon Musk’s Neuralink — want to go further. Musk has said that the ultimate goal is “to achieve a symbiosis with artificial intelligence.” And Neuralink is explicit about its dual mission: to “create a generalized brain interface to restore autonomy to those with unmet medical needs today and unlock human potential tomorrow.” BCIs exist on a spectrum of invasiveness, with Neuralink at the most extreme end (the device is implanted directly into the brain) and wearables on the other end (they use EEG sensors and other tech that sits on the scalp or skin, so there’s no need for surgery; some are already on the consumer market).  Between these two ends of the spectrum, there are BCIs that are less invasive than Neuralink’s but are nevertheless implantable — they may go in the skull, but not in the brain itself. This type, not a Neuralink device, will probably be used for enhancement purposes first, and I will consider my prediction right if this type is in use by 2034.  I know even that may seem extreme. And I wouldn’t hazard such a guess in the realm of, say, CRISPR, because when it comes to changing our genetics the scientific community has been appropriately cautious in calling for moratoria. But if the ruling mantra in biomedicine is “first, do no harm,” the ruling mantra in tech is “move fast and break things.” And with companies like Meta and Apple also exploring brain-reading technology, appetite is growing: The global BCI market is expected to top $8 billion within the next 8 years. So I think this is a solid possibility. —Sigal Samuel A nuclear weapon will be used (20 percent) My Future Perfect colleague Dylan Matthews, who is the best forecaster I know, wrote recently that one key to accurately predicting future events is to first establish a “base rate.” A base rate is the rate at which some event has been known to happen in the past, which is a useful starting point for estimating how likely it is to happen in the future. The base rate for nuclear weapon use in war in a given year is, thankfully, very low. (If it weren’t, I likely wouldn’t be here writing this.) Nuclear weapons have been available to at least one country since the US successfully tested the first atomic bomb at Trinity Site in New Mexico on July 16, 1945. In all the time since, an atomic weapon has been used in war twice — in 1945, by the US at Hiroshima and Nagasaki.  That’s one year out of 79, which equates to a base rate of 1.2 percent. Extend that over the next decade, and the base rate grows to a little more than 11 percent. And even that’s likely putting it high — for the last 78 out of those 79 years, nuclear weapons have been used in war precisely zero times, which should make us more confident that we’ll get through the next decade nuclear-free. So why do I think there’s a 1 in 5 chance we’ll see a nuclear weapon deployed in war over the next decade? It’s based on my reading on trends in the future of war, arms control, and international relations — all of which are moving in increasingly dangerous directions. War between states, which had basically ceased over the past several decades, is back once more. Russia’s invasion of Ukraine in 2022 made nuclear weapons a live issue again, in part because Vladimir Putin was not shy of making vague atomic threats, but also because nuclear weapons ultimately defined the contours of the conflict.  There’s a limit to how much NATO can defend Ukraine precisely because Putin controls the single biggest nuclear arsenal in the world. And Putin himself is constrained — though how much he purposefully makes ambivalent — by NATO’s own nuclear weapons, and its stated policy of treating an attack on one member state as an attack on all. So far the balance has held, but that’s a risky place to be. And even as the war in Ukraine raises the atomic temperature, the arms control treaties that have successfully reduced the nuclear threat since the end of the Cold War are unraveling one by one. The New START treaty, which aimed to constrain the tactical and strategic nuclear arsenals of both the US and Russia, is currently set to expire in 2026.  Both the US and Russia have withdrawn from the Intermediate-Range Nuclear Forces Treaty, which reduced short-range nuclear missile numbers. Given the open animosity between Russia and the West — plus the growing arsenal of China, which is mostly outside the current arms control regime — and you have a three-body problem that spells heightened nuclear danger. Most worrying of all, the nuclear taboo seems weaker than ever. Over the past several months, Iran — a country that is perpetually on the threshold of nuclear weapons — launched attacks on two nuclear-armed countries, Pakistan and Israel. North Korea, which has resisted every attempt to constrain its arsenal, now has dozens of nuclear weapons, many of which are capable of striking the US. Should Donald Trump return to the White House and make good on his promises to close the US nuclear umbrella, the risk will grow that countries in Europe, the Middle East, and Asia will seek their own nuclear weapons. That future, as President John F. Kennedy said in 1961, will be one where we all live under a “nuclear sword of Damocles.”          I usually hope my predictions will turn out correct. This one, I pray, won’t. —Bryan Walsh The number of people forcibly displaced around the world according to the UN will exceed 200 million by 2034 (70 percent) In May 2022, the UN refugee agency reported that the number of people forcibly displaced from their homes globally had passed 100 million for the first time in history. Since then, the number has only grown, reaching 114 million people as of early 2024, including those displaced within their countries and those forced to flee across international borders. The causes are manifold, with war as a major driver. More than seven in 10 international refugees come from just five countries — Syria, Ukraine, Venezuela, Afghanistan, and South Sudan — most of which are embroiled in some kind of serious conflict. For countries like Cuba, severe economic disruption can force people to flee in search of a better life, or just survival. Environmental factors — including climate-driven droughts, famines, and wildfires — are driving others to leave their homes in places like southern Africa. And human rights abuses, such as those perpetrated against the Rohingya minority in Myanmar, are another factor.Predicting that the number of displaced people will reach 200 million between now and 2034 means assuming that the total will rise at least 75 percent. I think that’s a more than reasonable bet. For one thing, while global population growth has been slowing, it’s still increasing a little below 1 percent per year, which means the world will likely have hundreds of millions of more people by 2034. More importantly, the bulk of that growth will be in places like sub-Saharan Africa — a region that is already highly vulnerable to climate disasters and political disruption. That means more people will likely be in harm’s way, even as that harm continues to grow. Nor do I expect the rich nations for displaced people are trying to reach to suddenly become more welcoming — even if, precisely because fertility rates are falling so rapidly, it would likely be in their long-term economic interest to do so. The very fact that President Joe Biden is considering stringent border policies that sound like something out of the Trump playbook underscores just how toxic migration has become as a political issue. I don’t see that changing. Lastly, another driving factor is actually a sign of global success. It sounds paradoxical, but it’s true — economic growth in poor nations can increase outward migration. As even the poorest people in the world become richer than they used to be, more people will have the means to escape their countries. The last decade has seen a startling increase in displacement and migration. I’m willing to bet that the next decade will see even more.  —Bryan Walsh Global energy emissions will peak (85 percent) Finally, some actual good news. With the exception of 2020, when pandemic lockdowns froze economic activity, energy-related greenhouse gas emissions have mostly gone in one direction: up. Last year, global energy emissions reached a record high of 37.4 billion metric tons of CO2. But the peak is in sight, and it may be coming sooner than you think. The International Energy Agency (IEA) has predicted that energy emissions will top out in 2025, a forecast echoed by the energy intelligence company Rystad. Behind those estimates are predictions that global coal consumption — which has been at a record level, driven by demand from developing countries — will peak in the next few years, as will demand for oil.      Together those two fuels contribute more than 50 percent of global energy emissions. While demand for natural gas continues to grow, as lower-carbon gas substitutes for coal, it’s still a net win for emissions. Add that to the rapid increase in solar and wind energy, along with renewed growth in nuclear energy, and there’s a more than reasonable case that the end of a multi-decade era of growing energy emissions could finally be over.  Of course, nothing is guaranteed. As artificial intelligence ramps up, the demand for electricity to power all those generative searches and data centers could skyrocket. Already, after years of relatively flat growth, US electricity demand is projected to spike, due in large part to AI demand. And there are other potential obstacles; should the EV revolution falter, oil demand could continue to rise for years into the future. The renewable energy revolution in the US will require an equally revolutionary shift in how the country permits energy projects — and that’s far from guaranteed. Nonetheless, I’m willing to bet that energy emissions will peak, and soon. That doesn’t mean we’ve won the war against climate change — not by a long shot. Emissions will have to peak soon and then drop, rapidly, to avert some of the more dangerous scenarios around climate change. Still, for as long as I’ve been alive, the indicators around climate change have only gone in one direction: worse. Don’t underestimate the psychological benefit of finally turning one around. —Bryan Walsh The US car fleet will be 10 percent EV (65 percent) The global car market is on the cusp of its greatest revolution since, perhaps, the Model T first brought automobility to the American middle class. To decarbonize the transportation sector, the world’s second-highest greenhouse gas emitter, governments are pushing to speed up the availability and mass adoption of electric cars.  Last year, according to preliminary data, about 18 percent of new cars sold around the world were electric, up from 14 percent in 2022 and less than 5 percent in 2020. The world’s highest electric car adoption rates are in China, where about a third of new cars are electric, and in Western Europe, but US sales have been rising precipitously, too, hitting 10 percent of new cars in 2023, more than four times the share just two years prior.  EVs’ year-over-year share of new car sales give us a sense of how quickly they’re going mainstream, but it’s the overall makeup of cars on US roads that matters for carbon emissions. At the end of 2022, EVs made up a more sobering 1.2 percent of the national car fleet.  As new federal and state fuel economy rules aiming to sharply reduce vehicle emissions take effect, that number will climb. One leading estimate projected that 10 percent of cars on American roads will be electric by 2030. To reach that, we’d need to add more than 20 million new EVs over the next six years. I don’t think it’ll happen that quickly, but I do think the share will hit 10 percent within a decade.  With interest rates elevated until who knows when, Americans are holding on to old cars for longer than ever — and the average age of cars on the road has been steadily increasing for decades, a trend that’s unlikely to go away. There have also been recent signs, as Vox’s Umair Irfan has reported, that Americans’ enthusiasm for EVs is flagging. 2023 US electric car sales disappointed the expectations of some forecasters, as consumers report being wary of the high price tag, lack of convenient charging stations, and reliability issues, particularly in cold weather. This year, following concerns from the auto industry, the EPA’s new car emissions standards were downgraded from what the agency proposed last year. The hurdles — like equipping power grids to handle tens of millions of massive EV batteries — are considerable. But mass adoption will depend on carmakers’ and policy leaders’ will to make electric cars no less cheap or convenient than their fossil-burning counterparts. —Marina Bolotnikova
vox.com
This Changed Everything
Vox has been explaining for an entire decade. We’ve published thousands upon thousands of articles, videos, and podcasts about nearly every subject imaginable, illuminating all the important and interesting and confounding things that give us a better understanding of who we are and how our world works. It’s been a wild time! To say so much has happened would be a gross understatement. Our staff has taken the opportunity of our 10th anniversary to look back at the moments that defined the past decade. Rather than rehashing how, say, the Covid pandemic impacted public health in America, we’ve focused our attention on the unexpected — both small-scale incidents that foretold big-deal changes and huge events that had surprising consequences. There are stories on everything from the rise of self-care to the death of George Floyd, a not-at-all-comprehensive list of pivotal turning points from reporters across the newsroom, accompanying Today, Explained episodes, and a text and video forecast of what the next decade could look like, from Future Perfect. The last 10 years, explained The (wild! scary! surprising!) moments that mattered. How the self-care industry made us so lonely The commodification of an activist concept turned a revitalizing practice into an isolating one. by Allie Volpe The “racial reckoning” of 2020 set off an entirely new kind of backlash We are living in an era of conservative grievance politics. by Fabiola Cineas The overlooked conflict that altered the nature of war in the 21st century From drones to social media, the war between Armenia and Azerbaijan was a preview of Ukraine and the conflicts to come. by Joshua Keating The internet peaked with “the dress,” and then it unraveled The ominously perfect meme marked the splintering of our shared reality. by Brian Resnick Serial transformed true crime — and the way we think about criminal justice The show that helped to free Adnan Syed completely upended how much the average person knows about US legal and prison systems. by Aja Romano 10 big things we think will happen in the next 10 years Obesity will go down, electric cars will go up, and a nuclear bomb might just fall. Credits Editorial Director: Julia Rubin | Project Manager: Nathan Hall Reporters: Alex Abad-Santos, Zack Beauchamp, Fabiola Cineas, Rachel Cohen, Kyndall Cunningham, Abdallah Fayyad, Constance Grady, Ellen Ioanes, Oshan Jarow, Benji Jones, Josh Keating, Whizy Kim, Keren Landman, Dylan Matthews, Ian Millhiser, Anna North, Christian Paz, Brian Resnick, Aja Romano, Sigal Samuel, Dylan Scott, Allie Volpe Editors: Marina Bolotnikova, Melinda Fakuade, Meredith Haggerty, Caroline Houck, Libby Nelson, Alanna Okun, Lavanya Ramanathan, Izzie Ramirez, Patrick Reis, Paige Vega, Elbert Ventura, Bryan Walsh Art Director: Paige Vickers | Illustrator: Hudson Christie Managing Editor, Audio & Video: Natalie Jennings Audio: Amina al-Sadi, Laura Bullard, Rob Byers, Victoria Chamberlin, David Herman, Miranda Kennedy, Noel King, Andrea Kristinsdottir, Amanda Lewellyn, Sean Rameswaram Video: Rajaa Elidrissi, Adam Freelander, Dean Peterson, Joey Sendaydiego, Catherine Spangler Style & Standards: Elizabeth Crane, Anouck Dussaud, Kim Eggleston, Caity PenzeyMoog, Sarah Schweppe Audience Lead: Gabby Fernandez | Audience: Shira Tarlo, Kelsi Trinidad  Special thanks: Bill Carey, Nisha Chittal, Swati Sharma
vox.com
The B-17 blew apart in an instant. The memory has burned for 80 years.
For waist gunner Mel Jenner, a friend’s farewell in the skies over occupied France has echoed since 1944.
washingtonpost.com
My Husband Unearthed His Biological Family’s History. Oh No.
We need to have an eye out for the risks.
slate.com
Help! My Father-in-Law Insists on Kissing Me on the Lips.
How many colds can I fake?!
slate.com
How to Keep Watch
With smartphones in our pockets and doorbell cameras cheaply available, our relationship with video as a form of proof is evolving. We often say “pics or it didn’t happen!”—but meanwhile, there’s been a rise in problematic imaging including deepfakes and surveillance systems, which often reinforce embedded gender and racial biases. So what is really being revealed with increased documentation of our lives? And what’s lost when privacy is diminished?In this episode of How to Know What’s Real, staff writer Megan Garber speaks with Deborah Raji, a Mozilla fellow, whose work is focused on algorithmic auditing and evaluation. In the past, Raji worked closely with the Algorithmic Justice League initiative to highlight bias in deployed AI products.Listen to the episode here:Listen and subscribe here: Apple Podcasts | Spotify | YouTube | Pocket CastsThe following is a transcript of the episode:Andrea Valdez: You know, I grew up as a Catholic, and I remember the guardian angel was a thing that I really loved that concept when I was a kid. But then when I got to be, I don’t know, maybe around seven or eight, like, your guardian angel is always watching you. At first it was a comfort, and then it turned into kind of like a: Are they watching me if I pick my nose? Do they watch me?Megan Garber: And are they watching out for me, or are they just watching me?Valdez: Exactly. Like, are they my guardian angel or my surveillance angel? Surveillance angel.Valdez: I’m Andrea Valdez. I’m an editor at The Atlantic.Garber: And I’m Megan Garber, a writer at The Atlantic. And this is How to Know What’s Real.Garber: I just got the most embarrassing little alert from my watch. And it’s telling me that it is, quote, “time to stand.”Valdez: Why does it never tell us that it’s time to lie down?Garber: Right. Or time to just, like, go to the beach or something? And it’s weird, though, because I’m realizing I’m having these intensely conflicting emotions about it. Because in one way, I appreciate the reminder. I have been sitting too long; I should probably stand up. But I don’t also love the feeling of just sort of being casually judged by a piece of technology.Valdez: No, I understand. I get those alerts, too. I know it very well. And you know, it tells you, “Stand up; move for a minute. You can do it.” Uh, you know, you can almost hear it going, like, “Bless your heart.”Garber: “Bless your lazy little heart.” The funny thing, too, about it is, like, I find myself being annoyed, but then I also fully recognize that I don’t really have a right to be annoyed, because I’ve asked them to do the judging.Valdez: Yes, definitely. I totally understand. I mean, I’m very obsessed with the data my smartwatch produces: my steps, my sleeping habits, my heart rate. You know, just everything about it. I’m just obsessed with it. And it makes me think—well, I mean, have you ever heard of the quantified-self movement?Garber: Oh, yeah.Valdez: Yeah, so quantified self. It’s a term that was coined by Wired magazine editors around 2007. And the idea was, it was this movement that aspired to be, quote, unquote, “self-knowledge through numbers.” And I mean, it’s worth remembering what was going on in 2007, 2008. You know, I know it doesn’t sound that long ago, but wearable tech was really in its infancy. And in a really short amount of time, we’ve gone from, you know, Our Fitbit to, as you said, Megan, this device that not only scolds you for not standing up every hour—but it tracks your calories, the decibels of your environment. You can even take an EKG with it. And, you know, when I have my smartwatch on, I’m constantly on guard to myself. Did I walk enough? Did I stand enough? Did I sleep enough? And I suppose it’s a little bit of accountability, and that’s nice, but in the extreme, it can feel like I’ve sort of opted into self-surveillance.Garber: Yes, and I love that idea in part because we typically think about surveillance from the opposite end, right? Something that’s done to us, rather than something that we do to ourselves and for ourselves. Watches are just one example here, right? There’s also smartphones, and there’s this broader technological environment, and all of that. That whole ecosystem, it all kind of asks this question of “Who’s really being watched? And then also, who’s really doing the watching?”Valdez: Mm hmm. So I spoke with Deb Raji, who is a computer scientist and a fellow at the Mozilla Foundation. And she’s an expert on questions about the human side of surveillance, and thinks a lot about how being watched affects our reality.—Garber: I’d love to start with the broad state of surveillance in the United States. What does the infrastructure of surveillance look like right now?Deborah Raji: Yeah. I think a lot of people see surveillance as a very sort of “out there in the world,” physical-infrastructure thing—where they see themselves walking down the street, and they notice a camera, and they’re like, Yeah, I’m being surveilled. Um, which does happen if you live in New York, especially post-9/11: like, you are definitely physically surveilled. There’s a lot of physical-surveillance infrastructure, a lot of cameras out there. But there’s also a lot of other tools for surveillance that I think people are less aware of.Garber: Like Ring cameras and those types of devices?Raji: I think when people install their Ring product, they’re thinking about themselves. They’re like, Oh, I have security concerns. I want to just have something to be able to just, like, check who’s on my porch or not. And they don’t see it as surveillance apparatus, but it ends up becoming part of a broader network of surveillance. And then I think the one that people very rarely think of—and again, is another thing that I would not have thought of if I wasn’t engaged in some of this work—is online surveillance. Faces are sort of the only biometric; uh, I guess, you know, it’s not like a fingerprint. Like, we don’t upload our fingerprints to our social media. We’re very sensitive about, like, Oh, you know, this seems like important biometric data that we should keep guarded. But for faces, it can be passively collected and passively distributed without you having any awareness of it. But also, we’re very casual about our faces. So we upload it very freely onto the internet. And so, you know, immigration officers—ICE, for example—have a lot of online-surveillance tools, where they’ll monitor people’s Facebook pages, and they’ll use sort of facial recognition and other products to identify and connect online identities, you know, across various social-media platforms, for example.Garber: So you have people doing this incredibly common thing, right? Just sharing pieces of their lives on social media. And then you have immigration officials treating that as actionable data. Can you tell me more about facial recognition in particular?Raji: So one of the first models I actually built was a facial-recognition project. And so I’m a Black woman, and I noticed right away that there were not a lot of faces that look like mine. And I remember trying to have a conversation with folks at the company at the time. And it was a very strange time to be trying to have this conversation. This was like 2017. There was a little bit of that happening in the sort of natural-language processing space. Like, people were noticing, you know, stereotyped language coming out of some of these models, but no one was really talking about it in the image space as much—that, oh, some of these models don’t work as well for darker-skinned individuals or other demographics. We audited a bunch of these products that were these facial-analysis products, and we realized that these systems weren’t working very well for those minority populations. But also definitely not working for the intersection of those groups. So like: darker skin, female faces.Garber: Wow.Raji: Some of the ways in which these systems were being pitched at the time, were sort of selling these products and pitching it to immigration officers to use to identify suspects.Gaber: Wow.Raji: And, you know, imagine something that’s not 70 percent accurate, and it’s being used to decide, you know, if this person aligns with a suspect for deportation. Like, that’s so serious.Garber: Right.Raji: You know, since we’ve published that work, we had just this—it was this huge moment. In terms of: It really shifted the thinking in policy circles, advocacy circles, even commercial spaces around how well those systems worked. Because all the information we had about how well these systems worked, so far, was on data sets that were disproportionately composed of lighter-skin men. Right. And so people had this belief that, Oh, these systems work so well, like 99 percent accuracy. Like, they’re incredible. And then our work kind of showed, well, 99 percent accuracy on lighter-skin men.Garber: And could you talk a bit about where tech companies are getting the data from to train their models?Raji: So much of the data required to build these AI systems are collected through surveillance. And this is not hyperbole, right? Like, the facial-recognition systems, you know, millions and millions of faces. And these databases of millions and millions of faces that are collected, you know, through the internet, or collected through identification databases, or through, you know, physical- or digital-surveillance apparatus. Because of the way that the models are trained and developed, it requires a lot of data to get to a meaningful model. And so a lot of these systems are just very data hungry, and it’s a really valuable asset.Garber: And how are they able to use that asset? What are the specific privacy implications about collecting all that data?Raji: Privacy is one of those things that we just don’t—we haven’t been able to get to federal-level privacy regulation in the States. There’s been a couple states that have taken initiative. So California has the California Privacy Act. Illinois has a BIPA, which is sort of a Biometric Information Privacy Act. So that’s specifically about, you know, biometric data like faces. In fact, they had a really—I think BIPA’s biggest enforcement was against Facebook and Facebook’s collection of faces, which does count as biometric data. So in Illinois, they had to pay a bunch of Facebook users a certain settlement amount. Yeah. So, you know, there are privacy laws, but it’s very state-based, and it takes a lot of initiative for the different states to enforce some of these things, versus having some kind of comprehensive national approach to privacy. That’s why enforcement or setting these rules is so difficult. I think something that’s been interesting is that some of the agencies have sort of stepped up to play a role in terms of thinking through privacy. So the Federal Trade Commission, FTC, has done these privacy audits historically on some of the big tech companies. They’ve done this for quite a few AI products as well—sort of investigating the privacy violations of some of them. So I think that that’s something that, you know, some of the agencies are excited about and interested in. And that might be a place where we see movement, but ideally we have some kind of law.Garber: And we’ve been in this moment—this, I guess, very long moment—where companies have been taking the “ask for forgiveness instead of permission” approach to all this. You know, so erring on the side of just collecting as much data about their users as they possibly can, while they can. And I wonder what the effects of that will be in terms of our broader informational environment.Raji: The way surveillance and privacy works is that it’s not just about the information that’s collected about you; it’s, like, your entire network is now, you know, caught in this web, and it’s just building pictures of entire ecosystems of information. And so, I think people don’t always get that. But yeah; it’s a huge part of what defines surveillance.__Valdez: Do you remember Surveillance Cameraman, Megan?Garber: Ooh. No. But now I’m regretting that I don’t.Valdez: Well, I mean, I’m not sure how well it was known, but it was maybe 10 or so years ago. There was this guy who had a camera, and he would take the camera and he would go and he’d stop and put the camera in people’s faces. And they would get really upset. And they would ask him, “Why are you filming me?” And, you know, they would get more and more irritated, and it would escalate. I think the meta-point that Surveillance Cameraman was trying to make was “You know, we’re surveilled all the time—so why is it any different if someone comes and puts a camera in your face when there’s cameras all around you, filming you all the time?”Garber: Right. That’s such a great question. And yeah, the sort of difference there between the active act of being filmed and then the sort of passive state of surveillance is so interesting there.Valdez: Yeah. And you know, that’s interesting that you say active versus passive. You know, it reminds me of the notion of the panopticon, which I think is a word that people hear a lot these days, but it’s worth remembering that the panopticon is an old idea. So it started around the late 1700s with the philosopher named Jeremy Bentham. And Bentham, he outlined this architectural idea, and it was originally conceptualized for prisons. You know, the idea was that you have this circular building, and the prisoners live in cells along the perimeter of the building. And then there’s this inner circle, and the guards are in that inner circle, and they can see the prisoners. But the prisoners can’t see the guards. And so the effect that Bantham was hoping this would achieve is that the prisoners would never know if they’re being watched—so they’d always behave as if they were being watched.Garber: Mm. And that makes me think of the more modern idea of the watching-eyes effect. This notion that simply the presence of eyes might affect people’s behavior. And specifically, images of eyes. Simply that awareness of being watched does seem to affect people’s behavior.Valdez: Oh, interesting.Garber: You know, beneficial behavior, like collectively good behavior. You know, sort of keeping people in line in that very Bentham-like way.Valdez: We have all of these, you know, eyes watching us now—I mean, even in our neighborhoods and, you know, at our apartment buildings. In the form of, say, Rng cameras or other, you know, cameras that are attached to our front doors. Just how we’ve really opted into being surveilled in all of the most mundane places. I think the question I have is: Where is all of that information going?Garber: And in some sense, that’s the question, right? And Deb Raji has what I found to be a really useful answer to that question of where our information is actually going, because it involves thinking of surveillance not just as an act, but also as a product.—Raji: For a long time when you—I don’t know if you remember those, you know, “complete the picture” apps, or, like, “spice up my picture.” They would use generative models. You would kind of give them a prompt, which would be, like—your face. And then it would modify the image to make it more professional, or make it better lit. Like, sometimes you’ll get content that was just, you know, sexualizing and inappropriate. And so that happens in a nonmalicious case. Like, people will try to just generate images for benign reasons. And if they choose the wrong demographic, or they frame things in the wrong way, for example, they’ll just get images that are denigrating in a way that feels inappropriate. And so I feel like there’s that way in which AI for images has sort of led to just, like, a proliferation of problematic content.Garber: So not only are those images being generated because the systems are flawed themselves, but then you also have people using those flawed systems to generate malicious content on purpose, right?Raji: One that we’ve seen a lot is sort of this deepfake porn of young people, which has been so disappointing to me. Just, you know, young boys deciding to do that to young girls in their class; it really is a horrifying form of sexual abuse. I think, like, when it happened to Taylor Swift—I don’t know if you remember; someone used the Microsoft model, and, you know, generated some nonconsensual sexual images of Taylor Swift—I think it turned that into a national conversation. But months before that, there had been a lot of reporting of this happening in high schools. Anonymous young girls dealing with that, which is just another layer of trauma, because you’re like—you’re not Taylor Swift, right? So people don’t pay attention in the same way. So I think that that problem has actually been a huge issue for a very long time.—Garber: Andrea, I’m thinking of that old line about how if you’re not paying for something in the tech world, there’s a good chance you are probably the product being sold, right? But I’m realizing how outmoded that idea probably is at this point. Because even when we pay for these things, we’re still the products. And specifically, our data are the products being sold. So even with things like deepfakes—which are typically defined as, you know, using some kind of machine learning or AI to create a piece of manipulated media—even they rely on surveillance in some sense. And so you have this irony where these recordings of reality are now also being used to distort reality.Valdez: You know, it makes me think of Don Fallis: this philosopher who talked about the epistemic threat of deepfakes and that it’s part of this pending infopocalypse. Which sounds quite grim, I know. But I think the point that Fallis was trying to make is that with the proliferation of deepfakes, we’re beginning to maybe distrust what it is that we’re seeing. And we talked about this in the last episode. You know, “seeing is believing” might not be enough. And I think we’re really worried about deepfakes, but I’m also concerned about this concept of cheap fakes, or shallow fakes. So cheap fakes or shallow fakes—it’s, you know, you can tweak or change images or videos or audio just a little bit. And it doesn’t actually require AI or advanced technology to create. So one of the more infamous instances of this was in 2019. Maybe you remember there was a video of Nancy Pelosi that came out where it sounded like she was slurring her words.Garber: Oh, yeah, right. Yeah.Valdez: Really, the video had just been slowed down using easy audio tools, and just slowed down enough to create that perception that she was slurring her words. So it’s a quote, unquote “cheap” way to create a small bit of chaos.Garber: And then you combine that small bit of chaos with the very big chaos of deepfakes.Valdez: Yeah. So one, the cheat fake is: It’s her real voice. It’s just slowed down—again, using, like, simple tools. But we’re also seeing instances of AI-generated technology that completely mimics other people’s voices, and it’s becoming really easy to use now. You know, there was this case recently that came out of Maryland where there was a high-school athletic director, and he was arrested after he allegedly used an AI voice simulation of the principal at his school. And he allegedly simulated the principal’s voice saying some really horrible things, and it caused all this blowback on the principal before investigators, you know, looked into it. Then they determined that the audio was fake. But again, it was just a regular person that was able to use this really advanced-seeming technology that was cheap, easy to use, and therefore easy to abuse.Garber: Oh, yes. And I think it also goes to show how few sort of cultural safeguards we have in place right now, right? Like, the technology will let people do certain things. And we don’t always, I think, have a really well-agreed-upon sense of what constitutes abusing the technology. And you know, usually when a new technology comes along, people will sort of figure out what’s acceptable and, you know, what will bear some kind of safety net. Um, and will there be a taboo associated with it? But with all of these new technologies, we just don’t have that. And so people, I think, are pushing the bounds to see what they can get away with.Valdez: And we’re starting to have that conversation right now about what those limits should look like. I mean, lots of people are working on ways to figure out how to watermark or authenticate things like audio and video and images.Garber: Yeah. And I think that that idea of watermarking, too, can maybe also have a cultural implication. You know, like: If everyone knows that deepfakes can be tracked, and easily, that is itself a pretty good disincentive from creating them in the first place, at least with an intent to fool or do something malicious.Valdez: Yeah. But. In the meantime, there’s just going to be a lot of these deepfakes and cheap fakes and shallow fakes that we’re just going to have to be on the lookout for.—Garber: Is there new advice that you have for trying to figure out whether something is fake?Raji: If it doesn’t feel quite right, it probably isn’t. A lot of these AI images don’t have a good sense of, like, spatial awareness, because it’s just pixels in, pixels out. And so there’s some of these concepts that we as humans find really easy, but these models struggle with. I advise people to be aware of, like—sort of trust your intuition. If you’re noticing weird artifacts in the image, it probably isn’t real. I think another thing, as well, is who posts.Garber: Oh, that’s a great one; yeah.Raji: Like, I mute very liberally on Twitter; uh, any platform. I definitely mute a lot of accounts that I notice [are] caught posting something. Either like a community note or something will reveal that they’ve been posting fake images, or you just see it and you recognize the design of it. And so I just knew that kind of content. Don’t engage with those kind of content creators at all. And so I think that that’s also like another successful thing on the platform level. Deplatforming is really effective if someone has sort of three strikes in terms of producing a certain type of content. And that’s what happened with the Taylor Swift situation—where people were disseminating these, you know, Taylor Swift images and generating more images. And they just went after every single account that did that—you know, completely locked down her hashtag. Like, that kind of thing where they just really went after everything. Um, and I think that that’s something that we should just do in our personal engagement as well.—Garber: Andrea, that idea of personal engagement, I think, is such a tricky part of all of this. I’m even thinking back to what we were saying before—about Ring and the interplay we were getting at between the individual and the collective. In some ways, it’s the same tension that we’ve been thinking about with climate change and other really broad, really complicated problems. This, you know, connection between personal responsibility, but also the outsized role that corporate and government actors will have to play when it comes to finding solutions. Mm hmm. And with so many of these surveillance technologies, we’re the consumers, with all the agency that that would seem to entail. But at the same time, we’re also part of this broader ecosystem where we really don’t have as much control as I think we’d often like to believe. So our agency has this giant asterisk, and, you know, consumption itself in this networked environment is really no longer just an individual choice. It’s something that we do to each other, whether we mean to or not.Valdez: Yeah; you know, that’s true. But I do still believe in conscious consumption so much as we can do it. Like, even if I’m just one person, it’s important to me to signal with my choices what I value. And in certain cases, I value opting out of being surveilled so much as I can control for it. You know, maybe I can’t opt out of facial recognition and facial surveillance, because that would require a lot of obfuscating my face—and, I mean, there’s not even any reason to believe that it would work. But there are some smaller things that I personally find important; like, I’m very careful about which apps I allow to have location sharing on me. You know, I go into my privacy settings quite often. I make sure that location sharing is something that I’m opting into on the app while I’m using it. I never let apps just follow me around all the time. You know, I think about what chat apps I’m using, if they have encryption; I do hygiene on my phone around what apps are actually on my phone, because they do collect a lot of data on you in the background. So if it’s an app that I’m not using, or I don’t feel familiar with, I delete it.Garber: Oh, that’s really smart. And it’s such a helpful reminder, I think, of the power that we do have here. And a reminder of what the surveillance state actually looks like right now. It’s not some cinematic dystopia. Um, it’s—sure, the cameras on the street. But it’s also the watch on our wrist; it’s the phones in our pockets; it’s the laptops we use for work. And even more than that, it’s a series of decisions that governments and organizations are making every day on our behalf. And we can affect those decisions if we choose to, in part just by paying attention.Valdez: Yeah, it’s that old adage: “Who watches the watcher?” And the answer is us.__Garber: That’s all for this episode of How to Know What’s Real. This episode was hosted by Andrea Valdez and me, Megan Garber. Our producer is Natalie Brennan. Our editors are Claudine Ebeid and Jocelyn Frank. Fact-check by Ena Alvarado. Our engineer is Rob Smierciak. Rob also composed some of the music for this show. The executive producer of audio is Claudine Ebeid, and the managing editor of audio is Andrea Valdez.Valdez: Next time on How to Know What’s Real: Thi Nguyen: And when you play the game multiple times, you shift through the roles, so you can experience the game from different angles. You can experience a conflict from completely different political angles and re-experience how it looks from each side, which I think is something like, this is what games are made for. Garber: What we can learn about expansive thinking through play. We’ll be back with you on Monday.
theatlantic.com
On D-Day, the U.S. Conquered the British Empire
For most Americans, D-Day remains the most famous battle of World War II. It was not the end of the war against Nazism. At most, it was the beginning of the end. Yet it continues to resonate 80 years later, and not just because it led to Hitler’s defeat. It also signaled the collapse of the European empires and the birth of an American superpower that promised to dedicate its foreign policy to decolonization, democracy, and human rights, rather than its own imperial prestige.It is easy to forget what a radical break this was. The term superpower was coined in 1944 to describe the anticipated world order that would emerge after the war. Only the British empire was expected to survive as the standard-bearer of imperialism, alongside two very different superpower peers: the Soviet Union and the United States. Within weeks of D-Day, however, the British found themselves suddenly and irrevocably overruled by their former colony.That result was hardly inevitable. When the British and the Americans formally allied in December 1941, the British empire was unquestionably the senior partner in the relationship. It covered a fifth of the world’s landmass and claimed a quarter of its people. It dominated the air, sea, and financial channels on which most global commerce depended. And the Royal Navy maintained its preeminence, with ports of call on every continent, including Antarctica.The United States, by contrast, was more of a common market than a nation-state. Its tendency toward isolationism has always been overstated. But its major foreign-policy initiatives had been largely confined to the Western Hemisphere and an almost random collection of colonies (carefully called “territories”), whose strategic significance was—at best—a point of national ambivalence.In the two years after Pearl Harbor, the British largely dictated the alliance’s strategic direction. In Europe, American proposals to take the fight directly to Germany by invading France were tabled in favor of British initiatives, which had the not-incidental benefit of expanding Britain’s imperial reach across the Mediterranean and containing the Soviet Union (while always ensuring that the Russians had enough support to keep three-quarters of Germany’s army engaged on the Eastern Front).Things changed, however, in November 1943, when Winston Churchill and Franklin D. Roosevelt held a summit in Cairo. The British again sought to postpone the invasion of France in favor of further operations in the Mediterranean. The debate quickly grew acrimonious. At one point, Churchill refused to concede on his empire’s desire to capture the Italian island of Rhodes. George Marshall, the usually stoic U.S. Army chief of staff, shouted at the prime minister, “Not one American is going to die on that goddamned beach!” Another session was forced to end abruptly after Marshall and his British counterpart, Sir Alan Brooke, nearly came to blows.With the fate of the free world hanging in the balance, a roomful of 60-year-old men nearly broke out into a brawl because by November 1943, America had changed. It was producing more than twice as many planes and seven times as many ships as the whole British empire. British debt, meanwhile, had ballooned to nearly twice the size of its economy. Most of that debt was owed to the United States, which leveraged its position as Britain’s largest creditor to gain access to outposts across the British empire, from which it built an extraordinary global logistics network of its own.[From the April 2023 issue: The age of American naval dominance is over]Having methodically made their country into at least an equal partner, the Americans insisted on the invasion of France, code-named “Operation Overlord.” The result was a compromise, under which the Allies divided their forces in Europe. The Americans would lead an invasion of France, and the British would take command of the Mediterranean.Six months later, on June 6, 1944, with the D-Day invasion under way, the British empire verged on collapse. Its economic woes were exacerbated by the 1.5 million Americans, and 6 million tons of American equipment, that had been imported into the British Isles to launch Operation Overlord. Its ports were jammed. Inflation was rampant. Its supply chains and its politics were in shambles. By the end of June 1944, two of Churchill’s ministers were declaring the empire “broke.”The British continued to wield considerable influence on world affairs, as they do today. But after D-Day, on the battlefields of Europe and in international conference rooms, instead of setting the agenda, the British found themselves having to go along with it.In July 1944, at the Bretton Woods Conference, the British expectation that global finance would remain headquartered in London and transacted at least partially in pounds was frustrated when the International Monetary Fund and what would become the World Bank were headquartered in Washington and the dollar became the currency of international trade. In August 1944, America succeeded in dashing British designs on the eastern Mediterranean for good in favor of a second invasion of France from the south. In September 1944, the more and more notional British command of Allied ground forces in Europe was formally abandoned. In February 1945, at a summit in Yalta, Churchill had little choice but to acquiesce as the United States and the Soviet Union dictated the core terms of Germany’s surrender, the division of postwar Europe, and the creation of a United Nations organization with a mandate for decolonization.How did this happen so quickly? Some of the great political historians of the 20th century, such as David Reynolds, Richard Overy, and Paul Kennedy, have chronicled the many political, cultural, and economic reasons World War II would always have sounded the death knell of the European imperial system. Some British historians have more pointedly blamed the Americans for destabilizing the British empire by fomenting the forces of anti-colonialism (what D. Cameron Watt called America’s “moral imperialism”).Absent from many such accounts is why Britain did not even try to counterbalance America’s rise or use the extraordinary leverage it had before D-Day to win concessions that might have better stabilized its empire. The French did precisely that with far less bargaining power at their disposal, and preserved the major constituents of their own empire for a generation longer than the British did. The warning signs were all there. In 1941, Germany’s leading economics journal predicted the rise of a “Pax Americana” at Britain’s expense. “England will lose its empire,” the article gloatingly predicted, “to its partner across the Atlantic.”[Read: How Britain falls apart]The American defense-policy scholar and Atlantic contributing writer Kori Schake recently made a persuasive case that Britain came to accept the role of junior partner in the Atlantic alliance, rather than seek to balance American power, because the two countries had become socially, politically, and economically alike in all the ways that mattered. Britain, in other words, had more to lose by confrontation. And so it chose friendship.The argument makes sense to a point, especially given how close the United Kingdom and the United States are today. But the remembered warmth of the “special relationship” in the 1940s is largely a product of nostalgia. British contempt for American racism and conformist consumerism seethed especially hot with the arrival in the U.K. of 1.5 million Americans. And American contempt for the British class system and its reputation for violent imperialism equally made any U.S. investment in the war against Germany—as opposed to Japan—a political liability for Roosevelt.The British elite had every intention of preserving the British empire and European colonialism more generally. In November 1942, as Anglo-American operations began in North Africa, Churchill assured France that its colonies would be returned and assured his countrymen, “I have not become the King’s First Minister in order to preside over the liquidation of the British Empire.”The British assumed that America’s rise was compatible with that goal because they grossly miscalculated American intentions. This was on stark display in March 1944, just over two months before D-Day, when Britain’s Foreign Office circulated a memorandum setting out the empire’s “American policy.” Given how naive the Americans were about the ways of the world, it said, Britain should expect them to “follow our lead rather than that we follow theirs.” It was therefore in Britain’s interest to foster America’s rise so that its power could be put to Britain’s use. “They have enormous power, but it is the power of the reservoir behind the dam,” the memo continued. “It must be our purpose not to balance our power against that of America, but to make use of American power for purposes which we regard as good” and to “use the power of the United States to preserve the Commonwealth and the Empire, and, if possible, to support the pacification of Europe.”It is easy to see why members of Britain’s foreign-policy elite, still warmed by a Victorian afterglow, might discount Americans’ prattling on about decolonization and democracy as empty wartime rhetoric. If anything, they thought, Americans’ pestering insistence on such ideals proved how naive they were. Churchill often grumbled with disdain about Americans’ sentimental affection for—as he put it—the “chinks” and “pigtails” fighting against Japan in China, scornful of the American belief that they could be trusted to govern themselves.And the face America presented to London might have compounded the misapprehension. Roosevelt was expected to choose George Marshall to be the American commander of Operation Overlord, a position that would create the American equivalent of a Roman proconsul in London. Instead, he picked Dwight Eisenhower.Roosevelt’s reasons for choosing Eisenhower remain difficult to pin down. The president gave different explanations to different people at different times. But Eisenhower was the ideal choice for America’s proconsul in London and Europe more generally, if the goal was to make a rising American superpower seem benign.Eisenhower had a bit of cowboy to him, just like in the movies. He was also an Anglophile and took to wearing a British officer’s coat when visiting British troops in the field. He had a natural politician’s instinct for leaving the impression that he agreed with everyone. And he offered the incongruous public image of a four-star general who smiled like he was selling Coca-Cola.He was also genuinely committed to multilateralism. Eisenhower had studied World War I closely and grew convinced that its many disasters—in both its fighting and its peace—were caused by the Allies’ inability to put aside their own imperial prestige to achieve their common goals. Eisenhower’s commitment to Allied “teamwork,” as he would say with his hokey Kansas geniality, broke radically from the past and seemed hopelessly naive, yet was essential to the success of operations as high-risk and complex as the D-Day invasion.Eisenhower, for his part, was often quite deft in handling the political nature of his position. He knew that to be effective, to foster that teamwork, he could never be seen as relishing the terrifying economic and military power at his disposal, or the United States’ willingness to use it. “Hell, I don’t have to go around jutting out my chin to show the world how tough I am,” he said privately.On D-Day, Eisenhower announced the invasion without mentioning the United States once. Instead, he said, the landings were part of the “United Nations’ plan for the liberation of Europe, made in conjunction with our great Russian allies.” While the invasion was under way, Eisenhower scolded subordinates who issued reports on the extent of French territory “captured.” The territory, he chided them, had been “liberated.”The strategy worked. That fall, with Paris liberated, only 29 percent of French citizens polled felt the United States had “contributed most in the defeat of Germany,” with 61 percent giving credit to the Soviet Union. Yet, when asked where they would like to visit after the war, only 13 percent were eager to celebrate the Soviet Union’s contributions in Russia itself. Forty-three percent said the United States, a country whose Air Force had contributed to the deaths of tens of thousands of French civilians in bombing raids.In rhetoric and often in reality, the United States has continued to project its power, not as an empire, but on behalf of the “United Nations,” “NATO,” “the free world,” or “mankind.” The interests it claims to vindicate as a superpower have also generally not been its imperial ambition to make America great, but the shared ideals enshrined soon after the war in the UN Charter and the Universal Declaration of Human Rights.Had the D-Day invasion failed, those ideals would have been discredited. Unable to open the Western Front in France, the Allies would have had no choice but to commit to Britain’s strategy in the Mediterranean. The U.S. military, and by extension the United States, would have lost all credibility. The Soviets would have been the only meaningful rival to German power on the European continent. And there would have been no reason for the international politics of national prestige and imperial interest to become outmoded.Instead, on D-Day, American soldiers joined by British soldiers and allies from nearly a dozen countries embarked on a treacherous voyage from the seat of the British empire to the shores of the French empire on a crusade that succeeded in liberating the Old World from tyranny. It was a victory for an alliance built around the promise, at least, of broadly shared ideals rather than narrow national interests. That was a radical idea at the time, and it is becoming a contested one today. D-Day continues to resonate as much as it does because, like the battles of Lexington and Concord, it is an almost-too-perfect allegory for a decisive turning point in America’s national story: the moment when it came into its own as a new kind of superpower, one that was willing and able to fight for a freer world.
theatlantic.com
Billions in taxpayer dollars now go to religious schools via vouchers
The rapid expansion of state voucher programs follows court decisions that have eroded the separation between church and state.
washingtonpost.com
Claudia Sheinbaum to Become First Woman President After Mexico’s Bloodiest Ever Election
Raquel Cunha/ReutersClaudia Sheinbaum secured a landslide victory and will become the first female president of Mexico, according to official projections Monday.The 61-year-old climate scientist and former mayor of Mexico City took at least 58.3 percent of the vote, according to early results, a lead of almost 30 points over her main competitor, businesswoman Xóchitl Gálvez. Sheinbaum, a leftist, will also become the first person with Jewish heritage to lead the predominantly Catholic country when she takes over from her mentor and Mexico’s incumbent President Andres Manuel López Obrador in October.Mexico’s ruling coalition was also on course to potentially seal a two-thirds supermajority in both houses of Congress which would allow it to pass constitutional reforms without requiring the support of the opposition.Read more at The Daily Beast.
thedailybeast.com
Used paperbacks change lives behind bars, even with growing prison book bans
Rejected books include manuals like “Nutrition for Dummies” as well as thousands of other titles such as “Malcolm X Speaks: Selected Speeches and Statements.”
washingtonpost.com
What Is Removed Through the Use of a Depilatory?
Test your wits on the Slate Quiz for June 3, 2024.
slate.com
Slate Crossword: A Detective Might Connect Them to Close a Case (Four Letters)
Ready for some wordplay? Sharpen your skills with Slate’s puzzle for June 3, 2024.
slate.com
The Key to Going After Donald Trump Now That He’s Been Convicted
Biden's response is still too muted.
slate.com
Gun-Toting Republican Candidate Raps On 'Free Trump' Song
Valentina Gomez, who is running to be the GOP's candidate for Missouri's secretary of state, has dropped a verse on the new Hi-Rez track "America First."
newsweek.com
North Korea Says 3,500 Trash Balloons Launched Across Border
South Korean residents were told to remain indoors to avoid touching the items.
1 h
newsweek.com
Everything you need to know about the Hunter Biden federal gun case
Hunter Biden, 54, is poised to become the first child of a sitting president to stand a criminal trial over his alleged illegal possession of a firearm while addicted to narcotics.
1 h
nypost.com
Ukraine's F-16s Can Strike Inside Russia, NATO Ally Confirms
The Defense Minister for the Netherlands outlined the sole condition for Kyiv's use of the 24 combat jets heading to Ukraine for strikes on Russia.
1 h
newsweek.com
Five Huge New Construction Projects Coming To The US
Four projects in Texas and one in Florida are soon set to begin as part of a $1.2 billion bundle of contracts.
1 h
newsweek.com