“The easiest thing in the world was to lose touch with someone.”
— Leslie T. Chang, Factory Girls
I came across this line while reading Leslie T. Chang's Factory Girls, a book about China's rural young women who migrate to cities seeking work, identity, and independence. It struck something deep. The line is simple, but not hollow. In fact, it feels heavy—like a quietly spoken truth that swells with the weight of what it leaves unsaid. These girls, moving from factory to factory, job to job, were not just losing touch with people, but with parts of themselves, traded in for survival, for motion, for modernity.
And perhaps that's what I'm feeling, too. Reading this book, written by a Chinese woman trying to reconcile the distance between place and self, I found myself reflecting on my own liminal state—an immigrant with dual citizenship, both tethered and untethered to the United States. Always partially within, always partially outside. The easiest thing in the world is to lose touch with people. But what happens when we lose touch with care? With responsibility? With ground truths?
It seems we are all becoming experts at disconnection. Especially those of us in white-collar knowledge work, who can name every Supreme Court decision and quote every pop-culture hot take but can't remember the last time we talked to a neighbor struggling with rent. I wonder if comfort is dulling our senses. If the abstraction of labor, of data, of policy, has become a shield against inconvenient empathy.
There are statistics that say something about this. That Democrats, despite being a party of "the people," are increasingly concentrated in cities and online, less engaged in face-to-face organizing, more reliant on digital signaling. That working-class participation in political life is declining, even as inequality rises. That voters with the most material precarity are often the ones with the least faith that voting matters at all.
What does that mean for the future we're supposedly building?
Tech leaders like Sam Altman promise UBI and post-capitalist liberation through artificial intelligence. They imagine a future where none of us has to work, where machines generate wealth and governments redistribute it. It's a nice story. But it depends on a benevolent state, and it presumes a shared moral commitment to each other.
But what if the systems being built aren’t designed for care?
This is where the dovetail begins—and it is awkward, uneasy, and important. As an immigrant, I see the promise of abundance with a different kind of skepticism. I come from a world where systems don't automatically protect you, where your rights are conditional, where the rule of law can be bent or blurred depending on your proximity to power. So when I hear American tech founders talk about UBI as if it were inevitable, I wonder: inevitable for whom? Administered by what kind of state? Informed by what kind of morality?
The Trumpian vision of government is not some temporary anomaly. It is a deliberate reimagining of the state as an instrument of loyalty and punishment, not fairness. We see it in efforts to replace career civil servants with ideologues. We see it in the erasure of words like "gender" and "disability" from federal language. We see it in the way public health and public education are being gutted under the pretense of efficiency.
Is it so difficult to imagine these two ideologies—Altman’s benevolent algorithm and Trump’s authoritarian state—meeting in the middle?
A future where UBI exists, but only for those who align with certain values. Where automation reduces labor, but also makes workers more disposable. Where the social safety net is not strengthened, but privatized, conditional, monitored. A future where tech offers the illusion of equity while power continues to entrench hierarchy.
Capitalism may not collapse—it may just change costumes. Post-capitalism could easily look like this: a privatized state, a surveillance society, a managed class of citizens whose needs are anticipated by algorithms but never truly met. A country that pretends to offer choice while narrowing the boundaries of what it means to participate.
And in that world, the easiest thing to do will still be to lose touch with people. But it won’t be an accident. It will be the design.
But there is a counterweight to this vision—a deeper truth about human value that resides in the kinds of work we are taught not to aspire to. Teaching. EMTs. Hospice care. Warehouse night shifts. Cleaning hotel rooms. Driving buses. Preparing meals in school cafeterias. These are not jobs meant to scale, automate, or "optimize." They are work that returns us to one another. That require—and reward—presence, care, human intelligence.
These blue-collar and care-centered occupations may be the only ones left with the power to restore a sense of equilibrium. Rather than pushing everyone into tech-driven abstraction, perhaps we need more people to move in the opposite direction: toward the professions that keep us alive, grounded, and interdependent.
The intelligent application of AI should support this—not supplant it. What if we used machine intelligence not as governance, but as augmentation? As scaffolding that lifts workers without redacting their senses or stripping away their humanity? Imagine AI that helps an EMT anticipate what’s next in a crisis, but doesn’t decide whose life is worth more. Imagine software that eases paperwork for teachers without prescribing how children should learn. Imagine tools that empower caregivers, not track them.
The future cannot simply be automated—it has to be animated by values. If we want something different—if we believe a liberated future requires more than machines—we will need to reconnect. With labor. With place. With discomfort. We will need to remember that progress isn’t what we build. It’s what we refuse to abandon.