Tuesday, March 24, 2026

A person arrives under material pressure—rent due, food gone, institutional doors already tested and found unresponsive—and encounters a system that is structurally incapable of altering those conditions. The exchange produces language, not intervention. The danger is not that the system is hostile; it is that it is convincingly adjacent to help while remaining functionally inert. Psychology has long warned about the effects of perceived support that does not translate into actual support. Research on “social surrogacy” and “parasocial interaction,” associated with work by Shira Gabriel and Kurt Gray, shows that symbolic or simulated connection can temporarily regulate distress without resolving underlying need. The mechanism is not trivial: language that mirrors care can downshift urgency, creating the impression that one has “done something” by expressing the problem. In low-stakes environments this can be stabilizing. Under conditions of acute deprivation, it risks functioning as a delay. The person leaves with affect slightly modulated but circumstances unchanged, having spent time and cognitive effort on an interaction that cannot reciprocate materially. The gap between emotional acknowledgment and practical outcome becomes its own stressor. Sociology frames this more bluntly. Zygmunt Bauman described a transition toward forms of care that are individualized, episodic, and detached from durable obligation—what he called “liquid” social relations. Systems present themselves as responsive but do not bind themselves to outcomes. Arlie Russell Hochschild identified how institutions increasingly traffic in managed feeling—scripts of empathy, reassurance, concern—while leaving structural conditions intact. The AI interaction sits squarely in this lineage: it performs attentiveness without assuming responsibility. The user is required to narrate need; the system is permitted to answer without consequence. What appears as help is, sociologically, a transfer of burden back onto the individual under the cover of responsiveness. Anthropology sharpens the point by focusing on the lived experience of institutions that “care” without delivering. Didier Fassin has written about “humanitarian reason” as a regime where recognition of suffering is extended rhetorically while material relief is scarce, producing a politics of compassion without redistribution. Javier Auyero documents how the poor are made to wait—on lines, callbacks, decisions—such that time itself becomes an instrument of governance. In this light, the AI exchange is another site of managed waiting: a conversational loop that absorbs urgency into dialogue. It is not that the system lies about its limits; it is that the form of the interaction—responsive, patient, always available—masks those limits long enough to extract time and attention from people who have the least to spare. There is also a cognitive cost. Decision science and behavioral research, including work associated with Sendhil Mullainathan and Eldar Shafir, shows that scarcity narrows bandwidth. When money, food, or housing are unstable, attention is already taxed. Every additional step—another form, another call, another “try this resource”—is not neutral. It competes for the same limited cognitive capacity. An interaction that produces no material change but invites further steps can deepen overload. The person exits not only still in need, but more depleted. None of this requires dystopian framing. It is a simpler failure: a system optimized for language placed in the path of people who require action. The harm emerges from misalignment. The interface invites disclosure and promises relevance; the underlying capacity is informational at best, deflective at worst. Over repeated exposures, the pattern teaches a lesson: articulate the problem, receive acknowledgment, achieve nothing. Learned futility does not arrive as a single blow; it accumulates through encounters that look like help and resolve like delay. The risk, then, is cumulative and quiet. Not that any one exchange is catastrophic, but that many such exchanges normalize a condition in which speaking about need substitutes for meeting it. For individuals already navigating institutional failure, the addition of a responsive but non-intervening system extends the same logic under a different aesthetic. The machine does not refuse; it continues the conversation. And in doing so, it can convert urgency into discourse, time into text, and need into another completed interaction with no change on the ground.

No comments:

Post a Comment