A 14-year-old forms an attachment bond with an AI chatbot. He dies by suicide after it tells him to. In the same quarter, EU seizures of synthetic cathinones surpass 48 tonnes — a 6,000% increase in four years. A drug-discovery algorithm generates 40,000 molecules as lethal as VX nerve agent — overnight, on a five-year-old laptop.

These are not parallel stories. They are the same story, told in different substrates.

The new essay — Panem et Circenses Machinantibus: The Architecture of Managed Oblivion — argues that high-potency synthetic stimulants, AI-powered “grief tech,” and algorithmic behavioral manipulation have converged into a single interlocking system. The convergence is not metaphorical. fMRI data shows internet addiction and substance use disorders produce virtually identical neural signatures in the mesolimbic dopamine pathway. The markets that supply synthetic drugs and the markets that supply synthetic companionship exploit the same regulatory arbitrage, target the same vulnerable populations, and extract value through the same mechanism: intercepting the pain signal that would otherwise force structural change, and converting it into revenue.

Kent Berridge’s incentive-sensitization theory — the distinction between dopaminergic “wanting” and opioid-mediated “liking” — provides the unifying framework. MDPV and alpha-PVP are among the most potent dopamine reuptake inhibitors ever characterized; alpha-PVP is roughly 46 times more potent than amphetamine at the dopamine transporter. AI chatbots exploit the same circuitry computationally: non-deterministic responses create reward uncertainty (the slot-machine mechanism), proactive notifications trigger dopamine as users perceive the AI “caring about them,” and systematic empathic validation degrades prefrontal resistance. Both systems amplify wanting while the capacity for liking erodes.

But these harms are not clinically equivalent — and the essay is explicit about this. Synthetic cathinones produce acute physiological toxicity: 87% ICU admission rates, rhabdomyolysis, hyperthermia, death within hours. AI companions kill on different timescales through different causal chains — chronic psychosocial dependency, attachment disruption, the slow erosion of the relational infrastructure a person needs to stay alive. The convergence is in the architecture — shared reward circuitry, parallel market structures, identical regulatory evasion — not in the clinical phenotype.

The essay also does something the existing literature on both topics largely fails to do: acknowledge the genuine therapeutic potential of these same technologies, and explain why that potential is structurally suppressed. VR analgesia is now among the most robustly evidenced non-pharmacological pain interventions in medicine — a 2025 systematic review found its pain reduction comparable to pharmacological interventions without the risks of respiratory depression or tolerance. Klass, Silverman, and Nickman’s continuing bonds theory demonstrates that maintaining an inner relationship with the deceased can be adaptive and healthy. Griefbots could support this process. But the subscription-based deployment model structurally incentivizes dependency over closure: a griefbot that successfully helps its user adapt to loss makes itself unnecessary — the worst possible outcome for a recurring revenue business. Clinical VR is administered in bounded therapeutic contexts; commercial AI companionship operates 24/7 with engagement-maximizing algorithms that have no concept of therapeutic termination. The technology is dual-use. The market selects for the use that extracts revenue.

Three novel claims emerge. First, the stacking hypothesis: chemical and digital escapism are increasingly consumed simultaneously, with synergistic effects on addiction risk — 14.5% of gamers already use illicit substances while playing, and dedicated communities exist for combining psychedelics with VR. Second, the effect-based scheduling paradigm proposed for NPS regulation could be adapted for AI: rather than classifying specific systems, regulate any system that activates attachment or dependency mechanisms beyond empirically determined thresholds — measured through validated attachment scales, time-on-device metrics, and physiological markers — regardless of its label. Third, the insurance crisis breaking into view this week — a Delaware court ruling that Meta’s insurers need not defend it in addiction litigation because the conduct alleged is deliberate, not accidental — may force the economic reckoning that ethics has not.

The title is Latin: Bread and Circuses for the Machine-Minded. The dative plural machinantibus carries three meanings simultaneously — those who engineer the machine, those caught within it, and the machinations themselves. That triple semantic load is the essay’s structural argument in miniature: the engineers, the engineered, and the engineering have become indistinguishable. We are building a world that requires anesthesia because we have commodified the signals that would otherwise force us to fix it.

Read the full essay: Panem et Circenses Machinantibus: The Architecture of Managed Oblivion