Even the Fed Has Caught the Singularity Bug
When central banks start sketching sci-fi GDP curves, you know AI hype has gone too far.
I expect AI hype from Silicon Valley, not from the Federal Reserve Bank of Dallas.
Yet here we are: the Dallas Fed recently published a chart titled “AI Scenarios” with two extraordinary projections for U.S. economic growth. One line is labeled “Singularity: Benign”, a fantasy curve rocketing off the chart toward infinity. Another is “Singularity: Extinction”, plunging straight to zero GDP.
Neither line is based on a model, a regression, or even a spreadsheet. They’re simply drawn narrative arcs in a research brief that’s supposed to be about real economic data.
And that’s the problem. Here is this chart:
When the Fed starts drawing fan fiction
The chart is, technically, part of a blog post about AI’s potential economic impact. But the visuals tell a different story. You can see 150 years of steady GDP per-capita growth, roughly 1.9% a year, through world wars, oil shocks, and the Internet revolution. Then, right after 2027, the lines explode into science fiction: one toward transcendence, the other toward annihilation.
It’s not clear why the Fed is publishing illustrations of extinction scenarios. These aren’t risk models or tail-event simulations. They’re simply stories with axes.
The orange “trend GDP” line, based on a century and a half of data, is the serious one. The green “AI-boosted trend” (a modest 0.2% uptick for a decade) is reasonable. Everything beyond that is a cartoon of belief: what happens when policymakers start importing Silicon Valley’s metaphors instead of testing them.
Why it matters
For most of history, the Federal Reserve has been a temple of empiricism. It’s where enthusiasm goes to get measured. But now, even the Fed seems to feel the gravitational pull of AI exceptionalism, the idea that this time really is different, that the exponential line won’t bend back to the mean.
That’s not analysis; that’s contagion.
When a central bank puts “Singularity: Extinction” on a chart, it’s signaling something deeper than curiosity. It’s admitting that narrative has invaded the room where empiricism used to live.
We’ve reached a strange place in public discourse: AI has become both theology and macroeconomics.
The slow facts of the real economy
Step back from the fever. Real data tells a simpler, steadier story.
GDP per capita has hugged the same growth path since the Reconstruction era.
Unemployment and underemployment remain near pre-AI levels.
Income distribution looks almost unchanged from before ChatGPT’s release.
No productivity boom, no macro disruption, no visible “singularity” in the numbers.
That doesn’t mean AI isn’t transformative. It just means transformations are slow, diffusion takes years, complementary processes take longer, and institutional inertia absorbs most shocks.
If history is a guide, any visible economic impact from LLMs will look more like the spreadsheet than the steam engine: gradual, distributed, and hard to measure until we stop noticing.
The politics of belief
This is, at its core, a political problem.
Institutions like the Fed aren’t supposed to amplify cultural myths, they’re supposed to discipline them. But the AI singularity narrative is now so dominant that even empiricists are sketching speculative futures to stay relevant.
It’s not just economists. Every sector now faces pressure to frame its work in the language of “AI transformation.” Universities. Law firms. City governments. Nobody wants to sound like they’re missing the revolution, even if the revolution is mostly vapor.
That’s how belief becomes policy and policy becomes distortion.
When official documents start echoing the same language as VC pitch decks and futurist YouTube channels, it signals a deeper shift: the merging of credibility and spectacle.
The real lesson of the chart
If you look past the red and purple fantasy lines, the real story is sitting quietly underneath them: The orange line.
That 1.9% annual growth trend, astonishingly stable for 150 years, represents the stubborn reality of modern economies. It’s what happens when innovation, policy, demography, and luck average out.
The Fed’s own chart accidentally demonstrates that reality is hard to bend.
And that’s where the “AI economy” probably lives: not in the stratosphere of singularity, but in the narrow band between ceiling and floor, constrained by data, computation, diffusion, and human coordination.
That’s not doom. It’s discipline.
What this tells us about ourselves
The real story here isn’t that AI might end civilization or make us immortal. It’s that serious institutions have started narrating like futurists.
And when that happens, the line between governing and imagining starts to blur.
AI isn’t just a technological event, it’s a cultural one. It’s teaching our bureaucracies to dream, and our scientists to speculate. The risk isn’t extinction or transcendence; it’s distraction.
When central bankers start drawing the Singularity, it’s not the machines that are misaligned — it’s us.
About the Author
Sean Richey, Ph.D., is a Professor of Political Science at Georgia State University specializing in AI information environments and digital political communication.
Expert Witness & Consulting Services
Dr. Richey provides expert witness testimony, case review and analysis for counsel, survey methodology evaluation, and policy consulting on AI-associated information environments. Visit my website or email consulting@seanrichey.com.receive new posts and support my work.


