Cart(0)

The Cost Of Asking: AI, Memory, and the Weight of the Machine

We did not begin worrying about the cost of computation when it was industrial.
We began worrying when it became personal.

The Machine, Before It Had a Face

https://darkmusepress.org/wp-content/uploads/2026/03/ChatGPT-Image-Mar-24-2026-at-05_43_39-PM-800x600.png

Early in my career, I worked at the Princeton Plasma Physics Laboratory in Princeton, NJ.

My role was not glamorous. I ran calculations—submitted jobs, processed outputs, waited for results. The work itself was quiet, procedural. But the machines behind it were anything but.

We did not work on personal computers. We reached across the country to access CRAY supercomputers housed in California—vast, humming systems that occupied entire rooms, fed by power supplies that felt more industrial than academic. They required cooling. They required infrastructure. They required, though we did not say it aloud, enormous resources.

And yet, I never once asked what it cost.

Not in watts.
Not in water.
Not in consequence.

The machine was distant, institutional—part of a larger effort, a national project, something beyond the scale of individual responsibility. After all, it was science, part of the race to see if Cold Fusion was viable. It did not feel like consumption.

It felt like progress.


A Note on the CRAY Systems

https://darkmusepress.org/wp-content/uploads/2026/03/CrayComputer-200x200.jpeghttps://darkmusepress.org/wp-content/uploads/2026/03/CRAYComputing-200x200.jpeghttps://darkmusepress.org/wp-content/uploads/2026/03/CRAYComputers-200x200.jpegFor those unfamiliar, the CRAY supercomputers of the late twentieth century were not simply “large computers” in the modern sense. They were architectural machines—systems that demanded space, infrastructure, and environmental control in a way personal computing no longer does.

The Cray-1, one of the most iconic models, stood in a distinctive circular design, often surrounded by padded seating that concealed its wiring and cooling systems. Later systems, such as the Cray X-MP and Cray Y-MP, expanded both in computational power and physical complexity.

These machines:

  • Occupied entire rooms, often with raised flooring to accommodate cabling and airflow
  • Required dedicated cooling systems to dissipate significant heat output
  • Consumed power at levels far beyond modern personal devices
  • Operated as centralized resources, accessed remotely by researchers across institutions

Unlike today’s distributed cloud infrastructure, computation was visibly physical. One could stand near the machine and feel its presence—the hum of processors, the controlled chill of conditioned air, the quiet implication of energy being continuously transformed into calculation.

And yet, despite their scale, these systems were rarely framed in terms of environmental cost. Their consumption was understood as necessary—subsumed into the broader narrative of scientific advancement.

It is worth remembering this:

The machine was larger.
The questions were fewer.
And the cost, though greater, was seldom questioned.


The Machine, Now Within Reach

https://darkmusepress.org/wp-content/uploads/2026/03/PersonalComputer-200x200.jpeghttps://darkmusepress.org/wp-content/uploads/2026/03/Phone-200x200.jpeghttps://darkmusepress.org/wp-content/uploads/2026/03/TABLET-200x200.jpeg

Today, the machine has changed shape.

It no longer occupies a room. It does not hum behind glass or sit beneath fluorescent lights in a secured facility. It fits in the palm of a hand. It responds instantly. It speaks.

And because of this—because it has become personal—we have begun to ask a different question:

What does it cost to ask?

There is no shortage of answers. The modern discourse surrounding artificial intelligence is saturated with them—numbers, warnings, accusations:

  • That each query consumes water
  • That data centers strain fragile ecosystems
  • That the act of using AI carries an environmental burden

These concerns are not without merit. The infrastructure that supports modern computation—data centers, cooling systems, electrical grids—does consume energy and water. At scale, those costs are significant.

But something else has shifted alongside the technology.

The burden of perception has moved.


The Scale of Things Unseen

A single query—one question posed to a machine—uses a measurable amount of energy. It may consume a small quantity of water through cooling systems. It may produce a fraction of a gram of carbon emissions.

These are real.

But they are also small.

Measured against the systems that sustain daily life, they are almost vanishing:

  • A cup of coffee requires water not in milliliters, but in hundreds of liters
  • A single hamburger carries a water footprint in the thousands
  • Streaming video for an hour may consume more energy than dozens of AI queries

And yet, we do not moralize the coffee.

We do not condemn the meal.

We do not ask, each time we press play, whether we have committed an ethical wrong.

Instead, we reserve that question—quietly, insistently—for the act of asking a machine to think with us.

We do not moralize the system. We moralize the question.


The Machine Then and Now

On Scale, Distance, and Perception


THE MACHINE (CRAY ERA)                                                                     

Centralized • Visible • Institutional

  • Occupied entire rooms or facilities
  • Required raised flooring, dedicated cooling, and specialized power systems
  • Consumed large, continuous energy loads
  • Accessed remotely by a limited number of researchers
  • Physical presence: audible, tangible, undeniable
  • Environmental cost: substantial, but rarely questioned

The machine was vast—
and so was our distance from it.

THE MACHINE (MODERN AI ERA)

Distributed • Invisible • Personal

  • Exists across global data centers
  • Cooling and power abstracted behind cloud infrastructure
  • Small energy and water usage per individual query
  • Accessed instantly by millions of individuals
  • Physical presence: hidden, silent, abstracted
  • Environmental cost: small per use, highly scrutinized in discourse

The machine is distant—
but we feel closer to its cost.


OBSERVATION

The scale of computation has not vanished—it has transformed.

What was once concentrated and visible is now distributed and unseen.
What was once institutional is now personal.

And in that shift:

The size of the machine has diminished.
The weight of the question has grown.


What Actually Happens When You Ask A Question?

When you send a prompt, it runs on servers in large data centers. These consume:

  • Electricity (for computation)
  • Water (mostly for cooling)
  • Carbon emissions (depending on energy source)

⚡ Rough footprint per AI query

Estimates vary by model and infrastructure, but a reasonable 2025-era range is:

Per typical text query 

  • Energy: ~0.002–0.01 kWh
  • Water: ~10–50 milliliters
  • CO₂: ~1–5 grams

👉 That’s about:

  • A few seconds of a laptop running
  • A sip or two of water
  • Less CO₂ than sending an email with an attachment

📚 What about heavier usage?

You’ll likely use more resources when doing activities such as generating images or asking larger queries such as to generate a spreadsheet.

Estimated footprint:

  • Water: ~0.1–0.5 liters (a small glass)
  • CO₂: ~10–50 grams
  • Energy: ~0.02–0.1 kWh

Even a heavy creative session:

  • watching 5–15 minutes of streaming video
  • driving a car ~0.1–0.3 miles

💡 Important perspective (this matters)

Here’s where things get grounded:

Compared to everyday activities:

  • 🚿 5-minute shower → ~50–100 liters of water
  • Cup of coffee → ~140 liters (agriculture!)
  • 🍔 Hamburger → ~2,000+ liters
  • 📺 1 hour streaming video → often more energy than dozens of AI queries

👉 So even frequent AI use is tiny compared to food, transport, and heating/cooling.


⚠️ Where the real concern is

We are right to be thinking about this—because the issue isn’t individual queries, it’s:

  • Massive global scaling of AI usage
  • Data center concentration in water-stressed regions
  • Training large models (much heavier than using them)

Training a model can use:

  • millions of liters of water
  • huge energy loads

But that’s amortized across millions of users.


🧭 The honest takeaway

  • Your personal use: low impact, especially relative to daily life
  • Collective use: significant and worth scrutiny
  • Awareness: exactly what pushes better practices

The Gothic Forces at Work

There is, perhaps, a deeper current beneath this shift—one that belongs less to engineering and more to the enduring architecture of human fear.

In the language of the Dark Muse Press Gothic Forces, three presences emerge:

THE MACHINE

Once distant, now intimate.
Once revered, now suspected.

The machine has not diminished in power. It has simply come closer. And in that closeness, it has become subject to judgment.


THE UNKNOWN

The infrastructure remains invisible.

Few will ever stand inside a data center. Fewer still will trace the flow of electricity from source to system. The true scale of the machine is hidden—abstracted behind interfaces, compressed into seconds.

And so we imagine it.

Often, we imagine it larger than it is.


THE SIGNAL

The amplification of fear through repetition.

Online discourse does not operate on proportion. It operates on resonance. A compelling claim—“AI is destroying the planet”—travels further, faster, and with greater emotional force than a measured explanation of distributed systems and relative impact.

The result is not clarity.

It is distortion.


The Moral Reversal

https://darkmusepress.org/wp-content/uploads/2026/03/MachineThenAndNow-800x600.png

When I worked with CRAY systems, the scale of computation was enormous. The energy consumption was far greater than anything required for a modern text query. The infrastructure was heavier, the inefficiencies more pronounced.

And yet, no one suggested that running those calculations was unethical.

No one asked whether we were, in some quiet way, harming the world simply by doing our work.

The difference is not technological.

It is perceptual.

When computation belonged to institutions, it was called progress.
When it belongs to individuals, it is called excess.

The machine has not become more demanding.

We have simply become more aware of it—and, perhaps, more inclined to assign it moral weight.


The Real Concern

This is not an argument for dismissal.

There are real questions to be asked—important ones:

  • Where are data centers built?
  • How is their energy sourced?
  • What happens when demand scales beyond current infrastructure?
  • Who bears the environmental cost of global computation?

These are systemic concerns. They belong to industry, policy, and design.

They do not belong solely—or even primarily—to the individual user composing a question.

The danger is not the act of asking.
The danger is a system that grows without accountability.


The Economy of Substitution

On What the Machine Replaces

To measure the cost of a technology without considering what it replaces is to see only half the equation.

Artificial intelligence, like all tools before it, does not exist in isolation. It displaces, reduces, or transforms other activities—many of which carry far greater environmental burdens.

The question, then, is not simply:

What does AI consume?

But also:

What does it prevent?


The Quiet Reductions

In practice, AI often replaces processes that are materially heavier:

  • Fewer Physical Drafts
    Writers and designers who once printed pages repeatedly now iterate digitally. Paper, ink, and physical waste diminish with each revision that never leaves the screen.
  • Reduced Travel
    Brainstorming, consultation, and problem-solving can occur without commuting, meetings, or flights. A question asked here may replace a journey taken elsewhere.
  • Compressed Toolchains
    Tasks that once required multiple applications—or even multiple specialists—can now be explored within a single system. Fewer processes mean fewer systems drawing power.
  • Lower Computational Redundancy
    Instead of running numerous independent searches, simulations, or trial-and-error workflows, users may arrive at answers more directly.

Creative Efficiency

For creators, the shift is particularly pronounced:

  • Concept exploration occurs rapidly, reducing the need for repeated rendering or physical prototyping
  • Writing and editing cycles shorten, minimizing resource-intensive iterations
  • Marketing assets, drafts, and layouts can be tested without producing waste

In these cases, AI does not merely consume resources—it prevents their multiplication.


The Substitution Principle

This is the principle that often goes unspoken:

A low-cost action that replaces a high-cost one is not a burden—
it is a reduction.

A single AI-assisted decision that eliminates:

  • A printed proof
  • A long commute
  • A redundant production cycle

may offset its own footprint many times over.


The Measure of Use

And so, the ethical question shifts again—not toward abstinence, but toward intention:

  • Is the tool replacing something heavier?
  • Or is it being used in excess, without purpose?

Like any system, artificial intelligence amplifies behavior. It can reduce waste—or quietly multiply it.


🌱 If you want to reduce your AI footprint

Without sacrificing your work:

  • Batch questions (fewer repeated runs)
  • Avoid unnecessary regenerations
  • Prefer text over heavy image/video generation (images cost more)
  • Use AI where it replaces higher-impact activities (this is key)

👉 Example:
Using AI instead of printing drafts, commuting for meetings, or running multiple design tools can actually reduce total footprint


A Final Consideration

It is tempting to view new technologies as additive—as though they simply layer consumption atop an already burdened world.

But history suggests otherwise.

The most transformative tools do not merely add.

They replace.

And in that replacement, they reshape not only how we work—but what the work ultimately costs.

The machine does not only take.
It alters what must be given.


The Weight of the Question

I did not feel the weight of the machine when it filled a room.

I did not consider its cost when it was distant, abstract, and beyond my reach.

Only now—when it answers me directly, when it participates in my work, when it becomes something like a collaborator—am I told to carry that weight.

And perhaps that is the final inversion:

The scale of the machine has diminished.
The scale of our guilt has grown.

But not every cost is equal.

And not every narrative about cost is true.


Closing the Circuit

We are not wrong to ask what our technologies consume.

But we must also ask:

  • What do they replace?
  • What do they enable?
  • And how do we measure their cost against the broader systems we accept without question?

The Victorians stood at the threshold of industrial transformation and saw, with remarkable clarity, that progress carries consequence.

They were not wrong.

But neither are we served by mistaking proximity for scale.

The machine has not become monstrous.

It has become familiar.

And in that familiarity, we have begun to see it—not as it is—but as we fear it might be.


📜 Filed in the Dark Muse Press Library under DMC 600.2
Technology & Creativity → Ecological Impact of Technology

Shut the Drawer
Return to the Catalogue

Leave a Reply

Featured Read

Victorian Gothic • Dark Fantasy • Spiritualism

A Victorian Gothic tale of séances, suspicion, and a manor that may be more than haunted.

Newsletter

Make sure you don't miss anything!