AI's Environmental Footprint
Understanding the energy costs of AI systems and what healthcare organizations should consider when deploying these technologies.
As healthcare increasingly adopts AI, what are the environmental implications—and how should clinicians and institutions factor sustainability into AI deployment decisions?
Overview
AI systems require substantial computational resources, which translates to significant energy consumption. For clinicians evaluating AI tools, understanding these environmental costs is part of responsible technology adoption—particularly as healthcare organizations increasingly prioritize sustainability.
This appendix provides context on AI's environmental footprint without advocating for or against AI adoption. The goal is informed decision-making.
The Scale of AI Energy Use
Training vs. Inference
AI energy costs come in two forms:
- Training: The one-time (or periodic) process of building the model. Training large language models like GPT-4 requires massive computational resources—estimates suggest training runs can consume energy equivalent to hundreds of homes' annual usage.
- Inference: The ongoing cost of running the model to generate outputs. Each query to ChatGPT, each image analyzed by a diagnostic AI, consumes energy. At scale, inference often exceeds training costs.
Early estimates suggested a ChatGPT query used 10x the energy of a Google search. More recent research indicates the gap has narrowed significantly—modern AI queries may use comparable energy to traditional searches. But at massive scale, even small per-query costs compound into substantial infrastructure demands.
Data Center Impact
AI workloads are driving rapid expansion of data centers, which require:
- Electricity: For computation and cooling systems
- Water: Many data centers use evaporative cooling, consuming millions of gallons annually
- Hardware: GPUs and specialized AI chips have their own manufacturing footprint
Major tech companies have seen their carbon emissions rise despite renewable energy commitments, largely due to AI infrastructure expansion.
Healthcare-Specific Considerations
Imaging AI
Radiology and pathology AI systems process large image files, which is computationally intensive. A hospital running AI analysis on every chest X-ray or CT scan generates continuous inference costs. The benefit (faster diagnosis, reduced workload) must be weighed against resource use.
Ambient Documentation
AI scribes that transcribe and summarize clinical encounters require continuous speech-to-text processing plus LLM summarization. For a health system with thousands of daily encounters, this represents substantial ongoing computation.
Clinical Decision Support
Real-time CDS tools that query AI models for every patient interaction (alerts, suggestions, risk scores) generate high inference volumes. The clinical value per query varies—some alerts are ignored, others change care.
Frameworks for Thinking About This
Net Benefit Analysis
Consider AI's environmental cost in context:
- Does the AI reduce other resource use? (e.g., fewer unnecessary tests, shorter hospital stays, reduced travel for telehealth)
- What's the alternative's footprint? (e.g., flying pathology slides across the country vs. digital analysis)
- Does improved diagnosis prevent downstream resource-intensive care?
Proportionality
Not all AI use cases are equal:
- High value: AI that catches cancers, prevents sepsis deaths, or enables care access may justify significant resource use
- Lower value: AI for convenience features with marginal clinical benefit deserves more scrutiny
Institutional Responsibility
Individual clinicians have limited control over infrastructure choices. But institutions can:
- Select vendors with strong sustainability practices
- Choose cloud providers powered by renewable energy
- Right-size AI deployments (not everything needs the largest model)
- Consider on-premise vs. cloud trade-offs
What Clinicians Can Do
Ask Questions
When evaluating AI tools, consider asking vendors:
- Where does computation happen? What's the data center's energy source?
- Is this the right-sized model for the task, or is it using a general-purpose LLM when a smaller specialized model would suffice?
- What's the per-query resource footprint?
Use Thoughtfully
This isn't about avoiding AI—it's about intentional use:
- Use AI when it adds clinical value, not just because it's available
- Consider whether the task requires the most powerful model or if simpler tools suffice
- Batch queries when possible rather than multiple small requests
Advocate Institutionally
Push for sustainability to be part of AI procurement criteria. Health systems increasingly have sustainability officers and carbon reduction commitments—AI infrastructure should be part of those conversations.
Perspective
Healthcare itself has a substantial environmental footprint—estimated at 8-10% of U.S. greenhouse gas emissions. AI is one factor among many, including building operations, supply chains, and pharmaceutical manufacturing.
The question isn't whether AI has environmental costs (it does) but whether its benefits justify those costs and whether we're deploying it thoughtfully. The same critical thinking you apply to clinical AI effectiveness should extend to its broader impacts.
AI's environmental footprint is real and growing. As clinicians, you're not responsible for solving this at the infrastructure level, but being informed allows you to advocate for responsible deployment and make thoughtful choices about the tools you use.
Further Reading
Reflection Questions
- How would you weigh environmental costs against clinical benefits when evaluating an AI tool for your practice?
- Does your institution have sustainability criteria for technology procurement? Should AI be included?
- Are there AI tools you currently use where a simpler, less resource-intensive alternative might suffice?