ecotech framework

Empowering Developers to Measure, Predict, and Reduce AI's Environmental Impact
Date
Fall 2024
Blog
Link
Callout
Link
Final
Link
Demo 1
Link
Demo 2
Link
Demo 3
Link
Miro
Link
Design mockup of Voltvalet

THE ECOTECH FRAMEWORK: AN EMISSIONS CHECK FOR AI

This project, undertaken by students at The Ohio State University’s Computer Science and Engineering program in collaboration with 99P Labs, tackles one of the most pressing challenges in modern technology: the environmental impact of artificial intelligence. Dubbed “EcoTech Framework,” the initiative came together to address the rising energy demands of large-scale AI models. Drawing parallels to an “emissions check” for vehicles, EcoTech offers developers a practical way to monitor and reduce the carbon footprint of training neural networks—ultimately striking a balance between sustainability, performance, and cost.

EcoTech’s core functionality revolves around measuring, predicting, and evaluating the energy usage of AI model training in real time. A Python-based tracker collects power consumption data from processors (CPUs and GPUs), converts it into emissions metrics, and streams these details to a centralized React dashboard via WebSockets. On-premise and cloud-hosted training processes are both supported, allowing developers to visualize carbon outputs whether they’re using local hardware or Azure VMs. By tapping into open-source libraries like Eco2AI, EcoTech translates raw energy consumption figures into a tangible indicator of environmental impact. These measurements even factor in region-specific carbon intensities, highlighting how training location influences sustainability outcomes.

A standout element is EcoTech’s “Prediction” feature, which relies on FLOPs (floating point operations) and dataset size to approximate how much energy—expressed in kilowatt-hours—an upcoming training run might consume. While the data informing these calculations is inherently sparse and somewhat theoretical, EcoTech’s team views it as a meaningful first step toward more accurate resource planning. From the developer’s perspective, the tool’s user-friendly “Similes” feature also translates abstract consumption metrics (like total CO2) into everyday comparisons, reinforcing the real-world magnitude of training AI models. An interactive global map, built into the dashboard, further illustrates how carbon intensity changes by country, encouraging data scientists to consider geographic and infrastructure factors when setting up compute environments.

EcoTech’s interface goes beyond just a streaming dashboard. A “Comparison” tool saves previous experiments in a MySQL database so users can track improvements over time, evaluating how changes in model size, data preprocessing, or hardware usage influence carbon emissions. Additionally, an experimental “Sustainability Report” awards letter grades (A through F) to reflect a given setup’s relative efficiency and environmental impact. Though still at an MVP stage, EcoTech also hints at future ambitions such as deeper integration into software development environments (e.g., a VS Code extension) or broader public benchmarking, which would allow developers to share and compare emissions data across the AI community.

Despite its limitations—particularly around precise GPU monitoring in cloud clusters—EcoTech lays the groundwork for a more transparent AI ecosystem, one that recognizes the costs of large-scale computation in a resource-constrained world. The students’ hands-on experiences in building a containerized, Docker-based client-server architecture highlighted both the challenges and the immense promise of quantifying environmental impact within everyday machine learning workflows. Their efforts dovetail with a broader industry push for sustainable AI, as more and more organizations seek to align powerful new models with responsible, eco-friendly practices.

Stay Connected

Follow our journey on Medium and LinkedIn.