Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Let's make AI Green(er): Challenges and opportu...

Sponsored · Your Podcast. Everywhere. Effortlessly. Share. Educate. Inspire. Entertain. You do you. We'll handle the rest.

Let's make AI Green(er): Challenges and opportunities

Avatar for Tushar Sharma

Tushar Sharma

January 23, 2026
Tweet

More Decks by Tushar Sharma

Other Decks in Technology

Transcript

  1. Software engineering Machine learning • Source code analysis • Software

    quality • Code smell detection and refactoring • Developers’ productivity • Program comprehension • Machine learning for software engineering • Software engineering for machine learning https://web.cs.dal.ca/~tushar/smart/ • Binary symbol reconstruction • Program comprehension for decompiled binaries Green AI • Sustainable machine learning • Energy hotspots and refactorings • Energy efficient code representation Dr. Tushar Sharma [email protected] SMART lab, Dalhousie University Tools and platforms
  2. 6 Create an image of Baby JJ from Cocomelon having

    a birthday party with Baby Shark, where they're surrounded by colorful balloons, a big rainbow cake, and sparkly presents AI Model https://arxiv.org/pdf/2311.16863v1 But it comes at a cost! 0.012 kWh ≈ Charging a mobile phone once
  3. Impact on environment and cost 7 Llama cost of training

    - 2048 A100 GPUs for 23 days - Electricity cost $53K Operational cost: ChatGPT spends $700K per day Google PaLM trained on 6144 TPUs V4 made of two TPU V4 pods Meta AI’s OPT was trained on 992 A100 GPUs https://www.economist.com/technology-quarterly/2020/06/11/the-cost-of-training-machines-is-becoming-a-problem
  4. GPT3 • 1287 MW-hr • 550 tons of carbon •

    120 years' worth of single-family electricity usage of an American household 8 How green are LLMs
  5. 10

  6. Sustainable AI Sustainable AI aims to optimize resource utilization used

    by AI models without compromising, ideally, the models' accuracy. Image credit: https://www.bbc.com/news/science-environment-55498657
  7. Sustainable AI Model Infrastructure Data Are we using the right

    model for a given task? Is our data quality adequate? Is our data diverse enough? Is our ML pipeline optimized?
  8. 1. Software Optimization - Optimizing AI models for energy efficiency

    while maintaining accuracy 2. Energy-Efficient Hardware - Designing next-generation energy-efficient hardware architectures 3. Benchmarks & Metrics - Developing efficiency metrics and benchmarks Directions towards Sustainable AI
  9. Quantifying energy use is the defining feature in sustainable software

    development Quantifying energy usage Or, related quantities: battery usage, CO2 emissions, etc.
  10. Energy measurement tools Hardware energy meters Software energy meters •

    Perf • Scaphandre • PyRAPL • jRAPL • NVIDIA-SMI • NVML • AMD-SMI
  11. Programming languages also exhibit different energy consumption profile “What are

    your programming language's energy-delay implications?”, MSR 2018
  12. Model size (#parameters) matters 25 From Words to Watts: Benchmarking

    the Energy Costs of Large Language Model Inference. HPEC 2023
  13. An adaptive language-agnostic pruning method for greener language models for

    code 30 • An effective pruning method that makes language models computationally efficient. • plug-and-play with any Transformer-based model • Maintains ~original accuracy with substantially less computation FSE 2025
  14. FlipFlop: A Static Analysis-based Energy Optimization Framework for GPU Kernels

    32 ICSE 2026 FlipFlop: Predicts energy consumption and recommend Pareto- optimal thread block configurations using static analysis.
  15. BRACE: Unified Ben chmarking of Accu racy and Energy for

    Code Language M odels 33 Rate LLMs on 1-5 scale for energy efficiency and accuracy
  16. Goals TCA-AI: AI for the Ocean-Climate-People Nexus • Robust: Operate

    under noisy and missing data from ocean sensors, work in a wide range of situations and environments • Integrated: Physics + ML models fill in ocean climate/carbon gaps • Explainable: Visualize and provide reasons and biases of results • Sustainable: Low power, less expensive ocean AI • Automated Monitoring and Forecasting: Combine underwater video, acoustics and text. Regulatory monitoring and forecasting for fisheries, tidal/hydro power, and ocean-based climate action • Environmental DNA analysis • Generative AI for the Ocean-Climate-People nexus: Provide information with auditable answers and sources from a curated “memory bank”
  17. • AI is power hungry • Training GPT-3 consumed 1,287

    MWh, roughly equal to the energy consumption of an average American household over 120 years. • The computational resources required to train a best-in-class ML model is doubling every 3.4 months. • Goals • Developing methods, tools, and techniques to enhancing the energy efficiency of AI models. • Work with industry partners to apply sustainability techniques to reduce total energy consumption without compromising the models’ accuracy. • Approach • Energy profiling of various hardware devices, including sensors and edge devices • Energy efficient AI models using data pruning/enrichment, model quantization, distillation, and pruning strategies • Identify and refactor energy code smells WP4: Sustainable AI
  18. Sustainable AI – A long road ahead • Standardized benchmarks

    and metrics • Carbon metrics leaderboards of Open LLM • Standardize practice of reporting model training energy data, especially large orgs training LLMs • Collaborative frameworks for sharing computational resources • Open-source tools for energy consumption monitoring • Community-driven best practices for sustainable AI development • Support for research initiatives in Green AI practices • Accountability and awareness