Slide 39
Slide 39 text
Data Strategy and Operation Center
リファレンス
1. Quantifying the Carbon Emissions of Machine Learning (https://arxiv.org/pdf/1910.09700.pdf)
2. Energy and Policy Considerations for Deep Learning in NLP (https://arxiv.org/pdf/1906.02243.pdf)
3. DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter (https://arxiv.org/pdf/1910.01108.pdf)
4. Neural Network Architectures(https://towardsdatascience.com/neural-network-architectures-156e5bad51ba)
5. DEEP COMPRESSION: COMPRESSING DEEP NEURAL NETWORKS WITH PRUNING, TRAINED QUANTIZATION AND
HUFFMAN CODING(https://arxiv.org/pdf/1510.00149.pdf)
6. SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5MB model size(https://arxiv.org/pdf/1602.07360.pdf)
7. Deep Learning Performance Documentation Nvidia (https://docs.nvidia.com/deeplearning/performance/mixed-precision-
training/index.html#mptrain__fig1)
8. MIXED PRECISION TRAINING (https://arxiv.org/pdf/1710.03740.pdf)
9. Distilling the Knowledge in a Neural Network(https://arxiv.org/pdf/1503.02531.pdf)
10. Distilling Task-Specific Knowledge from BERT into Simple Neural Networks(https://arxiv.org/pdf/1903.12136.pdf)
11. Knowledge Distillation: Simplified (https://towardsdatascience.com/knowledge-distillation-simplified-dd4973dbc764)
12. ML CO2 IMPACT: https://mlco2.github.io/impact/#home
13. Rhonda Ascierto. 2018. Uptime Institute Global Data Center Survey. Technical report, Uptime Institute.
14. EPA. 2018. Emissions & Generation Resource Integrated Database (eGRID). Technical report, U.S. Environmental Protection
Agency.
15. Learning both Weights and Connections for Efficient Neural Networks(https://papers.nips.cc/paper/5784-learning-both-weights-
and-connections-for-efficient-neural-network.pdf)
16. GPT-3: The New Mighty Language Model from OpenAI(https://mc.ai/gpt-3-the-new-mighty-language-model-from-openai-2/)
17. AI and Compute(https://openai.com/blog/ai-and-compute/)
18. Performance Analysis(HPC Course, University of Bristol)
19. Reduce your carbon footprint by Planting a tree(https://co2living.com/reduce-your-carbon-footprint-by-planting-a-tree/)
20. EIE: Efficient Inference Engine on Compressed Deep Neural Network(https://arxiv.org/pdf/1602.01528.pdf)