Zuhören, Verstehen, Verbindung

Quellen für unseren Insta Post über ChatGPT’s Ressourcenverbrauch


  • “AI Index Report 2023 – Artificial Intelligence Index.” https://aiindex.stanford.edu/report/ (accessed May 05, 2023).
  • D. Patterson et al., “Carbon Emissions and Large Neural Network Training.” arXiv, Apr. 23, 2021. doi: 10.48550/arXiv.2104.10350.
  • “ChatGPT Is Consuming a Staggering Amount of Water,” Futurism. https://futurism.com/the-byte/chatgpt-ai-water-consumption (accessed Apr. 27, 2023).
  • J. An, W. Ding, and C. Lin, “ChatGPT: tackle the growing carbon footprint of generative AI,” Nature, vol. 615, no. 7953, pp. 586–586, Mar. 2023, doi: 10.1038/d41586-023-00843-2.
  • E. Strubell, A. Ganesh, and A. McCallum, “Energy and Policy Considerations for Deep Learning in NLP.” arXiv, Jun. 05, 2019. Accessed: May 07, 2023. [Online]. Available: http://arxiv.org/abs/1906.02243
  • A. S. Luccioni, S. Viguier, and A.-L. Ligozat, “Estimating the Carbon Footprint of BLOOM, a 176B Parameter Language Model.” arXiv, Nov. 03, 2022. Accessed: May 07, 2023. [Online]. Available: http://arxiv.org/abs/2211.02001
  • P. Li, J. Yang, M. A. Islam, and S. Ren, “Making AI Less ‘Thirsty’: Uncovering and Addressing the Secret Water Footprint of AI Models.” arXiv, Apr. 06, 2023. Accessed: Apr. 27, 2023. [Online]. Available: http://arxiv.org/abs/2304.03271
  • M. C. Rillig, M. Ågerstrand, M. Bi, K. A. Gould, and U. Sauerland, “Risks and Benefits of Large Language Models for the Environment,” Environ. Sci. Technol., vol. 57, no. 9, pp. 3464–3466, Mar. 2023, doi: 10.1021/acs.est.3c01106.
  • AI, “The Power Requirements to Train Modern Large Language Models,” nnlabs.org, Mar. 05, 2023. https://www.nnlabs.org/power-requirements-of-large-language-models/ (accessed Apr. 27, 2023).

Schreibe einen Kommentar

Deine E-Mail-Adresse wird nicht veröffentlicht. Erforderliche Felder sind mit * markiert