A?dancing?robot?dances?at?2024?World?AI?Conference,?July,5,?2024.?(PHOTO:XINHUA)
By?CHEN?Jie?and?BI?Weizi
At a time when society stands in awe of the rapid advances in AI, the environmental footprint of these advances is often overlooked. However, the significant environmental impacts of AI development demand attention and action.
AI and energy consumption
Liu Yanjia, an engineer at the Institute of Computing Technology of the Chinese Academy of Sciences, told Science and Technology Daily that the electricity consumption of AI is mainly concerned with two key phases: the training and the inference phase.
In the training phase, models learn and evolve by digesting large amounts of data. Generally speaking, the greater the number of parameters, the more computational power is consumed by large models, and consequently, the more electrical energy they consume. Taking the GPT-3 model as an example, its total energy consumption for training is about 1.28 terawatt-hours. This amount is comparable to the monthly electricity usage of 6,400 Chinese households.
Once trained, the models enter the inference phase, where they're applied to solve real-world problems. "As AI models gain traction in various sectors, the need for inference and its electricity consumption will increase," said Liu.
At present, the application scope of AI is becoming more and more extensive, and its "exploitation" of the global power system will be further highlighted.
How to mitigate AI power usage?
"The most direct solution is to start from the supply side and continuously increase the power supply to solve the problem of AI power consumption," said Liu, adding that more progress should be made in wind power, photovoltaics and energy storage technologies.
In addition, there are energy-saving potentials on the demand side, including algorithm optimization, hardware improvement and energy management. Algorithm optimization can reduce computation and energy consumption without significantly reducing AI performance.
Liu also suggested that new computing technologies, such as quantum computing and photonic computing, can also greatly improve computing efficiency and reduce energy consumption in the long term.
Large language model, "Huashan", specifically applied in the aerospace field for the first time, was launched at the 2024 China Satellite Application Conference.