The task force proposed creating new standards, indicators, and definitions for energy use and efficiency metrics. Organizations need to track and predict data center power usage. He also said that the organizations that would benefit most from new infrastructure should bear the brunt of the increased costs. Additionally, power grids must be modernized and made secure, and AI can be used to improve energy infrastructure, production, and efficiency.
Mr O’Connor detailed some further steps that organizations could take to be “in line with or supported by government advice”. Beyond just using energy-efficient hardware, there’s also:
Optimize your model by pruning your model, or removing redundant neurons from your neural network, to reduce energy consumption and reduce model size and computational load. Reducing the numerical precision of AI calculations through a technique known as quantization can reduce computational costs by up to 50%. Train smaller models to replicate the behavior of larger models, reducing the need for large-scale computational resources.
Is it clear and practical, or is 273 pages too long?
The wide-ranging report touches on other areas, including financial services, health care, data privacy, and research and development, and calls for federal preemption when it comes to AI, meaning federal law takes precedence over state law.