Summary:
Le député européen Sergey Lagodinsky a déposé une plainte formelle auprès du Médiateur européen concernant la mise en œuvre de l’exigence de transparence de l’Acte sur l’IA de l’Union européenne relative au reporting de la consommation d’énergie lors de l’entraînement des modèles dans le Code de pratique pour les modèles d’IA à usage général. L’objectif de la plainte est de garantir que l’exigence de reporting soit correctement appliquée et conforme à l’intention législative initiale de l’Acte sur l’IA. Les éléments clés comprennent des préoccupations selon lesquelles le Code de pratique, adopté comme mécanisme officiel de conformité en août, permet aux fournisseurs de modèles de ne pas divulguer les données de consommation d’énergie si des informations critiques ne sont pas fournies par les fournisseurs d’ordinateurs ou de matériel.
Original Link:
Generated Article:
MEP Sergey Lagodinsky’s formal complaint to the EU Ombudsman marks a significant moment in the implementation of the Artificial Intelligence Act (AI Act), specifically regarding transparency mandates. At the core of this complaint lies the issue of whether the Code of Practice for general-purpose AI models aligns with the AI Act’s requirement for energy consumption reporting during model training. Lagodinsky argues that the Code, adopted in August as an official compliance mechanism, deviates from the AI Act by allowing AI model providers to withhold essential energy consumption data, citing an inability to procure critical information from computational or hardware providers.
The legal context here centers on the AI Act, a regulatory framework aimed at ensuring that AI systems used within the European Union are safe, ethical, and transparent. Article 13 of the AI Act specifically requires that providers disclose information about the environmental impact of AI systems, a move aligned with the EU’s broader commitment to sustainability and reduction of carbon emissions under the European Green Deal. By permitting exceptions where providers lack information from third parties such as hardware vendors or cloud service providers, the Code of Practice appears to undermine the actionable transparency that Article 13 mandates.
Ethically, this raises questions about the accountability of AI developers and related industries. Transparency around energy consumption is not merely a technical concern; it also reflects the broader ethical initiative to combat climate change. When energy data is withheld, it obscures public and regulatory understanding of AI’s environmental footprint, potentially enabling providers to evade scrutiny and the responsibility to act sustainably. The lack of energy usage reporting creates ethical concerns around corporate responsibility and risks diminishing public trust in the AI sector.
The implications for the AI industry are substantial. If exceptions to energy reporting requirements become standard, it may undermine the EU’s ambitions to lead in ethical AI governance globally. Companies may have diminished incentives to optimize their systems for energy efficiency, stalling innovation aimed at reducing computational costs and environmental impact. Furthermore, this precedent could give rise to competitive inequities, where transparent companies that comply strictly with the AI Act might find themselves at a disadvantage to opaque competitors who exploit loopholes.
Concrete examples illustrate the stakes. Consider large-scale AI model providers, such as organizations building generative AI systems for natural language processing or image generation. Training a model like OpenAI’s GPT-4 reportedly requires extensive computational resources, translating into significant energy use. If providers can withhold such data, the true environmental cost of these systems remains opaque, making it harder for regulators and consumers to make informed decisions. This could lead to broader disillusionment with AI in public discourse.
To address these challenges, policymakers may need to revisit the provisions of the AI Act or demand revisions to the Code of Practice. Possible actions include requiring hardware and compute providers to cooperate with energy tracking or implementing sanctions for non-compliance with reporting requirements. The issue raised by MEP Lagodinsky deserves both immediate attention and a robust, transparent resolution if the EU aims to uphold its legislative and ethical commitments in the AI domain.