Human or Machine? Italy’s New AI Law Draws a Red Line in Public Decision-Making
Law 132/2025 is set to overhaul Italian public administration with AI - while fiercely defending human responsibility.
When artificial intelligence knocks on the doors of government, who stays in charge - the algorithm or the civil servant? Italy’s Law 132/2025, a landmark piece of legislation, answers with a resounding message: in the age of automation, human beings remain at the helm. But beneath the surface, the law reveals a struggle to balance technological innovation with democratic accountability, as bureaucrats, technologists, and lawmakers all eye the future of public power.
Inside the Law: Principles and Tensions
Italy’s Law 132/2025 arrives as the European Union’s AI Act begins to reshape the continent’s digital landscape. While the EU regulation is directly applicable, the Italian law carves out its own path - carefully threading the needle between harmonizing with Brussels and asserting national priorities.
The law’s backbone is its “anthropocentric principle”: artificial intelligence must serve human needs, not dictate them. AI systems in public administration are strictly tools, not masters. The legislation spells out that every significant administrative decision - especially those involving discretion or balancing public and private interests - must be made by a human, not by a machine. Any attempt to delegate final decision-making or the reasoning (“motivation”) behind a decision to AI is explicitly prohibited.
Transparency and accountability are non-negotiable. All AI models deployed in government must be traceable; officials must be able to explain how an AI’s output was produced, and how it aligns with the input provided by humans. Personal data processed by AI in public administration is tightly regulated, requiring clear consent, plain-language communication, and compliance with the GDPR and Italian data protection codes. For minors under 14, parental consent is mandatory.
What Can AI Actually Do?
Despite strict boundaries, the law opens the door to a range of AI-powered efficiencies - so long as they support, not supplant, human judgment. AI can manage and filter vast data flows, screen applications for formal compliance, assist with bookkeeping and reporting, and even help draft the factual background of administrative acts. Yet, whenever a decision demands discretion or interpretation of law, the human official must take the wheel - and be ready to justify their reasoning in court if challenged.
The law also signals a shift in public sector hiring and procurement: implementing AI will require new digital skills, specialized roles, and partnerships with private tech firms. But the transformation won’t happen overnight, and the risk remains that without strategic hiring and training, the promise of AI will remain just that - a promise.
Conclusion: The Human Firewall
Law 132/2025 sets Italy on a cautious but determined path toward digital transformation in the public sector. Its message is clear: AI is a powerful ally, but never the boss. As the bureaucracy braces for change, the real test will be whether public administrations can bridge the gap between principle and practice - harnessing AI’s potential while keeping human judgment, transparency, and accountability firmly in the driver’s seat.
WIKICROOK
- Anthropocentric Principle: The anthropocentric principle mandates human oversight of AI decisions, especially where individual rights and freedoms may be affected by automated processes.
- Administrative Discretion: Administrative discretion is the authority for officials to make decisions based on their judgment within legal and policy boundaries, crucial in cybersecurity contexts.
- Traceability: Traceability is the ability to monitor and record actions or data flows, ensuring transparency, accountability, and security within cybersecurity systems.
- GDPR: GDPR is a strict EU and UK law that protects personal data, requiring companies to handle information responsibly or face heavy fines.
- Motivation (of an act): Motivation (of an act) is the legal requirement to explain the factual and legal reasons behind an administrative or cybersecurity decision.