Mountains of files instead of automation: everyday life at German authorities
Waiting times of up to 16 weeks for housing benefit, 70,000 unprocessed applications at the state aid office in Hesse - the German administration has its back to the wall. Overworked employees, outdated processes and a decreasing number of skilled workers make efficient work almost impossible. No wonder hopes are now pinned on artificial intelligence (AI).
AI can do exactly what many authorities urgently need: Automating routine tasks, shortening processing times and relieving the burden on staff. However, a law that should actually ensure efficiency is proving to be an anti-technology hurdle: Section 35a of the Administrative Procedure Act (VwVfG).
AI yes, but please only without thinking?
Section 35a only permits the use of automated administrative acts if there is no scope for discretion and no need to weigh up the options. In other words, only if a decision can clearly be made according to the scheme "if A, then B". As soon as human judgment is required - for example in the case of exceptions or special cases of hardship - the machine is out.
The problem is that in practice, there are hardly any administrative decisions without some form of discretion. Even where everything has long been carried out according to internal guidelines, criteria catalogs and administrative regulations, there is still a formal margin of discretion - even if no one actually uses it.
This is precisely where Section 35a comes into play: it does not differentiate between genuine, open discretion and so-called structured discretion, which de facto follows fixed rules. As a result, the administration is not allowed to use AI even in cases where humans only make decisions "by the book".
Bureaucracy meets digitalization - and loses
§ Section 35a dates from 2017 - in other words, from a time when AI was still considered a nice research topic, not a productive tool. Today, the technology has developed rapidly: Modern AI can understand natural language, recognize complex patterns and make decisions based on probabilities. But the law acts as if it were a simple calculator at work.
What's more, while Brussels is pursuing a risk-based approach with the EU AI Regulation - the higher the risk, the higher the requirements - German legislators are simply saying: AI at discretion? Prohibited. Period.
Germany is therefore missing out on a huge opportunity. Particularly in areas with high case numbers, standardized procedures and manageable fundamental rights risks, a well-trained AI could noticeably reduce the workload of caseworkers - for example when applying for housing benefit, parental allowance or subsidies.
What needs to happen now: Courage to reform!
Instead of a blanket ban, a differentiated approach is needed. Not all discretion is the same. In areas with structured discretion, AI should be allowed to be used under clear legal and technical conditions. And not blindly, but with human control, transparency and the possibility of correction at any time.
Pilot projects could show how such a hybrid model works: AI pre-checks standard cases, humans make decisions in complex constellations. This saves time, money and nerves - and creates space for real administrative work instead of paperwork.
Administration makes it difficult for itself
We have known for years that the administration is overloaded. The fact that it paralyzes itself with outdated laws such as Section 35a VwVfG is homemade. The paragraph stems from analog thinking that no longer has anything to do with the reality of digital tools.
AI needs rules, yes - but no bans on thinking. Anyone who treats artificial intelligence like a bean counter should not be surprised if the system collapses. It is high time to give the administration the tools it deserves: a modern, differentiated, future-proof AI law.