Kärt Pormeister: Chatbot answers incompatible with idea of legal counseling

Replacing free legal counseling with a ministry chatbot is likely the worst idea of the year. Such a plan might only seem like a sensible initiative to someone who has never dealt with chatbots or the provision of legal aid, writes Kärt Pormeister in a Vikerraadio daily commentary.
In October, as is often the case, everything seemed new again. Among other things, the Ministry of Justice announced that starting in 2026, a chatbot will replace free legal counseling. That's right — replace, not supplement — meaning that, for most people in need, free legal counseling will simply disappear.
While one might assume that this hasty move is driven by a naive attempt to cut costs, the minister of justice claims that saving money is not the goal. Instead, the aim is to "respond to recurring and simpler legal questions quickly and competently" (link in Estonian).
The idea of replacing free legal counseling with a chatbot might only seem reasonable to someone who has never dealt with chatbots or provided legal assistance, or to someone who stands to profit from creating and maintaining such a chatbot.
I've encountered both. Elisa's self-service chatbot Annika once sent me into a very dark mental place. And all I was trying to do was cancel a service contract. Now imagine asking Annika for help in a complex and stressful family dispute over child custody or alimony. I, for one, can't. Of course, the state will presumably spend much more developing and maintaining its legal chatbot than Elisa ever did on Annika.
Law students learn in their very first year that the law is abstract, because life is varied and lawmakers cannot anticipate every situation. On top of that, these abstract rules are often phrased in vague or convoluted ways. A single paragraph of law can keep opposing sides locked in litigation for years. The justice minister, however, optimistically told the media that the questions people ask free legal counselors tend to be repetitive.
That's all well and good, but the facts of each case are never exactly the same. More importantly, someone without legal training has no idea which facts might be relevant to resolving their legal issue. For example, if someone on a management board asked the chatbot about a non-compete clause in their employment contract, without mentioning that their contract is technically a contract for services, the chatbot would give the wrong answer. (That actually happened to a good friend of mine during a casual conversation — we didn't have time to get into all the details.)
A chatbot can only be as helpful as the training data it's built on. Unless the state has secretly spent years training its chatbot using digital conversations between lawyers and clients, its data will only cover a narrow range of typical cases.
And typical cases don't account for nuance. In fact, information on standard cases is often already available on public websites of government institutions. When giving legal advice to a specific person, nuance in the facts is central. That means a legal chatbot's real-world usefulness is limited to very narrow questions. Never mind the growing number of cautionary tales from around the world about AI-generated legal advice going wrong.
AI can be a powerful tool in legal work, but the golden rule must apply: trust, but verify. The problem is, a layperson might not be able to tell whether a chatbot's answer is correct or not.
And even the justice minister admits openly that the chatbot is only meant to answer recurring and simple legal questions. But that isn't legal counseling — it's merely fulfilling the ministry's obligation to provide legal explanations. Legal counseling, by its very nature, is a personalized service that focuses on the specifics of an individual's case, with all its nuance. Generic answers from a chatbot don't meet that standard.
Human interaction also plays a crucial role in legal counseling. Free counseling cases are often related to family law, which means the person seeking help may be vulnerable and highly anxious. They may need emotional support in addition to dry legal information. That human support is part of the lawyer-client relationship and something a chatbot can never replace. As Erki Pisuke, head of HUGO.legal, told Postimees (link in Estonian), the people most in need of free legal advice are often those with below-average computer skills.
One major issue that hasn't received enough attention in public debate is confidentiality. Lawyers operate in the client's interest under a contract for services, which requires them to keep information confidential, even from the state. But who receives a person's sensitive information when they turn to a legal chatbot? Obviously, the state — after all, we're talking about a "ministry chatbot."
In cases involving family or contract law, this may not immediately raise alarm bells. But when it comes to potential criminal matters, anyone should pause and ask: wait a minute — did I just confess to a possible crime? And the answer, again, would be: to the state. Even though this legal chatbot service wouldn't apply to people who are already suspects or defendants in criminal proceedings, someone might need advice before that procedural status is formalized.
In response to confidentiality concerns, the government would likely promise "strict usage rules" for the chatbot's data. But as the license plate camera controversy showed, the state can quietly broaden such uses long before the public catches on. More fundamentally, how can the state itself provide independent legal advice when the legal issue might involve the state's own actions?
Don't get me wrong — of course there are lawyers out there who provide poor service and give bad advice. Human involvement isn't a guaranteed mark of quality. But we should not underestimate the value of being listened to and (at least seemingly) understood by another person.
And finally: how could we possibly teach a logically operating computer system to navigate our often illogical and ever-changing jungle of legal norms? While not all people use their reasoning capacity to the fullest, the human brain remains more effective at solving complex problems. More importantly, people have emotional intelligence and empathy — capacities that a robot can, at best, only mimic. Yet just like in medicine, these are essential and irreplaceable in legal counseling.
--
Editor: Marcus Turovski










