By clicking “Accept All”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.

Artificial intelligence has quickly become part of everyday life. People use it to draft emails, summarise documents, generate ideas and answer questions in seconds. It is not surprising, then, that some members of the public have started using AI tools for legal matters too.

At first glance, that may seem efficient. Legal processes can feel intimidating. Professional advice costs money. If an AI platform can produce a neat, confident answer in moments, many people may be tempted to treat it as a shortcut. That is where the real risk begins.

AI can be useful for general background information. It may help someone understand a legal term, organise questions for a consultation, or get a broad sense of how a process works. What it cannot do is stand in the place of a lawyer who understands the facts, the law, the procedure and the consequences of getting something wrong.

That distinction matters far more than many people realise.

A legal problem is rarely just a question of finding a rule and applying it. It often turns on timing, evidence, wording, procedural steps, local practice, strategy and the particular facts of the matter. An answer that looks polished on a screen may still be incomplete, misleading or simply wrong when applied in the real world.

We have already seen how damaging that can be. In one matter, a client attempted to manage her own legal issue with the help of AI-generated guidance. By the time legal assistance was sought, the situation had become significantly more complicated than it needed to be. Steps had been taken on the basis of information that sounded plausible but did not properly fit the matter. The result was not a saving of time or money. It created more work, more cost and more difficulty in trying to repair the position.

That is one of the most important points for the public to understand. When AI goes wrong in a legal context, the problem is not always obvious at the start. In fact, the answer may look convincing enough to encourage further mistakes. A person may draft the wrong document, take the wrong procedural step, miss a deadline, misunderstand their rights, or frame an argument in a way that weakens their position before the matter has properly begun.

The law is not only about what may be said. It is also about what must be done, when it must be done and how it must be done.

This is particularly dangerous in disputes, litigation, family matters, property issues, employment problems and other areas where people are already under pressure. A person dealing with stress, financial strain or urgency may understandably want quick answers. Unfortunately, those are precisely the circumstances in which overly general or inaccurate guidance can do the most damage.

In South Africa, this is not only a practical concern. Section 34 of the Constitution protects the right of everyone to have a legal dispute decided in a fair public hearing before a court or another independent and impartial tribunal or forum. The Legal Practice Act, in turn, regulates the legal profession in the public interest, sets norms and standards, and is intended to ensure accountable professional conduct.

That matters because legal advice in South Africa is given within a regulated professional framework. The Legal Practice Council’s Code of Conduct applies to legal practitioners and makes clear that it serves as an enforceable standard of conduct. The Code also requires practitioners to maintain legal professional privilege and confidentiality, preserve clients’ confidential information and avoid misleading a court or tribunal on any matter of fact or law.

AI does not operate within that same framework. It does not owe professional duties to a client. It does not exercise legal judgment in the way a practitioner must. It does not stand behind its output with ethical accountability. It does not appear in court, manage procedural compliance, or answer for the consequences when things go wrong. That is why people should be cautious of treating AI as though it were a substitute for a qualified lawyer.

There is also a wider concern for the legal system. When members of the public rely on AI to prepare documents, launch claims, respond to litigation or interpret legal obligations without proper advice, the consequences do not always stop with their own matter. Inaccurate or confused filings can create unnecessary delays, increase costs and place additional pressure on courts and legal practitioners. In a system built around fair hearings, proper procedure and accountable professional conduct, that is not a small issue.

None of this means AI has no place in legal matters. Used carefully, it can be a helpful tool. It can assist with administration, help organise information and help a person prepare for a meeting with a lawyer. The danger arises when people mistake a tool for a substitute. Legal advice is not merely information arranged in sentences. It is analysis, judgment, experience and accountability applied to a specific set of facts. That remains true no matter how fluent the software may sound.

A practical rule of thumb

AI may be useful for:

  • Understanding basic legal terminology
  • Organising a timeline of events
  • Preparing a list of questions for a lawyer
  • Getting general background on a legal process

AI should not be relied on for:

  • Deciding whether you have a legal claim or defence
  • Drafting documents to be sent to court, an employer, a spouse, a business counterparty or a regulator
  • Interpreting deadlines, filing requirements or procedural rules
  • Assessing settlement offers or litigation strategy
  • Dealing with matters involving money, children, employment, property, compliance or reputational risk

A simple procedure before acting on AI-generated legal content

Start by asking whether the issue could affect your rights, your money, your family, your business or an important deadline. If the answer is yes, do not treat the AI output as legal advice. Use it, at most, to organise your thoughts or prepare questions. Before you sign, send, file or rely on anything important, have the position checked by a lawyer.

A quick answer can be attractive. Fixing the damage caused by the wrong one is usually far more expensive.

Practical Implications

AI may be able to assist with general legal information, but it cannot replace case-specific legal advice given within South Africa’s constitutional and professional framework. Where legal rights, deadlines, court processes or commercial consequences are involved, relying on an automated answer can create avoidable and costly problems. The safer course is still to obtain advice from a qualified professional who can assess the facts, identify the risks and guide the matter properly from the outset.

Need guidance on a legal issue?

If you are unsure whether information you have received online can be relied on, or if a matter has become more complicated than expected, the Thomson Wilks team can help you assess your position and advise on the right next step.