Natascha Miller

Historically, conveyancers and estate agents have considered verbal communication with clients and standard identity verification as sufficient safeguards for authorising transactions.

The diminishing effectiveness of traditional identity verification methods requires a critical re-evaluation of security protocols within the conveyancing sector. Failing to update transactional safeguards in response to technological advancements, such as deepfake voice fraud, exposes firms to acute financial loss and reputational damage, underscoring the urgent need for a systematic overhaul of existing verification practices in real estate transactions.

The New Reality: Deepfake Voice Fraud Has Entered Property Transfers

Recently, several South African law firms have reported incidents where criminals used AI-powered voice cloning to impersonate sellers, buyers, or company directors during property transactions. The following real-world incident timeline illustrates the urgency and rapid progression of this type of fraud:

  1. Initial Contact: Criminals use a cloned voice to contact the conveyancer, appearing as the legitimate client and requesting a conversation regarding changes needed in the transaction.
  2. Altering Instructions: During the call, the impersonator issues fraudulent instructions, such as changing bank account details for the payout.
  3. Execution Phase: The conveyancer, believing the authenticity of the voice, initiates the requested changes without suspicion of fraud.
  4. Detection and Consequences: Realisation of the deceit often occurs too late, frequently only when alerted by a party involved or when funds have been irretrievably diverted. 

These incidents have led to significant consequences for the affected parties. In some cases, trust account funds were nearly diverted.

Example:

A Cape Town firm lost R1.2 million before the bank could freeze the transfer. (Matter currently under investigation, not yet in reported case law).

  • This form of fraud is distinct from traditional risks such as email interception or business email compromise schemes.
  • The advent of deepfake technology marks a significant evolution in synthetic identity fraud, establishing it as a discrete and rapidly proliferating legal threat within the conveyancing sector. In contrast to established cyber threats such as email interception or phishing, which target textual or visual aspects of communication, deepfake voice cloning penetrates the auditory sphere, thereby undermining conventional trust-based verification mechanisms. This method subverts established legal protocols that rely on verbal confirmation for mandate authenticity, creating opportunities for fraudulent actors to manipulate transactional instructions. Accordingly, this technological development necessitates a reassessment of legal standards for client identification and transaction authorisation. The inability of traditional safeguards to detect sophisticated synthetic audio underscores the urgent requirement for multilayered security frameworks that address both the legal and operational vulnerabilities introduced by advances in artificial intelligence.

How Deepfake Mandate Fraud Works

  1. Criminals obtain voice samples from:
    • A property viewing video
    • WhatsApp voice notes
    • Social media clips
    • Zoom/Teams meeting recordings
  1. They use generative-AI tools to clone tone, accent, and cadence.
  2. They call the conveyancer or agent, sounding exactly like the real client.
  3. They then issue fraudulent instructions, such as:
    • “We need to change the payout account.”
    • “Please update the sale agreement details.”
    • “I’m travelling; send documents to my assistant to sign.”
    • “Release guarantees early; we have a timing issue.”

Fraudulent transactions are frequently executed before automated banking alerts, POPIA-mandated data protection measures, or FICA compliance procedures identify anomalies. 

Notably, the lag between the initiation of unauthorised instructions and the activation of detection mechanisms can span several hours, during which illicit transfers are often completed without intervention. To address this vulnerability, firms should adopt a multifaceted verification framework. 

For example, requiring advanced electronic signatures, biometric liveness authentication, and independent confirmation through pre-verified contact methods can significantly reduce the window of opportunity for fraudsters. The integration of these countermeasures constitutes a proactive approach that diminishes reliance on single point safeguards and enhances overall transactional security in the conveyancing process.

The impact of this emerging threat on professional trust requires careful consideration.

Deepfake technology erodes the professional trust that is fundamental to the conveyancing process.

Conveyancers have always been trained to:

  • Confirm instructions verbally
  • Validate identity documents
  • Ensure that instructions are “authentic”

However, authenticity cannot be established solely based on voice, tone, or a convincing conversation.

A fraudster may convincingly imitate a client’s voice, sometimes with greater accuracy than the client.

Legal Risk Exposure

A conveyancer who releases funds based on a fraudulent instruction faces: 

  • Risk
  • Basis in Law
  • Exposure
  • Professional negligence, Lex Aquilia, delictual liability
  • Civil claims for full loss
  • Breach of fiduciary duty, Attorney-client trust standard
  • Disciplinary sanction
  • Trust account mismanagement
  • Legal Practice Act & Trust Rules
  • LPC Audit referral & penalties
  • POPIA security failure (Protection of Personal Information Act)
  • POPIA s19 Data Safeguard Obligation (Section 19: Data Safeguard Requirements)
  • Information Regulator investigation (assessment by the authority designated to enforce POPIA compliance)

For example, if a conveyancer, relying on a voice-over-the-phone, mistakenly releases

R500,000 to a fraudulent account, the consequences extend beyond financial loss. The client may file a civil claim seeking full reimbursement for professional negligence. Additionally, the conveyancer could face disciplinary action for breaching fiduciary duty by failing to uphold required trust standards.

Moreover, following trust account mismanagement, a conveyancer may face referral for audit and disciplinary action by the Legal Practice Council (LPC), as established in precedent cases addressing breaches of the Legal Practice Act and the accompanying Trust Rules (see, for example, Law Society of the Northern Provinces v Maseka [2017] ZAGPPHC 1155). 

In addition, the error may prompt an investigation by the Information Regulator, which may scrutinise the practitioner’s adherence to Section 19 of the Protection of Personal Information Act (POPIA) regarding data safeguard obligations. This scenario mirrors past legal proceedings in which inadequate verification undermined practitioners’ defences, effectively translating the theoretical threat of deepfake mandate fraud into concrete legal liability for regulatory sanctions and civil claims. 

Thus, the risk shifts from an abstract possibility to an actionable, precedent-informed consequence, compelling a reassessment of verification protocols.

In short:

  • Exclusive reliance on voice confirmation exposes conveyancers to significant liability.
  • The New Standard: Multi-Factor Identity Confirmation
  • Verbal confirmation must no longer be considered an adequate or exclusive method for identity verification. Particularly when processing critical instructions, such as requests to alter banking details, conveyancing professionals are obligated to implement a multi-factor authentication protocol that systematically verifies client identity through multiple independent, technologically robust measures.

Require all of the following: 

  1. An AES-authenticated electronic signature (not a PDF-pasted signature). AES (Advanced Electronic Signature) is a digital signature that legally verifies a person’s identity and is difficult to forge, in compliance with the Electronic Communications and Transactions Act (ECTA). 
  1. Biometric facial-liveness video authentication, rather than a static selfie. This live video process confirms the person is real and present, preventing the use of static images or deepfakes, in accordance with POPIA s19’s data safeguard requirements. 
  1. Call-back confirmation to a verified number already on file, using a phone number that has been previously confirmed and recorded.
  1. Bank account verification certificate (not a screenshot). This official document, supplied by the bank, proves account ownership.
  1. Written confirmation acknowledging the consequences of fraud. To assist agents in quickly recalling potential warning signs during transactions, the acronym “TRAVEL” may be useful:

 T – Tone Change: Any alterations in the client’s voice or way of speaking.

 R – Request to Change Account: Any requests to modify banking details.

 A – Abnormal Requests: Unusual instructions like releasing funds early.

 V – Verification Avoidance: Statements such as “I’m travelling” or “my mic is broken.”

 E – Emails from Unknown Sources: Instructions arriving from new or unconfirmed email addresses.

 L – Last-Minute Changes: Any sudden changes just before completion.

Practical Red Flags for Estate Agents & Conveyancers

Red Flag                                                             Likely Meaning   Deepfake voice event in progress  Funds-redirection fraud Synthetic voice masking eSignature Identity spoofing attempt
 
The client suddenly can’t video call.
Client wants account details changed urgently.
Client insists, “I can only speak by voice note”
Client refuses to confirm via AES 

The Message to the Profession

  • This represents an immediate risk. Incidents of this nature are already occurring.
  • The legal sector remains especially vulnerable to these threats.
  • At present, the legal sector is considered one of the most vulnerable targets.

As generative AI advances, the authenticity of deepfakes will continue to increase.

Immediate action is required. Once the first High Court matter is published, the central question will be: “Should the conveyancer have known?”

It is essential to acknowledge the possibility that deepfake technology may evolve to a level of sophistication at which even advanced forensic techniques cannot reliably distinguish manipulated audio from authentic human speech. In anticipation of such a scenario, firms must critically assess whether current verification protocols would be defensible if challenged in legal or regulatory proceedings, primarily when objective authentication methods no longer provide certainty. 

By proactively envisioning this imminent threat landscape, practitioners are compelled to consider whether proposed solutions—such as multi-factor authentication or biometric verification—will remain effective or if additional procedural or technological innovations will be necessary. This forward-looking analysis encourages legal professionals to invest not only in immediate safeguards but also in the ongoing evaluation and adaptation of security frameworks, thereby fortifying their practice against both current and anticipated challenges.

Conclusion

  • The legal profession has traditionally evolved gradually.
  • However, the rapid evolution of cybercrime does not permit such delays.
  • The standard for safeguarding is shifting from reliance on individual trust to robust verification processes.
  • The adoption of robust verification processes must occur without delay; postponement is not a viable option. A client’s voice cannot be regarded as adequate proof of identity. Instead, secure verification methods must provide the necessary evidence. As a final recommendation, each practitioner should identify one procedure to update this week. Taking direct action will translate insight into concrete steps, ensuring enhanced security and trust within the practice.

By Natascha Miller, Attorney, Conveyancer & Forensic Consultant
Cell: 082 445 7003
E-mail: natascham@bnlaw.co.za
LinkedIn: https://www.linkedin.com/in/natascha-miller-ab837284/

LEAVE A REPLY

Please enter your comment!
Please enter your name here

9 − 6 =