Chong Ke is a British Columbia lawyer representing a millionaire named Wei Chen who is fighting for child custody in divorce proceedings with his ex-wife, Nina Zhang.
Last December, KE cited two cases in an application supporting Wei Chen's request to take her children to China. The two cases are:
A mother takes a 7-year-old child on a six-week trip to India.
A mother is allowed to take her 9-year-old child to China to visit her parents and friends.
However, both cases were later found to have not existed at all. They are fake cases generated by the chatbot ChatGPT.
Court ruling
In his ruling, Judge D**id Masuhara wrote: "As this case unfortunately demonstrates, generative AI is still not a substitute for the expertise that the legal system requires of lawyers. ”
Competence is crucial when it comes to choosing and using any technological tool, including those powered by artificial intelligence. ”
Masuhara accepted Ke's apology, arguing that she did not intentionally deceive the court. But he also said that KE should cover the costs of the measures taken by opposing counsel to resolve the confusion caused by the sham case.
He also ordered KE to review documents from her other cases and immediately report any documents generated by ChatGPT or other generative AI tools to the opposing parties and the court.
Event Impact
The incident has raised concerns about the use of AI tools by lawyers. Some believe that AI tools can help lawyers be more efficient and accurate. But others are concerned that AI tools could be used to generate false information or mislead courts.
The incident also reminds lawyers of their responsibility to ensure that all materials submitted to the court are accurate and truthful.
Here are some of the key takeaways from the event:
In court filings, KE cited two fake cases generated by ChatGPT.
After the cases were discovered, KE apologized to opposing counsel and admitted his mistake.
The judge accepted Ke's apology but held that she should bear the costs of the opposing lawyer.
The incident highlights the risks of using AI tools in legal practice.
The incident has also caused reflection in the Canadian legal community. The Law Society of British Columbia warns lawyers that it is their responsibility to ensure that all materials submitted to the court are accurate and truthful. The association also advises lawyers to be cautious when using AI tools and informs the court that they use them.