We have all heard over and over again about lawyers who use Gen AI and fail to check the citations the tools provide. The dangers of hallucinations and inaccuracies when using Gen AI tools are well known, and a Court will likely have little sympathy for the lawyer who fails to check sources.
But what if an expert witness uses Gen AI to come up with nonexistent citations to support their declarations or testimony?

That very thing just happened in a case pending in Minnesota federal court, as reported by Luis Rijo in an article in PPC Land. Ironically, the expert in question, Professor Jeff Hancock, the Stanford Social Media Lab Director, offered a declaration in a case challenging the validity of a Minnesota statute regulating deepfake content in political campaigns. Hancock subsequently admitted using ChatGPT to help draft his declaration. The declaration included two citations to nonexistent academic articles and incorrectly attributed the authors in another citation.Continue Reading Did Your Expert Use ChatGPT? You Might Want to Ask