- Diddy Docket Updates
- Posts
- Tyrone Blackburn, Generative AI, and Hallucinated Case Law
Tyrone Blackburn, Generative AI, and Hallucinated Case Law
Civil attorney hit with sanctions for citing non-existent cases in legal briefs

Attorney Tyrone Blackburn, who represents plaintiffs in civil litigation against Sean “Diddy” Combs, Joseph “Fat Joe” Cartagena, and others, has been exposed in several civil litigation proceedings for citing AI-hallucinated, non-existent case law in his briefs. Several of his clients’ cases are impacted. Opposing attorneys have taken notice, filed for sanctions, and escalated their findings to bar associations. To date, two judges have awarded sanctions and attorney fees—more are pending.
Facey v. Fisher
In Monique Facey v. Liane Fisher et al., Justice Mary V. Rosado dismissed Facey’s suit against her former lawyers and sanctioned Blackburn on September 15, 2025, finding the complaint “malicious” and aimed at harassing defendants and faulting him for failing to disclose that defendants and a federal judge had already compelled him to accept the underlying settlement check. The court further held that Blackburn’s motion papers were “riddled with incorrect and false citations,” ordered him to submit an affirmation within five days stating whether he used “artificial intelligence applications and/or chatbots” and explaining why his papers cite cases that do not exist or do not stand for the propositions asserted, and awarded sanctions and fees based on that conduct. Monique Facey v. Liane Fisher et al. 152088/2025 (NY Sup. Ct., NY Cty.)
Jakes v. Youngblood
Judge William S. Stickman IV found that briefs signed by Blackburn on behalf of plaintiff Duane Youngblood contained “wholly fabricated” quotations from case law, including from the court’s own prior opinion, and repeatedly misrepresented the cited authorities, which the court linked to the use of generative AI. The court struck the AI‑tainted motion to dismiss and reply, ordered Blackburn to show cause under Rule 11 and ethics rules, and later imposed monetary sanctions, including a $5,000 sanction and an order to pay a portion of T.D. Jakes’ attorney’s fees tied to the AI‑generated filings, in orders issued between June and October 2025. Jakes v. Youngblood et al. 2:24‑cv‑01608 (W.D. Pa.)
Whoever or whatever drafted the briefs signed and filed by Blackburn, it is clear that he, at the very best, acted with culpable neglect of his professional obligations.
Cartagena v. Dixon
In Joseph “Fat Joe” Cartagena’s federal suit against former hype man Terrance Dixon and Blackburn, Fat Joe’s counsel filed a September 18, 2025, letter describing Blackburn’s original opening brief as citing cases that “simply do not exist” and mis‑citing others for propositions they do not support, and noting ten such inaccurate citations that were silently “corrected” in an unauthorized revised brief. The letter asks Judge Jennifer Rochon to strike the revised brief and affidavit, impose sanctions, and award fees, and it explicitly invokes Blackburn’s prior AI‑related misconduct and sanctions in Jakes v. Youngblood and Facey v. Fisher as evidence of a pattern of AI‑tainted or fabricated citations. Cartagena v. Dixon et al. 1:25‑cv‑03552 (S.D.N.Y.)
Gardner v. Combs
In Liza Gardner’s case, she alleges that Sean “Diddy” Combs and others trafficked her and raped her when she was a minor, asserting Mann Act, child sex abuse, and related claims. Judge Leo M. Gordon issued an order to show cause after discovering that Gardner’s opposition brief, signed by Blackburn, cited a non‑existent case, “United States v. Masha, 990 F.3d 1005 (7th Cir. 2021).” Blackburn then filed a letter admitting the citation was inaccurate and disclosing prior $5,000 Rule 11 sanctions and loss of pro hac vice status in Jakes v. Youngblood for “similar AI‑related citation and quotation errors.” Gardner v. Combs et al. 2:24‑cv‑07729 (D.N.J.)
Blackburn Deflects Blame, LexisNexis Rejects His Claim
In his defense, Blackburn has tried to shift some blame for the bad citations to LexisNexis, telling courts that its “Lexis+ AI” and “Protégé” tools mangled his research and helped produce bogus case cites. But in letters filed in the Facey and Cartagena cases, LexisNexis’s head of legal for North America Julie Chapman told opposing counsel that Blackburn was never an authorized user or subscriber to Lexis+ AI or Protégé, and that any suggestion that those products are responsible for his fabricated citations is “factually incorrect.” Chapman asks that the court treat Blackburn’s attempts to pin responsibility on LexisNexis as misleading, underscoring that the citation failures sit with the lawyer who filed them, not with a system he was not even licensed to use.
Course Correction and Future Consequences
In court filings, Blackburn states that he began taking AI‑ethics CLE courses in mid‑July 2025; however, multiple non‑existent citations at issue in these cases were filed after that date, underscoring that the core problem has been his verification practices, not just his familiarity with the technology.
Judges in Pennsylvania and New York have already sanctioned Blackburn and ordered him to pay significant fees over fabricated quotations and non‑existent citations tied to generative‑AI use. In the Fat Joe and Gardner matters, sanctions motions and show‑cause orders remain pending, and filings in those cases note that his conduct has already been referred—or is suitable for referral—to disciplinary authorities, raising the prospect of formal bar complaints and longer‑term professional fallout beyond any one lawsuit.
At the same time, Blackburn is a vivid example rather than a lone outlier. Other federal and state judges have publicly acknowledged that AI‑assisted drafting has produced bogus citations and misquotes even inside chambers, and some have gone so far as to withdraw opinions or ban certain AI tools after discovering hallucinated authority in their own work product. The throughline in those cases is the same one that runs through Blackburn’s docket: courts are treating generative AI as a risky instrument, not a scapegoat, and they are making clear that lawyers—and judges—who sign their names to AI‑assisted writing remain fully responsible when the law in their filings turns out to be fiction.

Reply