By Thorsten Meyer AI

TL;DR. A viral headline claimed a former Google AI leader warned students to “not even bother” with law or medical degrees because AI will “destroy” those careers before graduation. The underlying interview does urge caution about long, expensive degrees during a period of rapid change—but the “destroy” rhetoric is editorial spin. Data and regulation point to accelerating task automation and changing apprenticeship models, not wholesale professional collapse by 2030. Expect profound workflow shifts, new accountability duties, and broader access—if we get guardrails right. Business InsiderFuturism


What was actually said—and what was spun

In Business Insider, Jad Tarifi—who helped start Google’s first generative‑AI team and now runs Integral AI—argues it’s “too late” to begin a PhD purely to ride the AI wave and says degrees that take years, like medicine and law, are “in trouble” because what you learn may be outdated by the time you finish. His core message is about speed and opportunity cost: pursue early‑stage niches (he cites “AI for biology”) or reconsider. Business Insider

Futurism converted that into a headline claiming AI will destroy medicine and law before you graduate—a framing, not Tarifi’s verbatim claim. Futurism


The labor market reality (2025→2030)

  • Law isn’t evaporating. U.S. Bureau of Labor Statistics (BLS) projects lawyer employment to grow ~5% (2023–2033)—about average—with ~35,600 openings/year, much from retirements and churn. That’s not a collapse profile. Bureau of Labor Statistics
  • Medicine still faces scarcity. BLS projects physicians and surgeons at +4% (2023–2033) and the AAMC continues to forecast a shortage of up to ~86,000 physicians by 2036. Demand doesn’t vanish because tools get better. Bureau of Labor StatisticsAAMC
  • Adjacent roles surge. Shorter‑path clinical roles—physician assistants (+28%) and advanced practice registered nurses (+40%)—are among the fastest‑growing jobs, underscoring a shift in who provides care, not whether care is needed. Bureau of Labor Statistics+1

Implication: AI will compress certain tasks and entry‑level workloads, but overall service demand in law and health remains positive through the decade.


How AI is changing the work (not eliminating the professions)

Law: from drudge work to judgment work

  • Automating routine: Research, cite‑checking, first‑drafting, contract analysis, and discovery are speeding up with embedded gen‑AI in major legal platforms. Surveys from Thomson Reuters and LexisNexis report rapid experimentation and budgeted adoption across large firms and corporate legal departments. Thomson Reuters+1LexisNexis
  • New guardrails: The ABA’s Formal Opinion 512 directs lawyers to protect confidentiality, verify AI output, supervise non‑lawyers/tech, and preserve candor to the tribunal. Multiple judges require AI‑use certifications in filings, and courts have sanctioned lawyers for hallucinated citations (e.g., Mata v. Avianca). Net‑net: copilot with human accountability. American Bar AssociationTexas ENRLSJustia
  • Training challenge: If gen‑AI reduces “grunt work,” firms must redesign apprenticeship so juniors still acquire judgment—more structured supervision, simulations, and explicit verification duties. (Courts’ disclosure rules make this non‑optional.) Texas ENRLS

Medicine: augmentation first, autonomy only in narrow niches

  • Narrow autonomy already exists: The FDA‑cleared IDx‑DR system screens for diabetic retinopathy without clinician interpretive oversight—tightly scoped, with defined referral pathways. FDA Access Data
  • Triage and documentation scale: AI helps flag urgent imaging (e.g., stroke/hemorrhage triage) and is spreading as ambient “AI scribes,” which early studies associate with lower documentation time and burden—while still requiring physician review. U.S. Food and Drug AdministrationFDA Access DataJAMA Network
  • Careful national adoption: The U.K.’s NICE issued early‑value guidance allowing fracture‑detection AIs in urgent care only with evidence generation and human review, reflecting real‑world caution. NICE+1

Bottom line: Expect productivity lifts and scope shifts (especially for routine analysis and paperwork), not the disappearance of doctors or lawyers by 2030.


Why “full replacement in five years” is unrealistic

  1. Regulators are building brakes—and enabling safe iteration.
    The U.S. FDA finalized its Predetermined Change Control Plan (PCCP) pathway so AI medical devices can update while preserving safety/effectiveness; the EU AI Act classifies medical AI as “high‑risk,” mandating risk management, human oversight, and transparency. These frameworks hard‑wire human accountability and slow any rush to autonomy. U.S. Food and Drug Administration+1European CommissionDigital Strategy
  2. Demand fundamentals point the other way.
    Aging populations, chronic disease, regulatory complexity, and persistent access gaps in both health and justice systems ensure continued human expertise—even as the mix of tasks evolves. (See BLS and AAMC projections above.) Bureau of Labor StatisticsAAMC
  3. Liability and ethics shift work upward.
    Sanctions and ethics rules make verification and professional judgment central. This raises the premium on the human skills machines lack: client counseling, ethics, risk trade‑offs, bedside manner, and persuasion. American Bar AssociationJustia

Societal impacts to expect by 2030

  • Access & affordability:
    Law. Routine services (basic contracts, forms, compliance) get cheaper and faster, potentially expanding access—if courts and bars enforce disclosure/verification to keep quality high. Texas ENRLS
    Health. AI triage and scribes can shorten queues and reduce burnout; NICE’s phased approach shows how to scale benefits while generating real‑world evidence. NICE
  • Training & equity:
    Reduced low‑level work risks starving juniors of learning. Expect new apprenticeship models (supervised AI workflows, structured case simulations) and a premium on soft‑skill development. Without investment, elite institutions with data and integration advantages may widen the gap. Thomson Reuters
  • Standards & accountability:
    In law, ABA ethics plus courtroom standing orders now shape “AI‑enabled practice.” In medicine, FDA PCCP + EU AI Act create update pathways and oversight duties. These institutional frictions deliberately trade speed for safety. American Bar AssociationU.S. Food and Drug AdministrationEuropean Commission

Guidance for students, professionals, and builders

  1. Choose “AI‑complementary,” not “AI‑competitive,” edges.
    • Law: client strategy, negotiation, courtroom advocacy, regulatory navigation, complex investigations, cross‑border risk.
    • Health: patient communication, multidisciplinary care plans, procedures, ethics and safety cases, complex differential reasoning.
  2. Invest in verification literacy.
    Learn how to design and audit AI‑assisted workflows (prompting, retrieval constraints, red‑teaming, citation checking), and document your checks. Courts and clinical regulators increasingly expect it. Texas ENRLSU.S. Food and Drug Administration
  3. Target “moated” niches.
    Tarifi’s instinct to pursue AI for biology has merit: data/regulatory moats (validated assays, trial pipelines, safety cases) support durable value—if you bring domain depth. Business Insider
  4. Rethink apprenticeship.
    Firms, health systems, and schools should move beyond “see one, do one” toward structured, AI‑aware training: simulated cases, graduated autonomy with explicit sign‑offs, and outcome tracking.

A five‑year outlook (2025 → 2030)

  • Law: Most filings, contracts, and research will be AI‑assisted by default; disclosure/verification becomes boilerplate; associate cohorts shrink modestly while mentorship and QA roles grow. Litigation strategy, negotiation, and bespoke advisory remain human‑led. American Bar AssociationTexas ENRLS
  • Medicine: Ambient AI scribes become standard; imaging triage expands; a few more narrow autonomous tools (like IDx‑DR) enter primary care; physicians retain final judgment and liability under PCCP/EU AI Act regimes; PA/NP teams expand to meet demand. JAMA NetworkU.S. Food and Drug AdministrationEuropean CommissionBureau of Labor Statistics

Verdict: The next five years will be wild—but for how doctors and lawyers work, not whether they exist.


Sources & signal

You May Also Like

Multi‑Cloud AI, Egress Economics and the EU Data Act: Implications for Private and Sovereign AI

Executive summary Thorsten Meyer AI champions private, responsible and sovereign AI. These principles…

Reality Check: Are Robots Really Stealing Jobs, or Just Redefining Them?

Just when you think robots are stealing jobs, the reality reveals they’re actually reimagining work—discover how to stay ahead in this evolving landscape.

AI’s Role in Health Insurance Coverage Choices

Explore how artificial intelligence is reshaping the way health insurance coverage is determined and what it means for you.

When Towns Say No to AI: The Local Revolt Against Data Centers

By Thorsten Meyer, ThorstenMeyerAI.com | October 13, 2025 AI’s physical footprint is…