About me
Shazia Kazi
SPECULATIVE BRIEF · 2026

Heidi Health — Head of Product Design

Product Strategy · Design systems · AI UX · Human first

A speculative design brief. What I’d build if I joined.

This page is grounded in people first, friction second: the joy clinicians find in caring for people, and the erosion they feel from admin and cognitive load. My instinct is to strip pain so doctors can do what they trained for—by mapping where friction lives, asking why it exists, then designing it out with clarity and humility.

Built from Heidi Health’s public positioning and metrics only—no insider knowledge. A portfolio exercise in how I’d frame product design leadership for a clinical AI care partner.

Product Strategy AI UX Design systems Human first Service design thinking
Context

Where Heidi is right now

In 18 months, Heidi returned over 18 million hours to frontline clinicians. 73 million consults. 200+ specialties. 116 countries. That’s not traction — that’s a clinician-led movement.

The product-market fit was found through a freemium AI scribe that just works. Now the mission is expanding: from documentation, into Tasks, Evidence, Comms, Ask Heidi, and a brand new 21g wearable — Heidi Remote. From tool to care partner.

That expansion is where the design challenge gets genuinely hard. Because each new surface, each new capability, each new workflow is another moment where the product either deepens clinical trust — or quietly erodes it.

Heidi Health product context — clinician-facing experience
Photo credit: Heidi Health public website.

$65M Series B

Oct 2025 · $465M valuation

18M+

Hours returned to clinicians in 18 months

2M+

Consults supported every week

110+

Languages across 116 countries

Human in the room

Clinical AI only matters if it fits the rhythm of real care: interruptions, uncertainty, empathy under pressure. I’d keep research and design rituals close to those rhythms—shadowing, co-design, and blunt feedback from people who don’t owe us optimism.

North Star

Principles I’d design around

Tom’s bar for clinical software is unforgiving: if it adds cognitive load, it fails. These principles keep the work honest when the roadmap gets loud.

Invisible when it’s working.

The best clinical tools disappear into the flow—documentation, suggestions, and safety nets feel like part of the conversation, not a billboard on top of it.

Trusted by design.

Trust in clinical AI is earned in the room, not declared on a slide. I’d design so clinicians can always see what Heidi is confident about, where it’s less certain, and what each output is grounded in—provenance and humility made visible, not buried. Transparency belongs in the flow of care, not in the fine print.

Consistent enough to be safe.

A disciplined system is a safety strategy: predictable patterns, audited components, and a single language for risk-critical states across surfaces and specialties.

My approach

My approach — 3 pillars

How I’d organise design work at Heidi: trust in the UI, truth in the field, and a system that keeps both honest at scale.

Clinician-first research programme

  • Embedded research with GPs, specialists, and nurses — not just usability testing
  • Workflow shadowing to understand where Heidi fits in the actual consultation
  • Feedback loops from clinician power users into the design system

Trustworthy AI UX patterns

  • Design language that signals when Heidi is confident vs uncertain
  • Visual hierarchy that keeps the clinician in control, not the AI
  • Audit of current suggestion surfaces for overconfidence signals

Design system for scale

  • Component library built for clinical contexts — not generic SaaS patterns
  • Accessibility and legibility as non-negotiables (clinical environments are high-stress, high-speed)
  • Contribution model so engineering and design ship faster together
Plan

First 90 days — where I’d focus

Sequencing matters: earn context before you earn roadmap. Three horizons, one thread—reduce cognitive load for clinicians and teams shipping alongside them.

Weeks 1–2

Map reality; align on what “trust” means in the product

  • Shadow real consults and admin workflows; document friction with language clinicians already use.
  • Interview designers, PMs, and clinical partners on decision-making and risk—where does the system help vs. where does it posture?
  • Publish a short “trust manifesto” for the org: what we show, what we never hide, and how we measure success in the room.

Weeks 3–6

Tighten the design system around clinical risk states

  • Audit components for disclosure, provenance, uncertainty, and handoff—then close gaps with engineering and clinical safety partners.
  • Prioritise accessibility and internationalisation in the system foundations (not as a late pass).
  • Establish a weekly design–clinical review with explicit “safe to ship” criteria.

Months 2–3

Ship learning loops: measure trust where it’s earned

  • Pair qualitative research with telemetry on abandonment, correction, and time-to-trust signals in live workflows.
  • Run structured experiments on disclosure placement—not to add noise, but to prove what clinicians actually use.
  • Share a transparent roadmap narrative for customers: what we’re learning, what we’re changing, and why.
Design system

The design system opportunity

Most SaaS design systems are built for speed and consistency. Heidi's needs something more: it needs to encode clinical confidence.

When a clinician glances at an AI-generated note, they're not just reading text — they're making a rapid trust assessment. Is this right? Is this mine? Would I sign this? The typography, density, hierarchy, and micro-interactions in that moment either support that assessment or create friction.

Confidence signalling

Visual language that communicates AI certainty levels without breaking clinical flow — not warnings, not alerts. Quiet, precise, contextual.

Editability by default

Every AI output must feel immediately, obviously editable. The clinician is always the author. The system should make that felt, not just stated.

Ambient density

Clinical environments are cognitively loaded. The system should give information the space to be read, not optimise for feature density. Less on screen. More in mind.

Cross-surface coherence

From web to mobile to Heidi Remote — the experience should feel continuous. A clinician moving from desktop to ward round shouldn't relearn anything.

Experience sketch

One care moment I would be deeply curious about

Post-consultation handoff pattern

This pattern comes from my NSW Aged Care work with Microsoft, where the handoff after each consult was the highest-cognitive-load moment. The clinician is transitioning out of presence with a patient and into action mode. What the system surfaces in that 30-second window shapes whether AI feels like a partner or just another task list.

Pattern observed in prior aged care project

Generated note + tasks listed sequentially. Cognitive load transferred, not reduced.

Explorations

  • A single 'ready to review' state — note, tasks, and comms collapsed into one confident summary, not three separate flows
  • Friction-right editing — one tap to confirm, deliberate action to change. Defaults that are right 99% of the time
  • A confidence signal — not a percentage, but a quiet visual cue on anything the system is less certain about
  • A closing moment that feels like a handoff between partners, not a form submission
Care workflow sketch from NSW aged care project — post-consultation handoff and calmer clinician workspace
Team culture

The team I'd build

High craft standards and psychological safety aren't opposites. The best design teams I've led held both at once — and the quality of the work was the proof.

I'd set the bar through my own work first. Show up in the Figma file. Give honest critique. Be wrong sometimes in public. Make it normal to say this isn't good enough yet — including about your own work.

The thing I've learned about designers in clinical contexts: they care more than in almost any other domain. They know what's at stake. The job is to give them the conditions to act on that care — clear principles, fast feedback cycles, and the autonomy to own their work properly.

Team collaboration — workshops and critique as culture
Closing

Why this brief exists

Heidi is operating at the intersection of care, software, and societal trust. If I were leading product design, my job would be to keep the work grounded in real clinicians, real time pressure, and real accountability—while building a system that scales without turning cold.

Portfolio contact: shaziakazi.com

Case study

Building trust in clinical AI?

If you’re hiring for healthtech product design—or want to pressure-test a brief like this—I'd welcome the conversation.

  • Focused on practical outcomes

  • Based in Sydney, working globally