Tech Talk: 90-day playbook to change your practice

Actionable items that take your team from “That’s How We’ve Always Done It” to “Let’s Test It”

 

Mindset shift: from prove it to improve it

As the team starts to work toward comfort and safety with change, reframing changes as clinical quality improvement, not culture disruption, can be helpful. Most importantly, the team should be involved in identifying, prioritizing, and troubleshooting change. Support the conversation by leaning into the clinician's logic:

  1. Assess (What's the presenting problem? Evidence?)
  2. Diagnose (What's driving it—workflow, training, tools?)
  3. Intervene (Run a small, time-boxed test.)
  4. Evaluate (Did outcomes improve? Keep, tweak, or stop.)

This is the Plan-Do-Study-Act (PDSA) cycle in plain language. When leaders model PDSA openly, staff learn change is about curiosity, not criticism, and overall improvement. Most importantly, it provides a structure to assess needs where the whole team thinks critically and openly, creating a safe space for ideas and creative thinking.

 

A four-part framework that clinics can use to become a culture that embraces change

Think of this as an upgrade path you can complete in 90 days.

1) Start looking for the problems (Weeks 1–2)

  • Set the norm. For one to two weeks, each employee writes down a problem they encounter during the week. It might be empty supplies, double-booked appointments, expired medications, etc. At your next meeting, everyone shares things they came across. As a group, decide, "Our default will be What problem are we solving? And what small test could we run?" Create excitement around problem-solving and idea-sharing
  • Install debriefs. After shifts, noteworthy cases, or unusual outcomes, run a 10-minute after-action review:
    • What was supposed to happen?
    • What happened?
    • What helped/hurt?
    • What will we repeat or try next time?
  • Create a "parking lot" for ideas. Physical board or digital form; commit to reviewing weekly and responding to every item.

2) Aim at one meaningful, measurable problem (Weeks 2–3)

Pick a pain point staff note consistently and everyone can agree on. It is also reasonably easy to fix with a new process or system. Examples might include

  • Unpredictable discharge times are ruining the client's experience
  • Inventory waste (expired meds, low turns)
  • Dental protocols are inconsistently followed

Define the success metric up front. Once the team has agreed on the exact issue, they define the specific and obtainable success metrics. Examples: door-to-discharge time, anesthetic complication rate, inventory turns, revisit/complication rates, and client callbacks completed within 24 hours.

Everyone should agree that the metric should be safe, obtainable, and measurable, and there should be a time point for checking in on success.

3) Run micro-experiment (Weeks 3–7)

Keep tests "small, reversible, and observable." As teams become more flexible with change, it may be successful to run two micro-experiments to examine potential solutions. This also helps teams become critical of assessing what works as much as what does not.

4) Lock in what works; sunset what doesn't (Weeks 7–12)

Keep agreed check-in time frames and discuss success and struggles. Determine if more research for solutions is needed or if the proposed solution is ready to be structured into clinic SOP, training and policy, or if more time is needed to determine.

Do not just set it and forget it. Recent changes should be discussed in regular team huddles and check-ins to ensure new policies or procedures are still working and are followed with early intervention for corrections or realignment as necessary. As new policies are more acclimated into practice culture, check-ins and discussions don't have to be as frequent but should meet the evaluation timeline projections.

 

Metrics that matter (and feel fair)

Metrics are essential to ensuring that expectations are clear and that the desired outcome is defined in a way that all involved understand. Everyone involved should have input on the metrics for success and agree on the timelines for assessment.

Pick one or two per experiment, examples could include:

  • Quality and safety: revisit rate, surgical site infections
  • Access and flow: door-to-discharge time, on-time starts, call-back completion
  • Financial: missed charges per doctor, inventory turns, average charge per visit
  • People: RVT task mix ratio, stay-intent pulse, schedule predictability, overtime hours
  • Client: NPS/CSAT post-visit, adherence to discharge plan

Share accomplishments visually. A simple run chart with today's data point, a green arrow, and a shout-out goes a long way in keeping teams motivated to keep up with change. It also helps them feel optimistic about success, pride, and ownership of their role in positive change in the clinic, patient outcomes, and the overall success of the team. Additionally, it instills management's commitment to upholding change and not getting busy and overwhelmed, reverting to what is easy or convenient. Tracking progress keeps everyone accountable and motivated.

 

Obstacles and challenges are guaranteed, so be ready for challenges with positive, purpose-driven statements

Replace "we've always done it" with practical scripts. Even in the most flexible work cultures, it is likely inevitable to occasionally hear the dreaded phrase, "This is how we have always done it." Leadership should be prepared to have conversations about change when it arises and discuss the benefits and possible disadvantages of changing what has always been.

So, if someone says, "This is how we've always done vaccines." Preparing a response like, "What does our current process do well? What problem is it creating, or what could be better?" creates a mindset of curiosity and system improvement. Another response we often hear is, "We tried that once; it didn't work."  When faced with the been-there, done-that mindset, you might respond by asking, "What was different in context? Can we identify a new variable to test?" If the concern expressed is, "Clients won't like it." Having a response such as, "Let's try it and then ask five clients this week and measure satisfaction scores and compliance. If it dips, we stop."

Having responses like these doesn't engage in a right or wrong conversation so much as a constructive, forward-thinking directive to help conversations work toward solutions and embrace change, as well as potential failure in trial-and-error.

 

Common blockers and practical responses

  • "We don't have time."

    Response: "Totally fair. Let's pick a change that saves five minutes per visit if it works—and time-box the test to two weeks."

  • "This threatens my role."

    Response: "Let's clarify roles and commit to skills cross-training so no one loses scope—everyone gains mastery."

  • "What if it goes wrong?"

Comments
Post a Comment