It can be difficult to focus on strategy when your organization has to comply with standards and regulations. Tracking and auditing can be tedious! If you’re a medical device manufacturer, you may need to maintain ISO 13485 compliance to participate in the supply chain. At the same time, you’ve got to meet all the requirements of 21 CFR 820. You’ve also got to remember other regulations that govern production and postmarket. (To read more about the challenges, check out Wienholt’s 2016 post.) There’s a lot to keep track of!
I have not shared all the commonalities of or differences between ISO 9001:2015 and the Baldrige Excellence Framework. Instead, I have tried to show the organizational possibilities of building on conformity assessment to establish a holistic approach for achieving excellence in every dimension of organizational performance today, with a look to the strategic imperatives and opportunities for the future. Baldrige helps an organization take this journey with a focus on process (55% of the scoring rubric) and results (45% of the rubric), recognizing that great processes are only valuable if they yield the complete set of results that lead to organizational sustainability… I encourage organizations that have not gone beyond conformity to take the next step in securing your future.
My Favorite (#10, Firing Line), from http://www.telegraph.co.uk/sport/horseracing/11574821/Kentucky-Derby-Simon-Callaghan-has-Firing-Line-primed.html. Apr 29, 2015; Louisville, KY, USA; Exercise rider Humberto Gomez works out Kentucky Derby hopeful Firing Line trained by Simon Callaghan at Churchill Downs. Mandatory Credit: Jamie Rhodes-USA TODAY Sports
I love horse racing. More specifically, I love betting on the horses. Why? Because it’s a complex exercise in data science, requiring you to integrate (what feels like) hundreds of different kinds of performance measures — and environmental factors (like weather) — to predict which horse will come in first, second, third, and maybe even fourth (if you’re betting a superfecta). And, you can win actual money!
I spent most of the day yesterday handicapping for Kentucky Derby 2015, before stopping at the track to place my bets for today. As I was going through the handicapping process, I realized that I’m essentially following the analysis process that we use as Examiners when we review applications for the Malcolm Baldrige National Quality Award (MBNQA). We apply “LeTCI” — pronounced like “let’s see” — to determine whether an organization has constructed a robust, reliable, and relevant assessment program to evaluate their business and their results. (And if they haven’t, LeTCI can provide some guidance on how to continuously improve to get there).
LeTCI stands for “Levels, Trends, Comparisons, and Integration”. In Baldrige parlance, here’s what we mean by each of those:
Levels: This refers to categorical or quantitative values that “place or position an organization’s results and performance on a meaningful measurement scale. Performance levels permit evaluation relative to past performance, projections, goals, and appropriate comparisons.”  Your measured levels refer to where you’re at now — your current performance.
Trends: These describe the direction and/or rate of your performance improvements, including the slope of the trend data (if appropriate) and the breadth of your performance results.  “A minimum of three data points is generally needed to begin to ascertain a trend.” 
Comparisons: This “refers to establishing the value of results by their relationship to similar or equivalent measures. Comparisons can be made to results of competitors, industry averages, or best-in-class organizations. The maturity of the organization should help determine what comparisons are most relevant.”  This also includes performance relative to benchmarks.
Integration: This refers to “the extent to which your results measures address important customer, product, market, process, and action plan performance requirements” and “whether your results are harmonized across processes and work units to support organization-wide goals.” 
Here’s a snapshot of my Kentucky Derby handicapping process, using LeTCI. (I also do it for other horse races, but the Derby has got to be one of the most challenging prediction tasks of the year.) Derby prediction is fascinating because all of the horses are excellent, for the most part — and what you’re trying to do is determine on this particular day, against these particular competitors, how likely is a horse to win? Although my handicapping process is much more complex than what I lay out below, this should give you a sense of the process that I use, and how it relates to the Baldrige LeTCI approach:
Levels: First, I have to check out the current performance levels of each contender in the Derby. What’s the horse’s current Beyer speed score or Bris score (that is, are they fast enough to win this race)? What are the recent exercise times? If a horse isn’t running 5 furlongs in under a minute, then I wonder (for example) if they can handle the Derby pace. Has this horse raced on this particular track, or with this particular jockey? I can also check out the racing pedigree of the horse through metrics like “dosage”.
Trends: Next, I look at a few key trends. Have the horse’s past races been preparing him for the longer distance of the Derby? Ideally, I want to see that the two prior races were a mile and a sixteenth, and a mile and an eighth. Is their Beyer speed score increasing, at least over the past three races? Depending on the weather for Louisville, has this horse shown a liking for either fast or muddy tracks? Has the horse won a race recently?
Comparisons: Is the horse paired with a jockey he has been successful with in the past? I spend a lot of time comparing the horses to each other as well. A horse doesn’t have to beat track records to win… he just has to beat the other horses. Even a slow horse will win if the other horses are slower. Additionally, you have to compare the horse’s performance to baselines provided by the other horses throughout the duration of the race. Does your horse tend to get out in front, and then burn out? Or does he stalk the other horses and then launch an attack in the end, pulling out in front as a closer? You have to compare the performance of the horse to the performance of the other horses longitudinally — because the relative performance will change as the race progresses.
Integration: What kind of story do all of these metrics tell together? That’s the real trick of handicapping horse races… the part where you have to bring everything together in to a cohesive, coherent way. This is also the part where you have to apply intuition. Do I really think this horse is ready to pull off a victory today, at this particular track, against these contenders and embedded in the wild and festive Derby environment (which a horse may not have experienced yet)?
And what does this mean for organizational metrics? To me, it means that when I’m formulating and evaluating business metrics I should take a perspective that’s much more like handicapping a major horse race — because assessing performance is intricately tied to capabilities, context, the environment, and what’s bound to happen now, in the near future.
I had a great experience in 2006 using the Baldrige Criteria to develop a Workforce Management Plan for the National Radio Astronomy Observatory (NRAO). We were tasked by the National Science Foundation (NSF) to prepare this report, which was definitely going to require us to dig deep and reflect on how we were managing our workforce, both at the operational level and in service of our strategic priorities. Unfortunately, none of us had ever done this before, so we were pretty much clueless as to what elements such a report would require, and what sorts of questions we might have to answer to ensure that we were approaching the question of workforce management strategically. The NSF wasn’t really able to provide guidance to us other than “you should use best practices from business and industry.” Fortunately, because I had been involved in the quality community for several years, I knew that the Baldrige Criteria might help us accomplish our goal. And it did!
“…share with us some of the ways the practice of quality is changing to meet the needs of faster, faster, faster. “
Fortunately, there has been a ton of research on this over the past decade or so, on the topics of dynamic capabilities and environmental dynamism. The notion of environmental dynamism just means things are changing pretty fast out there, and we have to respond to it. So I’ll focus on what dynamic capabilities are, one way to get them, and some resources and references where you can find out more.
The cornerstone for developing dynamic capabilities seems to be a culture of intentional learning with a dual focus on gaining tacit knowledge (learning by doing) as well as explicit knowledge (learning by reading, conceptualizing, categorizing information) – just like what was recommended by Nonaka & Takeuchi (1995). Using system dynamics modeling, however, Romme et al. (2010) figured out that “there is no linearrelationship between tacit knowledge, deliberate learning and dynamic capability” so really understanding how to leverage your learning capabilities to become more agile needs a few more years of research, it seems.
What this means is – don’t just rush out and start a giant initiative involving deliberate learning in your organization. Although this research uncovered a relationship between a learning orientation and dynamic capabilities, the investigators also found that positive outcomes are very sensitive to the level of environmental dynamism and the initial conditions of the organization (ie. its culture).
However, we have several models for quality systems that honor deliberate learning as a core value, such as the Baldrige Criteria! To keep up with the changing pace of the external environment, the best course of action is to commit to a proven system for continuous improvementThere are also resources like Senge’s classic (1995) book The Fifth Discipline that illuminate the characteristics of a learning organization. While researchers are exploring these links to help us understand how to meet the pace of change more effectively, tried and true systems for continuous improvement with learning as a key component can provide a useful foundation for dealing with these challenges.
In this February 2012 column, author Bob Kennedy examines reflected on a heated discussion at a gathering of senior-level quality practitioners regarding the merit of various tools, methodologies and themes in the context of the quality body of knowledge – what I refer to as “quality soup”. These paragraphs sum up the dilemma captured at that meeting:
Next came the bombshell from a very senior quality consultant: “No one is interested in ISO 9000 anymore; they all want lean.” In hindsight, I think he was speaking from a consultant’s perspective. In other words, there’s no money to be made peddling ISO 9000, but there is with lean and LSS.
I was appalled at this blatant undermining of a fundamental bedrock of quality that is employed by more than 1 million organizations representing nearly every country in the world. The ISO 9000 series is Quality 101, and as quality practitioners, we should never forget it.
If we don’t believe this and promote it, we undermine the impact and importance of ISO 9000. We must ask ourselves, “Am I interested in ISO 9000 anymore?”
When I see articles like this, and other articles or books that question whether a tool or technique is just a passing fad (e.g. there’s a whole history of them presented in Cole’s 1999 book) my visceral reaction is always the same. How can so many quality professionals not see that each of these “things we do” satisfies a well-defined and very distinct purpose? (I quickly and compassionately recall that it only took me 6 years to figure this out, 4 of which were spent in a PhD program focusing on quality systems – so don’t feel bad if I just pointed a finger at you, because I’d actually be pointing it at past-me as well, and I’m still in the process of figuring all of this stuff out.)
In a successful and high-performing organization, I would expect to see SEVERAL of these philosophies, methodologies and techniques applied. For example:
The Baldrige Criteria provide a general framework to align an organization’s strategy with its operations in a way that promotes continuous improvement, organizational learning, and social responsibility. (In addition to the Criteria booklet itself, Latham & Vinyard’s users guide is also pretty comprehensive and accessible in case you want to learn more.)
Thus you could very easily have ISO 9000 compliant processes and operations in an organization whose strategy, structure, and results orientation are guided by the Baldrige Criteria.
Six Sigma helps us reduce defects in any of those processes that we may or may not be managing via an ISO 9000 compliant system. (It also provides us with a couple of nifty methodologies, DMAIC and DMADV, that can help us structure improvement projects that might focus on improving another parameter that describes system performance OR design processes that tend not to yield defectives.)
The Six Sigma “movement” also provides a management philosophy that centers around the tools and technologies of Six Sigma, but really emphasizes the need for data-driven decision making that stimulates robust conclusions and recommendations.
Lean helps us continuously improve processes to obtain greater margins of value. It won’t help you reduce defects like Six Sigma will (unless your waste WAS those defects, or you’re consciously mashing the two up and applying Lean Six Sigma). It won’t help you explore alternative designs or policies like Design of Experiments, part of the Six Sigma DMAIC “Improve” phase, might do. It won’t help you identify which processes are active in your organization, or the interactions and interdependencies between those processes, like an ISO 9000 system will (certified or not).
ISO 9000 only guarantees that you know your processes, and you’re reliably doing what you say you’re supposed to be doing. It doesn’t help you do the right thing – you could be doing lots of wrong things VERY reliably and consistently, while keeping perfect records, and still be honorably ISO certified. The Baldrige process is much better for designing the right processes to support your overall strategy.
Baldrige, ISO 9000, and lean will not help you do structured problem-solving of the kind that’s needed for continuous improvement to occur. PDSA, and possibly Six Sigma methodologies, will help you accomplish this.
Are you starting to see how they all fit together?
So yeah, let’s GET LEAN and stop wasting our energy on the debate about whether one approach is better than another, or whether one should be put out to pasture. We don’t dry our clothes in the microwave, and we don’t typically take baths in our kitchen sink, but it is very easy to apply one quality philosophy, methodology or set of practices and expect a result that is much better generated by another.
Bob Kennedy comes to the same conclusion at the end of his column, one which I fully support:
All quality approaches have a place in our society. Their place is in the supportive environment of an ISO 9000-based QMS, regardless of whether it’s accredited. Otherwise, these approaches will operate in a vacuum and fail to deliver the improvements they promise.
I spent last week at the 2011 International Conference on Software Quality in San Diego. On Wednesday, I hosted a session of lightning talks, where anyone in the audience can volunteer to “have the stage” for 5 minutes. Some people give presentations with slides, some give presentations without slides, others present ideas or comment on other conference sessions, and yet others lead a short discussion or ask the audience a question they would like to have answered. The best part about lightning talks – and what makes them so fun – is that when the timer (that everyone can see) reaches 0:00 the audience is required to loudly and aggressively clap the speaker off the stage! It’s a great way to get (usually) introverted scientists, engineers and techno-geeks actively involved in discussion.
One of the guys who presented a lightning talk (I unfortunately can’t remember his name) was an ISO 9000 auditor. He shared a little nugget with us that really stuck with me, and I’d like to share it with the rest of the world. His insight came from many of the quality systems he’s reviewed and audited, and he said he noticed this with some of the conference presentations as well. I paraphrase:
Lessons learned must be actionable. So many times I see people present their lessons learned, but they’re not doing anything as a result, or they haven’t figured out how to do something in response to the lesson – they’re not changing their processes, attitudes, or strategies in response to what they’ve uncovered. If it’s merely an insight, it’s just a lesson… if it’s an insight that results in changing or adapting behavior, then it’s a lesson learned!
It struck me that since lessons learned are such an important component of Baldrige assessments as well (via ADLI), organizations that are currently working on an application might also benefit from this perspective.
If anyone remembers this guy’s name, please post it. He was tall and had lots of fluffy white hair.
If you have at least 5 years broad experience in quality, please help us validate the “Quality Systems Development Roadmap” originally published in Quality Progress in 2008 (http://asq.org/quality-progress/2008/09/basic-quality/starting-from-scratch.html). This is part of an expert systems project developed by Doug Jin, a student at James Madison University, under the guidance of Nicole Radziwill, JMU faculty member and ASQ member leader.
The 5-page, 13-question survey will be available until January 15, 2011 or 1500 responses are received, whatever comes first, so contribute now at http://www.surveymonkey.com/s/W2LYFYP!