Tag Archives: Baldrige

How the Baldrige Process Can Enrich Any Management System

The “Baldrige Crystal” in a hall at NIST (Gaithersburg, MD). Image Credit: me.

Another wave of reviewing applications for the Malcolm Baldrige National Quality Award (MBNQA) is complete, and I am exhausted — and completely fulfilled and enriched!

That’s the way this process works. As a National Examiner, you will be frustrated, you may cry, and you may think your team of examiners will never come to consensus on the right words to say to the applicant! But because there is a structured process and a discipline, it always happens, and everyone learns.

I’ve been working with the Baldrige Excellence Framework (BEF) for almost 20 years. In the beginning, I used it as a template. Need to develop a Workforce Management Plan that’s solid, and integrates well with leadership, governance, and operations? There’s a framework for that (Criterion 5). Need to beef up your strategic planning process so you do the right thing and get it done right? There’s a framework for that (Criterion 2).

Need to develop Standard Work in any area of your organization, and don’t know where to start (or, want to make sure you covered all the bases)? There’s a framework for that.

Every year, 300 National Examiners are competitively selected from industry experts and senior leaders who care about performance and improvement, and want to share their expertise with others. The stakes are high… after all, this is the only award of its kind sponsored by the highest levels of government!

Once you become a National Examiner (my first year was 2009), you get to look at the Criteria Questions through a completely different lens. You start to see the rich layers of its structure. You begin to appreciate that this guidebook was carefully and iteratively crafted over three decades, drawing from the experiences of executives and senior leaders across a wide swath of industries, faced with both common and unique challenges.

The benefits to companies that are assessed for the award are clear and actionable, but helping others helps examiners, too. Yes, we put in a lot of volunteer hours on evenings and weekends (56 total, for me, this year) — but I got to go deep with one more organization. I got to see how they think of themselves, how they designed their organization to meet their strategic goals, how they act on that design. Our team of examiners got to discuss the strengths we noticed individually, the gaps that concerned us, and we worked together to come to consensus on the most useful and actionable recommendations for the applicant so they can advance to the next stage of quality maturity.

One of the things I learned this year was how well Baldrige complements other frameworks like ISO 9001 and lean. You may have a solid process in place for managing operations, leading continuous improvement events, and sustaining the improvements. You may have a robust strategic planning process, with clear connections between overall objectives and individual actions.

What Baldrige can help you do, even if you’re already a high performance organization, is:

  • tighten the gaps
  • call out places where standard work should be defined
  • identify new breakthrough opportunities for improvement
  • help everyone in your workforce see and understand the connections between people, processes, and technologies

The whitespace — those connections and seams — are where the greatest opportunities for improvement and innovation are hiding. The Criteria Questions in the Baldrige Excellence Framework (BEF) can help you illuminate them.

Make Strategic Alignment Actionable with Baldrige

It can be difficult to focus on strategy when your organization has to comply with standards and regulations. Tracking and auditing can be tedious! If you’re a medical device manufacturer, you may need to maintain ISO 13485 compliance to participate in the supply chain. At the same time, you’ve got to meet all the requirements of 21 CFR 820. You’ve also got to remember other regulations that govern production and postmarket. (To read more about the challenges, check out Wienholt’s 2016 post.) There’s a lot to keep track of!

But strategy is important. Alignment is even more important! And in my opinion, the easiest way to improve alignment and get “Big Q” quality is to use the Baldrige Excellence Framework. It was developed by the Baldrige Performance Excellence Program, and is administered by NIST.

In Is Good, Good Enough for You? Taking the Next Step After ISO 9001:2015, former Baldrige Program Executive Director Harry Hertz outlines similarities and differences between ISO 9001:2015 and Baldrige. After examining complements, Harry shows how Baldrige helps organizations grow beyond the conformance mindset:

I have not shared all the commonalities of or differences between ISO 9001:2015 and the Baldrige Excellence Framework. Instead, I have tried to show the organizational possibilities of building on conformity assessment to establish a holistic approach for achieving excellence in every dimension of organizational performance today, with a look to the strategic imperatives and opportunities for the future. Baldrige helps an organization take this journey with a focus on process (55% of the scoring rubric) and results (45% of the rubric), recognizing that great processes are only valuable if they yield the complete set of results that lead to organizational sustainability… I encourage organizations that have not gone beyond conformity to take the next step in securing your future.

Read More Here! –>

What Kentucky Derby Handicapping Can Teach Us About Organizational Metrics

My Favorite (#10, Firing Line), from http://www.telegraph.co.uk/sport/horseracing/11574821/Kentucky-Derby-Simon-Callaghan-has-Firing-Line-primed.html

My Favorite (#10, Firing Line), from http://www.telegraph.co.uk/sport/horseracing/11574821/Kentucky-Derby-Simon-Callaghan-has-Firing-Line-primed.html. Apr 29, 2015; Louisville, KY, USA; Exercise rider Humberto Gomez works out Kentucky Derby hopeful Firing Line trained by Simon Callaghan at Churchill Downs. Mandatory Credit: Jamie Rhodes-USA TODAY Sports

I love horse racing. More specifically, I love betting on the horses. Why? Because it’s a complex exercise in data science, requiring you to integrate (what feels like) hundreds of different kinds of performance measures — and environmental factors (like weather) — to predict which horse will come in first, second, third, and maybe even fourth (if you’re betting a superfecta). And, you can win actual money!

I spent most of the day yesterday handicapping for Kentucky Derby 2015, before stopping at the track to place my bets for today. As I was going through the handicapping process, I realized that I’m essentially following the analysis process that we use as Examiners when we review applications for the Malcolm Baldrige National Quality Award (MBNQA). We apply “LeTCI” — pronounced like “let’s see” — to determine whether an organization has constructed a robust, reliable, and relevant assessment program to evaluate their business and their results. (And if they haven’t, LeTCI can provide some guidance on how to continuously improve to get there).

LeTCI stands for “Levels, Trends, Comparisons, and Integration”. In Baldrige parlance, here’s what we mean by each of those:

  • Levels: This refers to categorical or quantitative values that “place or position an organization’s results and performance on a meaningful measurement scale. Performance levels permit evaluation relative to past performance, projections, goals, and appropriate comparisons.” [1] Your measured levels refer to where you’re at now — your current performance. 
  • Trends: These describe the direction and/or rate of your performance improvements, including the slope of the trend data (if appropriate) and the breadth of your performance results. [2] “A minimum of three data points is generally needed to begin to ascertain a trend.” [1]
  • Comparisons: This “refers to establishing the value of results by their relationship to similar or equivalent measures. Comparisons can be made to results of competitors, industry averages, or best-in-class organizations. The maturity of the organization should help determine what comparisons are most relevant.” [1] This also includes performance relative to benchmarks.
  • Integration: This refers to “the extent to which your results measures address important customer, product, market, process, and action plan performance requirements” and “whether your results are harmonized across processes and work units to support organization-wide goals.” [2]

(Quoted sections above come from http://www.dtic.mil/ndia/2008cmmi/Track7/TuesdayPM/7059olson.pdf, Slide 31 [1] and http://www.baldrige21.com/Baldrige%20Scoring%20System.html. [2])

Here’s a snapshot of my Kentucky Derby handicapping process, using LeTCI. (I also do it for other horse races, but the Derby has got to be one of the most challenging prediction tasks of the year.) Derby prediction is fascinating because all of the horses are excellent, for the most part — and what you’re trying to do is determine on this particular day, against these particular competitors, how likely is a horse to win? Although my handicapping process is much more complex than what I lay out below, this should give you a sense of the process that I use, and how it relates to the Baldrige LeTCI approach:

  • Levels: First, I have to check out the current performance levels of each contender in the Derby. What’s the horse’s current Beyer speed score or Bris score (that is, are they fast enough to win this race)? What are the recent exercise times? If a horse isn’t running 5 furlongs in under a minute, then I wonder (for example) if they can handle the Derby pace. Has this horse raced on this particular track, or with this particular jockey? I can also check out the racing pedigree of the horse through metrics like “dosage”. 
  • Trends: Next, I look at a few key trends. Have the horse’s past races been preparing him for the longer distance of the Derby? Ideally, I want to see that the two prior races were a mile and a sixteenth, and a mile and an eighth. Is their Beyer speed score increasing, at least over the past three races? Depending on the weather for Louisville, has this horse shown a liking for either fast or muddy tracks? Has the horse won a race recently? 
  • Comparisons: Is the horse paired with a jockey he has been successful with in the past? I spend a lot of time comparing the horses to each other as well. A horse doesn’t have to beat track records to win… he just has to beat the other horses. Even a slow horse will win if the other horses are slower. Additionally, you have to compare the horse’s performance to baselines provided by the other horses throughout the duration of the race. Does your horse tend to get out in front, and then burn out? Or does he stalk the other horses and then launch an attack in the end, pulling out in front as a closer? You have to compare the performance of the horse to the performance of the other horses longitudinally  — because the relative performance will change as the race progresses.
  • Integration: What kind of story do all of these metrics tell together? That’s the real trick of handicapping horse races… the part where you have to bring everything together in to a cohesive, coherent way. This is also the part where you have to apply intuition. Do I really think this horse is ready to pull off a victory today, at this particular track, against these contenders and embedded in the wild and festive Derby environment (which a horse may not have experienced yet)?

And what does this mean for organizational metrics? To me, it means that when I’m formulating and evaluating business metrics I should take a perspective that’s much more like handicapping a major horse race — because assessing performance is intricately tied to capabilities, context, the environment, and what’s bound to happen now, in the near future.

Baldrige as a Micro-Framework for Organizational Planning

7737-thumbnailIn his June post, ASQ CEO Bill Troy shares the news that ASQ has recently been awarded the Excellence level of achievement in 2014 for the Wisconsin Forward Award, which is the state’s quality program that reflects the values of the Malcolm Baldrige National Quality Award (MBNQA). He asks what experiences others have had with using quality award programs as frameworks for reflection and continuous improvement.

I had a great experience in 2006 using the Baldrige Criteria to develop a Workforce Management Plan for the National Radio Astronomy Observatory (NRAO). We were tasked by the National Science Foundation (NSF) to prepare this report, which was definitely going to require us to dig deep and reflect on how we were managing our workforce, both at the operational level and in service of our strategic priorities. Unfortunately, none of us had ever done this before, so we were pretty much clueless as to what elements such a report would require, and what sorts of questions we might have to answer to ensure that we were approaching the question of workforce management strategically. The NSF wasn’t really able to provide guidance to us other than “you should use best practices from business and industry.” Fortunately, because I had been involved in the quality community for several years, I knew that the Baldrige Criteria might help us accomplish our goal. And it did!

In addition to using the questions in Section 5, Workforce Focus, we also integrated some of the elements of the “P” section of the Criteria to develop our plan. This helped us construct the initial draft in an intense week, rather than the weeks or months it might have taken if we didn’t have the Criteria to guide us. We captured our experience in a paper that was published to an Observatory Operations conference proceedings book in 2006, which you can read here for additional background if you need to construct a Workforce Management Plan. We also included the outline for our report (even though the content itself was confidential). The main point is that you don’t need to use or implement all sections of the Baldrige Criteria for it to yield immediate tangible value for your organization… consider applying the sections when you need them in your continuous improvement journey. I hope you find it useful!

Fast Quality via Dynamic Capabilities

(Image Credit: Doug Buckley of http://hyperactive.to)

In his November post, ASQ CEO Paul Borawski comments on the rapid pace of change and asks to:

“…share with us some of the ways the practice of quality is changing to meet the needs of faster, faster, faster. “

Fortunately, there has been a ton of research on this over the past decade or so, on the topics of dynamic capabilities and environmental dynamism. The notion of environmental dynamism just means things are changing pretty fast out there, and we have to respond to it. So I’ll focus on what dynamic capabilities are, one way to get them, and some resources and references where you can find out more.

Dynamic capabilities are defined as the skills, attitudes, and capacities within an organization to adapt existing operations to new conditions in the (competitive) environment.  An organization that has developed its dynamic capabilities is agile and adaptive, and it knows how to quickly and effectively adjust its operations to meet the needs of the market. For an overview, you might want to read Eisenhardt & Martin’s (2000) article in the Strategic Management Journal, titled “Dynamic capabilities: what are they?” or check out Teece’s new (2011) book, “Dynamic Capabilities”.

The cornerstone for developing dynamic capabilities seems to be a culture of intentional learning with a dual focus on gaining tacit knowledge (learning by doing) as well as explicit knowledge (learning by reading, conceptualizing, categorizing information) – just like what was recommended by Nonaka & Takeuchi (1995). Using system dynamics modeling, however, Romme et al. (2010) figured out that “there is no linear relationship between tacit knowledge, deliberate learning and dynamic capability” so really understanding how to leverage your learning capabilities to become more agile needs a few more years of research, it seems.

What this means is – don’t just rush out and start a giant initiative involving deliberate learning in your organization. Although this research uncovered a relationship between a learning orientation and dynamic capabilities, the investigators also found that positive outcomes are very sensitive to the level of environmental dynamism and the initial conditions of the organization (ie. its culture).

However, we have several models for quality systems that honor deliberate learning as a core value, such as the Baldrige Criteria! To keep up with the changing pace of the external environment, the best course of action is to commit to a proven system for continuous improvement There are also resources like Senge’s classic (1995) book The Fifth Discipline that illuminate the characteristics of a learning organization. While researchers are exploring these links to help us understand how to meet the pace of change more effectively, tried and true systems for continuous improvement with learning as a key component can provide a useful foundation for dealing with these challenges.

Quality Soup: Too Many Quality Improvement Acronyms

Note: This post is NOT about soup.

This post is, in contrast, about something that @ASQ tweeted earlier today: “QP Perspectives Column: Is the quality profession undermining ISO 9000?

In this February 2012 column, author Bob Kennedy examines reflected on a heated discussion at a gathering of senior-level quality practitioners regarding the merit of various tools, methodologies and themes in the context of the quality body of knowledge – what I refer to as “quality soup”. These paragraphs sum up the dilemma captured at that meeting:

Next came the bombshell from a very senior quality consultant: “No one is interested in ISO 9000 anymore; they all want lean.” In hindsight, I think he was speaking from a consultant’s perspective. In other words, there’s no money to be made peddling ISO 9000, but there is with lean and LSS.

I was appalled at this blatant undermining of a fundamental bedrock of quality that is employed by more than 1 million organizations representing nearly every country in the world. The ISO 9000 series is Quality 101, and as quality practitioners, we should never forget it.

If we don’t believe this and promote it, we undermine the impact and importance of ISO 9000. We must ask ourselves, “Am I interested in ISO 9000 anymore?”

When I see articles like this, and other articles or books that question whether a tool or technique is just a passing fad (e.g. there’s a whole history of them presented in Cole’s 1999 book) my visceral reaction is always the same. How can so many quality professionals not see that each of these “things we do” satisfies a well-defined and very distinct purpose? (I quickly and compassionately recall that it only took me 6 years to figure this out, 4 of which were spent in a PhD program focusing on quality systems – so don’t feel bad if I just pointed a finger at you, because I’d actually be pointing it at past-me as well, and I’m still in the process of figuring all of this stuff out.)

In a successful and high-performing organization, I would expect to see SEVERAL of these philosophies, methodologies and techniques applied. For example:

  • The Baldrige Criteria provide a general framework to align an organization’s strategy with its operations in a way that promotes continuous improvement, organizational learning, and social responsibility. (In addition to the Criteria booklet itself, Latham & Vinyard’s users guide is also pretty comprehensive and accessible in case you want to learn more.)
  • ISO 9000 provides eight categories of quality standards to make sure we’re setting up the framework for a process-driven quality management system. (Cianfrani, Tsiakals & West are my two heroes of this system, because it wasn’t until I read their book that I realized what ISO 9001:2000, specifically, was all about.)
  • Thus you could very easily have ISO 9000 compliant processes and operations in an organization whose strategy, structure, and results orientation are guided by the Baldrige Criteria.
  • Six Sigma helps us reduce defects in any of those processes that we may or may not be managing via an ISO 9000 compliant system. (It also provides us with a couple of nifty methodologies, DMAIC and DMADV, that can help us structure improvement projects that might focus on improving another parameter that describes system performance OR design processes that tend not to yield defectives.)
  • The Six Sigma “movement” also provides a management philosophy that centers around the tools and technologies of Six Sigma, but really emphasizes the need for data-driven decision making that stimulates robust conclusions and recommendations.
  • Lean helps us continuously improve processes to obtain greater margins of value. It won’t help you reduce defects like Six Sigma will (unless your waste WAS those defects, or you’re consciously mashing the two up and applying Lean Six Sigma). It won’t help you explore alternative designs or policies like Design of Experiments, part of the Six Sigma DMAIC “Improve” phase, might do. It won’t help you identify which processes are active in your organization, or the interactions and interdependencies between those processes, like an ISO 9000 system will (certified or not).
  • ISO 9000 only guarantees that you know your processes, and you’re reliably doing what you say you’re supposed to be doing. It doesn’t help you do the right thing – you could be doing lots of wrong things VERY reliably and consistently, while keeping perfect records, and still be honorably ISO certified. The Baldrige process is much better for designing the right processes to support your overall strategy.
  • Baldrige, ISO 9000, and lean will not help you do structured problem-solving of the kind that’s needed for continuous improvement to occur. PDSA, and possibly Six Sigma methodologies, will help you accomplish this.

Are you starting to see how they all fit together?

So yeah, let’s GET LEAN and stop wasting our energy on the debate about whether one approach is better than another, or whether one should be put out to pasture. We don’t dry our clothes in the microwave, and we don’t typically take baths in our kitchen sink, but it is very easy to apply one quality philosophy, methodology or set of practices and expect a result that is much better generated by another.

Bob Kennedy comes to the same conclusion at the end of his column, one which I fully support:

All quality approaches have a place in our society. Their place is in the supportive environment of an ISO 9000-based QMS, regardless of whether it’s accredited. Otherwise, these approaches will operate in a vacuum and fail to deliver the improvements they promise.

Lessons Learned Must Be Actionable

I spent last week at the 2011 International Conference on Software Quality in San Diego. On Wednesday, I hosted a session of lightning talks, where anyone in the audience can volunteer to “have the stage” for 5 minutes. Some people give presentations with slides, some give presentations without slides, others present ideas or comment on other conference sessions, and yet others lead a short discussion or ask the audience a question they would like to have answered. The best part about lightning talks – and what makes them so fun – is that when the timer (that everyone can see) reaches 0:00 the audience is required to loudly and aggressively clap the speaker off the stage! It’s a great way to get (usually) introverted scientists, engineers and techno-geeks actively involved in discussion.

One of the guys who presented a lightning talk (I unfortunately can’t remember his name) was an ISO 9000 auditor. He shared a little nugget with us that really stuck with me, and I’d like to share it with the rest of the world. His insight came from many of the quality systems he’s reviewed and audited, and he said he noticed this with some of the conference presentations as well. I paraphrase:

Lessons learned must be actionable. So many times I see people present their lessons learned, but they’re not doing anything as a result, or they haven’t figured out how to do something in response to the lesson – they’re not changing their processes, attitudes, or strategies in response to what they’ve uncovered. If it’s merely an insight, it’s just a lesson… if it’s an insight that results in changing or adapting behavior, then it’s a lesson learned!

It struck me that since lessons learned are such an important component of Baldrige assessments as well (via ADLI), organizations that are currently working on an application might also benefit from this perspective.

If anyone remembers this guy’s name, please post it. He was tall and had lots of fluffy white hair.

« Older Entries