Tag Archives: government

Adding a Little STEAM: On Risk, Failure, and the Quality of Higher Education

doug-fullsteamahead(Image Credit: Doug Buckley of http://hyperactive.to)

On Thursday, Morgan and I attended the first meeting of the Congressional STEAM Caucus on Capitol Hill… “a briefing on changing the vocabulary of education to include both art and science – and their intersections – to prepare our next generation of innovators to lead the 21st century economy.” STEAM seeks to promote creativity and innovation as key elements of Science-Technology-Engineering-Math (STEM) education. The “A” in STEAM reflects the growing awareness that art and design can be effective enablers, catalyzing the kind of creative thinking and openness to risk-taking that is critical for success in STEM. Although initially conceived by John Maeda of the Rhode Island School of Design (RISD), the idea is catching on, and there are now many supporters scattered across the country.

Why is STEAM gaining steam? As expressed by the panelists at the Caucus, many now recognize that students just aren’t being prepared by our educational system to be creative, independent thinkers who are willing to take risks and experiment. On View from the Q this month, ASQ CEO Paul Borawski raised the same issue, citing the recent ASQ STEM careers survey of young adults: students know that you have to experiment (and sometimes fail) to be successful in STEM, and yet they admit that they’re afraid to take those risks.

Paul asks:

I want to know how you— the quality professional — handle failure in the workplace. Do you try again until you find a solution? Are you penalized for failure? Or do you avoid it altogether? How much risk are you willing to take to find solutions to quality challenges?

One of the reasons Morgan and I started the Burning Mind Project is that we wanted our students to feel comfortable taking risks, and accept full personal responsibility for the evolution of their own learning process. We use techniques like “choose your own grade” and “grading by accumulation” to encourage risk taking, eliminate penalties for “traditional failure,” and shift the focus to understanding and embracing quality standards on a personal and visceral level. We like what STEAM represents because the approach embraces divergent thinking, and thus innately supports the development of positivity and emotional alignment in an educational setting, which (a la Fredrickson) broadens the ability of students to see new opportunities and possibilities

That is, to invent (and ultimately – by understanding how to create value for others – innovate).

Your weaknesses may actually be the keys that reveal your secret strengths. As educators, it’s up to us to help facilitate this process of discovery, not to fail our students for engaging in it. As business leaders, this can be more difficult because many of us have convinced ourselves that we should only have to pay for those things that “pay off.” However, the lessons learned from traditional failure are often the most empowering, even though our ability to honor them may be weak.

Quality in Government: Maintenance vs. New Development

This month, ASQ CEO Paul Borawski explores the issue of quality within government:

If you have ideas of what it would take to make quality in government the rule rather than the far-too-seldom exception, please tell us. If your view on the prevalence of quality in government differs, please share your view as well.

Although it’s been a busy month, and I just recently returned from the ASQ World Conference, I admit I’ve been putting off my response to this topic. I feel like I should have something to say because I’ve worked in the government – as part of a national laboratory – for more than a decade. But what insight could I bring to this story? I’ve really been challenged to figure out what it is.

Finally, in a conversation with a friend who’s dealing with a philosophical shift in thinking about quality at his company, I remembered one of the biggest strategic challenges that we had encountered in this environment: Maintenance vs. New Development.

Funding is always a driving factor in planning for activities that fall within the federal domain. And there’s never enough of it. As a result, there is always tension between allocating funding to maintaining and continuing the quality of existing functionality vs. innovating and creating new functionality. If you want to push forward and create new value, you’re going to have to shave resources from doing what you do now. As a result, relaxing standards for quality is always a consideration. It’s a modern-day Devil’s Pact.

I believe that employees in the government sector are not unaware of the requirements for achieving quality; indeed, most of the people I worked with were VERY aware of what was required to achieve high quality in products and processes. But because of continued pressure to move forward and innovate, which is (of course) required to remain modern and relevant, the external pressure regularly led to unconscionable compromises.

So to me, the question is… when are we, as constituents, going to pressure the funding agencies to allocate the appropriate resources to achieve consistently high quality?

How to Achieve Transparency: One Approach

Point 1: Transparency in business and in government means that you know what’s going on (or can find out). You have access to information about the organization’s processes and results, it is clearly presented, and it is understandable. It is difficult, if not impossible, to understand accountability when transparency does not exist. In the emerging ISO 26000 standard for social responsibility, both transparency and accountability are important.

Point 2: In data management, we struggle with the concept of provenance: how to track what happened to your data at every step of its journey – from being collected, to being operated upon by a host of processes and algorithms, to being evaluated, analyzed and visualized.

McClatchy reports today that the U.S. government is having problems with both. In “Where did that bank bailout go? Watchdogs aren’t entirely sure”, Chris Adams describes the murkiness of the issue:

Although hundreds of well-trained eyes are watching over the $700 billion that Congress last year decided to spend bailing out the nation’s financial sector, it’s still difficult to answer some of the most basic questions about where the money went.

Despite a new oversight panel, a new special inspector general, the existing Government Accountability Office and eight other inspectors general, those charged with minding the store say they don’t have all the weapons they need. Ten months into the Troubled Asset Relief Program, some members of Congress say that some oversight of bailout dollars has been so lacking that it’s essentially worthless.

Bottom line: achieving transparency requires successfully managing provenance. But in the case of the bailout, are transparency problems an information technology issue, or a policy issue?

Baldrige-Based Health Care Reform?

Today’s Washington Post has an article by Minnesota senator Tim Pawlenty on the effective design of national health care reform, entitled “To Fix Health Care, Follow the States”. He argues that the federal government should model its initiatives after successful state-based systems that link outcomes to value:

In Minnesota, our state employee health-care plan has demonstrated incredible results by linking outcomes to value. State employees in Minnesota can choose any clinic available to them in the health-care network they’ve selected. However, individuals who use more costly and less-efficient clinics are required to pay more out-of-pocket.

Not surprisingly, informed health-care consumers vote wisely with their feet and their wallets. Employees overwhelmingly selected providers who deliver higher quality and lower costs as a result of getting things right the first time. The payoff is straightforward: For two of the past five years, we’ve had zero percent premium increases in the state employee insurance plan.

Minnesota has also implemented an innovative program called QCARE, for Quality Care and Rewarding Excellence. QCARE identifies quality measures, sets aggressive outcome targets for providers, makes comparable measures transparent to the public and changes the payment system to reward quality rather than quantity. We must stop paying based on the number of procedures and start paying based on results.

Pawlenty also notes that healthcare reform should not focus solely on access to health care, but also the cost and quality of the service – that is, the value that is delivered. The Malcolm Baldrige National Quality Award (MBNQA) Criteria for Performance Excellence provides a framework that has been tailored over 20 years by a huge collaboration of experts to help business, industry and the government better solve this kind of “wicked problem”. The Minnesota solution sounds like it has applied concepts very similar – if not identical – to those presented by the Baldrige Criteria.

When will the government employ the successful problem-solving frameworks it has developed itself (e.g. MBNQA) to solve its most pressing problems?

New Quality Manager for Obama: Zients replaces Killefer

obama-headIn his weekly radio address, President Obama announced today a renewed intent to cut wasteful spending, and the upcoming announcement of even more decisive cuts. He also noted the appointment of Jeffrey Zients, a former executive and Director of Sirius XM, as the Obama Administration’s Chief Performance Officer. His official title will be deputy director for management of the Office of Management and Budget. Zients replaces Nancy Killefer, who rescinded her nomination in March.

There have been criticisms of Obama’s handling of the budget so far. For example, critics bristle at the thought that Obama approved the fiscal year 2009 budget with earmarks (this is covered in an article by George Stephanopoulos on March 1, “Obama Will Sign Omnibus Despite Earmark Pledge”). But the fiscal year 2009 budget – executed in March 2009 – is retroactive. It is intended to cover operations of the government and all government-funded agencies (including research facilities, and university-driven research and development) from October 1, 2008 through September 30, 2009. Failure to pass that budget would have meant a swift and immediate crisis, catalyzing a domino effect of layoffs in the highly specialized industries. This could have a nontrivial and long-reaching impact on national competitiveness by depressing not only technological innovation, but also by cutting off practical opportunities for university students and researchers to contribute to innovation as they receive mentorship and training.

More about Zients from the White House Blog:

Zients has twenty years of business experience as a CEO, management consultant and entrepreneur with a deep understanding of business strategy, process reengineering and financial management. He served as CEO and Chairman of the Advisory Board Company and Chairman of the Corporate Executive Board. These firms are leading providers of performance benchmarks and best practices across a wide range of industries.  Currently, he is the Founder and Managing Partner of Portfolio Logic, an investment firm focused primarily on business and healthcare service companies.

Systems Thinking Predicts Economic Collapse in 21st Century

According to some researchers, it’s the end of the world as we know it – sometime this century, in fact. Economists and policy researchers have actually envisioned it coming for about three centuries, though.

The most recent tap on this subject came on March 7, 2009, when journalist and Hot, Flat, and Crowded author Thomas L. Friedman published an Op-Ed in the Washington Post, entitled “Is the Inflection Near?” He describes how the economic, financial and political systems that we have established in the world – particularly in the west – are inherently unsustainable, and that in order to achieve a truly green world, our fundamental systems for living life must shift:

Let’s today step out of the normal boundaries of analysis of our economic crisis and ask a radical question: What if the crisis of 2008 represents something much more fundamental than a deep recession? What if it’s telling us that the whole growth model we created over the last 50 years is simply unsustainable economically and ecologically and that 2008 was when we hit the wall — when Mother Nature and the market both said: “No more.”

We have created a system for growth that depended on our building more and more stores to sell more and more stuff made in more and more factories in China, powered by more and more coal that would cause more and more climate change but earn China more and more dollars to buy more and more U.S. T-bills so America would have more and more money to build more and more stores and sell more and more stuff that would employ more and more Chinese …

We can’t do this anymore.

djiasm1

What would you think if I told you that this was actually not a new idea, and that the notions Friedman presents were determined by a simulation done over thirty-five years ago? Furthermore, what if I let you in on the fact that people have been thinking about this conundrum since the late 1700’s? It may sound outlandish, but in this case, truth is stranger than fiction.

The simulation that I refer to was done in 1972, with a model called World3 which was coded in the object-oriented Modelica environment. It’s the subject of the Club of Rome commissioned study called “The Limits to Growth” (full text is here). Although the model has received criticism for some of its assumptions, a redaction in 2002 upheld many of the outcomes of the model. In 2009, Dr. Dennis L. Meadows (who directed this research) was awarded the 25th Japan Prize from The Science and Technology Foundation of Japan. Recall that the Japanese were the ones who initially recognized Dr. W. Edwards Deming for his contributions to revitalizing the economy – decades before the Americans embraced Deming’s teachings – and spawned the quality revolution in U.S. business in the late 1970’s and 1980’s that has embossed the landscape of how we do business today. From the Japan Prize announcement:

Dr. Dennis L. Meadows served as Research Director for the project on “The Limits to Growth,” for the Club of Rome in 1972. Employing a system simulation model called “World3,” his report demonstrated that if certain limiting factors of the earth’s physical capacity – such as resources, the environment, and land – are not recognized, mankind will soon find itself in a dangerous situation. The conflict between the limited capacity of the earth and the expansion of the population accompanied by economic growth could lead to general societal collapse. The report said that to avert this outcome, it is necessary that the goals of zero population growth and zero expansion in use of materials be attained as soon as possible. The report had an enormous impact on a world that had continued to grow both economically and in population since World War II.

We also have a rich literature dating back centuries that has studied the relationships between population, environment and technology. In the 1700’s, English economist Thomas Robert Malthus studied these relationships in terms of the projected effects of uncontrolled population growth. “Before Malthus, populations were considered to be an asset. After Malthus, the concept of land acquisition to support “future large populations” became a motivating factor for war.” (citation) The 20th century Boserupian Theory of Ester Boserup, in contrast, suggests that advances in technology will drive the capacity of the world to support population. Researchers like Steinmann & Komlos (1988) have simulated the interplay between both paradigms over time and suggest that there is a cyclical dominance. (I note that references to Malthus and Boserup, let alone Meadows’ World3 model, are rarely on the lips of policymakers.)

In my opinion, it is not climate change we should be worried about per se, but the social, economic and global political system that drives human interactions with each other and with the environment. Climate change may be a symptom, but it is just a tracer for the attitudes of unbounded material growth that are contributing to the effects (if you want to learn about climate change and policy, Prometheus is a good place to start – my point is not to argue the merits of “is it” or “isn’t it” happening because others including Pielke, Jr. do that very well). Regarding climate change, we need to decode what the data is trying to tell us about how we’ve structured our large-scale systems of interaction with one another – rather than merely trying to control our personal “carbon footprints” or recycle more (though these may be important ingredients in the solution).

There is nothing new under the sun. Only today, the forces of production, consumption and population have metamorphosed into a crisis of sustainability – a “perfect storm” to test our ability to live and work in the limit case.


Steinmann, Gunter & Komlos, John (1988). Population growth and economic development in the very long run: a simulation model of three revolutions. Mathematical Social Sciences, Vol. 16, No. 1, Aug 1988. 49-63 pp. Amsterdam, Netherlands.

Quality Metrics for Policy Evaluation?

The Center for Environmental Journalism (CEJ) recently posted an interview with Roger Pielke, Jr., an authority on (as CEJ calls it) “the nexus of science and technology in decision making”. The interview seeks to provide a perspective on how journalists can more accurately address climate change in the context of public policy over the next several years.

I was really intrigued by this part:

Reporters could help clarify understandings by asking climate scientists: “What behavior of the climate system over the next 5-10 years would cause you to question the IPCC consensus?” This would give people some metrics against which to evaluate future behavior as it evolves.

Similarly, you could ask partisans in the political debate “What science would cause you to change your political position on the issue?” This would allow people to judge how much dependence partisans put on science and what science would change their views. I would be surprised if many people would give a concrete answer to this!!

For the first question, Pielke is recommending is that we take an approach conceptually resembling statistical process control to help us figure out how to evaluate the magnitude and potential impacts of climate change. (Could we actually apply such techniques? It would be an interesting research question. Makes me think of studies like Khoo & Ariffin (2006), for example, who propose one method based on Shewhart x-bar charts to detect process shifts with a higher level of sensitivity – only tuned for a particular policy problem.) For the second question, I’m reminded of “willingness to pay” or “willingness to recommend” or other related marketing metrics. I’m sure that one of these established approaches could be extended to the policy domain (if it hasn’t been done already).

« Older Entries