An environmental analysis (or environmental assessment) is a decision-making tool, often applied in technology management to characterize the forces impacting an emerging technology or a new or existing product. The environmental analysis can help you determine the effects of a proposed project or policy, and to proactively assess the impacts of a developing or emerging product or discipline. An environmental analysis also provides a really useful structure for learning about an area or a theme new to you or your company and identifying what the “state of the art” is (e.g. petascale computing, nanotechnology, innovative composite materials).
To conduct an environmental analysis, you should investigate and outline:
CONTEXT. The technology of interest and the context in which it is/to be used
CHALLENGES. The challenges that are presently identifiable; what you know, and how it compares and contrasts with the unknowns
EXTERNAL ENVIRONMENT. How the competitive environment impacts the scenario. This can be done via SWOT analysis (strengths, weaknesses, opportunities, threats) and/or by examining Porter’s (1980) Five Forces (supplier power, barriers to entry, threat of substitutes, buyer power, degree of rivalry)
INTERNAL ENVIRONMENT. How themes influence and affect the scenario (e.g. via PEST analysis – political, economic, socio-cultural, technological impacts)
ALTERNATIVES. Examine alternatives to the scenario being evaluated, and investigate what criteria (e.g. values, beliefs, project constraints, technical constraints) might be used if you will choose between competing alternatives in the future
Where can you get data for an environmental analysis? In addition to searching through resources from newspapers, magazines and trade journals, check the following:
In the Growth Competitiveness Index (GCI) issued by the World Economic Forum, which is measured for more a hundred countries every year, there are four dimensions of global competitiveness routinely assessed: institutions, infrastructure, the macroeconomic environment, and health and education.
Because technology has the potential to impact productivity at many levels, and because it is embedded in each of these areas, the effects of technological change are implicit in macroeconomic measures of competitiveness.
National Science Foundation Solicitations for Research Proposals – The NSF solicitations are an excellent place to learn about the state of the art in various fields. The solicitations explain what topics are the most interesting to the experts today, and what they are willing to pay to know more about. Often, the solicitations will explain the most recent trends that may be difficult to ascertain from the industry and academic literature.
The reason this interests me is that Google is using a tracer – examining search patterns in terms of where the searches are originating from geographically to infer how diseases might be spreading. They are not tracking diagnosis information or other “hard” data which would affirm the presence of disease, only recognizing that people will tend to be more interested in the flu when they’re trying to figure out whether they have it! (The most useful aspect of the search data is that it appears to serve as a leading indicator for the CDC data, which has a two week lag.)
Are any companies out there using patterns in their Google searches on their websites to infer what consumers or constituents are most interested in at any given time? It would be interesting to see what other “real” things Google search data can serve as a leading indicator for. I could see this as a useful technique for diagnosing the “voice of the customer” in a novel way.
Today is Monday, November 3rd. Election Day, when the U.S. picks its 44th President, is less than 24 hours away. And as of Saturday night, just 72 hours before the polls close, 27 MILLION early votes and absentee ballots had already been placed. This represents almost 13% of the total population that’s eligible to vote this year, and 22% of all the people who voted in 2004. (The numbers are from Michael McDonald’s dataset; he is an associate professor specializing in voting behaviors. The VEP column in his table represents the total number of eligible voters over 18 and not in prison, on probation or on parole. )
Remember, long ago (or maybe more recently) in statistics class, when you learned that you could learn a lot about the properties of a population by taking a random sample? Having approximately 20% of the vote already in from a sample expected to be between 120 and 150 million is extremely significant – remember, these are actual votes, and not someone’s report of what they may or may not vote “for real”. Assuming that systematic errors have not played a large part in early voting behavior, the winner is already determined, and we just don’t know it yet.
“We go around in a circle and suppose, but the answer lies in the middle and knows.” –Robert Frost
However, ignoring systematics is indeed a significant assumption, one that’s discussed by Peter Norvig, Director of Research at Google, in his excellent explanation of the accuracy of polls. Which is why the campaigns are rightly pushing EVERYONE to get out there and vote – to mitigate the impact of systematic errors. (After all, you don’t want to stop voting if the other side keeps voting.) So if you are reading this and you haven’t voted yet, DO IT! Go vote!
I see three potential scenarios:
Breakthrough: the decision has already been made, is accurately reflected in the actual sample of early votes, and the votes placed on Tuesday won’t change the pattern at all. The additional votes amount to nothing (other than beating down or insuring against systematic error).
Breakdown: a flood of voters overwhelm the capacity of the voting stations, the voting machines just can’t handle it, and the polls close before everyone can get through the door and get an error-free ballot submitted. I think there might be social unrest if this is the case.
Breakout: a single demographic (or two) comes out in droves to vote on Tuesday, breaking out of wherever they’ve been hiding, and shifting the balance of the race in a huge upset. Certainly a possibility.
Whatever happens, the 2008 Election reflects a mythical struggle between structure, order, hierarchy, stability, and tradition on one side; revolution, dynamism, community, collaboration, and exploration on the other. One potential leader clearly has more experience on one side of the coin, and the other potential leader is stronger in the opposite area. Each candidate has plenty of experience on the side of the coin he’s promoting. The difference will be how the voter determines which standard the candidate’s experience should be measured against!
Why am I interested in all this? First, because polling is measurement, and quality assurance requires effective measurement. But more importantly, because the themes of this election parallel the struggle that many organizations face with quality and innovation – getting the job done reliably is paramount, and experience is important, but we cannot lose sight of the way we need to reinvent ourselves and our companies to continue being competitive. Accepting the wilder side, where structures are not sacrosanct and community is more productive than hierarchy, is hard to swallow.
There is not one person I know who doesn’t have to deal with complexity in their work or personal lives. Either the subject matter they work with is complex or specialized, the politics are stressful, or there’s just too much information to process and the information overload becomes oppressive.
Fortunately, dealing with complexity has been the subject of recent research and there are some lessons to report. These lessons revolve around the importance of “sensemaking” – a term coined by Karl Weick to reflect concerted effort made to understand the relationships between people, technologies, places and events, how they behave, and how to respond to their current and future behaviors.
Weick and Sutcliffe (2001) studied the environments of aircraft carriers, nuclear power plants, and fire fighting teams – situations where the stakes are high, and on-the-job errors can be costly and dangerous. These are the workplaces “that require acute mindfulness if they are to avoid situations in which small errors build upon one another to precipitate a catastrophic breakdown.” Snowden (2003), who worked at IBM, said that most environments just weren’t like that – so it would be difficult to generalize how those workers dealt with complexity to the typical office.
Browning & Boudes (2005) compared and contrasted these two studies to try and define some guiding principles for how people make sense of complex scenarios. Here’s a synopsis of what they found:
1: “Acknowledging and accepting complexity is better than placating it with planning models. There are simply too many situations where the standard tools and techniques of policy-making and decisionmaking do not apply.” The lesson: move away from a “training culture of obedience” and towards a “learning culture of understanding and action”. (This will be a challenge, because it requires humility and trust.)
2: “It is important to acknowledge failure and learn from instances of it.” Self-explanatory, but as above, this requires humility and ego-transcendence. For people to learn from failure, they must first confront it. I can think of some “death march” software projects where failure “never” occurred!
3: “Self-organization is an order that has no hierarchical designer or director.” Browning & Boudes cite Peter Drucker’s idea that “in the Knowledge Economy everyone is a volunteer.” Volunteerism means that everyone is fundamentally qualified to do some part of a project, that roles shift to accommodate existing talents and the development of new talents, and that if a person isn’t working out in one role, they can move to another. In volunteer contexts, leadership is often dynamic, where everyone serves for a time as the leader and then moves on.
4: “Narratives are valuable for showing role differentiation and polyvocality.” There are many voices, and many perspectives, and these can be effectively communicated when people relate to one another through stories and examples. Diversity of opinion and distance from a problem can help raise solutions – but the people who know the situation best, and who are closest to it, must be open to the possibilities. (Easier said than done.)
5: “Conformity carries risks, and thus we need diverse inputs when responding to complexity.” The authors suggest that learning should be valued over order and structure. This does not mean that order is unnecessary, but that any order that is established should be viewed as temporary – a framework to serve as the basis for new learning.
6: “Action is valuable under conditions of complexity.” Acting, learning, and adjusting is more effective and more productive than trying to identify the right solution, plan it, and then do it. Action builds identity and creates meaning; “the most useful information is contextual and need-driven.”
7: “The focus is properly on small forces and how they affect complex systems.” The authors suggest that focusing on small wins and keeping things simple is a strategy for success. I love the following story that they describe, so I have to include it here:
Snowden relates a story of two software development groups – one expert, the other a lesser group – whose experience in programming was limited to the fairly routine requirements of payroll systems. In a competitive exercise between these two groups for learning purposes, the experts created a plan for an elegant piece of code that would take two months to develop. The payroll programmers, meanwhile, downloaded a “good enough” list from the Internet that cost five dollars (Snowden, 1999). Thus one feature for smallness for Snowden is the decisions that can be made that allow the group to move on – to accept “good enough,” implement it, and then see what that action means.
8: “It is important to understand the irony of bureaucratic control.” Producing data and information can be overwhelming; innovative achievements can be suffocated by measurement, evaluation and control. We assume that organizations are deterministic and behavior can be programmed, using carrots and sticks. But people are neither rational nor linear, and this can be both the strength of the organization and its Achilles heel.
New organizational models are needed to be able to follow this advice. So many of the projects I see today are like solving mysteries: you know what needs to happen (because you have requirements documents), there are motivated people all around you who really want the mystery solved, someone is footing the bill (at least for a while), and everyone wants to make progress. But because the process is full of learning new information, finding clues, and relating to people’s needs – it’s impossible to put a timeline on it. This frustrates project managers (and their bosses) immensely, because their jobs revolve around bringing some order to complexity and setting expectations in terms of time and budget.
Can you imagine a crime show like Law & Order starting out with a murder – and then you cut to the scene where the police chief says: “We need to solve this murder immediately… all the clues need to be found by next Friday, all the suspects interviewed by the following Wednesday, and then I want a report on my desk with the solution three weeks from today!”
It sounds funny, but this is exactly what plenty of managers are trying to do right now in their own organizations. The “corporate consciousness” will support this kind of behavior until a rational alternative is found.
Is your body programmed to get cancer? How about Parkinson’s, or multiple sclerosis, or diabetes? With 23andme.com, you could find out if you are predisposed to these or 19 other diseases simply by spitting in a cup. In addition, you can easily donate your data to further medical research. These are the same kinds of tests used to diagnose cystic fibrosis and Down’s syndrome prenatally, and have earned a measure of trust within the medical community. Now, genetic testing is a service offered to just about anyone – even outside the hospital or doctor’s office. (It’s not free, but what’s $399 for a little peace of mind – assuming that’s what you end up with?)
That’s not where the fun ends, though. If you’d like to meet other people who share similar genetic characteristics, no problem – one someone writes a mashup between a social networking site like Facebook and your 23andme genome data, it will be as easy as adding a new tab to “My Networks”.
Diagnosing genetic profiles is a high-stakes game for two reasons: quality and ethics. The quality of the genetic testing procedures, the calibration of the lab instruments, and the handling of the samples are all aspects of the process that must be carefully controlled. Failures can result in minor consequences (e.g. a sample is lost and the subject has to spit in a second cup, then wait longer to get the results) or major consequences (someone is incorrectly told that they are predisposed to a life-threatening condition, and because they can’t bear to get it, they commit suicide). Fortunately, quality control is taken very seriously in this domain, as the QC process from Kalman et al. (2005) indicates.
The ethical issues can be a direct result of the capability of the process to produce accurate and reliable results. But is it appropriate to offer this service on such a large scale? What are the consequences of knowing versus not knowing, on both a personal scale and on the scale of society? Without probing these issues in more depth, and understanding how laws and regulations could counterbalance potentially austere social consequences, it’s a risky service to offer. Some people might be able to handle the information; some might not. (Imagine being in high school and getting dumped by someone who checked your genome, found out you’re predisposed to psoriasis, and thinks that’s gross.)
Standards represent “accepted ways of doing things” and can be defined as sets or features of characteristics that describe products, processes, services or concepts. (National Research Council, 2005) Alternatively, standards can be described as “rules, regulations, and frameworks intended to enable individuals, groups, organizations, countries and nations to achieve goals.” (Spivak & Brenner, 2001) According to these authors, there are five types of standards: 1) physical or measurement standards; 2) standards for symbols, meanings and language; 3) standards for products and processes, including test validation, quality assurance, and operating procedures; 4) standards for systems (including management systems and regulatory systems); and 5) standards for health, safety, the environment and for ensuring other consumer interests. Standards may be compulsory (as in the case of legal and regulatory standards, such as the Sarbanes-Oxley requirements for financial reporting) or voluntarily adopted (as in the case of product interface standards such as USB for computer peripherals).
Because standards are a mechanism used by social groups to promote norms, and to facilitate the creation of artifacts that advance human capabilities, standards are technologies. (Recall that Random House defines technology as “The sum of the ways in which social groups provide themselves with the material objects of their civilizations.”)
What can standards do for you and your projects? The importance of standards can be explained in terms of the eight goals that standards can be used to achieve (Blind, 2004):
Compatibility – facilitating cooperation between people, processes, systems, or technologies
Communication – facilitates information flow between people, processes, systems or technologies
Conformance – provides a common basis to establish competency or define excellence
Continuous Improvement – helps organizations leverage the “lessons learned” that are imbued within the standards
Establish Order – promotes consistency, reliability and effective communications
Prescribe Behavior – enables organizations to reduce variability between people and systems
Promote Social Order – establishes a basis for legal and ethical behavior
Establish Shared Meaning – provides a context for communication, compatibility and conformance
A standard is “working” if it accomplishes one or more of these goals (depending, of course, on how relevant and pertinent the goals are to the project that is being pursued). For example, it’s probably not very important for two computer devices to “promote social order” if they need to communicate. But it’s definitely important for people.
Blind, K. (2003). The economics of standards. Kluwer Academic Press.
National Research Council. (1995). Standards, conformity assessment and trade. National Academy Press.
Spivak, L. & Brenner, F.C. (2001). Standardization essentials: principles and practice. Marcel Dekker.
“You can only manage what you measure.” Yeah, yeah, yeah. I’ve heard this statement so often that I’m not sure whether it’s a valuable heuristic or just a cliché. What I do know is that I (personally) tend to like measuring for two reasons: 1) I’m not good at remembering things that happened in the past, and so measurement provides me with an objective line in the sand that I can use to gauge improvement, and 2) for some unknown visceral reason, it just feels wrong not to do it.
But then I read the story of Ari Derfel, who I call the “Trash Guy” – and this completely shifted the way I look at measurement. Motivated by a dinner conversation with friends, he decided that he was going to “keep” his trash for one year to see what he was throwing away. Starting in December 2006, he stashed his trash in the basement (cleaned, of course) just to see what refuse he (personally) was producing. After some dedicated time measuring and observing his trash-producing habits, he found that his consumption habits and spending habits changed!
Here’s what he says about his “lessons learned”:
Some of the things that I learned are interesting. I learned what I spent most of my money on because by watching a pile of trash grow over a year, I really began to see, “Wow! I spent it on that food, on this electronic, on that item,” and my consumption habits and spending habits became really clear.
The second thing that I learned really powerfully, in addition to what I spend my money on, is what I put in my body. I started to see things pile up. The most commonly talked about are little stacks of pints of non-dairy ice cream that I would eat: pint one, pint two, pint three, pint 12, pint 15. I started to see what lives in my body, and what kind of fuel I’m choosing to put in my body.
Then, I learned where most trash seems to be made, food packaging. Of all the different things that could be making trash, that was really profound to me because I realized that it’s not that big of a problem. I mean, we’ve only been packaging food for 50, 60 or 75 years. So, if that’s the small amount of time in which the problem was created. We should be able to undo the problem. Those are three of the primary things that I learned.
Then I learned that if I composted everything organic, which I did, trash doesn’t smell. That’s an awesome thing to learn because most folks think of the dump or, trash and they think it smells really disgusting. I realized, “Wow! That’s not the case.” If we properly treat all of our organic matter, that’s not going to be a problem.
Measurement can motivate people – in a profound way – to change their behavior. After reading about Ari’s trash escapades, I started thinking about the trash I produce. I’m looking at all those plastic bottles I drink water from with a little disdain now. I’m saving a lot more of those empty Country Crock containers. I’m far more conscious of the proliferation of plastic bags in modern food packaging. I’m really looking fondly at that burlap bag with the 15 lbs of rice in it… I can use that for something else later, when the rice is gone. I’m actually cooking dinner more – from scratch. And I haven’t even started seriously measuring yet.