top of page

Translating Leadership principles to Operational performance

The following is not precisely in the column of Career Transition. I include this because this is the answer that one successful grad supplied when I asked him "how have you applied your experience as a Navy Division Officer to the management of a highly complex production facility." He replied to the effect that one can trust the intent and skills of the production crew if they understand the expected principles of engineering management and related problem solving. He referred to these principles as The Eight. I did not copy-edit as I do not want to influence the immediacy of their intent.

Q 1 - before you attack a problem, you really need to understand how the ’system’ works. If you don’t understand it, how can you fix it? This question actually saved us this past week - we had an oil leak on a very important piece of equipment. Conventional thought would be to tighten it up! but the way this connection was made, you could easily cock the internals and make the leak a gap that you couldn’t recover from if you treated it as a normal bolted connection. Have to understand the design!

Q 2 - More often than not, people take heresay as a fact. Often times people present information as facts, when the information is not a fact. If it can’t be validated, it is not a fact and should be treated differently. What people think or heard or heard someone else say is a good place to start, but it will lead to a bad result if you make a decision based solely on it. You need facts to get to the right answer.

Q 3 There is always an expert, and just because you don’t know that person doesn’t mean you shouldn’t reach out to find them and get their input. Expert judgment is still judgment, but experts can lead you to understanding the design and facts. Also, this sounds nuts, but you have to talk to the people who are actually there working on the issue. Imagine a detective that doesn’t interview first hand witnesses. You’d be surprised how often this gets skipped because it is hard to find that person and people are hesitant to talk to people they don’t know.

Q 4. If you assume you are going to get it wrong and are responsible for the consequences, then you get the right pucker factor and don’t treat the problem lightly. You don’t treat serious matters lightly. Too often people don’t realize that they can screw something really big up with their input. Our risk model that the question refers to essentially says, set probability to one…ie, only think about the consequence when you first approach a problem rather than saying ‘oh that will never happen.’ Assume it will.

Q 5 - goes w/ Q4, if you are dealing with something that is a big deal, use every method under the sun to make sure you get it right. If it isn’t a big deal, then you don’t have to spend as much time on it. Our ’tools’ refers to the level of technical rigor we put into it…checks, peer checks, challenge meetings, third party input, additional analysis, oversight, etc. All things that earn a good result. Use them all if you need a good result.

Q 6 - normally people’s vision is too narrow and they only see one aspect of the problem. How many times do you hear…we actually made that problem worse by trying to fix it. Usually because they didn’t consider the bigger picture. There are always unintended consequences. Deal with them up front.

Q 7 - people are also blind to data that doesn’t support conclusions that they spent a lot of time coming up with. The Challenger tragedy was a perfect example where contradictory data was not reconciled. Q7 is also written assuming contradictions exist, because most often they do. When you explain your thought process behind contradictory information, it opens your logic to challenge, which is a good thing. If you don’t ask the question then you can bury data that may be key to the whole problem.

Q 8 - similar to Q7, often we like the answers we come up with, and don’t look at other alternatives. In organizations, sometimes people don’t want to oppose what a room full of people came up with. Devil’s advocate or ‘red team’ is nothing new, but unless you make it a key part of your thinking, often it doesn’t happen. Stay humble, and realize you could be wrong and challenge others to find the flaws that you can’t see because you are too close to the problem.

BZ to Nemo.

Recent Posts
Search By Tags
Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page