Session Notes: Workshop 2: The ISO 56007 Paradigm Shift – Addressing the Blind Spot in Project Risk Management
Executive Summary
Prof. Dr. Raphael Cohen presented the ISO 56007 paradigm for addressing critical blind spots in project risk management, focusing on the distinction between unknowns and risks and the need for standardized decision criteria. With industry statistics showing only 12-24% of projects delivered on time and budget, Cohen argued that better upfront calibration through structured frameworks like his IPOP model and the 11-question ISO standard could dramatically improve project success rates.
Full Notes
The Project Failure Crisis and Root Causes
Cohen opened with sobering industry statistics: 30% of projects are stopped before completion, 30% run behind schedule, and 50% exceed budget. This leaves only 12-24% of projects delivering on time and on budget across all industries. When he polled the audience, most indicated their organizations performed similarly or worse than these benchmarks. Cohen's analysis traced this failure rate to poor project calibration at the decision stage, where organizations launch projects without proper evaluation frameworks. His core thesis is that 'every time you launch a project that should not have been launched, you're actually slowing down because it's a burden for everybody to carry.'
Decision Criteria Standardization Gap
A central problem Cohen identified is the lack of standardized decision criteria across organizations. In 15 years of training, he has never encountered two people from the same organization who used identical project selection criteria, nor has anyone provided a complete framework. His research with European senior decision makers revealed they average only 6 criteria (ranging from 3-9), when 11 are actually needed. This gap means organizations are missing approximately 5 critical questions that could prevent project failures. Cohen advocates for organizations to publish their complete criteria list to create transparency and prevent the introduction of hidden criteria during evaluation.
Distinguishing Unknowns from Risks
Cohen made a crucial distinction between unknowns and risks that fundamentally changes how project challenges should be managed. Unknowns are simply missing information that can be resolved through research and verification. Risks, by contrast, have probabilities and require protection strategies. 'You cannot make a risk disappear. The only thing you can do about it is protect yourself from the risk,' he explained. This distinction is critical because unknowns can be systematically cataloged and addressed through structured analysis, while risks require different mitigation approaches. His IPOP model provides a framework for identifying unknowns across all project dimensions.
Customer Decision Criteria Blindspots
Cohen illustrated the danger of misunderstanding customer decision criteria through compelling examples. Nokia, despite being the world leader in mobile phones, was killed by the iPhone because they ignored customers' desire for multi-functionality beyond calling. Nestlé, despite dominating coffee globally, missed the experience dimension that Starbucks capitalized on. These failures occurred because organizations made assumptions about customer priorities without proper verification. Cohen emphasized that focus groups often fail to capture real behavior - citing pharmaceutical research where two-thirds said they'd buy generics but only one-third actually did. He advocates for anthropological observation of actual customer behavior rather than relying solely on stated preferences.
Implementation Framework and Organizational Benefits
The implementation process Cohen described involves creating a business case with identified critical unknowns, allocating resources to verify these unknowns, and then making go/no-go decisions based on the findings. A key element is defining 'minimum viable results' - the threshold below which projects should be killed - before launch to prevent 'zombie projects' that continue despite poor performance. Cohen shared a hospital case study where publishing decision criteria and requiring complete answers upfront dramatically reduced project volume while improving quality. The transparency also reduced middle management's incentive to reflexively say no to ideas, since they had to justify decisions against published criteria rather than personal preference.
Action Items
- → Participants (interested attendees) — Conduct organizational decision criteria alignment survey using Cohen's free assessment tool open
Key Insights (16)
Project failure rates remain critically high
Prof. Dr. Raphael H Cohen Decision criteria lack standardization
Prof. Dr. Raphael H Cohen Unknowns versus risks require different treatment
Prof. Dr. Raphael H Cohen Customer decision criteria often misunderstood
Prof. Dr. Raphael H Cohen ISO 56007 provides 11 essential project questions
Prof. Dr. Raphael H Cohen IPOP model systematically identifies unknowns
Prof. Dr. Raphael H Cohen Known unknowns versus unknown unknowns distinction
Prof. Dr. Raphael H Cohen Customer behavior differs from stated preferences
Prof. Dr. Raphael H Cohen Minimum viable results prevent zombie projects
Prof. Dr. Raphael H Cohen Conduct organizational decision criteria survey
Prof. Dr. Raphael H Cohen Einstein's problem-solving approach
Prof. Dr. Raphael H Cohen The danger of mixed information
Prof. Dr. Raphael H Cohen Common sense is not common
Prof. Dr. Raphael H Cohen ISO 56007 standard for innovation management
Prof. Dr. Raphael H Cohen IPOP model for unknown identification
Prof. Dr. Raphael H Cohen Free organizational assessment survey
Prof. Dr. Raphael H Cohen Full Transcript (click to expand)
Apr 22, 2026 Raphael's session - Transcript 00:00:00 : Hey, are you all good with the transcriber? Yeah, I'm just getting it up right now. Are you using Gemini? Yeah. Okay, great. Um, it should in the upper right corner it should say uh start taking notes. Yeah, it's probably being perfect. Yeah. And then um I'll give you my um email address and then we'll you'll just share with Y. Great. Perfect. Perfect. Perfect. Uh there's going to be space for you here to have your uh computer. Maybe maybe we can we still need to check with the organizer, but maybe we can still move the tables a bit or this table a bit. Yeah, just so you're possibly connected. Yeah, I'm just going to Yeah. Okay. Yeah, very I grew up in Yes. Look at me. Are you I I'll also add my own phone here just so we have a backup. You're going to you're going to be here, right? I think so. 00:03:44 : So, at the moment, there's nobody besides you. This is your Okay. Uh I'll just leave my phone there and Yeah. I'm going to use uh the earpiece. Okay. Hopefully, it will give you better sound, better quality recording. Okay, great. Yeah, that should that sounds good. Yeah. Okay, great. I'm going to put this I put it here. So you're going to send the uh before you uh automatically like uh I can check if my bomb Just Just rejuvenating now. Yes, I do. Find the right place. I think so. Yeah. Okay. to share. Yes. It doesn't Where is that? Thanks. Yeah, that's over. I think the time which works with Folks, you have to do it. Hello everyone. Um, now we have three different workshops and it's nearly there again. Perfect. 00:09:05 : So workshop either one is Michael Collins showing something on on AI assisted project control workshop one table is over there. Then we have the workshop two from Raphael. It's over there with Raphael is standing and waving. And workshop three is here. Stefan is already there also waving. Please choose your table the workshop you want to join and it's half an hour and then we have additional workshops go from is it would you recommend it in that between one and two. So no really okay. Yeah, we have I don't know. I'm I'm not I'm afraid to touch it. I'm afraid to touch it. I'm not sure. I think we shuttle. Okay. Good afternoon. Good afternoon. Good afternoon everybody. Thank you for coming. Always nice to know that there is somebody who listen to me. So maybe I'll introduce myself first. Um I'm a serial entrepreneur. I've been in business for many years. 00:11:59 : uh and my and one day I bec I went on the dark side and I became a professor in business school. So I teach people how fail means financially abundant and independent leaders. So uh I give them tools. I had prepared a wonderful presentation with lots of slides but I don't have a beamer so I have to do that. So I teach two and a half topics. The first one is how to analyze projects and I've written two books. You will receive my presentation in PDF afterwards so you have everything. um the so I'll explain you the reason of the first book but that's the one the reason why I'm here the second uh topic I teach is leadership because I've observed that many projects fail simply because of leadership skills human issues uh so my second book is coming out in English in French it came out several years ago but in English it's coming out so in a couple of weeks it's called Driving Employee engagement and it offers concrete tools on how to obtain engagement and the third topic is the half topic is artificial intelligence since I'm teaching innovation and project management I had to look at uh artificial intelligence I teach people how to use it not the big principle be behind that so u today u and I forgot to mention I'm also a founding member of the international federation of humorriven educators. 00:13:52 : I believe that if we teach with humor and fun people learn much better uh unfortunately you won't see the punish but that one doesn't have a funny acronym I would expect that one not yet I founded it last week working maybe we can help you yeah is that what this workshop is for you are most welcome to suggest the idea I'm very happy to get some uh support for some contribution. Um so most of my life I'm teaching sorry there is a sound you that's Great. So, basically, as I told you, I was a serial entrep I've been a serial entrepreneur. I've been involved in all sorts of businesses from fashion to cosmetics, uh, software, construction, amusement, parks, many things. Um, some projects work very well. So I am a fail and some of them fail and I've been obsessed with one question. The question was what is the question I should have asked that could have prevented the failure if I had asked this question earlier. So that has been my obsession over the years and uh since I've been lucky to coach more than 300 entrepreneurial projects in large organization from Nestle to Airbuds um French post hospitals banks etc. uh I've seen a lot of problem little by little I've managed to work out the model on what are the fundamental question that should be asked early stage and I want to share with you some statistics I didn't produce them 00:16:12 : but 30% of the projects that are launched are stopped before completion and I'm not talking only pharma I'm not talking only science I'm talking any project including a new software a RP a new marketing strategy etc. So 30% are stopped before completion, 30% are uh behind schedule and 50% cost more than anticipated. So I'm going to ask you to vote. I'm coming from will vote on everything. I'm going to ask you to vote with your hand. When I say three, you're going to tell me if you think that this is a realistic assessment of what's happening in organizations. If you if you think you're doing better, you put your hand here. If you think you're doing worse, you put your hand at table level. And if you are, if you agree more or less than with the statistics I gave you, you put your hand here. And there's absolutely no political connotation. Okay. Uh no pictures, no picture during this. So uh one, two, three. Okay. Most people are not so buddy. 00:17:27 : No, there was one person. You put your hand. Okay. Okay. Sorry I misinterpreted that. So you're probably doing your hair. So ours are under the table. You were under the table as well. Okay. So, so that means it's a real problem because the implication of this is that there's only 12 or at the best 24% of the projects that are delivered on time and on budget. And if you think about it is a lot of what we've been discussing already today. And my analysis is that many of the projects that have been launched um were not calibrated properly when the decision was made to launch them. And then people discover all sorts of things and and feel please feel free to interrupt me to say I don't agree. So, so my take on this is that we should calibrate project much better and do our homework early because every time you launch a project that should not have been launched, you're actually slowing down because it's a burden for everybody to carry all prepared. 00:18:38 : Now we talk um when I run a training session I always ask the same question at the beginning of the seminar. I'm going to actually ask you this question. Assuming you're the decision maker and that people are coming to you with a problem. What are the criteria you should use to select to decide to allocate resources to this project? And I would like a list that does not exceed 12 questions because coming with a list of 50 questions is a piece of cake. But the real key question that should imper imperatively be asked. This is a much more interesting question. So usually we don't have the time to do the exercise. You can relax but I can tell you that statistically speaking um until now I've been asking this question for about 15 years. There's not a single person who has come up with a complete and there were never two people in the same organization who had the same Yeah. So if you start on with this, it means that there's so much fuzziness about the decision making process that it cannot work. 00:20:04 : So what I tell people who attend my sessions, I said at the end of this seminar, I'm going to present you with a list of 11 questions and I'm going to ask you two question at that moment. First one is is there any of these questions that you can live without? The second one is, is there a question missing? Because I want you to leave this room after the seminar with a list that you can reliably use to select culture in a robust and rigorous manner. And I can tell you that in the last five, six years, nobody is changing anything in the now. We have run a study to ask senior decision makers in Europe what criteria they were using. Same question. And on average there were they had six criteria the minimum was three the maximum was nine. What does it mean is that if you need 11 and you only have six, it means that you're missing five questions that can make your project. So, uh that's why I think it's very important to come up to have a clear list that organization can rely. 00:21:25 : So there is a decision tree that I have developed over the years that is now part of the ISO 567 um standard that suggest the 11 question. You can change your order but at the end of the day these 11 questions you have to ask them and you can ask add some more that are industry industry specific but this is really the minimum uh required so I cannot uh present them to you now um but because I don't have time um but you should my recommendation is that every organization should ask one should have a list such a list and should publish it because when you publish the list then you make the process transparent. There's a rule an additional rule is that you are the organization is not allowed to use any additional criteria because the temptation is to come up with a hidden or new criteria. But if you want people to take initiative, they should have a transparent process because if you if you come up with an idea and you don't know how you're going to be going to be evaluated, you cannot prepare yourself. 00:22:49 : But if the the list is clear and transparent then everybody can prepare themselves and everybody is judging on the same uh bit. Okay. So uh I told you the standard is called 567 and you cannot see it but here I have a license to kill. Uh so now another thing that I think is important is that every project has unknowns. What is an unknown? It's the information that is missing. Uh earlier one lady said that there there are uncertainties or there assumption. If you need to make an assumption, it means that you don't know. Okay. By definition, so what what are the unknowns in a project? There could be a technological unknown. There could be demand or marketing unknown. How many people will be interested? There are resources online there all sorts of online not going to list them now but uh if you're not aware of all the information that you're missing then you have a serious problem because you don't you're navigating without knowing where you're going assumptions is just another word for risk no no No, no, no, no, no, no. 00:24:46 : Very good question and thank you for asking it. What is the difference between an unknown and the risk? Who can answer this question? No, but the unknown can also be known. I I know I don't know if I need uh a license to or if there's a patent on this product. Okay. I don't know. Is there a risk there? No, there's an uncertainty. But risk and uncertainty are not the same thing. Uncertainty is lack of knowledge and in fact the ISO standard uses the term uncertainty. But the problem with uncertainty is that it's confused with risk. Now what is the different? So if I don't know if there's a patent uh it's just an unknown and I'm aware of it. So now what is the risk? What is the characteristic of a risk? It's known for the sorry it won't happen the way it was and it has a probability. Okay. The unknown does not have a probability. 00:25:56 : It is just a fact something that I don't know. And once I have the information that I'm missing the unknown disappear. You cannot make a risk disappear. The risk is always there. The only thing you can do about it is protect yourself from the risk. You think it might rain, that's a risk. Then you buy an umbrella and you're protected. Okay? So the way you treat risk is not the same as the way you treat. Um, in any project you should should catalog all the unknowns at the beginning of your project because if you catalog them, you have a much clearer view of all the challenges that you're going to find on your role. And again, you stop me if you you don't agree with that. So the real challenge is to be aware of cataloging is to be aware of the unknowns and find as many as possible. So the model I have developed that's called the IPOP model is the model that helps people think about their project on all the aspects that the project will uh need to to address and by asking all sorts of questions about all these dimensions, people realize that there are things they don't have and then they can catalog. 00:27:26 : So uh and let me share with you a quote that I think is very interesting from Paul Valer. He said there is worse than false the mix of true and false. So what happens if you look at a business plan or a business case loaded with a mix of true unquote there are certain things that have been verified and that all sorts of assumptions that are presented as as factual when they're not like people say the customer need that the customer need that has this been verified and if you start to to challenge things and say are there is this reliable information or is this information that need to be verified then you start to have a list of all the critical unknown that you need to address before you launch a product. So the process for um at the decision stage for any project is to do a complete pre- project analysis have a business case and with the business case you have a list of all the unknowns. So you have the critical unknown that can dramatically affect the project and those those that are more marginal but when you have your list of critical unknown then there's another question which one are you prepared to accept because there's a tolerance to uncertainty that is personal or everybody doesn't have to do the same. 00:29:14 : So you have the business case that de demonstrates the merits of the project and you have a list of unknown and management or decision makers should say do we accept to proceed without knowing this information or should we verify a few things before deciding to go ahead full speed. And usually to reduce the the unknowns that are critical, you need certain resources. And for each unknown, you can evaluate the resources that you need, people, money, skills, etc. And you have a total at the end. And you say this is the project with its merit and what it's going to bring us. And this is the list of unknowns and the cost I have to put on the table to verify so that I don't go blindly. Um this is a decision and once assuming the decision is yes it's worth it then let's say you put a million dollars to address all the critical unknowns on the table and you give time and money resources to the people in charge of the project. They will analyze all this and then there's a new update of the business case where they say this is what we have learned. 00:30:37 : This is the new state status of our project and we feel comfortable that we have reduced the kon and there's a decision should we should we launch or not launch. So this process which is also suggest recommended by the ISO standards allows people to um to have a more rigorous approach because then you can follow each unknown how it is um being handled and what information you obtain at the end of the day. Yeah. What do you measure? Sorry. How do you measure this? How do I measure this? Yeah. Well, you have a question which is the unknown. You have a time to obtain the information for that unknown and then at that time you have to check whether you have the answer or if you don't have the answer and if the answer is satisfactory because if the answer is not satisfactory you should stop at the beginning. Sorry. How do you how do you identify the unknown? Okay. But that that's where the IPO model is the tool that helps you identify all this. 00:31:59 : Some research has shown uh I'll give you some uh references has shown that if you use it, you identify more unknowns than if you go live. How do you differentiate? Because I'm just thinking this through, right? If I were to implement this, you mentioned the 11 questions and if you have a published framework of those 11 questions on which a decision is made on a project goal versus no game, right? The unknowns feel like a back door. So I can't add more than 11 questions, but what I can do is document the hell out of the unknown. Document 10 or 100. So how do you differentiate the unknowns that have an impact? This is a question of judgment. You're you're absolutely right. Very important. But you you have questions the answers are about judgment and the same thing applies to the unknown. You have your list of unknown and then you have to decide whether uh they're important or not important. So you make an assessment like you do for risk. 00:32:59 : Okay. The same underlying logic but you the risk you cannot do much besides protecting yourself the unknown you can try to obtain as much information as possible to reduce that uh that unknown. Okay. What do you mean by visibility or what would be the difference between the dancers and a visibility? Physibility. I heard visibility. Okay. It is it is a form of physibility. But this addresses the physibility of your question mark. the feibility can go way beyond the questions you have. It's it's very close but it's a more structure because when you when people talk about the feasibility study it's a very g general con concept here it's very uh structured and and rigorous so that that's that's a big difference. Okay. Yeah. 11 question. You want the 11 question? No way. I'll tell you why I don't want to give you the 11 question because the questions are relatively simple and common sense. The real important thing is common sense is very individual my friends. 00:34:28 : No, common sense is my count. No. um the the real thing is how do you uh answer what should be the real answer to that question okay so I'm going to give you an example of one of the 11 question is will the customers adopt that product okay how do you answer this question so the iPod model gives you a tool which is deadly for answering this question. What is the tools is that it says you must first identify all the decision criteria of your customer. How do they decide to buy uh that product? And for each of those questions, you're going to have a benchmarking table comparing your product with all the alternative option that the customer has. Okay. Now, do you really know all the decision criteria of your customer? Most of the time, not because our customers are the patient or maybe the doctors, but that's it. Exactly. But even how do you know how the doctors are making their decision to to prescribe? Are you sure? Because I've seen so many people including in saying it out loud. 00:35:52 : These are our customers actually. Okay. But what I'm just trying to say is that um very often people think they know and they don't. I'm going to I'm going to give you an example that is not far. Okay. Uh you know what is the iPhone or uh have you heard of a company called Nokia? Nokia was number one in the world. They were dominating the market. How come they got killed by the eyeball? They asked the right questions. What is the decision criteria that they did not take into acco ... [transcript truncated]