Showing posts with label OPP. Show all posts
Showing posts with label OPP. Show all posts

Sunday, September 11, 2011

Is This a Valid Performance Model?

Is a reliability growth model considered to be a valid PPM in the CMMI?


Asking this question out of context with what you do in the organization does not have a lot of meaning.  The correct answer is both yes and no.  Please remember what High Maturity is all about.  You begin with setting your business goals and objectives and use them to derive your Quality and Process Performance Objectives (QPPOs).  These QPPOs in turn will lead you to the proper measures, process performance baselines (PPBs), and process performance models (PPMs) that your organization needs to quantitatively and statistically manage your work.So, if Reliability Growth is a critical process or sub-process and you have sufficient data to analyze to determine that you have a stable and capable process, then a reliability growth model might be considered a valid PPM.

But just selecting models without performing the analysis I just sketched out is incorrect and you will not be able to demonstrate that your organization is a High Maturity organization.


Thanks for the detail. I just happen to see in CMMI v1.3 High Maturity in which the "reliability model growth" which was given as example in OPP SP1.5 (CMMI v1.2) is deleted. Does it mean that reliability growth model will not be accepted in CMMI v1.3? Or the reliability growth model is not acceptable by the experts? Or is it only good if you use CMM v1.2  and not for CMMI v1.3?

As CMMI v1.3 is an improvement and the practices are carefully analyzed by the SEI and experts, is it advisible to use the reliability growth model given in CMMI v1.2? Or there is any chance that CMMI v1.3 will include the reliability growth model as an example?



Apparently there is some misunderstanding of my answer above.  Whether the CMMI contains the reliability growth model as an example or not is irrelevant to whether or not it is a good model.  Your organization has to mathematically analyze its data, business objectives, QPPOs, PPBs, and PPMs to determine if there is a need for using a reliability growth model.  Do the following analysis:
  1. Describe the reliability growth model in probabilistic terms.
  2. Define the critical sub-processes (those that must be consistently and correctly followed every time) that can be managed using the reliability growth model.
  3. Define how a project manager uses the reliability growth model in the context of his or her projects to predict performance, "what-if" analysis, and predict QPPO achievement.
  4. Provide an equation or show by other means how the stable sub-processes that you have identified in your processes contribute to the reliability growth model.
  5. List the other models that are used in conjunction with reliability growth model and why it has statistical relevance.
Once you have performed this analysis you will have enough information to answer this question yourself.

Wednesday, April 15, 2009

Applicability of the Informative Material

I have been told that if an organization has to go for CMMI ML 5 the organization has to address all of the sub-practices even though they are informative material and if the company is only going for ML 3, then they can apply appropriate sub-practices. Is this true?

Please allow me to try to explain the model. There are at least two ways to look at the CMMI: 1) implementing the model and 2) appraising the organization against the model.

In addition, there are three CMMI components: Required, Expected, and Informative. These components only have meaning when you are talking about appraisals. The Required components are the Specific and Generic Goals, the Expected components are the Specific and Generic Practices, and everything else is an Informative Component.

When you are implementing the model, you should not be concerned about differentiating between the different types of components. From Chapter Two of the CMMI-ACQ (and this statement applies to ALL CMMI constellations): “All model components are important because the informative material helps you to understand the expected and required material. It is best to take these model components as a whole. If you understand all three types of material, you can then understand all the pieces and how they fit together to form a framework that can benefit your organization.”

When the organization is being appraised against the CMMI in a formal SCAMPI A appraisal, the organization will only be appraised against the Required and Expected components, regardless of the Maturity Level. However, the appraisal team may be evaluating the evidence and perhaps asking questions in the interview sessions at the sub-practice level just to gain a better understanding of how the organization is addressing each of the Required and Expected components. The organization will not be penalized if it is not performing one or more sub-practices. The appraisal team will be identifying and documenting weaknesses with the organization’s implementation of the goals and practices.

Because there has been a misunderstanding of what High Maturity means (ML 4 and ML 5), the SEI has been emphasizing that the proper implementation of the goals and practices for OPP, QPM, OID, and CAR means reading, understanding, and implementing the types of activities described in the Informative material. So, for a ML 4 or ML 5 SCAMPI, the organization will not be evaluated against the OPP. QPM, OID, and CAR sub-practices, but weaknesses will be noted at the goal and practice level if the organization has not properly implemented these Process Areas to meet the intent, which is gained by understanding the Informative material.

Sunday, September 7, 2008

Difference between measurement objectives and process performance objectives

CMMI uses following terms while talking about measurements:
  1. Identified information needs and objectives at organization level (MA)
  2. Measurement objectives derived from such information needs and objectives (MA/SP 1.1)
  3. Quality and Process Performance Objectives (OPP/SG1)
I want to investigate finer differences between the later two. Are they more or less the same? If not, what are the finer differences?

Your first two questions deal with Measurement and Analysis (MA) which is a Maturity Level 2 Process Area (PA). Your third question is about OPP which is a Maturity Level 4 PA.
MA is a support PA that applies to ALL PAs in the CMMI. Its focus is defining measures and indicators that help answer or address a specific information need. An information need can be articulated using the following format:

<someone> needs to know <what information> in order to make <what decision>.

For example, an issue facing the Project Manager may be that Project Plans are inaccurate causing project teams to routinely miss milestones and deliveries.
Therefore the information need would be: The Project Manager needs to evaluate the schedule, progress, and dependencies of key development activities and events in order to determine necessary corrective actions.
Then the measurement concept or strategy for addressing this information need would be: At the end of each week the Project Manager collects estimates to complete the work for each current task and enters the data into the schedule. The Project Manager and team review progress to determine the necessary corrective actions and modify the Project Plan as needed each Monday morning.
Now it becomes very easy to determine the base and derived measures that support the measurement concept.

So these concepts (information need, measurement concept, and measures) are interrelated concepts that help define what data to collect and analyze to help answer a question.

The Quality and Process Performance Objectives (QPPOs) are entirely different from the measurement concept. The QPPOs are objectives set by the organization based on the organization’s business objectives, past performance of projects, and are constrained by the inherent variability or natural bounds of the process. Now, in order to determine the measures that the organization will use to support the QPPOs, the organization should follow the MA process of articulating the information need, defining the measurement concept, and determining the measures.

Tuesday, June 24, 2008

Definition of Complexity

While going through some SEI Presentations, I came across a term Design Complexity which was one of the contributing factors for the Delivered Defect Density. As we have chosen this as one of the PPMs in our organization, I would like know how to define the "Design Complexity". I have been searching in Google for the definition, but unable to find it.

I see a problem right from the start in your approach. You have elected to use a term and build a Process Performance Model that has no definition within your organization. Don’t do a literature search to determine what Process Performance Baselines (PPBs) and Process Performance Models (PPMs) to build for your organization. You need to start from the basics. What are your organization’s Quality and Process Performance Objectives (QPPOs)? You need to determine these first. Then you analyze the data in your measurement repository to build PPBs and PPMs that support the QPPOs.

The next step to use Measurement and Analysis (MA) to take the QPPOs, derive the information needs for the QPPOs, and then the measurement specifications that support the information needs and QPPOs. Once you have performed the MA process for your QPPOs, then you will be able to determine if you in fact need to look at Delivered Defect Density and/or Design Complexity. And keep in mind these two terms are just labels. The data you collect and analyze for Delivered Defect Density in one organization may be different from the data for Delivered Defect Density that are collected and analyzed for a different organization. The concepts are important, but going through the MA process will help you define what you need for your organization. Then you can apply whatever label makes sense for you. The MA process will also focus you on what data to analyze to build the PPBs and PPMs that will support your QPPOs.

Obviously Delivered Defect Density and Design Complexity are interesting things to look at, but do they support your organization’s QPPOs? If not, then there is no point in investigating them any further.

Now having said all that, I am surprised that you haven’t been able to locate any information on Design Complexity on the internet. In fact here is an SEI link on this subject
http://www.sei.cmu.edu/str/descriptions/cyclomatic_body.html and a Wikipedia definition http://en.wikipedia.org/wiki/Cyclomatic_complexity

Tom McCabe has developed a whole collection of complexity measures, which you find out more about here
http://www.mccabe.com/

Bottom line, work through the OPP and MA processes first, THEN look for help with definitions and building PPBs and PPMs. Don’t adopt someone else’s PPBs and PPMs just because these worked for them.

Thursday, May 22, 2008

Selection of Processes\Subprocesses

Kindly give your comments on the following queries :
1.Say, in OPP if an organization has selected some process/subprocess (SP 1.1) for performance analysis. A critical criterion for selectionof process or subprocess is its historical stability. To ensure thisdoes the organization needs to use statistical techniques right fromthe beginning of the data collection for that process/subprocess?
2.In QPM, whether the projects have to select the subprocess fromprocess/subprocess identified in OPP or need to select based on their project specific objectives?

Let us first consider the OPP SP 1.1 question. How is it possible to know that a process, or sub-process for that matter, is stable without having already performed some sort of statistical analysis on the historical data? What I think the actual question you may be asking below is at what point does it make sense to begin using statistical techniques. Obviously, OPP SP 1.1 is where you want to be when you reach Maturity Level 4. However, it does take some time and thought before arriving here. So, as an organization begins its ML 4 journey implementing OPP SP 1.1, it may NOT have enough historical data to select the processes or sub-processes. Therefore, the organization must spend some time collecting and analyzing the data before it is smart enough to know if it has a stable process or sub-process or one that can be stabilized. As the organization's processes become more and more stable, the application of statistical techniques may prove insightful. The stabilization of process execution, as reflected in the resulting process data, provides the opportunity to apply more sophisticated analytical techniques, including statistical methods. So it is not necessary to "use statistical techniques right from the beginning of the data collection;" rather statistical techniques can be applied once the process is consistently executed as reflected by the resulting data stability. Therefore, the organization has to devote the effort to data analysis to determine if OPP SP 1.1 can be implemented.

QPM works in conjunction with OPP. The project’s quality and process performance objectives are derived from the organization’s quality and process performance objectives. And the processes and sub-processes being quantitatively and statistically managed by the project are selected from those specified in OPP. If not, what would be the value of OPP SP 1.1? OPP sets up the infrastructure to allow the projects to perform QPM. However, that being said, a project may find that it has unique characteristics or customer objectives that warrant the statistical management of one or more subprocesses for which no process performance baseline has yet been established. In such a case, the project would select that subprocess for statistical management in QPM SG2, establish spec limits for the expected process performance, and use that as its foundation for quantitative management. Once sufficient process data have been accumulated by the project, the actual Upper Control Limit (UCL) and Lower Control Limit (LCL) can be established and used (and process stability and process capability can be verified or denied). Note that this information should be fed back to OPP for potential use on future projects.

Thursday, April 10, 2008

Help! I am Totally Lost When Interpreting the QPM Specific Practices

QPM SP 1.1 states “Establish and maintain the project’s quality and process-performance objectives.” When you are starting to implement the Maturity Level 4 PAs you won’t have a Process Capability Baseline, that is ultimately the goal of being at ML 4 so the organization can then move to ML 5. One of the main thrusts of ML 4 is to analyze the process data looking for Special Causes of Variation. Once those have been analyzed, then it is possible to determine the process capability. For SP 1.1 the project’s quality and process-performance objectives (QPPOs) are set by management and are based, in part, on the organization’s objectives (OPP SP 1.3). The Process Performance Models (PPMs) are then used to determine if the QPPOs can be met. If not, the QPPOs should be appropriately adjusted.

QPM SP 1.2 states "Select the sub-processes that compose the project's defined process based on historical stability and capability datat." One of the key words in this practice statement is "historical". Sources of historical stability and capability data include the organization’s process performance baselines (PPBs) and PPMs (OPP SP 1.4 and SP 1.5). And the intent of this practice is to tailor the organization’s standard processes (OSSP) so the project’s processes will support the project’s QPPOs defined in SP 1.1.

QPM SP 1.3 states “Select the sub-processes of the project’s defined process that will be statistically managed.” The model does not say that these sub-processes MUST be statistically managed, but these WILL be statistically managed. And this practice focuses more on than just selecting sub-processes. It also focuses on identifying the attributes of each sub-process that will be used to statistically manage the sub-process.

People appear to get confused with Maturity Level 4 (ML 4) sounds when trying to understand QPM independently from the rest of the model. You cannot do that. You have to consider OPP and QPM together when looking at ML 4. OPP provides the analysis of the historical data to build the PPBs and PPMs which are then used by the projects to help them appropriately tailor the OSSP to what will meet the project’s QPPOs. QPM SP 1.2 uses the word "compose", which may contribute to some of the confusion. Since compose is not in the CMMI Glossary, then the dictionary definition is applicable. The Webster definition of compose is “To form by putting together two or more things or parts; to put together; to make up; to fashion.” So for this practice, compose means going to the OSSP and selecting the processes and sub-processes that will meet the QPPOs and then applying the necessary tailoring criteria. What this practice implies is that the OSSP may have several different processes for each Process Area (PA) so the project manager can choose the most appropriate one when composing the project's defined process.

Another ML 4 concept that may be the cause of confusion is the notion of Quantitative Management vs. Statistical Management. QPM SG 1 is all about quantitatively managing the project, which means the Project Manager must be periodically reviewing progress, performance, and risks using the PPMs to determine/predict if the project will meet its QPPOs. If not, then appropriate correctives actions must be taken to address the deficiencies in achieving the project’s QPPOs (QPM SP 1.4) Statistical Management is covered by QPM SG 2. The intent is for the project to statistically manage those sub-processes that are critical to achieving the project’s QPPOs. There are only a small set of sub-processes that are critical to achieving the project’s QPPOs. If the organization were to statistically manage all processes, that would be insane. This approach would mean that the organization would need PPBs and PPMs for EVERY process, regardless of their importance to the QPPOs. And then the shear overhead of collecting, analyzing, and reporting data on every process would most likely bring the organization to its knees. Work would come to a standstill because it was taking far too much time to statistically manage each process and taking corrective actions that may not be necessary. Unless you have an infinite budget and dedicated staff to perform these analyses, that is why the model states to statistically manage SELECTED sub-processes.