Showing posts with label CM. Show all posts
Showing posts with label CM. Show all posts

Saturday, September 10, 2011

PI SP 3.1 Confirm Readiness of Product Components for Integration

I need help with understanding this practice. Here is the situation: In our organization we have implemented MS Team System. This tool allows us to analyze the code from different perspectives. We have implemented peer reviews. The code reviews allow us to verify if the complies with the design specification. We have also implemented CM audits to check the identification of every configuration item. Consequently, I´m not certain we are fully aligned with this practice.

The purpose of PI SP 3.1 is to ensure that all of the components that you will be assembling are ready for assembly. For purely a software project, this practice is pretty easy and straightforward. At a minimum, you want to be certain that every module has been properly checked into your CM system, that every configuration unit has been unit tested, and that the external and internal interfaces have been examined to verify that they comply with the documented interface descriptions. It sounds like you might have most of these activities covered by MS Team System and your peer reviews. What I don’t see in your description is any activity associated with checking the interfaces against their descriptions. When you are integrating hardware and software, or have a large and complex software project with many different systems, this practice becomes more complicated.
Hope this short explanation helps.

I have one more question. Should we run unit tests for every configuration unit? Is it possible to implement actions other than unit testing to comply with this best practice? I think that the Static Code Analysis in VS Team System checks the interfaces betwen components. In the peer reviews of code we check the interfaces against their descriptions documented in design specifications.

You are actually focusing on the wrong topic. Instead you should be seeking answers to these types of questions.
  1. What do your business goals and objectives tell you about the required quality level of products?
  2. What is the reason for performing unit tests? Or what are you trying to achieve by unit testing the code?
  3. Do your customer requirements and your business goals and objectives require a quality level that demands that you perform unit tests before creating a product build?
  4. What are your requirements for each configuration item before creating a build?
Answers to these questions will provide the answers to your questions.
Basically, your configuration audits are there in order for you to determine if all of the configuration items are ready to be assembled. Perhaps the Static Code Analysis in VS Team System is satisfactory, perhaps it is not. That is for you to decide based on the quality requirements for your product.

Hope this explanation helps, but there is no clear answer to your question without being able to spend some time with you and your organization to perform an in-depth analysis of your processes and procedures.

Saturday, April 10, 2010

Configuration Management - SP 1.3-1 , SP 2.2.-1

I would like your opinion in relation to situation below.

Once a change was been verbally approved, is there some problem in relation to CM SP 1.3 (Create or Release Baselines) - sub-practice 1 and CM SP 2.2 (Control configuration items) - sub-practice 2, if the record of the modification in the schedule and the change control tool are done in subsequent periods until a maximum limit of time of the defined period for monitoring of the project?

Example:

Verbal approval of the change - Monday
Beginning of the work - Tuesday, with storage of the configuration items in the configuration system (without baseline).
Register of the change in the change control tool and record of the activities in the project schedule - Friday
Collection of the progress and effort - Next Monday (weekly Monitoring, all monday)

In advance, thank you very much.

The first problem I see is the verbal approval. Verbal approvals are difficult to document or provide as evidence and over time can be forgotten or misremembered, not to downplay the risks associated by not documenting decisions.

Secondly, since you are asking about the sub-practices, that makes me wonder if you have taken the SEI’s Introduction to CMMI class. If you have taken the class, your instructor should have made clear to you the distinction between the required, expected, and informative components of the model. The sub-practices are informative components and are therefore provided only as information to help you understand the intent of the Specific and Generic Practices and Goals. To be clear, you are neither required nor expected to implement the sub-practices.

If the weekly schedule you described works for your projects, then you should be able to map it to the Configuration Management (CM) Specific Practices (SPs). However, the described weekly schedule sounds like it may have some gaps.

I strongly suggest that you have an SEI-certified Lead Appraiser conduct a Gap Analysis (Class C appraisal) of your organization, especially CM, to determine the gaps between your implementation and the CMMI.

Tuesday, December 1, 2009

CMMI Practices for Documentation Teams

Our organization is CMMI Maturity Level 3 Ver 1.2 certified. Our delivery teams are going to implement CMMI practices soon. I did see the processes of the organization and though all the roles appeared from project leader to developer to manager, except for documentation teams. In the same context, I am very curious to see if there are any set of practices to be followed for the documentation department in CMMI as all the processes at CMMI ML2, ML3 are specific to project management, engineering, support, organization areas.


Your question actually raises some other questions:

  1. What are the roles and responsibilities of your documentation teams and documentation department?
  2. If this group of people is responsible for the technical documentation (e.g. requirements, design, etc.) and the user documentation, how and why were they excluded from your ML 3 SCAMPI A?
  3. How are the delivery teams different from the organization that achieved Maturity Level 3?

What is puzzling with your question is that REQM, RD, TS, PI, and CM cover the different aspects of writing and controlling the various documents associated with designing, developing, maintaining, operating, using, and deploying products. And then VER covers the inspection/review of the documents before placing them in the baseline and controlling them with CM.


It does appear from your description that your organization omitted the documentation people as a process role from your process documentation. In my opinion, at a minimum, your PPQA audits should have identified this omission long before your SCAMPI A appraisal. Then your Lead Appraiser should have identified this gap during the appraisal planning process before the SCAMPI A and should have taken steps to address the gap or postponed your appraisal until the documentation group was included in the scope. Since your documentation group was apparently not included in the scope of your appraisal, this oversight also calls into question your Lead Appraiser’s credentials and quite possibly the validity of your SCAMPI A results.


The bottom line, in my opinion, and based on only what you stated, your documentation group should have been included in the scope of your ML 3 SCAMPI A. Even if all they do is Document Configuration Control (which would be covered under CM) or Document Quality Assurance (which would be covered under PPQA).


The answers to my above questions could provide additional information that would change my opinion.

Monday, April 20, 2009

Requirements Development/Management Question

I was slightly confused after reading this statement in the Requirements Development (RD) Process Area (PA), Specific Practice (SP) 2.1 states "Establish and maintain product and product component requirements, which are based on the customer requirements"

The informative material goes on to say "The modification of requirements due to approved requirement changes is covered by the "maintain" function of this specific practice; whereas, the administration of requirement changes is covered by the Requirements Management process area."

I do not understand the difference between maintenance of requirements in RD and administration of requirement changes in REQM. How are they different?


The Requirements Management (REQM) Process Area (PA) is there to help an organization manage changes to an agreed to set of requirements (baseline). So, no one makes any changes to the requirements (add/modify/delete) without following the documented requirements change control process, which should include review and approval by the relevant stakeholders of the requirements.

And actually, REQM is a specific instance of Configuration Management (CM). But REQM is such an important and vital practice that it is its own Process Area. So it is entirely possible for an organization to have one Change Management process that it uses to manage changes to requirements, documents, plans, test scripts, code, etc.

In contrast the intent of Requirements Development (RD) is to actively elicit and further refine the requirements that are then agreed to and incorporated into the requirements baseline. Specific Practice 2.1 contains the phrase “establish and maintain,” which has a special meaning within the context of the CMMI. If you have taken the Introduction to CMMI class you would have learned what this phrase means. This phrase is also defined in the CMMI Glossary and it means formulate, document, and use. For SP 2.1, this means that the product and product component requirements are formulated (elicited, discussed, refined, reviewed, etc.), documented (written down in a requirements specification, requirements management tool, etc.), and used (develop the project plan, design, code, tests, etc.).


So the connection between REQM and RD is that proposed requirements changes are reviewed, analyzed, approved, and processed by REQM and the authorized and approved changes are propagated through the baselined requirements by RD.

I would not get too focused on trying to understand the separation between REQM and RD. You have identified one of the many areas in the CMMI where there is a tight coupling between PAs, if not overlap. What is important is that you

  1. create a requirements baseline that does not change unless there is an approved change request,
  2. investigate, impact, review, and disposition all proposed requirements changes with the relevant stakeholders (approved or rejected), and
  3. only make approved changes to the baselined requirements and appropriately update the traceability

Thursday, February 12, 2009

CMMI Implementation

I recently joined a company where there is no process and management recruited me to implement the CMMI. The organization has different business units. Though everyone sits together, they work very differently. I conducted a Gap analysis based on the CMMI Level 2 processes and here are the findings:

  1. Project Planning & PMC -- they create project plans and they have the regular project team meetings and they share the minutes. Each team has their own format and templates. The projects don't really do estimations. Can we satisfy the PP & PMC PA'ss without doing any estimation? I know it can't be that way, but can it be tailored?
  2. Requirements Management: Some of the business units have CCBs to discuss change requests and other business units discuss requirements changes in their project team meetings. The most significant gap I found is in Requirements Traceability. Traceability of Customer requirements to the Functional Requirements and Traceability of Use cases to the Test cases are missing. Is this reason enough to fail the RM PA? One of the Engineering directors asked me how much traceability you need to satisfy this condition. At that time I said 100% of all the requirements. Then I also read from somewhere that it is OK to define that we maintain traceability for at least the MUST BE CUSTOMER REQUIREMENTS. Traceability of other requirements can be made optional. Can it be possible like that?
  3. Configuration Management: The projects thought they have a CMP which is embedded in the project plan. What I found missing are the Configuration Audits ( PCA & FCA). Is it possible to satisfy the Configuration Management PA without doing Configuration Audits to check the document status, builds, and backup strategy?
  4. PPQA: One of the business units has a Software Quality Assurance plan, but it is done by one of the testers from another project. As a part of PPQA the SQA will do some spot checks based on a pre-defined checklist, which includes Project Planning, Risk Management, Project Monitoring and Control, Integration and releases. But I guess this can be improved by my role as a independent software quality engineer.
  5. M&A: I am very much worried with this process area, as of now the organization status is nil with respect to metrics collection, they don't have a metrics database and no metrics have been defined yet. Is it OK to start now to form a team to do some reasearch and come up with metrics definitions, deploy them and start collecting data? My question is how much data do we need to collect to satisfy this PA? And do we need to provide evidence of analyzing these collected data and show some improvements steps taken at the time of SCAMPI apprisals? How long will it take in general to satisfy the Measurement & Analysis PA?
  6. SAM: can we tailor this PA if we are not dealing with suppliers? If yes how can it be possible?

The directive from the Leadership team is to acheive CMMI Level 3 by end of 2009. I was baffled to hear this. Under these circumstances, what are the chances of getting CMMI Level 3 or my traget is at least CMMI Level 2? That is what my initial target. Can I acheive CMMI Level 2 by the end of 2009? If so, what are the things I need to address?

Here are some of the things I have already started:

  1. CMMI Overview training to all the teams
  2. Dailogue session on metrics identification
  3. Looking into some Requirements tools which provide Traceability
  4. Need to push the project team to have configuration audits.

In addition we already have established a process data base and the processes are defined and templates are being used from the parent organization. Since our company is a multi-site company, one of our counter parts has already acheived CMMI Level 3. We will be using the same process database and their templates. I thinking of providing their training on each Process Area as well. Is this a correct way to use the processes and templates of our parent organization? If not, do we need to establish our own local process data base?


First of all, I sympathize with you and the challenges you face. I applaud the fact that you had the foresight to conduct a Gap Analysis of the organization. You have highlighted a number of key weaknesses within the organization. However, to provide you meaningful feedback on all of your points would require working directly with you and your company.
  1. What is your CMMI experience? Have you taken the SEI’s Introduction to CMMI class? If not, I strongly recommend that you and possibly those others in your company who are responsible for your processes take the three day class. The class should provide you a more thorough understanding of the CMMI, its interpretations, and material for constructing an internal Overview class.
  2. You have identified some serious deficiencies within the organization in all of the ML 2 Process Areas. These need to be analyzed, addressed, corrected, solutions implemented, and then re-evaluated some months into the future before you can consider a formal SCAMPI at any Maturity Level. The length of time before the next evaluation is a function of a number of factors: number of people in the organization, number of projects, typical project duration, how much time and other resources are dedicated to process improvement, etc.
  3. The organization needs to first implement Maturity Level 2 to form a firm foundation before considering moving to Maturity Level 3. The issues you have identified are fairly typical. Basically, it sounds like your organization does not perform PPQA or MA and is challenged with Project Management, Requirements Management, and Configuration Management. The first steps here should be to identify the necessary skills-based training classes you need to bring in-house and train your staff on these concepts. If you just purchase tools and push for audits, you most likely will not achieve the desired effect. You have to understand your process first and the reasons for why it is important to perform each of the steps.
  4. Since another division has already achieved ML 3, it is a good idea to learn from their mistakes. But be very careful of the temptation to “clone” their processes and procedures. You have to implement the processes and procedures that match the way you conduct business today.
  5. Based on what you have outlined, and given how much time and effort it could take just to address the ML 2 issues, I would say that ML 3 is out of the question for 2009. You could conduct a ML 3 SCAMPI A by the end of this year, but in all likelihood it would not be successful.
  6. The best suggestion I have for you is to hire a CMMI consultant and Lead Appraiser to provide you with the proper advice and guidance. Otherwise, you could be spending a lot more time and effort than originally anticipated.

Thursday, October 9, 2008

CM SP 3.2 Evidence Guidance

The release of CMMI v1.2 included a change to the examples for CM SP 3.2, Perform configuration audits to maintain integrity of the configuration baselines.
Examples of audit types include the following:

  • Functional Configuration Audits (FCA) – Audits conducted to verify that the as-tested functional characteristics of a configuration item have achieved the requirements specified in its functional baseline documentation and that the operational and support documentation is complete and satisfactory.
  • Physical Configuration Audit (PCA) – Audits conducted to verify that the as-built configuration item conforms to the technical documentation that defines it.
  • Configuration management audits – Audits conducted to confirm that configuration management records and configuration items are complete, consistent, and accurate.
Are projects now required to provide evidence for each of these examples?

As the Configuration Management (CM) Specific Practice (SP) states, these are examples of Configuration Audits and are part of the informative material of the model. Therefore these examples are neither required nor expected. As long as the organization and projects are performing some kind of configuration audit(s) that assesses the integrity of the baseline(s), that meets the intent of the practice. For a SCAMPI Appraisal projects are not required to provide evidence for each type listed in the examples.

Tuesday, June 24, 2008

Project Artifact Repository

I have a concern about documentation repository related to CMMI.

We develop projects locally but we store deliveries/documents in a customer online system and use customer clear case to store source code. To improve our process to have CMMI, our consultants said that we must have full control over documents, requirements, etc, because if one day our customer says that we don´t have access to the remote network, we will lose all project information, historical data, etc. Is this a correct interpretation of the CMMI?

But it´s very difficult to us have all artifacts in a local repository because they would be duplicated information and rework, it is not so secure to have confidential information locally, and we would require an infrastructure that we do not need.

What do you think about this solution: to have a contract where the customer affirms that they will permit our access to the documents/sources of our projects for X years even if the business relation ends ? X = ? who should be the signer?

Either your CMMI consultants are giving you some incorrect CMMI compliance advice or you are misunderstanding their message. The CMMI says nothing about the location of the CM repository and the ownership of the repository. What your consultants have told you may be good advice, but not necessary to comply with the CMMI. When considering implementing the CMMI, you have to think about what is best for your organization and how you conduct business.
  1. It sounds like you are custom building a system for your customer and they own the repository. It all depends upon the contractual stipulations on ownership of the source code etc. In all likelihood, the customer owns everything and therefore it doesn't matter if the customer denies you access to the repository in the future. You don't own the sourcecode.
  2. Now if you are also using the customer's repository for storing project artifacts etc. for different customers, then you do have a problem. In this situation, you have made a bad business decision that has a huge risk to the company's future viability. Now you do need a locally owned and managed repository if your customer elects to deny you access to the repository.
I would NOT try to modify your existing contract per your suggestion below. Instead I would "bite the bullet" and purchase a CM tool and create your own repository if #2 applies to your organization. At a minimum, you may just want to mirror the contents of your customer's repository in the event of network outages just so you can continue to perform work without interruption.

Thank you your reply.

I work in an area that has a lot of projects for one big customer, so all of the artifacts and source code are maintained in the customer repository. There are some "internal" artifacts that we do not deliver, and we maintain in a local repository (metrics, audit reports), but the source code and project plan, requirements spec, etc, are all in the customer repository.

So must we store project plan, requirements spec etc in the local repository too?

Given your explanation, it sounds like you have the proper CM structure established. The customer’s project artifacts are stored in a customer repository. I can only foresee one circumstance where the customer would deny you access to their repository and that would be if they gave the work to a different company. And I am assuming that your contract states that the customer owns all of the project artifacts. Therefore you don’t have the rights to the information if they deny you access to the repository.

Now considering all of the information you are placing in your customer’s repository, there may be some information that you may want to retain for your own records. Depending on how your contract is written, you may have to inform your customer that you are retaining copies for internal purposes. And this subset of information you would place in your local repository. Please keep in mind that the CMMI does not require you to also keep a copy in a local repository and that I am not telling you do to so either. You need to evaluate your business need for maintaining copies in a local repository.

Perhaps it may make sense for you to run through the DAR process to determine which customer artifacts, if any, need to be placed in a local repository.

Wednesday, April 23, 2008

What is the Difference Between a Physical Configuration Audit and a Functional Configuration Audit?

A Physical Configuration Audit (PCA) is a Configuration Management (CM) audit that verifies that whatever you have built conforms to the technical documentation that defines the build. What that means is that CM must compare the configuration of the as-built product to the design documentation of the product. For example, CM looks at the list of all the configuration items (versions, etc.) in the as-built product and compares that list to the list of what is supposed to be in the product.

A Functional Configuration Audit (FCA) is a CM audit that verifies that the functions of the as-built product match the baselined functional requirements and that the operational and support documentation is complete and satisfactory.