“Common Features,” a concept that was initially a cornerstone of the CMMI in version 1.1, was a set of minimalistic principles dedicated to continuous process improvement. Although abandoned since CMMI v. 1.2, the spirit of Common Features is still valid and deserves a second consideration.
Once it has become evident that a new or substantially improved development process is required, the question remains on how to go about putting it into place. A naïve attempt to imply copying the “Automotive SPICE” description and adding the missing details are a proof that the development team has not understood the intention of the standard. A process (for the lack of better term – the term “process” tends to be also misleading, see LINK) is a means of ensuring that the development team is aligned as per the purpose and the goals of the organization. The Common Features in CMMI 1.1 standard was an attempt at this aspect. CMMI defined four Common Features:
- Commitment to Perform
- Ability to Perform
- Directing Implementation
- Verifying Implementation
My informal interpretation of the Common Features is as follows:
- Are you determined to make it work?
- Are you able to make it work?
- Are you effectively doing it?
- Have you done it correctly?
These questions are quite inspiring because they often lead to the right answers. Let us take a closer look at them
1. “Commitment to Perform” – or “Are you determined to make it work?”, for example:
- What are the strategic goals of the organization?
- Are the short-term and long-term plans consistent?
- What is the objective of each activity?
- What are the optimal responsibilities and authorities for each activity?
- How thorough should be the risk management for each project?
In the context of assessment standards like Automotive SPICE, the answers to the above questions take the shape of plans and documents such as “project plans,” “configuration management plans,”, “quality management plans,” and “test plans.” These concepts have been known about and used for decades in the form of the famous IEEE documents, such as the “IEEE Standard for Software Test Documentation” (LINK) or “IEEE Standard for Software Project Management Plans (LINK).
2. Ability to Perform – or “Are you able to make it work?”, for instance:
- What kind of human resources are required to achieve the project or organizational goals?
- Is the proper founding secured and sufficient?
- What tools and training are required?
- What organizational structure is effective?
- Are all resources committed and available for each planned project?
- Are all of the governmental and social obligations and approvals in place?
- Are all plans peer-reviewed and adequately approved?
Certain concrete outcomes could include approvals of plans and schedules, formal management sign-offs, hired engineers, organizational team charts properly structured and approved with the project sponsors, and software licenses. The latter can be crucial and quite costly, including software for HIL/SIL (LINK), specialized compilers, libraries such as AUTOSAR (LINK) stacks, debuggers, MISRA-checking tools, issue tracking tools, and test management tools. Hardware tools include hardware debuggers, HIL tools, layout tools, 3D printing prototyping tools, and measurement instruments.
Also, signed contracts are typically required, including from internal and outsourced team members, supplier contracts, and other formal organization-specific approvals.
3. “Directing Implementation” – or in other words: “Are you effectively doing it?”, for example:
- Do you know the current progress, quantitatively, as well as quantitatively?
- Is the current status for each stage and outcome adequately communicated to the stakeholders?
- Are any deviations from the original schedule or plans adequately accounted for?
- Do you know the current risk levels for each planned vital outcome?
- Is your effort estimation still accurate?
The old military wisdom, “No plan survives the first contact with the enemy,” may have some truth to that but it doesn’t mean that planning is useless. The planning is everything – the plan itself is not the goal but the means to an end. That also implies that planning must be an iterative activity. The evidence of proper planning is the schedule and plans updates. During the recurring planning activity, many essential aspects become apparent, such as analysis and mitigation of risks, as well as long-term milestones planning and stakeholder management. Quickly and — more importantly — timely analysis of risk management often makes or breaks a project.
The above is only possible if a proper reporting structure and tools are in place and being actively used for stakeholder communication purposes. The best reporting structure is automatically updated and visualized in real-time, such as in the form of issue management tools sued by the entire development team.
4. “Verifying Implementation” – or in other words: “Have you done it correctly?”, for example:
- Have all project outcomes been validated and verified?
- Have all formal approval stored correctly and securely?
- Has the progress been adequately reported to the holders?
- Have all outcomes accurately traced from across the entire V-model?
- Have all back-up measures in place?
- Have all open points been closed and confirmed?
The idea is to make sure that all activities have been performed as planned in addition to all course corrections. Some outcomes may include signed-off test runs, milestone sign-offs, baseline review signatures, and contract approvals. Any result should have a clearly defined status and date of completion. Other outcomes may include the trainer’s feedback sheet and other evidence of the correct process executing for assessment, contractual, and legal purposes.
Simplicity is hard
The four Common Features may look like a simple, linear, straightforward laundry list – but, naturally, it is not as trivial. For each Common Feature, further detailed activities and aspects must be defined. Also, the Common Feature does not suggest any specific life cycle or quality model, such as the “V-model” or “Scrum” methodology, nor specific work products such as “software design” or “test plan.” The Automotive SPICE – similar to CMMI – offers a more detailed “checklist” of things that need to be followed-upon.
The Common Features are a general approach – such as the “agility” paradigm, but they are a more purposeful and quality-oriented approach to process improvement. A practical way to use them is to ask – for each process, work instruction, checklist, and template – whether these process assets are consistent with the four Common Features. Otherwise, those assets should be removed or improved. In my opinion, the Common Features are quite useful when the goal is to reduce waste and to design lean processes.
Following the Common Features approach helps design both good and straightforward processes.
I greatly miss the concept of Common Features from the old CMMI specification. they were such a beautiful mental shortcut and it is a pity it was abandoned. Using them as a starting point for process improvement helps keep processes simple and more purposeful. Automotive SPICE is a good assessment model, but not a very helpful process improvement method. Unfortunately, there is no substitute in Automotive SPICE for the original CMMI Common Features.
Using process assessment standards for process improvement has resulted in a large number of failed process improvement initiatives. Maybe it is time to establish a process improvement standard instead. Common Features – or their equivalent – may be a starting point for that. They offer a good policy for effective process improvement and good process design. Common Features deserve a second chance because your process improvement team needs them.