CMMi - Decision Analysis and Resolution (DAR)





Decision Analysis and Resolution (DAR)
Process Areas
The CMMi easy button concept and disclaimer

Disclaimer: The opinions expressed here are the authors
and do not express a position on the subject from the
Software Engineering Institute (SEI) or any organization
or SEI Partner affiliated with the SEI.

The concept of The CMMi easy button is to be able to
jump start SQA software professionals in establishing an
effective Software process Improvement (SPI) framework that is based on CMMi theories and best practices.

CMMI, CMM, and Capability Maturity Model are registered
in the U.S. Patent and Trademark Office.
CMM Integration, SCAMPI, and IDEAL are service marks of
Carnegie Mellon University.
Causal Analysis and Resolution (CAR) Configuration Management (CM) Decision Analysis and Resolution (DAR)
Integrated Project Management +IPPD (IPM+IPPD) Measurement and Analysis (MA) Organizational Innovation and Deployment (OID)
Organizational Process Definition +IPPD (OPD+IPPD) Organizational Process Focus (OPF) Organizational Process Performance (OPP)
Organizational Training (OT) Product Integration (PI) Project Monitoring and Control (PMC)
Project Planning (PP) Process and Product Quality Assurance (PPQA) Quantitative Project Management (QPM)
Requirements Development (RD) Requirements Management (REQM) Risk Management (RSKM)
Supplier Agreement Management (SAM) Technical Solution (TS) Validation (VAL)
. Verification (VER) .
The CMMi Easy button notes on Decision Analysis and Resolution (DAR) Decision Analysis and Resolution (DAR) purpose and introductory notes
Specific Goals and Practices
Specific Goal 1 (SG 1) Evaluate Alternatives (SP 1.*)
SP 1.1 Establish Guidelines for Decision Analysis SP 1.2 Establish Evaluation Criteria SP 1.3 Identify Alternative Solutions SP 1.4 Select Evaluation Methods
SP 1.5 Evaluate Alternatives SP 1.6 Select Solutions . .
Generic Goals and Practices
Generic Goal 1 (GG 1) Achieve Specific Goals, Generic practices (GP 1.*)
GP 1.1 Perform Specific Practices . . .
Generic Goal 2 (GG 2) Institutionalize a Managed Process, Generic practices (GP 2.*)
GP 2.1 Establish an Organizational Policy GP 2.2 Plan the Process GP 2.3 Provide Resources GP 2.4 Assign Responsibility
GP 2.5 Train People GP 2.6 Manage Configurations GP 2.7 Identify and Involve Relevant Stakeholders GP 2.8 Monitor and Control the Process
GP 2.9 Objectively Evaluate Adherence GP 2.10 Review Status with Higher Level Management . .
Generic Goal 3 (GG 3) Institutionalize a Defined Process, Generic practices (GP 3.*)
GP 3.1 Establish a Defined Process GP 3.2 Collect Improvement Information . .
Generic Goal 4 (GG 4) Institutionalize a Quantitatively Managed Process, Generic practices (GP 4.*)
GP 4.1 Establish Quantitative Objectives for the Process GP 4.2 Stabilize Subprocess Performance . .
Generic Goal 5 (GG 5) Institutionalize an Optimizing Process, Generic practices (GP 5.*)
GP 5.1 Ensure Continuous Process Improvement GP 5.2 Correct Root Causes of Problems . .

The CMMi Easy button notes on Decision Analysis and Resolution (DAR)

For most people the idea of identifying and implementing decision making processes and procedures seems redundant. It is either seen as something managers should be doing or is viewed as being self evident to the particular situation. However, consider the occasions when you have attended a User Interface design alternatives meeting, this meeting may start (and continue) with people drawing their ideas on a whiteboard and then discussing or voting on them. In order to arrive at a useful conclusion, in the given user interface design meeting, it is better to agree and establish the criteria for evaluating alternatives or basic exit criteria at the beginning of the meeting. It is also useful to establish how the proposed alternatives will be evaluated (against the agreed criteria) in order to arrive a decision (or user interface design). Decision Analysis and Resolution (DAR) is really about formalizing the process that ultimately arrives at a decision. Clearly all software decisions should not be subjected to a formal DAR process, so only the critical decisions are selected, these typically include:-
  • Decisions that may move the project timeline
  • Design decisions that could have a major impact on system performance
  • Decisions that have legal implications (i.e. Company being sued over product performance
The formal decision making process will involve some use of evaluation techniques, such as Modeling simulation and User review\comment.

Although each decision type requires its own evaluation technique, these internet resources provide a great first step in implementing a formal DAR process:-


Decision Analysis and Resolution (DAR)



A Support Process Area at Maturity Level 3



Purpose

The purpose of Decision Analysis and Resolution (DAR) is to analyze possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria.

Introductory Notes
The Decision Analysis and Resolution process area involves establishing guidelines to determine which issues should be subjected to a formal evaluation process and then applying formal evaluation processes to these issues.

A formal evaluation process is a structured approach to evaluating alternative solutions against established criteria to determine a recommended solution to address an issue. A formal evaluation process involves the following actions:
  • Establishing the criteria for evaluating alternatives
  • Identifying alternative solutions
  • Selecting methods for evaluating alternatives
  • Evaluating the alternative solutions using the established criteria and methods
  • Selecting recommended solutions from the alternatives based on the evaluation criteria
Rather than using the phrase "alternative solutions to address issues" each time it is needed, we will use one of two shorter phrases: "alternative solutions" or "alternatives."

A formal evaluation process reduces the subjective nature of the decision and has a higher probability of selecting a solution that meets the multiple demands of relevant stakeholders.

While the primary application of this process area is to technical concerns, formal evaluation processes can also be applied to many nontechnical issues, particularly when a project is being planned. Issues that have multiple alternative solutions and evaluation criteria lend themselves to a formal evaluation process.

Trade studies of equipment or software are typical examples of formal evaluation processes.

During planning, specific issues requiring a formal evaluation process are identified. Typical issues include selection among architectural or design alternatives, use of reusable or commercial off-the-shelf (COTS) components, supplier selection, engineering support environments or associated tools, test environments, delivery alternatives, and logistics and production. A formal evaluation process can also be used to address a make-or-buy decision, the development of manufacturing processes, the selection of distribution locations, and other decisions.

Guidelines are created for deciding when to use formal evaluation processes to address unplanned issues. Guidelines often suggest using formal evaluation processes when issues are associated with medium to high risks or when issues affect the ability to achieve project objectives.

Formal evaluation processes can vary in formality, type of criteria, and methods employed. Less formal decisions can be analyzed in a few hours, use only a few criteria (e.g., effectiveness and cost to implement), and result in a one- or two-page report. More formal decisions may require separate plans, months of effort, meetings to develop and approve criteria, simulations, prototypes, piloting, and extensive documentation.

Both numeric and non-numeric criteria can be used in a formal evaluation process. Numeric criteria use weights to reflect the relative importance of the criteria. Non-numeric criteria use a more subjective ranking scale (e.g., high, medium, or low). More formal decisions may require a full trade study.

A formal evaluation process identifies and evaluates alternative solutions. The eventual selection of a solution may involve iterative activities of identification and evaluation. Portions of identified alternatives may be combined, emerging technologies may change alternatives, and the business situation of vendors may change during the evaluation period.

A recommended alternative is accompanied by documentation of the selected methods, criteria, alternatives, and rationale for the recommendation. The documentation is distributed to relevant stakeholders; it provides a record of the formal evaluation process and rationale that are useful to other projects that encounter a similar issue.

While some of the decisions made throughout the life of the project involve the use of a formal evaluation process, others do not. As mentioned earlier, guidelines should be established to determine which issues should be subjected to a formal evaluation process.

Related Process Areas.

Refer to the Project Planning process area for more information about general planning for projects.

Refer to the Integrated Project Management process area for more information about establishing the project's defined process. The project's defined process includes a formal evaluation process for each selected issue and incorporates the use of guidelines for applying a formal evaluation process to unforeseen issues.

Refer to the Risk Management process area for more information about identifying and mitigating risks. A formal evaluation process is often used to address issues with identified medium or high risks. Selected solutions typically affect risk mitigation plans.

Specific Practices by Goal

SG 1 Evaluate Alternatives
Decisions are based on an evaluation of alternatives using established criteria.
Issues requiring a formal evaluation process may be identified at any time. The objective should be to identify issues as early as possible to maximize the time available to resolve them.

SP 1.1 Establish Guidelines for Decision Analysis

Establish and maintain guidelines to determine which issues are subject to a formal evaluation process.

Not every decision is significant enough to require a formal evaluation process. The choice between the trivial and the truly important will be unclear without explicit guidance. Whether a decision is significant or not is dependent on the project and circumstances, and is determined by the established guidelines.

Typical guidelines for determining when to require a formal evaluation process include the following:
  • When a decision is directly related to topics assessed as being of medium or high risk
  • When a decision is related to changing work products under configuration management
  • When a decision would cause schedule delays over a certain percentage or specific amount of time
  • When a decision affects the ability to achieve project objectives
  • When the costs of the formal evaluation process are reasonable when compared to the decision's impact
  • When a legal obligation exists during a solicitation
Refer to the Risk Management process area for more information about determining which issues are medium or high risk.

Examples of when to use a formal evaluation process include the following:
  • On decisions involving the procurement of material when 20 percent of the material parts constitute 80 percent of the total material costs
  • On design-implementation decisions when technical performance failure may cause a catastrophic failure (e.g., safety of flight item)
  • On decisions with the potential to significantly reduce design risk, engineering changes, cycle time, response time, and production costs (e.g., to use lithography models to assess form and fit capability before releasing engineering drawings and production builds)
Typical Work Products

Guidelines for when to apply a formal evaluation process

Subpractice 1: Establish guidelines.

Subpractice 2: Incorporate the use of the guidelines into the defined process where appropriate.

Refer to the Integrated Project Management process area for more information about establishing the project's defined process.

SP 1.2 Establish Evaluation Criteria

Establish and maintain the criteria for evaluating alternatives, and the relative ranking of these criteria.

The evaluation criteria provide the basis for evaluating alternative solutions. The criteria are ranked so that the highest ranked criteria exert the most influence on the evaluation.

This process area is referenced by many other process areas in the model, and there are many contexts in which a formal evaluation process can be used. Therefore, in some situations you may find that criteria have already been defined as part of another process. This specific practice does not suggest that a second development of criteria be conducted.

Document the evaluation criteria to minimize the possibility that decisions will be second-guessed, or that the reason for making the decision will be forgotten. Decisions based on criteria that are explicitly defined and established remove barriers to stakeholder buy-in.

Typical Work Products
  • Documented evaluation criteria
  • Rankings of criteria importance
Subpractice 1 Define the criteria for evaluating alternative solutions.

Criteria should be traceable to requirements, scenarios, business case assumptions, business objectives, or other documented sources. Types of criteria to consider include the following:
  • Technology limitations
  • Environmental impact
  • Risks
  • Total ownership and lifecycle costs
Subpractice 2 Define the range and scale for ranking the evaluation criteria.

Scales of relative importance for evaluation criteria can be established with non-numeric values or with formulas that relate the evaluation parameter to a numeric weight.

Subpractice 3 Rank the criteria.

The criteria are ranked according to the defined range and scale to reflect the needs, objectives, and priorities of the relevant stakeholders.

Subpractice 4 Assess the criteria and their relative importance.

Subpractice 5 Evolve the evaluation criteria to improve their validity.

Subpractice 6 Document the rationale for the selection and rejection of evaluation criteria.

Documentation of selection criteria and rationale may be needed to justify solutions or for future reference and use.

SP 1.3 Identify Alternative Solutions

Identify alternative solutions to address issues.

A wider range of alternatives can surface by soliciting as many stakeholders as practical for input. Input from stakeholders with diverse skills and backgrounds can help teams identify and address assumptions, constraints, and biases. Brainstorming sessions may stimulate innovative alternatives through rapid interaction and feedback. Sufficient candidate solutions may not be furnished for analysis. As the analysis proceeds, other alternatives should be added to the list of potential candidate solutions. The generation and consideration of multiple alternatives early in a decision analysis and resolution process increases the likelihood that an acceptable decision will be made, and that consequences of the decision will be understood.

Typical Work Products
  • Identified alternatives
Subpractice 1 Perform a literature search.

A literature search can uncover what others have done both inside and outside the organization. It may provide a deeper understanding of the problem, alternatives to consider, barriers to implementation, existing trade studies, and lessons learned from similar decisions.

Subpractice 2 Identify alternatives for consideration in addition to those that may be provided with the issue.

Evaluation criteria are an effective starting point for identifying alternatives. The evaluation criteria identify the priorities of the relevant stakeholders and the importance of technical, logistical, or other challenges.

Combining key attributes of existing alternatives can generate additional and sometimes stronger alternatives.

Solicit alternatives from relevant stakeholders. Brainstorming sessions, interviews, and working groups can be used effectively to uncover alternatives.

Subpractice 3 Document the proposed alternatives.

SP 1.4 Select Evaluation Methods

Select the evaluation methods.

Methods for evaluating alternative solutions against established criteria can range from simulations to the use of probabilistic models and decision theory. These methods need to be carefully selected. The level of detail of a method should be commensurate with cost, schedule, performance, and risk impacts.

While many problems may need only one evaluation method, some problems may require multiple methods. For instance, simulations may augment a trade study to determine which design alternative best meets a given criterion.

Typical Work Products
  • Selected evaluation methods
Subpractice 1 Select the methods based on the purpose for analyzing a decision and on the availability of the information used to support the method.

For example, the methods used for evaluating a solution when requirements are weakly defined may be different from the methods used when the requirements are well defined.

Typical evaluation methods include the following:
  • Modeling and simulation
  • Engineering studies
  • Manufacturing studies
  • Cost studies
  • Business opportunity studies
  • Surveys
  • Extrapolations based on field experience and prototypes
  • User review and comment
  • Testing
  • Judgment provided by an expert or group of experts (e.g., Delphi Method)
Subpractice 2 Select evaluation methods based on their ability to focus on the issues at hand without being overly influenced by side issues.

Results of simulations can be skewed by random activities in the solution that are not directly related to the issues at hand.

Subpractice 3 Determine the measures needed to support the evaluation method.

Consider the impact on cost, schedule, performance, and risks.

SP 1.5 Evaluate Alternatives

Evaluate alternative solutions using the established criteria and methods.

Evaluating alternative solutions involves analysis, discussion, and review. Iterative cycles of analysis are sometimes necessary. Supporting analyses, experimentation, prototyping, piloting, or simulations may be needed to substantiate scoring and conclusions.

Often, the relative importance of criteria is imprecise and the total effect on a solution is not apparent until after the analysis is performed. In cases where the resulting scores differ by relatively small amounts, the best selection among alternative solutions may not be clear cut. Challenges to criteria and assumptions should be encouraged.

Typical Work Products
  • Evaluation results
Subpractice 1 Evaluate the proposed alternative solutions using the established evaluation criteria and selected methods.

Subpractice 2 Evaluate the assumptions related to the evaluation criteria and the evidence that supports the assumptions.

Subpractice 3 Evaluate whether uncertainty in the values for alternative solutions affects the evaluation and address as appropriate. For instance, if the score can vary between two values, is the difference significant enough to make a difference in the final solution set? Does the variation in score represent a high risk? To address these concerns, simulations may be run, further studies may be performed, or evaluation criteria may be modified, among other things.

Subpractice 4 Perform simulations, modeling, prototypes, and pilots as necessary to exercise the evaluation criteria, methods, and alternative solutions. Untested criteria, their relative importance, and supporting data or functions may cause the validity of solutions to be questioned. Criteria and their relative priorities and scales can be tested with trial runs against a set of alternatives. These trial runs of a select set of criteria allow for the evaluation of the cumulative impact of the criteria on a solution. If the trials reveal problems, different criteria or alternatives might be considered to avoid biases.

Subpractice 5 Consider new alternative solutions, criteria, or methods if the proposed alternatives do not test well; repeat the evaluations until alternatives do test well.

Subpractice 6 Document the results of the evaluation. Document the rationale for the addition of new alternatives or methods and changes to criteria, as well as the results of interim evaluations.

SP 1.6 Select Solutions

Select solutions from the alternatives based on the evaluation criteria.

Selecting solutions involves weighing the results from the evaluation of alternatives. Risks associated with implementation of the solutions must be assessed.

Typical Work Products
  • Recommended solutions to address significant issues
Subpractice 1 Assess the risks associated with implementing the recommended solution.

Refer to the Risk Management process area for more information about identifying and managing risks.

Decisions must often be made with incomplete information. There can be substantial risk associated with the decision because of having incomplete information.

When decisions must be made according to a specific schedule, time and resources may not be available for gathering complete information. Consequently, risky decisions made with incomplete information may require re-analysis later. Identified risks should be monitored.

Subpractice 2 Document the results and rationale for the recommended solution.

It is important to record both why a solution is selected and why another solution was rejected.

Generic Practices by Goal

GG 1 Achieve Specific Goals

The process supports and enables achievement of the specific goals of the process area by transforming identifiable input work products to produce identifiable output work products.

GP 1.1 Perform Specific Practices

Perform the specific practices of the decision analysis and resolution process to develop work products and provide services to achieve the specific goals of the process area.

GG 2 Institutionalize a Managed Process

The process is institutionalized as a managed process.

GP 2.1 Establish an Organizational Policy

Establish and maintain an organizational policy for planning and performing the decision analysis and resolution process.



Elaboration:

This policy establishes organizational expectations for identifying and systematically addressing root causes of defects and other problems.

GP 2.2 Plan the Process

Establish and maintain the plan for performing the decision analysis and resolution process.

Elaboration:

This plan for performing the decision analysis and resolution process can be included in (or referenced by) the project plan, which is described in the Project Planning process area.

GP 2.3 Provide Resources

Provide adequate resources for performing the decision analysis and resolution process, developing the work products, and providing the services of the process.

Elaboration:

Examples of resources provided include the following tools:
  • Simulators and modeling tools
  • Prototyping tools
  • Tools for conducting surveys
GP 2.4 Assign Responsibility

Assign responsibility and authority for performing the process, developing the work products, and providing the services of the decision analysis and resolution process.

GP 2.5 Train People

Train the people performing or supporting the decision analysis and resolution process as needed.

Elaboration:

Examples of training topics include the following:
  • Formal decision analysis
  • Methods for evaluating alternative solutions against criteria
GP 2.6 Manage Configurations

Place designated work products of the decision analysis and resolution process under appropriate levels of control.

Elaboration:

Examples of work products placed under control include the following:
  • Guidelines for when to apply a formal evaluation process
  • Evaluation reports containing recommended solutions
GP 2.7 Identify and Involve Relevant Stakeholders

Identify and involve the relevant stakeholders of the decision analysis and resolution process as planned.

Elaboration:

Examples of activities for stakeholder involvement include the following:
  • Establishing guidelines for which issues are subject to a formal evaluation process
  • Establishing evaluation criteria
  • Identifying and evaluating alternatives
  • Selecting evaluation methods
  • Selecting solutions
GP 2.8 Monitor and Control the Process

Monitor and control the decision analysis and resolution process against the plan for performing the process and take appropriate corrective action.

Elaboration:

Examples of measures and work products used in monitoring and controlling include the following:
  • Cost-to-benefit ratio of using formal evaluation processes
  • Schedule for the execution of a trade study
GP 2.9 Objectively Evaluate Adherence

Objectively evaluate adherence of the decision analysis and resolution process against its process description, standards, and procedures, and address noncompliance.

Elaboration:

Examples of activities reviewed include the following:
  • Evaluating alternatives using established criteria and methods
Examples of work products reviewed include the following:
  • Guidelines for when to apply a formal evaluation process
  • Evaluation reports containing recommended solutions
GP 2.10 Review Status with Higher Level Management

Review the activities, status, and results of the decision analysis and resolution process with higher level management and resolve issues.

GG 3 Institutionalize a Defined Process

The process is institutionalized as a defined process.
This generic goal's appearance here reflects its location in the continuous representation.

GP 3.1 Establish a Defined Process

Establish and maintain the description of a defined decision analysis and resolution process.

GP 3.2 Collect Improvement Information

Collect work products, measures, measurement results, and improvement information derived from planning and performing the decision analysis and resolution process to support the future use and improvement of the organization's processes and process assets.

Elaboration:

Examples of work products, measures, measurement results, and improvement information include the following:
  • Number of alternatives considered
  • Evaluation results
  • Recommended solutions to address significant issues
GG 4 Institutionalize a Quantitatively Managed Process

The process is institutionalized as a quantitatively managed process.

GP 4.1 Establish Quantitative Objectives for the Process

Establish and maintain quantitative objectives for the configuration management process, which address quality and process performance, based on customer needs and business objectives.

GP 4.2 Stabilize Subprocess Performance

Stabilize the performance of one or more subprocesses to determine the ability of the configuration management process to achieve the established quantitative quality and process-performance objectives.

GG 5 Institutionalize an Optimizing Process

The process is institutionalized as an optimizing process.

GP 5.1 Ensure Continuous Process Improvement

Ensure continuous improvement of the decision analysis and resolution process in fulfilling the relevant business objectives of the organization.

GP 5.2 Correct Root Causes of Problems

Identify and correct the root causes of defects and other problems in the decision analysis and resolution process.


Software-Quality-Assurance.org is an independent Web Site that presents information about CMMi and Software Quality Assurance. No guarantee (or claim) is made regarding the accuracy of this information. Any questions or comments should be sent to:-