DOE Releases Compliance Assessment Governance (CAG) Version 2.0

by Humphreys & Associates | June 3, 2022 5:14 pm

Significant Overhaul

In June 2022, the Department of Energy (DOE) Office of Project Management (PM) released a new version of its Compliance Assessment Governance (CAG) Version 2.0, formally known as the Compliance Assessment Guidance. The change in name however so slight, reflects the significance the DOE PM places on the EVMS governance process where good governance provides a well-defined structure to continuously sustain and improve the practice of integrated project management. DOE PM believes that during the life cycle of a project EVMS governance is everyone’s responsibility, both customer and contractor alike. While this point has always been the position of DOE PM, its importance towards influencing project outcomes was further highlighted in the results of a recent research study conducted by Arizona State University (ASU) that was sponsored by DOE PM. DOE PM considers the release of CAG 2.0 a significant overhaul to the prior guidance and now aligns with the methodology and structure resulting from the ASU Study.  

The primary goal of the ASU Study was to design and produce an evaluation methodology and toolset that can be used across agencies and departments to consistently assess the maturity and environment of projects or programs of various types and sizes that use an EVMS. The toolset developed assesses a spectrum of EVMS maturity attributes centered around the EIA-748 Standard for EVMS Guidelines, while also referencing the Project Management Institute (PMI) American National Standards Institute (ANSI) standard for EVM (2019), and the International Organization for Standardization (ISO) 21508:2018 guidance. It also takes a novel approach by assessing the environment within which an EVMS is employed. This methodology and toolset are known as the Integrated Project/Program (IP2M) Maturity and Environment Total Risk Rating (METRR) (pronounced “IP2M Meter”). By using IP2M METRR to assess both the maturity and environment of an EVMS or less sophisticated management control systems where EIA-748 compliance is not required, project/program leaders and practitioners can understand the efficacy of the EVMS (or similar management control system) in support of integrated project/program management.

Environmental Factors

The new methodology has two primary components. The first component assesses the environmental factors that influence the implementation of an EVMS. The ASU Study identified four primary environmental categories:  Culture, People, Practices, and Resources. Each category is further broken down into 27 Environmental Factors (EFs) that are defined by Effectiveness Criteria.   

An example of the hierarchy for an EVMS Environmental Assessment is: 

  • Category 1 – Culture
    • Environmental Factor:  1A – The contractor organization supports and is committed to EVMS implementation, including making the necessary investments for regular maintenance and self-governance.
      • Factor Check Point:  1A(a) – a) The contractor integrated project team (IPT)—including corporate leadership, execution, and operations personnel, oversight personnel, and support staff—is in place, and it has a demonstrated belief in the intrinsic value of the EVMS to position the project for success.

Each of the 27 EFs is assessed using a 5-point grading scale (i.e., 1. Not Acceptable, 2. Needs Improvement, 3. Meets Some, 4. Meets Most, 5. High Performing) with each having predetermined weightings. When summarized, all EF scores add up to 1,000 possible points. Final weightings were determined through a rigorous statistical analysis of inputs from various professional sources including the survey of professionals experienced with the practice of project management and EVMS implementation. The results of the ASU Study show that the higher the environment score, the better the chances a project or program will achieve better schedule and budget outcomes.   

Below are the summary level weightings of the four EVMS Environmental Categories:

  1. Culture: Total Possible Points @ L5 is 313 (31%)
  2. People: Total Possible Points @ L5 is 238 (24%)
  3. Practices: Total Possible Points @ L5 is 235 (24%)
  4. Resources: Total Possible Points @ L5 is 214 (21%)

EVMS Maturity

The second component of the ASU Study and Compliance Assessment Governance (CAG, 2.0) addresses EVMS Maturity. The purpose of EVMS Maturity is to assist in the assessment of compliance with the guidelines in the EIA-748. The maturity assessment consists of ten subprocesses, each of which is further divided into 56 attributes. The ten subprocesses may look familiar to practitioners with EVMS experience with the notable addition of “Risk Management” to the traditional nine guideline process groupings. The ASU Study captured the ten subprocess groups with the associated EIA-748 guidelines by using the following graphic from National Defense Industry Association (NDIA) Integrated Program Management Division (IPMD) Earned Value Management Scalability Guide. The primary purpose for doing this was to place greater focus on management subprocesses which are recognized by most practicing project/program managers rather than placing the entire focus on a particular guideline(s).  

Figure 1 - Ten Subprocess Mapped To EIA 748 Guidelines
Figure 1: Ten Subprocess Groups and Associated EIA-748 Guidelines
Source: DOE PM CAG 2.0

EVMS Maturity Subprocesses

As shown in Figure 2, each of the ten subprocesses is further broken down into attributes. Each attribute is assessed for a maturity level on a 5-point grading scale, plus the option of “Not Applicable” for those attributes that are not relevant to the project/program. For example, some project/program teams do not use resource-loaded schedules while others do. The 5-point grading scale is as follows:  1-Not Yet Started, 2-Major Gaps, 3-Minor Gaps, 4-No Gaps, 5-Best in Class. Attributes that are mature enough to be deemed an EIA-748 compliant EVMS are at a maturity level of “4.” The IP2M METRR methodology allows for a grade of “5” acknowledging management practices that optimize the EVMS. There are two primary sources to make this assessment. The first is the Effectiveness Criteria established for each maturity level, and the second is the DOE Attribute Metrics defined in 188 EVMS Testing Specification Sheets.

Below are the summary level weightings of the ten EVMS Maturity Subprocesses:

  1. Organizing: Total Possible Points @ L5 is 96 (10%)
  2. Planning and Scheduling: Total Possible Points @ L5 is 202 (20%)
  3. Budgeting and Work Authorization: Total Possible Points @ L5 is 178 (18%)
  4. Accounting Considerations: Total Possible Points @ L5 is 65 (7%)
  5. Indirect Budget and Cost Management: Total Possible Points @ L5 is 55 (6%)
  6. Analysis and Management Reporting: Total Possible Points @ L5 is 109 (11%)
  7. Change Control: Total Possible Points @ L5 is 116 (12%)
  8. Material Management: Total Possible Points @ L5 is 59 (6%)
  9. Subcontract Management: Total Possible Points @ L5 is 60 (6%)
  10. Risk Management: Total Possible Points @ L5 is 60 (6%)
Figure 2 - Ten Subprocess Attributes -  This image shows the summary level weightings of the ten EVMS Maturity Subprocesses on an example graph.
Figure 2: Ten Subprocess Groups Broken Down into Attributes
Source: DOE PM CAG 2.0

Of the ten subprocesses that constitute the EVMS, subprocesses B and C account for 380 points or 38% of the maximum score of 1,000 points (Figure 2). When combined with subprocesses F and G, these four subprocesses account for 605 points, or 61%, of the maximum score. Thus, emphasizing credible plans, schedules, and budgets with adequate controls and rigorous reporting best positions the EVMS to help the project/program team achieve their objectives.

An example of the evaluation hierarchy for an EVMS Maturity Assessment is: 

  • Sub-Process A:  Organizing
    • Attribute A.1:  A single product-oriented Work Breakdown Structure (WBS) encompasses all authorized work and is decomposed to the appropriate levels for effective management and reporting.
      • Effectiveness Criteria A.1.1:  The process to establish a singular, product-oriented WBS that accurately defines the products, services, and deliverables required to complete the project/program has been developed, documented, and approved. 
        • Metric ID A.01.05:  This metric confirms that the WBS includes all authorized project work and any revisions resulting from authorized changes and modifications.  This metric ensures that the WBS identifiers collectively provide a complete definition of work scope requirements.

DOE PM’s goal for the new guidance is to provide a synthesized and uniform approach to assessing an EVMS as a means to ensure fairness and consistency in its operations. The current version of the DOE EVMS Implementation Guidance and IP2M METRR documentation can be found here: 

https://www.energy.gov/projectmanagement/evms-implementation-guidance

and

https://www.energy.gov/projectmanagement/ecrsop-appendices-materials

and

https://ip2m.engineering.asu.edu/

Source URL: https://blog.humphreys-assoc.com/doe-releases-compliance-assessment-governance-cag-version-2-0/