Earned Value Management System (EVMS)

Clarification on the New Department of Defense Earned Value Management System EVMS Thresholds | DOD & DPAP

, , , , , , , , , , , , , , , ,

New Department of Defense Earned Value Management System (EVMS) ThresholdsOn September 28, 2015, the Defense Procurement and Acquisition Policy Directorate (<abbr=”Defense Procurement and Acquisition Policy Directorate”>DPAP) released a memorandum entitled “Class Deviation – Earned Value Management System Threshold”. In this memo the DoD changed the threshold for <abbr=”Earned Value Management System”>EVMS application to $100 million for compliance with EIA-748 for cost or incentive contracts and subcontracts. That same memorandum stated that no EVMS surveillance activities will be routinely conducted by the Defense Contract Management Agency (<abbr=”Defense Contract Management Agency”>DCMA) on contracts or subcontracts between $20 million to $100 million. As attachments to this memorandum, there was a reissuance of the Notice of Earned Value Management System <abbr=”Department of Defense Federal Acquisition Regulations”>DFARs clause (252.234-7001) and the Earned Value Management Systems DFARs clause (252.234-7002), with both reflecting the new $100 million threshold.In response to this guidance, a series of questions from both contractors and other government personnel were submitted to Shane Olsen of the DCMA EVM Implementation Division (<abbr=”EVM Implementation Division”>EVMID). Below are the salient points from this communication:

  • There will be no EVMS surveillance of DFARs contracts under $100 million. Contracts without the DFARs clause, such as those under other agencies using the FAR EVM clause, will continue surveillance under their current thresholds.
  • The $100 million threshold is determined on the larger of the contract’s Ceiling Price or Target Price; as reported on the Integrated Program Management Report (IPMR) or Contract Performance Report (CPR) Format 1.
  • The threshold is based on the Contract Value including fee (at Price) as noted above. If there is an approved Over Target Baseline (OTB) which increases the Total Allocated Budget (TAB), this cannot push a contract over the threshold.
  • The new thresholds not only apply to subcontracts, but also Inter-organizational work orders with an EVMS flow-down.
  • Regardless of the circumstances, the DCMA will not conduct surveillance on contracts less than $100 million. However, if there are Earned Value issues that the buying command or other parties believe need to be reviewed, then the DCMA may conduct a Review for Cause (RFC) of the system against potentially affected guidelines.
  • The DCMA Operations EVM Implementation Division (EVMID) will not be conducting Compliance Reviews in FY-2016 unless there is an “emergent need”.
  • If a site is selected for a Compliance Review, only contracts greater than $100 million would be in the initial scope of the Implementation Review (IR). However, if an issue is discovered that requires the team to “open the aperture”, other contracts are not precluded.

The DCMA is still working on a response to the following questions:

  • How do I handle a contract that is currently below $100 million but has options that, in aggregate, would exceed $100 million?
  • How is the contract value determined on:
    • Indefinite Delivery/Indefinite Quantity (ID/IQ) Contracts
    • Non-ID/IQ with Multiple CLIN-Level or Task Order reports?

This blog will be updated and reposted as answers to these questions are given.

Clarification on the New Department of Defense Earned Value Management System EVMS Thresholds | DOD & DPAP Read Post »

EVMS Variance Analysis — EVMS Analysis and Management Reports

, , , ,

A Variance Analysis Report (VAR) that includes specific information about the cause, impact, and corrective action “provides management with early insight into the extent of problems and allows corrective actions to be implemented in time to affect the future course of the program” [reference: NDIA, IPMD EIA-748 (Revision D) EVMS Intent Guide]. Unfortunately, variance analysis is an easy target for criticism during EVMS reviews. There are many examples of inadequate variance analysis to choose from, but what they all have in common is the lack of specific information on the “why, what, how, when, and who” of any variance. The variance analysis reporting requirements are found in the EIA-748 (Revision D) Guidelines in Section IV., Analysis and Management Reports, Guidelines 22-27.

EIA-748 Guidelines
Section IV. Analysis and Management Reports
22 2-4a Control Account Monthly Summary, Identification of CV and SV
23* 2-4b Explain Significant Variances | Earned Value Management
24 2-4c Identify and Explain Indirect Cost Variances
25 2-4d Summarize Data Elements and Variances thru WBS/OBS for Management
26* 2-4e Implement Management Actions as Result of EVM Analysis
27* 2-4f Revise EAC Based on Performance Data; Calculate VAC


A VAR that includes specific information and data about a problem will allow management to make informed decisions and mitigate project risk. Getting specific about variance analysis reporting includes the following elements.

Overall:

  • Emphasis on the quantitative, not qualitative
  • Emphasis on the specific, not the general
  • Emphasis on significant problems, not all problems
  • Define abbreviations and acronyms at first use
  • The Control Account Manager (CAM) is the most knowledgeable person to write the variance analysis report but will need information from the business support team

Cause:

  • Isolate significant variances
  • Discuss cost and schedule variances separately
  • Clearly identify the reason (root cause) for the variance (ties to the corrective action plan)
  • Clear, concise explanation of the technical reason for the variance
  • Provide cost element analysis
    • Labor – hours, direct rates, skill mix, overtime (rate & volume)
    • Material – unplanned requirements, excess quantities, unfavorable prices (price & usage)
    • Subcontracts – changing requirements, additional in-scope work, schedule changes
    • Other Direct Costs – unanticipated usage, in-house vendor
    • Overhead (indirect) – direct base, rate changes
  • Identify what tasks are behind schedule and why

Impact:

  • Describe specific cost, schedule, and technical impact on the project
  • Project future control account performance (continuing problem)
  • Address effect on immediate tasks, intermediate schedules, critical path, driving paths, risk mitigation tasks
  • Describe erosion of schedule margin, impacts to contractual milestones or delivery dates, and when the schedule variance will become zero (this may only mean the work getting completed late (BCWPcum =BCWScum); and does not necessarily mean getting “back on schedule”
  • Describe any impact to other control accounts
  • Assess the need to revise and provide rationale for the Estimate at Completion (justify ETC realism – CPI to TCPI comparison, impacts of corrective action plan, risk mitigation, open commitments, staffing changes, etc.)
  • Note: If there is a root cause, there will be an impact. It could be related to cost, schedule, lessons learned to be applied to future activity, an update required to a process to support the corrective action or a re-prioritization of resources to meet a schedule.

Corrective Action Planning:

  • Describe specific actions being taken, or to be taken, to alleviate or minimize the impact of the problem
  • Include the individual or organization responsible for the required action
  • Include schedules for the actions and estimated completion dates (ECD)
  • If no corrective action is possible, explain why
  • Include results of corrective action plans in previous VARs.

Ask yourself, is the analyses presented in a manner that is understandable? Does the data support the narrative? Does the variance explanation provide specifics of:

why” the problem occurred,
what” is impacted now or in the future,
how” the corrective action is being taken,
when” the corrective actions will occur,
when” the schedule variance will become zero, and/ or the work gets “back on schedule”
who” is responsible for implementing the corrections?

Remember, a well-developed Variance Analysis Report can reduce the risk of a Corrective Action Request (CAR) during an EVMS review.

EVMS Variance Analysis — EVMS Analysis and Management Reports Read Post »

DoD Earned Value Management System Interpretation Guide | EVMSIG

, , , , ,

The updated DoD Earned Value Management System Interpretation Guide (EVMSIG), dated February 18, 2015 was released in March, 2015.

This DoD update, per the GAO, focuses on “(1) problems facing the cost/schedule control system (CS2) process; (2) progress DOD has made with reforms; and (3) challenges DOD faces in fostering and managing potentially significant changes”.

The update commences with:

EVMSIG INTRODUCTION

1.1 Purpose of Guide

Earned Value Management (EVM) is a widely accepted industry best practice for program management that is used across the Department of Defense (DoD), the Federal government, and the commercial sector. Government and industry program managers use EVM as a program management tool to provide joint situational awareness of program status and to assess the cost, schedule, and technical performance of programs for proactive course correction. An EVM System (EVMS) is the management control system that integrates a program’s work scope, schedule, and cost parameters for optimum program planning and control. To be useful as a program management tool, program managers must incorporate EVM into their acquisition decision-making processes; the EVM performance data generated by the EVMS must be timely, accurate, reliable, and auditable; and the EVMS must be implemented in a disciplined manner consistent with the 32 EVMS Guidelines prescribed in Section 2 of the Electronic Industries Alliance Standard-748 EVMS (EIA-748) (Reference (a)), hereafter referred to as “the 32 Guidelines.”

The DoD EVMS Interpretation Guide (EVMSIG), hereafter referred to as “the Guide”, provides the overarching DoD interpretation of the 32 Guidelines where an EVMS requirement is applied. It serves as the authoritative source for EVMS interpretive guidance and is used as the basis for the DoD to assess EVMS compliance to the 32 Guidelines in accordance with Defense Federal Acquisition Regulation Supplement (DFARS) Subpart 234.2 and 234.201 (References (b) and (c)). The Guide provides the DoD Strategic Intent behind each guideline as well as the specific attributes required in a compliant EVMS. Those attributes are the general qualities of effective implementation that are tested in support of determining EVMS compliance as it relates to the 32 Guidelines. As applicable, the DoD Strategic Intent section may clarify where differences in guideline interpretation exist for development and production type work. DoD agencies and organizations charged with conducting initial and continuing EVMS compliance activities will establish amplifying agency procedures and/or guidance to clarify how they are implementing this Guide to include the development of evaluation methods for the attributes associated with each of the 32 Guidelines.

1.2 EVM Policy

The Office of Management and Budget Circular No. A-11 (Reference (d)), the Federal Acquisition Regulation (FAR) Subpart 34.2 and Part 52 (References (e) through (h)) require federal government agency contractors to establish, maintain, and use an EVMS that is compliant with the 32 Guidelines on all major capital asset acquisitions. Based on these federal regulations and the DoD Instruction 5000.02 (DoDI 5000.02) (Reference (i)), the DoD established the Defense Federal Acquisition Regulation Supplement (DFARS) 234.201 (Reference (c)), which prescribes application of an EVMS, via the DFARS 252.234-7002 EVMS clause (Reference (j)). When EVM reporting is contractually required, the contractor must submit to the government an Integrated Program Management Report (IPMR) (DI-MGMT-81861) (Reference (k)) to report program cost and schedule performance data. The IPMR is being phased in to replace the Contract Performance Report (CPR) (DI-MGMT-81466) and the Integrated Master Schedule (IMS) (DI-MGMT-81650). Hereafter, for simplicity purposes, the term “IPMR” is used to reference legacy or current CPR/IMS DIDs. There are times in this Guide when the IMS reference is to an output of the contractor’s internal management system, i.e., a work product, which may not be referred to in the same context as the IPMR. [The full EVMSIG update is found here.]

Furthermore, also in March, 2015 the GAO released its “Report to the Committee on Armed Services, House of Representatives: Defense Acquisition | Better Approach Needed to Account for Number, Cost, and Performance of Non-Major Programs”.

An overview:

The Department of Defense (DOD) could not provide sufficiently reliable data for GAO to determine the number, total cost, or performance of DOD’s current acquisition category (ACAT) II and III programs (GAO-15-188Better Approach Needed to Account for Number, Cost, and Performance of Non-Major Programsoverview). These non-major programs range from a multibillion dollar aircraft radar modernization program to soldier clothing and protective equipment programs in the tens of millions of dollars. GAO found that the accuracy, completeness, and consistency of DOD’s data on these programs were undermined by widespread data entry issues, missing data, and inconsistent identification of current ACAT II and III programs. See the figure below for selected data reliability issues GAO identified. [The full GAO-15-188 document is found here.]

DoD Earned Value Management System Interpretation Guide | EVMSIG Read Post »

Corrective Action Response: Planning and Closure – Part 2 of 2

, , , , , ,

Review Part 1 of Corrective Action Responses addresses Planning and Closure


Responding to a Corrective Action Request (CAR)– Planning and Closure

It is important that the contractor develop a disciplined, standardized approach for responding to a corrective action response.  This not only helps ensure that the responses are complete and contain compliant corrective actions, but that they also represent the position of the entire contractor team.  Below are nine suggested steps for successful Corrective Action Plan (CAP) development.

1)    Review the DRs/CARs with the customer

Prior to developing a corrective action in response to a Corrective Action Request (CAR), the first step is to ensure that both parties, the contractor and the review team, have a mutual understanding of the finding.  This also serves to screen those findings that may have been the result of a misunderstanding with the data or an incorrect statement from a member of the contractor’s team.  It is also recommended that DRs/CARs with similar or duplicative findings be grouped together so that a single Corrective Action Plan (CAP) can be used to address the issue.  When doing this, it is imperative that this approach is communicated to the review team lead and the grouping strategy approved before beginning corrective actions.  This is generally an acceptable approach providing the CAP closures can be traced to the original findings.

2)     Organize for successful CAP management

Once a mutual understanding has been reached on the corrective actions, the contractor must then begin the process of correcting or mitigating the identified issues.  It is critical that the process of corrective action has the participation of key management and organizations that can affect change.  When there are a significant number of findings that are to be corrected, the establishment of a senior management Review Board is a recommended method for managing the process.  The roles of the board are:

    • Ensure a CAP is developed and supported by a structured CAR/DR resolution process;
    • Assign an individual from the responsible organization to lead the corrective action efforts;
    • Review the proposed schedule for the CAP, and monitor progress towards CAP closure;
    • Review and approve all CAR/DR root cause assessments and proposed corrective action including the closure criteria;
    • Serve as the primary point of contact with the Customer for CAR/DR resolution and closure.

3)     Begin a thorough Root Cause Analysis

A tempting direction at this stage is to allow for a quick fix of the identified issue.  This may be acceptable for “just fix it” types of findings such as typos, formula errors, incorrect data runs, etc.; but most findings require a more in-depth approach to ensure that the underlying drivers of the issue are being addressed.  Most organizations have employees who are specialized in root cause analysis, such as Six Sigma or LEAN process improvement advisers. This would be a good time to employ their skills.  Tools such as “The 5 Whys” and the Ishikawa Fishbone Diagram are excellent methods for identifying the root causes.  These tools and processes are extremely effective in uncovering the sources of the problem.

A customer review team often samples a subset of CAMs, processes, or data in its review because of a limited amount of time or resources.  It is often the case that a more thorough root cause analysis conducted by the contractor team will uncover additional issues that need to be addressed and corrected.   The contractor’s obligation to the customer is to provide full visibility regarding the corrective actions associated with those findings identified by the customer.  While it is important that all issues are corrected or mitigated, it is, however, the contractor’s choice to allow visibility into those issues that were not discovered by the customer review team.

4)     Develop and evaluate Corrective Action Plans

A single DR or CAR issued by a customer team may have numerous corrective actions identified in the solution process.  Often a single problem may have corrective actions that entail changes in processes, training, tools, or management approach, or any combination of all of these.  Regardless, it is important to identify corrective actions that will prevent recurrence of similar outcomes, and will not cause or introduce other new or additional problems.  One important benefit of including senior management in the CAP Review Board process is the capability to reach beyond the owners of a particular CAP to influence other stakeholders in the organization who have the responsibility to incorporate corrective actions or who may be impacted by the solutions being identified.

5)     Develop verification closure steps

It is critical that verification methods, objective measures, metrics, artifacts, and evidential products are identified that will verify that the corrective actions are effective.  This includes any exit criteria for any activities in the CAP Integrated Master Schedule (IMS), which is a schedule network that contains all the detailed work packages (including activities and milestones) and planning packages to support the events, accomplishments, and criteria of the Integrated Master Plan  (if applicable). It is directly traceable to the Contract Work Breakdown Structure (CWBS) and the contract statement of work. The IMS is critical to CAP success.  On data driven findings, the criteria for verification often involves producing several accounting periods of results as evidence that the corrective actions were effective.  The CAP Review Board is responsible for reviewing the status of the exit criteria, and verifying that the required objective measures have been satisfied.

6)     Develop a detailed Integrated Master Schedule for CAP implementation

A critical component of any project, including corrective action development and implementation, is a detailed IMS containing the project scope and the required dates of completion.  There should be a unique IMS for each CAP that includes:

    1. Root Cause Analysis
    2. Changes to processes, tools, training, and other required system adjustments
    3. Management Review and regular team meetings
    4. Responsibility assignment for each activity
    5. Development of products and artifacts which will demonstrate effectiveness
    6. Validation and Verification steps with Closure Criteria

Resource loading the IMS is an important process, as it communicates to the management team the required personnel to accomplish implementation of the Corrective Action Plans, and can serve as a commitment on its part to support the process until closure.  If there is a lack of available resources available to support the process, this may impact the completion dates established for the corrective actions.  All tasks should be logically networked (with predecessors and successors) without any constraints.  Progress should be based on a 0 to 100% scale without subjective interpretation.  As mentioned above, data validation normally requires several months of data submittals, and these deliveries should be milestones in the IMS.  Completion milestones should include notifying the customer of corrective action implementation and confirmation by the customer that the implementation is complete.  Each activity should also have fields which identify the CAR or DR number, the EV Process Area and Guideline, the responsible manager for the CAP, and a unique ID number for each task.

Reviewing the CAP IMS and the accomplishment status is a critical role of the CAP Review Board.

7)     Submit CAP and CAP IMS to the customer for approval prior to implementing the Corrective Actions

While some corrective actions may be straight forward responses to simple findings, it is important to reach a mutual agreement of the CAP approach prior to implementation.  Often the customer’s approval of the CAP is a required step before proceeding.  Important in this agreement is consensus on the artifacts and data sets that will be delivered, along with the timing of the deliveries.

One topic that may need to be addressed with the customer review team is a cutoff date for data corrections.  For example, it is important to reach agreement on the “as of” date for clean data, because changing historical data is usually an unnecessary step.  Occasionally a corrective action is delayed until a new contract modification is implemented or a new contract baselined before a correction can be implemented and verified.  These conditions need to be agreed upon with the customer prior to proceeding.

8)     Implement Corrective Action Plans and track progress to successful completion

One path to the escalation of a CAR to Level IV*, and possibly the introduction of Business System payment withholds, is the failure to successfully implement an agreed upon Corrective Action Plan.  Many organizations discover that the actual implementation of the approved corrective actions is the most difficult part of the process.  Sometimes a successful plan will include interim modifications or fixes in the short term, with long term changes identified as well.  If for example, the issue were with the integration between the  MRP and EVM systems, an interim solution may involve a change in the interface or translation of data between the systems while in the long term a replacement of the MRP is required.  It is important to have CAP solutions that not only mitigate the findings, but also can also be implemented in an acceptable period of time.

It is also important to meet interim commitments of data, processes, or any agreed to delivery of an artifact.  If the execution of a CAP will be delayed for any reason, this should be communicated quickly to the customer.

9)     CAR closure and follow-up

When the issuer of the CAR is satisfied that the contractor’s corrective actions are appropriate to prevent recurrence of the noncompliance, and the solutions have been verified to be effective, the contractor will be notified that the CAR is considered closed.  Even after closure, the areas identified as needing improvement are often targeted for periodic follow-on reviews; so it is important that management attention is maintained to sustain the corrective action.  A well organized and disciplined internal surveillance program is often the best safeguard against future discrepancy reports.

For more information about responding to Corrective Action Requests, contact our consultants at Humphrey’s & Associates.

*Link to part 1 of Corrective Action Response: Sources

Corrective Action Response: Planning and Closure – Part 2 of 2 Read Post »

EVM Systems – The 16 Foundational Guidelines

The Earned Value Management System (EVMS) – Standard Surveillance Instruction (SSI) (latest revision February 2012) defines the Defense Contract Management Agency (DCMA) standardized methodology to conduct contractor surveillance on EVM Systems. This includes assessment of contractor processes and procedures to ensure the 32 EIA-748 Guidelines are being followed when contractually required.

Of the 32 Guidelines, sixteen are considered high-risk or foundational for EVM Systems.  This means that if the requirements of those Guidelines are not met, considered noncompliant, the Earned Value Management System may not produce accurate, reliable and auditable data such that it provides the customer with the information necessary to reliably manage a program.

The 16 Foundational Guidelines are highlighted below in red.

EVM Systems - 16 Foundational Guidelines

Each year, the DCMA prepares a surveillance schedule which includes the five EVMS Areas and associated Guidelines to be reviewed and the programs/contracts involved.  Of the 32 Guidelines, the 16 high-risk Guidelines are evaluated every year and all 32 Guidelines evaluated within a 3 year period.  By concentrating on these 16 high-risk Guidelines, resources for both the Government and the contractor can be used more efficiently.  Concerns with non-high risk Guideline(s) could be surfaced during reviews and these can then be scheduled for additional surveillance.  Generally, a minimum of four surveillance events are planned covering the five EVMS Areas in a given year.

If guideline noncompliance were found in any of the high risk guidelines, this signifies that there are shortcomings in the system and the information produced from that system is not reliable for management purposes.

Although the Standard Surveillance Instruction requirements are that those Guidelines that are not foundational be reviewed by the DCMA at least every three years, it is still incumbent on the contractor to ensure that those Guidelines remain compliant.

The 16 foundational guidelines are:

ORGANIZATION

Guideline 1: Define the authorized work elements for the program.

Guideline 3: Provide for the integration of the company’s planning, scheduling, budgeting, work authorization and cost accumulation processes with each other, and as appropriate, the program Work Breakdown Structure (WBS) and the program organizational.

PLANNING AND BUDGETING

Guideline 6: Schedule the authorized work in a manner, which describes the sequence of work and identifies significant task interdependencies required to meet the requirements of the program.

Guideline 7: Identify physical products, milestones, technical performance goals, or other indicators that will be used to measure progress.

Guideline 8: Establish and maintain a time-phased budget baseline, at the Control Account level, against which program performance can be measured.

Guideline 9: Establish budgets for authorized work with identification of significant cost elements (labor, material, etc.) as needed for internal management and for control of subcontractors.

Guideline 10: To the extent it is practical to identify the authorized work in discrete work packages, establish budgets for this work in terms of dollars, hours, or other measurable units.

Guideline 12: Identify and control level of effort activity by time-phased budgets established for this purpose.  Only that effort which is immeasurable or for which measurement is impractical may be classified as level of effort. 

ACCOUNTING CONSIDERATIONS

Guideline 16: Record direct costs in a manner consistent with the budgets in a formal system controlled by the general books of account.

 Guideline 21: For EVMS, the material accounting system will provide for:

      1. Accurate cost accumulation and assignment of costs to Control Accounts in a manner consistent with the budgets using recognized, acceptable, costing techniques.
      2. Cost performance measurement at the point in time most suitable for the category of material involved, but no earlier than the time of progress payments or actual receipt of material.
      3. Full accountability of all material purchased for the program including the residual inventory.

ANALYSIS AND MANAGEMENT REPORTS

Guideline 23: Identify, at least monthly, the significant differences between both planned and actual schedule performance and planned and actual cost performance, and provide the reasons for the variances in detail needed by program management.

Guideline 26: Implement managerial actions taken as a result of earned value information.

Guideline 27: Develop revised estimates of cost at completion based on performance to date, commitment values for material, and estimate of future conditions. Compare this information with the performance measurement baseline to identify variances at completion important to company management and any applicable customer reporting requirements including statements of funding requirements.

REVISIONS AND DATA MAINTENANCE

Guideline 28: Incorporate authorized changes in a timely manner, recording the effects of such changes in budgets and schedules.  In the directed effort prior to negotiation of a change, base such revisions on the amount estimated and budgeted to the program organizations.

Guideline 30: Control retroactive changes to records pertaining to work performed that would change previously report amounts for actual costs, earned value, or budgets.

Guideline 32: Document changes to the performance measurement baseline.

For more information on these guidelines or to inquire about EVMS implementation and remediation, contact Humphreys & Associates.

EVM Systems – The 16 Foundational Guidelines Read Post »

Scroll to Top