Management Analysis

Variance Analysis, Corrective Action Plans, Root Cause Analysis

, , , , , , , , ,

Variance Analysis “provides EVMS contract management with early insight into the extent of problems and allows corrective actions to be implemented in time to affect the future course of the program.” [NDIA ANSI EIA 748 Intent Guide] Department of Defense Data Item Descriptions: DI-MGMT-81861, Integrated Program Management Report (IPMR) paragraphs 3.6.10xx; DI-MGMT-81466A, Contract Performance Report, paragraph 2.6.3; and DI-MGMT-81650, Integrated Master Schedule (IMS) — paragraph 2.5 — all require analysis for significant variances including cause, impact and corrective action plans.  By comparing the performance against the plan, it is possible to make mid-course corrections which assist completion of the project on time and within the approved budget. The Variance Analysis Report (VAR) is a “living, working document to communicate cause, impact and corrective action”. [See: Chapter 35 Variance Analysis and Corrective Action, Project Management Using Earned Value, Humphreys & Associates, page 707.] Well-written variance analyses should answer the basic questions of why, what and how.

Cause is also known as root cause, nature of the problem, problem statement, issue, or problem definition. Root cause is the fundamental reason for the problem. Root cause is required in order to take preventative corrective action. The explanation of the variance is broken down into each of its components: discuss schedule variances separately from cost variances; discuss labor separately from non-labor; discuss which portion of the variance was caused by efficiency (hours) and which portion was because of dollars (rates) or if the variance was driven by material discuss how much was because of price and how much was because of usage. For more information refer to Humphreys & Associates blog Variance Analysis-Getting Specific.

Once the root cause of the problem has been identified and described, the impact(s) on the project should be addressed. Identify impacts to customers, technical capability, cost, schedule (including when the schedule variance will become zero), other control accounts, program milestones, subcontractors, and the Estimate at Completion, including rationale.

A corrective action (CA) plan should be developed that describes the specific actions being taken, or to be taken, which includes the individual or organization responsible for the action(s). The corrective actions should be directly derived from root cause analysis and related to each identified root cause.   Results from previous corrective action plans should be included.  Occasionally, a successful plan will include interim modifications or fixes in the short term, with long term changes identified as well. When no corrective action for an overrun is possible, an explanation and EAC rationale should be included.  A corrective action log should be used that tracks the actions taken and the status of the corrective plan for each variance analysis cycle.  As was stated in the Humphreys & Associates article:  Corrective Action Response: Planning and Closure – Part 2 of 2  “It is critical that verification methods, objective measures, metrics, artifacts, and evidential products are identified that will verify that the corrective actions are effective.”  Corrective action plans based on clearly a defined root cause facilitates time management action and avoids the occurrence of repetitive problems.

Variance Analysis, Corrective Action Plans, Root Cause Analysis Read Post »

EVMS Variance Analysis — EVMS Analysis and Management Reports

, , , ,

A Variance Analysis Report (VAR) that includes specific information about the cause, impact, and corrective action “provides management with early insight into the extent of problems and allows corrective actions to be implemented in time to affect the future course of the program” [reference: NDIA, IPMD EIA-748 (Revision D) EVMS Intent Guide]. Unfortunately, variance analysis is an easy target for criticism during EVMS reviews. There are many examples of inadequate variance analysis to choose from, but what they all have in common is the lack of specific information on the “why, what, how, when, and who” of any variance. The variance analysis reporting requirements are found in the EIA-748 (Revision D) Guidelines in Section IV., Analysis and Management Reports, Guidelines 22-27.

EIA-748 Guidelines
Section IV. Analysis and Management Reports
22 2-4a Control Account Monthly Summary, Identification of CV and SV
23* 2-4b Explain Significant Variances | Earned Value Management
24 2-4c Identify and Explain Indirect Cost Variances
25 2-4d Summarize Data Elements and Variances thru WBS/OBS for Management
26* 2-4e Implement Management Actions as Result of EVM Analysis
27* 2-4f Revise EAC Based on Performance Data; Calculate VAC


A VAR that includes specific information and data about a problem will allow management to make informed decisions and mitigate project risk. Getting specific about variance analysis reporting includes the following elements.

Overall:

  • Emphasis on the quantitative, not qualitative
  • Emphasis on the specific, not the general
  • Emphasis on significant problems, not all problems
  • Define abbreviations and acronyms at first use
  • The Control Account Manager (CAM) is the most knowledgeable person to write the variance analysis report but will need information from the business support team

Cause:

  • Isolate significant variances
  • Discuss cost and schedule variances separately
  • Clearly identify the reason (root cause) for the variance (ties to the corrective action plan)
  • Clear, concise explanation of the technical reason for the variance
  • Provide cost element analysis
    • Labor – hours, direct rates, skill mix, overtime (rate & volume)
    • Material – unplanned requirements, excess quantities, unfavorable prices (price & usage)
    • Subcontracts – changing requirements, additional in-scope work, schedule changes
    • Other Direct Costs – unanticipated usage, in-house vendor
    • Overhead (indirect) – direct base, rate changes
  • Identify what tasks are behind schedule and why

Impact:

  • Describe specific cost, schedule, and technical impact on the project
  • Project future control account performance (continuing problem)
  • Address effect on immediate tasks, intermediate schedules, critical path, driving paths, risk mitigation tasks
  • Describe erosion of schedule margin, impacts to contractual milestones or delivery dates, and when the schedule variance will become zero (this may only mean the work getting completed late (BCWPcum =BCWScum); and does not necessarily mean getting “back on schedule”
  • Describe any impact to other control accounts
  • Assess the need to revise and provide rationale for the Estimate at Completion (justify ETC realism – CPI to TCPI comparison, impacts of corrective action plan, risk mitigation, open commitments, staffing changes, etc.)
  • Note: If there is a root cause, there will be an impact. It could be related to cost, schedule, lessons learned to be applied to future activity, an update required to a process to support the corrective action or a re-prioritization of resources to meet a schedule.

Corrective Action Planning:

  • Describe specific actions being taken, or to be taken, to alleviate or minimize the impact of the problem
  • Include the individual or organization responsible for the required action
  • Include schedules for the actions and estimated completion dates (ECD)
  • If no corrective action is possible, explain why
  • Include results of corrective action plans in previous VARs.

Ask yourself, is the analyses presented in a manner that is understandable? Does the data support the narrative? Does the variance explanation provide specifics of:

why” the problem occurred,
what” is impacted now or in the future,
how” the corrective action is being taken,
when” the corrective actions will occur,
when” the schedule variance will become zero, and/ or the work gets “back on schedule”
who” is responsible for implementing the corrections?

Remember, a well-developed Variance Analysis Report can reduce the risk of a Corrective Action Request (CAR) during an EVMS review.

EVMS Variance Analysis — EVMS Analysis and Management Reports Read Post »

Ensuring CPI to TCPI Comparisons are Valid at the Total Contract Level

TCPI and CPI ComparsionHave you been in a meeting when presenters show differing To-Complete Performance Index (TCPI) values at the total contract level for the same contract? In these situations, the presenters have made different assumptions about the inclusion of Undistributed Budget and Management Reserve (MR) in the TCPI calculations. So let’s use some sample values and show different ways the TCPI can be calculated at the total contract level.

As a reminder, this is the formula for TCPI:

TCPI

Consider the following extract from the lower right portion of Format 1 of the Integrated Program Management Report (IPMR) (Contract Performance Report (CPR)).

To-Complete Performance Index (TCPI)

When comparing the TCPI to the CPI at the total contract level, the most realistic approach is to calculate the TCPI at the level of the Distributed Budgets. Stated differently, the TCPI should be calculated without Undistributed Budget and Management Reserve. The Cost Performance Index (CPI), BCWP divided by ACWP, represents the cost efficiency for the work performed to date. Notice in the above table that the BCWP and ACWP values in the rows for “Distributed Budgets by WBS”, “Subtotal”, and “Total” are the same; therefore, the CPI calculation will be the same for any of these data levels. The TCPI represents the cost efficiency necessary to achieve the reported EAC. The “Distributed Budgets by WBS” contain approved budgets as well as performance data against those budgets. The CPI and TCPI compared at this level of data certainly provide a valid comparison of past performance to projected performance. The CPI for the above data is 0.73 while the TCPI is .92.

Since the difference between the CPI and TCPI is greater than 0.10, the control account managers (CAMs) and the analysts should research the reasons that the future performance indicates improvement and provide EAC rationale.

Calculating the TCPI at the Performance Measurement Baseline level (i.e. including Undistributed Budget in the BAC and EAC) yields a different TCPI than at the Distributed Budget level. Mathemati-cally, the TCPI will be the same for the Distributed Budgets and PMB only if the value of the Estimate to Complete (EAC – ACWP) equals the budgeted value of the remaining work (BAC – BCWP). In that case, the TCPI will be 1.0. If the contract has an unfavorable cost variance and projects an overrun on future work, the TCPI at the PMB level (includes UB) will be higher than the TCPI calculated at the Distributed Budget level (does not include UB).

For the data in the above table, the Distributed Budget TCPI = 0.92 but increases to 0.94 if Undistributed Budget is included in the calculation. The Undistributed Budget, with the same value added to both BAC and EAC, represents a portion of the Estimate to Complete (ETC) that will be performed at an efficiency of 1.0. In an overrun situation at the distributed budget level, the disparity between the CPI and TCPI increases when Undistributed Budget is included in the TCPI because more work must be accomplished at a better efficiency to achieve the EAC. In the above data, the disparity between CPI and TCPI increased from 0.19 to 0.21.

Calculating the TCPI at the total contract level with Undistributed Budget and Management Reserve in both the BAC and EAC yields TCPI values very close to TCPI values calculated at the distributed PMB level. The UB and MR values included in the BAC and EAC increase the proportion of the remain-ing work that is forecast to be completed at an efficiency of 1.0 and push the TCPI toward the 1.0 val-ue. The larger the values of UB and MR, the more the TCPI will diverge from the TCPI calculated at the Distributed Budgets level. Using this approach for the sample data above, the CPI is 0.73 and the TCPI is 0.94.

Calculating the TCPI at the total contract level, but not including Management Reserve in the EAC, creates a significant disparity between the CPI and TCPI. This situation represents the classic “apples to oranges” comparison: the work remaining in the formula includes MR, but the funds estimated do not. Obviously, with a higher numerator, the TCPI would be higher than any of the other approaches discussed above. Using this approach for the sample data above, the CPI is 0.73 and the TCPI is 1.06. While situations arise where exclusion of MR from the EAC makes sense, it is still important to review the project manager’s rationale with respect to MR application. Most situations assume that MR will be depleted during contract performance; consequently, it should be added to the EAC at the PMB level.

In summary, be sure you understand what is included in the TCPI calculation before you make comparisons to the CPI at the total contract level. The following table summarizes the CPI and TCPI for the sample data in this article and highlights the differences in the TCPI when calculated at the various data summary levels.

CPI / TCPI

To ask about this topic or if you have questions, feel free to contact Humphreys & Associates.

Ensuring CPI to TCPI Comparisons are Valid at the Total Contract Level Read Post »

Common Problems Found in EVMS and Recommended Corrective Actions – Part 5

This is the last of a five part series regarding common findings discovered in contractors’ Earned Value Management Systems (EVMS), and the recommended corrective actions to mitigate those findings.

The previous articles discussed: 

Common Errors and CA part 4

Part 5 of this series includes:  Inappropriate use of PERT and LOE; Misuse of Management Reserve; Administrative Control Account Managers.

1)  Inappropriate use of PERT and LOE

The Program Evaluation and Review Technique (PERT) earned value method is a simple method for calculating the BCWP, where:  BCWP = (ACWP/EAC) X BAC.  In this method, the earned value is completely contingent upon cumulative expenditures (ACWP) divided by an estimate of total expenditures.  Because the results of this formula often have little to do with actual progress, its use is limited to non-critical work, and generally is applied only to high volume, low dollar fixed price material.  The PERT method should never be used for any critical path task, labor, or high dollar value material.  Guideline 7 of the EIA-748-C Standard requires that an EVMS “Identify physical products, milestones, technical performance goals, or other indicators that will be used to measure performance”. The primary condition that must be satisfied in a review of earned value techniques is the application of “meaningful indicators” for use in measuring the status of cost and schedule performance.

Level of effort (LOE) tasks consist of management or sustaining type activities that have no identifiable end products or an established relationship to other measurable effort.  The standard for the control of LOE is documented in Guideline 12, which requires that “Only that effort which is immeasurable or for which measurement is impractical may be classified as level of effort”.  There is no standard threshold for a contract or WBS level that would signify “too much” LOE. However, a common practice during review discussions with the control account managers is to challenge any LOE to assess its appropriateness.  There is always pressure on a contractor to minimize the LOE as the nature of LOE can easily mask or distort the performance of discrete work.

Most Common Corrective Action Plans

The most common response to findings regarding both PERT and LOE is to establish a screening/approval process, with thresholds, during the budgeting process.  For PERT, most Earned Value Management System Description Documents (EVM SDD) will specify the limited use of the technique for high volume, low dollar fixed price material.  Many also take the next step and create a threshold for what is considered “low dollar” and short duration.  This is dependent on the nature of the work, but it is not unusual for an SDD to require that any material extended value (quantity of parts times budgeted unit value) or part number greater than $10,000 (or some other threshold) must be tracked discretely in the IMS and may not use the PERT method.  Some also establish a a duration threshold like no greater than 3 months when there are many parts of low value. One of the signs that PERT is being used inappropriately is when variance analysis included in the Integrated Program Management Report (IPMR) or Contract Performance Report (CPR) Format 5 consistently refers to PERT accounts as drivers, or a schedule variance explanation that refers to material not being tracked in the IMS.

Level of Effort should be justified on a case-by-case basis.  A common strategy for setting the appropriate level of LOE is to require the Program Manager’s approval on all LOE accounts.  While control accounts may contain a mixture of LOE and Discrete Effort, many organizations establish rules concerning the maximum allowable LOE in a control account to prevent the distortion of status; often a threshold of 20% is established.  Above that threshold, a separate control account would be required for LOE work.  While it is a goal to have no more LOE than is required, care must be taken not to measure work that is truly LOE.  CAMs have been known to say that they were required to establish a discrete account even though the nature of the work is impractical to measure.  This type of discovery by a review team can also result in a finding for use of an inappropriate earned value technique.

2)  Misuse of Management Reserve

Management Reserve (MR) is a portion of the overall contract budget held for management control purposes and unplanned events that are within the scope of the contract.  H&A has often heard in the course of our consulting or training that there are few rules regarding MR, and the restrictions are unclear.  This is not quite accurate.  The Integrated Program Management Report Data Item Description (IPMR DID, DI-MGMT-81861) lists four restrictions on the use of MR:

  • MR shall not be used to offset cost variances.
  • MR shall never be a negative value.
  • If MR includes the contractor and subcontractor amounts together, the breakout shall be discussed in Format 5.
  • Amounts from MR applied to WBS elements during the reporting period shall be listed in Block 6.b of Format 3 and explained in Format 5.
    • Format 5:  Identify the sources and uses of MR changes during the reporting period.  Identify the WBS elements to which MR was applied and the reasons for its application.

The EIA-748-C Standard adds a few more caveats for MR:

  • Held for unexpected growth within the currently authorized work scope, rate changes, risk and opportunity handling, and other program unknowns.
  • May be held at the total program level or distributed and controlled at lower management levels.
  • Held for current and future needs and is not used to offset accumulated overruns or under runs.
  • Is not a contingency that can be eliminated from prices during subsequent negotiations or used to absorb the cost of program changes.
  • Must not be viewed by a customer as a source of funding for added work scope. This is especially important to understand that it is a budget item and that it is not for added work scope.

In addition, the Defense Acquisition University Evaluation Guide (EIA 748 Guideline Attributes & Verification Data Traces) requires that the internal MR Log be reconciled with the IPMR (or CPR).

The specific nature of the above requirements reflects the types of abuse experienced with the MR budgets.  The EVM SDD should establish specific organizational guidelines for the application of MR; however, those rules and the organization’s practices must fall within the bounds established by the EIA-748-C and the appropriate Data Item Description (DID).

The discrepancies found in recent reviews take two primary forms: inappropriate application of MR in the current accounting period and poor reporting and discussion involving MR use in the IPMR/CPR Formats 3 and 5.  With the exception of rate or process changes, all applications of MR must be made in association with additional work scope authorized to control accounts.  Because of the prohibition in the requirements regarding the offset of overruns or underruns, applications in the current period can be far more suspicious than in future periods.  Care must be taken to fully justify the timing and use of MR in terms of additional scope.  All applications of MR must have a full accounting in Formats 3 and 5 of the IPMR (or CPR).

Most Common Corrective Action Plans

Response to these discrepancies is usually a matter of policy, training, and discipline.  The EVM SDD must contain a policy that enforces the rules in the guidance documents mentioned above, as well as document who has the authority for approval of MR application (generally the Program Manager).  Those responsible for authorization of MR must be familiar with the approval policies, and their support staff must ensure there is complete visibility in the customer reporting and reconciliation to the program logs.  One of the more common corrective actions is to structure the IPMR (or CPR) Format 5 so that reporting of MR transactions meets the intent of the guidance cited above.

3)  Administrative Control Account Managers (CAMs)

The role of the CAM can change across organizations, and there is no standard set of criteria that defines the CAM’s duties.  The National Defense Industrial Association (NDIA) Integrated Program Management Division (IPMD) Earned Value Management Systems Intent Guide states that “The control account manager is responsible for ensuring the accomplishment of work in his or her control account and is the focal point for management control”.  The requirements for management control can be defined as having three essential attributes for the role of the CAM: responsibility, authority, and accountability.  These attributes assume ownership of the technical, schedule and cost aspects of the scope authorized to a Control Account Manager (CAM).  It is not always the case that the CAM must be the technical expert over the scope of the control account; sometimes that is the role of the “performing organization” rather than the “responsible organization”.

Primarily through discussion with the CAMs, it is easy to assess if they fail in performing any or all three of the above essential attributes.  Because being a CAM brings a set of duties that are over and above those of a technical manager or engineer, the CAM’s role is often not a welcomed addition and some organizations hand it over to employees who have little knowledge of, or responsibility for, the effort.  A comment on a Corrective Action Request (CAR) for an organization that was employing Administrative CAMs was “CAM does not stand for ‘Control Account Monitor’”.  The role of the CAM does include necessary administrative responsibilities; such as, reporting status, maintaining records, and developing analysis for the control accounts.  However, these cannot be the only functions the CAM performs.

A CAM should be active in the development of the control account plans in the IMS and EVM Systems, including being the primary architect for defining the tasks, logic for the schedule and the adequacy of the budget.  The CAM should be the primary contact for the Program Manager regarding the control account, including risk management and corrective action planning.  The CAM should also have the authority to assign and coordinate work performed by other organizations.  The CAM should have enough knowledge of the scope and the executing environment to develop a realistic forecast of costs beyond that of mathematical extrapolation.  If control accounts contain subcontracted work, the CAM is also responsible for management of that subcontractor effort.

Most Common Corrective Action Plans

Choosing the right personnel to fulfill the requirements of the CAM role can be difficult.  One of the first considerations is the appropriateness of the organization that is given the responsibility for management.  An example is for material control accounts where the organization responsible during a program’s development phase may not be appropriate for the production phase.  It is very important for the contractor when submitting the corrective action plan (CAP) to treat the job of the CAM as being critical to project success, and not one relegated to people in the organization who do not have responsibility, authority, or accountability.

The Correction Action Plan should also include a list of essential CAM attributes, and be very clear on the responsibilities and authority of the CAM role.  Companies should make a commitment to ensure that the position is considered critical and not just created to fulfill the requirements of earned value.  This can be demonstrated by not only choosing the right individuals to perform the functions, but also providing the necessary resources, training, and support to function successfully.

This completes our 5 part series. Thank you for your readership.

If you have any questions or would like to inquire about our services, please feel free to contact us

Common Problems Found in EVMS and Recommended Corrective Actions – Part 5 Read Post »

Common Problems Found in EVM Systems and Recommended Corrective Actions – Part 4

HA_Blog-Common-Problems-part-4-Twitter

This is the fourth part of a five part series regarding common problems found in EVM Systems and the recommended corrective actions to help mitigate those findings.  The previous three articles discussed:

common problems found in evm systems - part 4

The topics anticipated for part five are: Inappropriate use of PERT and LOE; Misuse of Management Reserve: Administrative CAMs.

1)  Misalignment between BCWP and ACWP

The Earned Value Management System Description Document (EVM SDD) should include a statement that requires Actual Cost of Work Performed (ACWP) to be reported within the same accounting period as Budgeted Cost for Work Performed (BCWP) is earned; which is most applicable for material.  Both ACWP and BCWP contain the term “Work Performed”.  The ACWP is not a measure of how much has been spent but rather reflects how much it cost to accomplish the scope of work reflected in the BCWP.

Accounting systems generally record actual costs for material when invoices are paid; this may or may not align with when earned value is claimed for that material.  If material earned value is claimed at point of usage, it may be necessary to collect actual costs in a holding account and then delay recording ACWP in the earned value system until the material is used.

When material earned value is taken at the point of receipt, invoice payments may be delayed for 45 days (or more). The actual costs associated with this material will be recorded in the accounting system after the earned value credit is taken.  In this case, recording ACWP in the earned value system must be accelerated.  The process of delaying or accelerating the recording of ACWP in the earned value system is often called using “Estimated Actuals” or, more appropriately, “Estimated ACWP”.

There are two obvious examples of this process being done incorrectly.  The first is in the data where BCWP is claimed without corresponding ACWP in the current period, or vice versa.  This may be below the threshold level for variance explanation and is often attributable to Level of Effort (LOE) control accounts, but it creates a situation that attentive customers will need to understand.  The second example is more direct, and occurs when contractors simply explain the situation in Variance Analysis Reports that are subsequently summarized in the Contract Performance Report (CPR) or Integrated Program Management Report (IPMR) Format 5.  The Control Account Manager (CAM) will use words such as “billing lag,” “accrual delay,” or “late invoicing” in the explanation of a cost variance.  Consequently, any time that financial billing terms are used to explain a cost variance, it raises a flag regarding a potential misalignment between BCWP and ACWP.

One issue with ACWP and BCWP misalignment is that it invalidates the use of the earned value data for predictive purposes.  Unless both data elements are recorded within the same accounting period, using indices such as the CPI, TCPI, or IEAC  (Independent Estimate at Completion) will deliver erroneous results.  The time and effort of the CAMs in the variance analysis process should be spent on managing the physical progress and efficiencies of the work, not having to explain payment or accounting system irregularities.

Most Common Corrective Action Plans

When this issue is reported, the best response is to develop a disciplined Estimated ACWP process, including logs and a monthly trace from the Accounting General Ledger to the EVM ACWP.  It is also important to train the CAMs and support staff on how to record and subsequently retire those entries in an Estimated ACWP log book.  Reviewers of the Variance Analysis Reports should be trained to screen for entries that indicate an inappropriate alignment between BCWP and ACWP.  In addition, as indicated in the blog discussion on Data Integrity (Part 2 of this series), situations where there is BCWP without corresponding ACWP, or vice versa, at the control account level, should be flagged and justified by the CAM prior to submittal of the CPR/IPMR to the customer.

2)  Freeze Period Violations

“Freeze Period” refers to future accounting periods, including the current accounting period, in which baseline changes should be strictly controlled.  This is also sometimes called the “Change Control Period”.  The definition of this period should be in the company’s EVM SDD, but will usually have a time-frame such as “current accounting period plus the next accounting period”.  The SDD should specify what kinds of changes are allowed within this period, how they are to be documented in the CPR/IPMR, and any necessary customer notification or approval requirements when these changes are incorporated.  The SDD should require that customer approval is necessary for changes to open work packages that affect BCWS or BCWP in the current or prior accounting periods, and any changes to LOE data in prior periods or in the current period if the LOE account has incurred charges (ACWP).

There is an additional requirement specific to retroactive adjustments which includes the current period.  The EIA-748-C Guideline 30 specifically stipulates the requirement that these types of changes be controlled, and that adjustments should be made only for “correction of errors, routine account adjustments, effects of customer or management directed changes, or to improve the baseline integrity and accuracy of performance measurement data”.  Again, the reasons allowed for the changes should be specified in the EVM SDD.  However, regardless of the reason, it is a requirement that all retroactive changes be reflected in the current period data in the CPR/IPMR Formats 1 and 3, and that Format 5 include the related explanations (National Defense Industrial Association (NDIA), Integrated Program Management Division (IPMD), Earned Value Management Systems Intent Guide, August 2012).

Some projects have a great deal of volatility.  The incorporation of subcontractor data (especially if that data lags the prime contractor reporting period) and accounting system adjustments often create retroactive (including current period) adjustments.  The operation of change boards may also result in changes, both internal and external, which require immediate implementation.  EVM compliance in this environment is a matter of disciplined incorporation of changes, including visibility and communication to the customer (and sometimes prior approval) of any impacts to the baseline.

Most Common Corrective Action Plans

When discrepancies are found with freeze period noncompliances, the first action should be to ensure that procedures are in place that are compliant with the EIA-748.  The discipline required by these procedures must be communicated to the program team so that a consistent change control processes is maintained.  Key to compliance is visibility and communication of freeze period changes via CPR/IPMR Formats 3 and 5.

H&A has seen a loose interpretation of the guideline allowance for adjustments to “improve the baseline integrity and accuracy of performance measurement data”.  Care must be taken that adjustments falling under this category are not made to avoid variances.

3)  Failed Data Traces

The reviews associated with EVM surveillance and compliance have become increasingly data centric for the past several years.  One of the first steps in a review is submittal to the customer of a complete set of EVM data so analysis can be conducted against predefined success criteria prior to conducting an on-site review.  When there is an on-site review, the data trace portion of that review can be a major component at the company, project, and Control Account Manager levels.

The primary purpose of the data traces is to evaluate the Earned Value Management System.  Is the EVMS operating as a single integrated system that can be counted on for reliable and valid information?  The data traces performed generally follow three separate threads: Scope, Schedule, and Budget.  There are a variety of documents and reports that contain this information, but the reviewers will look for a single thread of data to flow and be traceable throughout the system.

All systems are different, but a common strategy for data traces might be as follows:

  • Scope:  WAD → WBS Dictionary → Contract Statement of Work.
  • Schedule:  WAD → IMS → CAP.
  • Cost (Budget):  RAM → WAD → IMS → CAP → CPR/IPMR Format 1 → CPR/IPMR Format 5.
  • Cost (ACWP):  CAP → Internal Reports → CPR/IPMR (Formats 1 & 2) → General Ledger.

If there are also supplemental sources of data that flow into the EVMS, such as subcontractor, manufacturing, or engineering reports, then these should also be a part of the data trace.

The key to this process is the concept of “traceability”.  The easiest path to prove traceability is if the data are an exact match; however, this is not always possible.  Prime contractors often have to make adjustments to subcontractor data, use of estimated ACWP often will not allow a match with the accounting ledger, and supplemental schedules often “support” the IMS while not matching exactly.  These are normal and explainable disconnects in the data.  When submitting data for review, it is important to know where the data does not match and to pass that information on to the reviewers.  If preparing for an on-site review, the CAMs and others who may be scheduled for discussions should perform a thorough scrub of the data and have quick explanations available when a trace is not evident in that data.

Most Common Corrective Action Plans

It is important that any special circumstances that cause traceability issues be relayed to the review team with the data submittal.  The people who conduct the analysis often operate independently until they are on-site for the review, and it is possible to avoid misunderstandings by identifying any issues with the submitted data set.  This type of communication has the potential to eliminate unnecessary findings.

A short term response to a data trace issue is to establish a process to screen the EVM data before submission to the customer.  Starting with the accounting month end, the statusing and close-out process requires a comparative analysis of the various databases containing the same information.  Because of the volume of data contained in most systems, this should be automated.  There should be time in the monthly business rhythm to allow for corrections and data reloads to improve the accuracy across the various data locations.

The best approach to improved data traces is to design a system that minimizes the number of entries for a single set of data.  For example, H&A found one contractor with over 10 different databases where the CAM’s name was hand entered which resulted in a configuration control nightmare for that data element.  The process of system design should include a complete listing of common data elements that are included in the storyboarding of the process flow.

The topics anticipated for Part 5 are: Inappropriate use of PERT and LOE; Misuse of Management Reserve: Administrative CAMs.

To read previous installments:

  • Part 1 – EAC Alignment Issues, Poor Variance Analysis, Lack of Effective Subcontract Management
  • Part 2 – Poor use of Percent CompleteData Integrity Issues; Poor Scope Language
  • Part 3 – IMS Health Problems; Data Item Non-Compliance; Planning Package Misuse

Common Problems Found in EVM Systems and Recommended Corrective Actions – Part 4 Read Post »

Common Problems Found in EVM Systems and Recommended Corrective Actions – Part 3

Part 3: IMS Health Problems; Data Item Non-Compliance; Planning Package Misuse

This is the third part of a five part series regarding common findings discovered in Earned Value Management Systems (EVMS) reviews and the recommended corrective actions to help mitigate those findings.  The previous two articles discussed:

Part 3: IMS Health Problems; Data Item Non-Compliance; Planning Package Misuse

The topics anticipated for parts four and five are:

Part 4: Misalignment between BCWP and ACWP; Freeze Period Violations; Failed Data Traces.

Part 5: Inappropriate use of PERT and LOE; Misuse of Management Reserve; Administrative CAMs.

1.  IMS Health Problems 

Several years ago the Defense Contract Management Agency (DCMA) issued the 14-Point Assessment Metrics. Twelve of the metrics are related to the “health” of the Integrated Master Schedule (IMS) while the remaining two (critical path length index and baseline execution index) are “tripwire” metrics for schedule performance.  Of the 12 health metrics, H&A has found that most discrepancy reports (DRs) are associated with missing logic, high total float, and high duration.

Missing Logic:  The DCMA uses logic checks to identify any incomplete tasks that are missing a successor or predecessor, or both.  As a rule of thumb, all activities should be tied to at least one predecessor and one successor with the exception of the first and last activities (respectively) in the project.  By the DCMA’s standards, there is an allowance of 5% for activities not having these types of relationships; but some believe that may be too loose.  In the Planning and Scheduling Excellence Guide or PASEG (National Defense Industrial Association, June 2012, version 2.0) states that all discrete tasks (excluding receipts/deliveries, LOE and summary tasks) should have at least one predecessor and one successor as even one missing logical tie could adversely affect the program’s ability to successfully execute the contract.

High Total Float:  Total float (or “total slack” for Microsoft Project users) is the amount of time an activity can be delayed or expanded before the finish date of the project is affected.  In the DCMA Program Analysis Pamphlet (DCMA-EA PAM 200.1, April 2012), any incomplete tasks with total float greater than 44 working days are considered as having high total float;  an allowance of 5% is also given before this metric trips a red flag.  The primary drivers for inappropriate high total float are missing successor linkages or planning well in advance of need.  For this metric, however, there are many conditions that may drive a high total float value that are perfectly legitimate.  This is especially true for longer projects and production projects that often receive materials well advance of need.

High Duration:  The DCMA Program Analysis Pamphlet classifies incomplete activities with a baseline duration greater than 44 working days as having “high duration”, and again have applied a 5% threshold to the metric.  The fear of very long tasks is that they may not provide enough precision for measurement of accomplishment and will introduce subjectivity into the statusing process.  As with total float, there may be conditions that drive high duration activities that are justifiable.  This is often the case when activities are representative of schedules outside of the IMS, such as at subcontractors or manufacturing planning systems.

Most Common Corrective Action Plans

The IMS is a critical management tool, and the purpose of the health metrics is to ensure that it provides an accurate plan and reliable forecasting for program management and execution.  The basic approach to resolving DRs written for IMS health issues is to first take all the necessary steps to improve the real health of the schedule using the metrics as indicators.  This includes a thorough review of the linkages, relationships, and task durations on an ongoing basis.  Organizations should establish a health check “rhythm”, to be used to review the IMS prior to customer submittal.  This process should also require the CAMs and their scheduling support staff to justify any conditions that may drive tripping a metric.

The contractor should work with its customer to gain a mutual understanding of the conditions that may legitimately result in high total float and high duration activities.  Contractors should try to avoid taking illogical actions, such as adding unnecessary linkages or arbitrarily breaking tasks into small durations simply to meet the metric requirements.  The IMS health metrics are simply indicators of potential issues.  If the nature of the program were one where relatively higher total float values or high durations are to be expected, the appropriate thresholds for tripping a metric may be higher than the standard 44 days.  In these cases, it is worth having a discussion with the customer to establish  new metric thresholds.

2. Data Item Noncompliance

The reports that are generated from an EVMS or IMS, and delivered to the customer, are usually placed on the contract by the incorporation of a Data Item Description (DID) and included in the Contract Data Requirements List (CDRL).  As of June, 2012, both earned value and schedule reporting are included in the Integrated Program Management Report (IPMR), DI-MGMT-81861.  It is important to generate the system reports in accordance with the appropriate DID as the requirements have changed with progressive releases.

Prior to the IPMR DID, IMS reporting was required per DI-MGMT-81650.  For EVM reporting, the previous DID was DI-MGMT-81466A, Contract Performance Report (for contracts established between March, 2005 and June, 2012), and before that was DI-MGMT-81466 or the Cost Performance Report.  The release of the Contract Performance Report DID in March 2005 also eliminated the use of the Cost/Schedule Status Report (C/SSR, DI-MGMT-81467) for new contracts.  There are, however, active contracts which use any one of the above DIDs as the requirements document for earned value and schedule reporting, and compliance of the submitted reports is evaluated against the DID that is required on each contract.

Data Item Descriptions are not just guidelines for reporting, they stipulate the contractual requirements for the documents. There are 203 uses of the word “shall” in the current IPMR DID, and some of these “shall statements” refer to a list of many requirements.  Any planned deviation, or tailoring, from the DID must be approved by the Procuring Authority and documented in the CDRL (DD 1423-1 on DoD contracts).  Section 3.0 of the “IPMR Implementation Guide” (OUSD AT&L PARCA, January 24, 2013) provides tailoring guidance for the IPMR.

Software programs used to generate the IPMR formats have reduced the amount of data specific errors in the reports; however, there are many requirements in the IPMR that are not related to data reporting.  In the requirements for the Format 5 (Explanations and Problem Analyses), for example, there are nine discussion requirements in addition to the required explanations for cost and schedule variances that exceed the variance thresholds.  The narrative portions of the IPMR cannot be generated by a software tool.

A contributing factor in the delivery of poor data items is when the customer encourages noncompliance or does not provide feedback on submitted reports.  It is easy to fall into apathy regarding compliance to the DID when there is no motivation to do so.  This situation, however, does little to convince other reviewers, such as the DCMA, that noncompliance is allowable.

Most Common Corrective Action Plans

When DID noncompliance is found and communicated to the contractor, the best immediate approach is correction and resubmittal of the document.  Noncompliance is most likely a discipline issue which requires a structured approach to developing the report, training the personnel who are responsible, and a thorough review prior to submittal.  Many organizations develop checklists that are used to ensure that all the requirements have been met prior to submittal.  There are also training materials available which can provide cell-by-cell instructions to make the proper entries into these reports (H&A has a DVD titled “Contract Performance and Funds Status Reports (CPR/CFSR) Completion and Reconciliation”).  It may be worthwhile to develop a “buddy system” with another program or another part of the company to exchange outside review and evaluation of data items.  This type of accountability can be mutually beneficial.

3. Planning Package Mismanagement

A planning package is far-term effort in a control account that cannot yet be subdivided into detailed work packages.  Planning packages share similar attributes as work packages, such as a time phased budget, a scope of work, start and finish dates, and must have enough detail in the IMS to support the development of a critical path.  There can be no accomplishment or actual costs recorded against the scope and budget that is defined in a planning package.  When enough information is available to detail plan the planning packages, they are converted to work packages.  This is done through a process called “Rolling Wave Planning”, and it is a good practice to have the detailed information available for at least six months in advance.  Advanced detailed planning is an effective approach to avoid unpleasant surprises, such as lack of availability of the necessary resources or the necessity to begin a hiring exercise.  In addition, near term lack of detail in the Integrated Master Schedule may drive improper or incomplete logic ties, which will impact total float and critical path analyses.

Company EVMS System Description Documents (SDDs) should provide guidance for rolling wave planning, including rules for any baseline adjustment in the current or near term periods.  It is important that planning packages are not allowed to exist in the current or past periods.  It is also improper for any actual costs (ACWP) or performance (BCWP) to be recorded against a planning package.  Most earned value engine software tools prohibit this, but some contractors have been known to override that prohibition in the toolset.  In addition to ACWP and BCWP, there should also be no cumulative BCWS in the current period for any planning package.  Cumulative BCWS is the most noticeable evidence that a planning package was not converted to a work package in a timely manner.

Most Common Corrective Action Plans

The most common corrective action is to conduct a monthly analysis of the EVM data to identify planning packages that are nearing the planning period.  While it is the responsibility of the control account manager (CAMs) to convert planning packages to work packages, Project Controls can easily provide the CAMs with a list of planning packages needing conversion.  If there is no guidance or process written for rolling wave planning, these should be developed to provide instructions to the CAMs and the support staff.  It is also critical that organizations maintain the restrictions in the earned value engines to prohibit the accrual of earned value or actual costs for planning packages.

Please contact Humphreys & Associates if you have any questions on this article.

Common Problems Found in EVM Systems and Recommended Corrective Actions – Part 3 Read Post »

Scroll to Top