Keeping Track of Budgets, Changes, and IPMR Data

project IPMR DataFor projects, the moment the baseline is established it is subject to change and a disciplined approach in the change process must be in effect.  The source of project changes can be either external or internal. External changes frequently affect all aspects of a contractor’s internal planning and control system and are generally for effort that is out-of-scope to the contract.  Contract changes impact the Contract Budget Base (CBB) and are distributed to the Performance Measurement Baseline (PMB), which includes the distributed budgets containing control accounts, and Summary Level Planning Packages, and to the Undistributed Budget.

These changes may also impact the Management Reserve (MR) budget if the decision were made to withhold reserve from the budget for the change.  The Work Breakdown Structure (WBS) serves as the framework for integrating changes within the project’s structure.  Internal changes operate much the same, but they do not change the CBB. The most common reasons for internal changes are the allocation of MR for contractually in-scope effort, replanning of future work, and converting planning packages to work packages.

Keeping Track of Budgets, Changes, and IPMR Data

The Earned Value Management Systems Guidelines require that all changes, regardless of the source, be incorporated in a timely and disciplined manner. Consequently, the project needs to have a formal change process and procedures in place. Following these processes and procedures will also help minimize disruptions in the current effort while changes are being incorporated.  An undisciplined change control process has the potential to create timing or quality issues that will lessen the baseline’s effectiveness as a management tool.

Baseline changes must also be tracked to ensure baseline integrity. The most effective way to do this is to establish baseline logs to track all approved changes. These can include the Contract Budget Base (CBB) Log, as shown below, the Management Reserve (MR) Log, and the Undistributed Budget (UB) Log.  In addition, a log may be established to track all approved, unapproved and unresolved change requests.

Keeping Track of Budgets 2 blog

Once established, these logs must be maintained and reconciled to the data reported in the Integrated Program Management Report (or Contract Performance Report) that is delivered to the customer on a monthly basis. This reconciliation helps validate that the PMB accurately represents the project’s technical plans and requirements.

To find out more about this topic or if you have questions, feel free to contact Humphreys & Associates.

Keeping Track of Budgets, Changes, and IPMR Data Read Post »

Is it OTB/OTS Time or Just Address the Variances?

,

EVM: OTB/OTS Time or Just Address the VariancesNo project manager and project team ever wants to go through an Over Target Baseline (OTB) or Over Target Schedule (OTS).  The idea of formally reprogramming the remaining work and adjusting variances at the lowest level can be daunting and extremely time consuming. As painful as an OTB/OTS is, a project manager must first determine if the reprogramming is necessary.  Several factors should be considered before an OTB/OTS is declared and implemented.

NOTE: This paper addresses a Formal reprogramming as including both an OTB and an OTS.  If the Contract Performance Report is the CDRL Requirement, an OTS is not a part of a Formal Reprogramming.  It is a separate action.

Performance Data

Projected successful execution of the remaining effort is the leading indicator of whether an OTB/OTS is needed. Significant projected cost overruns or the inability to meet scheduled milestones play a major role in determining the need for an OTB/OTS as these indicators can provide a clear determination that the baseline is no longer achievable.

Leading indicators also include significant differences between the Estimate to Complete (ETC) and the Budgeted Cost of Work Remaining (BCWR). This is also demonstrated by major differences between the Cost Performance Index (CPI) and the To Complete Performance Index (TCPI).  These differences are evidence that the projected cost performance required to meet the Estimate at Completion is not achievable, and may also indicate that the estimated completion costs do not include all risk considerations. Excessive use of Management Reserve (MR) early in the project could also be an indicator.

 Schedule indicators include increased concurrency amongst remaining tasks, high amounts of negative float, and significant slips in the critical path, questionable activity durations and inadequate schedule margin for remaining work scope.  Any of these conditions may indicate that an OTB/OTS is necessary.

Quantified Factors

Various significant indicators in both cost and schedule can provide a clear picture that an OTB/OTS is warranted.  The term “significant” can be seen as extremely subjective and vary from project to project. For further evidence, other more quantified indicators can be used to supplement what has already been discussed.

Industry guidelines (such as the Over Target Baseline and Over Target Schedule Guide by the Performance Assessments and Root Cause Analyses (PARCA) Office) suggest the contract should be more than 20% complete before considering an OTB/OTS.  However, the same guidance also recommends against an OTB/OTS if the forecasted remaining duration is less than 18 months. Other indicators include comparing the Estimate to Complete with the remaining work to determine projected growth by using the following equation:

Projected Future Cost Overrun (%) = ([(EACPMB-ACWP) / (BACPMB-BCWP) – 1)] X 100

If the Projected Future Cost Overrun percentage were greater than 15%, then an OTB/OTS might be considered. Certainly the dollar magnitude must be considered as well.

Conclusion

There is no exact way to determine if an OTB/OTS is needed, and the project personnel must adequately assess all factors to make the determination. Going through an OTB/OTS is very time consuming, and the decision regarding that implementation should not be taken lightly.

After all factors are adequately analyzed, the project manager may ultimately deem it unnecessary and just manage to the variances being reported. This may be more cost effective and practical than initiating a formal reprogramming action.

If you have any questions about this article contact Humphrey’s & Associates. Comments welcome.

We offer a workshop on this topic: EVMS and Project Management Training Over Target Baseline (OTB) and Over Target Schedule (OTS) Implementation.

Is it OTB/OTS Time or Just Address the Variances? Read Post »

Schedule Health Metrics

What are Schedule Health Metrics?

Schedule Health Metrics by Humphreys & AssociatesAt the heart of every successful Earned Value Management System (EVMS) is a comprehensive Integrated Master Schedule (IMS) that aligns all discrete effort with a time-phased budget plan to complete the project.  As such, the IMS must be complete and accurate to provide the necessary information to other EVMS process groups and users.  The IMS may be a single file of information in an automated scheduling tool, or a set of files that also includes subcontractor schedules.

For any medium to large project, the IMS may contain thousands of activities and milestones interconnected with logical relationships and date constraints to portray the project plan.  Schedule Health Metrics provide insight into the IMS integrity and viability.

Why are Schedule Health Metrics important?

For a schedule to be useable, both as a standalone product and as a component of the EVMS, standards have been developed to reflect both general scheduling practices and contractual requirements.  Schedule Health Metrics contain checks designed to indicate potential IMS issues.  Each check has a tolerance established to help focus on particular areas of concern.  The individual metrics should not be considered as a pass or fail score, but should be used as a set of indicators to guide questions into specific areas of the IMS.

For example, if there is an unusually large number of tasks with high total float properties, a review of the logic in the IMS is warranted.  At the end of the analysis, if the Control Account Manager (CAM) responsible for the work, with the help of the Planner/Scheduler, can explain why the high float exists, then the issue is mute.  Metrics are simply a method to help isolate issues in a large amount of data.  In this example, the analysis will continue to depict issues with this CAM’s data, but those issues are not indicative of failure.

What are the standards?

From the beginning of automated scheduling systems in the 1980’s, attempts have been made to take advantage of the scheduling databases for the purpose of metrical analysis.  The maturity of scheduling software tools has provided better access to metrics in both open architecture databases and with export capabilities to tools such as Microsoft’s Excel and Access products. With the availability of the new tools, new analysis techniques were developed and implemented.

Several years ago, the Defense Contract Management Agency (DCMA) reviewed the various Schedule Health Metrics being used within the US Government and selected 14 tests they believed to be the best tests of an IMS.  Because they support a wide variety of customers from the DOD, NASA, and DOE, they have developed these checks with thresholds that should be common to all types of programs, but not specific or restrictive to a particular one. The thresholds help bring focus to the issues in the schedule under review.  With agreement between the customer, the DCMA and the contractor, they may be altered in some cases to reflect the unique nature of a project.

Unless otherwise indicated, the DCMA Health Metrics apply only to incomplete activities or tasks in the IMS, not milestones, with baseline durations of 1 day or longer. This set also excludes Level of Effort (LOE) and Summary tasks because they should not be driving the network.  The DCMA 14 point Schedule Health Metrics are:

1.  Missing Logic

The test: The percentage of incomplete activities that do not have a predecessor or successor.

The threshold: 5%.

For a schedule to function correctly, the tasks must be logically linked to produce a realistic mathematical model that sequences the work to be performed.

2.  Activities with Leads

The test: The percentage of relationships in the project with lags of negative 1 day or less.

The threshold: 0%.

The project schedule should flow in time from the beginning to the end.  Negative lags, or leads, are counter to that flow and can make it more difficult to analyze the Critical Path.  In many cases this may also indicate that the schedule does not contain a sufficient level of detail.

3.  Activities with Lags

The test: The percentage of incomplete activities that have schedule lags assigned to their relationships.

The threshold: 5%.

An excessive use of lags can distort an IMS and should be avoided.

4.  Relationship Types

The test: The percent of Finish to Start relationships to all relationships.

The threshold: 90%.

A project schedule should flow from the beginning of the program to the end.  Finish to Start (FS) relationships are the easiest and most natural flow of work in the IMS, with the occasional Start to Start (SS) and Finish to Finish (FF) relationship as required.  Start to Finish relationships should not be used because they represent a backward flow of time and can distort the IMS, as do the overuse of SS and FF relationships.

5.  Hard constraints

The test: The current definition includes any date constraint that effects both the forward and backward pass in the scheduling engine.  These include any date constraint that says ‘Must’ or ‘Mandatory’, ‘Start On’ or ‘Finish On’, and ‘Start’ or ‘Finish Not Later Than’ date constraints.

The threshold: 5%.

Hard constraints limit the flexibility of the IMS to produce reliable Driving Paths or a Program Critical Path.  Techniques using soft constraints and deadlines can allow the schedule to flow and identify more issues with float values.

6.  High Float

The test: Percentage of tasks with High Total Float values over 44 days.

The threshold: 5%.

A well-defined schedule should not have large numbers of tasks with high total float or slack values.  Schedules with this condition may have missing or incorrect logic, missing scope or other structural issues causing the high float condition.  The DCMA default threshold of 44 days was selected because it represents two months of effort.  Individual projects may wish to expand or contract that threshold based on the length of the project and the type of project being scheduled; however, any changes in thresholds should be coordinated with the customer first to confirm the viability of the alternate measurement.

7.  Negative Float

The test: The percentage of activities that have a total float or slack value of less than zero (0) days of float.

The threshold: 0%.

When a schedule contains tasks with negative float, it indicates that the project is not able to meet one or more of its delivery goals. This is an alarm requiring redress with a corrective action plan.  Please see the Negative Float blog for additional discussion.

8.  High Duration

The test: A percentage of tasks in the current planning period with baseline durations greater than 44 days.  This check excludes LOE, planning packages and summary level planning packages.

The threshold: 5%.

Near term tasks should be broken down to a sufficient level of detail to define the project work and delivery requirements.  These tasks should be shorter and more detailed since more is known about the immediate scope and schedule requirements and resource availabilities.  For tasks beyond the rolling wave period, longer duration tasks in planning packages are acceptable, as long as the IMS can still be used to accurately develop Driving Paths to Event Milestones and a Program Critical Path to the end of the project.

9.  Invalid Dates

The test: Percentage of tasks with actual start or finish dates beyond the Data Date, or without actual start or finish dates before the Data Date.

The threshold: 0%.

The check is designed to ensure activities are statused with respect to the Data Date in the IMS.  Claiming actual start or finish dates in the future are not acceptable from a scheduling perspective, but can also create distortions in the EVM System by erroneously claiming Earned Value in the current period for future effort.  Alternately, if tasks are not statused with actual start or finish dates prior to the Data Date, then they cannot be logically started or finished until at least the day of the Data Date, if not later.  If the forecast dates are not moved to the Data Date or later, the schedule cannot be used to correctly calculate Driving Paths to an Event Milestone, or calculate the Program Critical Path.

10.  No Assigned Resources

The test: Percentage of incomplete activities that do not have resources assigned to them.

The threshold: 0%.

This is a complex check because of two basic factors: 1) resources are not required to be loaded on tasks unless directed by the contractor’s internal management requirements, and 2) some tasks such as Schedule Visibility Tasks (SVTs) and Schedule Margin tasks should not be associated with work effort.  If the contractor chooses not to load resources into the schedule the options are:

  1. Associate basic quantities of work with tasks and define in a code field, transfer those quantities to the EVM cost system and verify the traceability between the IMS quantities and the associated budgets in the cost system.
  2. Maintain the budgets entirely in the EVM cost system and provide a trace point from the activities in the IMS to the associated budgets in the cost system.  The trace points are usually in the form of control account and work package/planning package code values.

In either case, care must be exercised so that Schedule Visibility Tasks are reviewed and confirmed to ensure that work is not misrepresented to either the contractor or the customer.

11.  Missed Activities

The test: Percentage of completed activities, or activities that should have been completed based on their baseline finish dates, and failed to finish on those dates.

The threshold: 5%.

Many people view this as a performance metric.  That is true, but it is also used to review the quality of the baseline.  For example, if a project has a 50% failure rate to date, what level of confidence should the customer have in future progress?  Is the baseline a workable plan to successfully complete the project?  Does the EVM System reflect the same issues as the IMS?  If not, are they correctly and directly connected? These are questions that should be addressed by the contractor before the customer or other oversight entities ask them.

12.  Critical Path Test

The test: Select a task on the program Critical Path and add a large amount of duration to that task, typically 600 days.

The threshold: The end task or milestone should slip by as many days as the delay in the Critical Path task.

This is a test of the integrity of the schedule tool to correctly calculate a Critical Path.  If the end task or milestone does not slip by as many days as the artificial delay, there are structural issues inhibiting this slip.  These issues may be logic links, hard constraints or other impediments to the ability of the schedule to reflect the slip.  These issues should be addressed and corrected as the schedule data is to be relied upon to provide meaningful information to management.

13.  Critical Path Length Index (CPLI)

The test: The Critical Path length + the Total Float on the Critical Path divided by the Critical Path Length.  This formula provides a ratio that puts the Critical Path Float in perspective with the Critical Path length.

The threshold: .95 or higher.

If the program is running with zero (0) Total Float on the Program Critical Path, then the ratio is 1.00.  If there is negative float on the Program Critical Path, then the ratio will fall below 1.00 which indicates that the schedule may not be realistic and that project milestones may not be met.

14.  Baseline Execution Index

The test: The number of completed activities divided by the number of activities that should have been completed based upon the baseline finish dates.

The threshold: .95 or higher.

This check measures the efficiency of the performance to the plan.  As such, some people also dismiss this as a simple performance metric, but as in the case of Metric #11 (Missed Tasks), this is also a measurement of the realism of the baseline plan.  As in Metric #11, if the schedule performance is consistently not to the plan, how viable is the plan?  How viable is the EVMS baseline?  How accurate is the information from the baseline that Management is using to make key decisions?  Metrics #11 and #14 may reflect the result of the effort being performed on the contract, but also represent the quality and realism of the baseline plan.

What are additional metrics that help identify schedule quality issues?

The DCMA’s 14 point schedule assessment should be considered a basic check of a schedule’s health, but by no means is the only check that should be used to analyze an IMS.  More industry standard checks are identified in other documents, including the Planning and Scheduling Excellence Guide (PASEG) revision 2.0 (6/22/12). The PASEG is a National Defense Industrial Association (NDIA) product and was developed in cooperation between industry and the Department of Defense. Section 10.4, Schedule Execution Metrics, discusses in greater detail some of the Health Metrics identified above, as well as other metrics including the Current Execution Index (CEI) and the Total Float Consumption Index (TFCI).

In addition to these metrics, checks should be performed on activity descriptions, activity code field values, risk inputs, Earned Value Techniques and other tests to assure alignment of the IMS with its partner information systems.  These systems include, but are not limited to the MRP system, the cost system, program finance systems and the risk management system.  The IMS in an integral component of a company’s management system, therefore issues with the IMS data will be reflected in the other components of the EVMS.

All of the above health checks can be performed manually with the use of filters and grouping functions within the scheduling tool; however, they may take too much time and effort to be successfully sustained.  The marketplace has tools available to perform these and other checks within seconds, saving time and cost, allowing schedule analysts and management to devote valuable time to address and resolve the issues.  With the aid of these tools, a comprehensive schedule health check can be performed as part of the business rhythm instead of an occasional, time available basis.

Summary

Schedule Health Metrics are an important component of the schedule development and maintenance process.  While the DCMA has established some basic standards for schedule health assessments, the 14 metrics should not be considered the only checks, but just the beginning of the schedule quality process.

Schedule checks should be an integral part of the schedule business rhythm and when issues are identified, they should be addressed quickly and effectively. Significant numbers of tasks that trip the metrics, or persistent issues that are not resolved, may require a Root Cause Analysis (RCA) to identify the reasons for the problems and to develop a plan to address them.

Give Humphreys & Associates a call or send us an email if you have any questions about this article. 

Schedule Health Metrics Read Post »

Earned Value and Negative Float

Earned Value and Negative FloatQuick.  What do Bankers, Ship Captains and Program Managers have in common?  Answer: They all want to address negative float issues in a timely manner.

While those of us working in program management are not concerned so much with a ship’s ability to stay afloat or financial maneuvers, we should be concerned with earned value and negative float in the schedules.  It is an important warning sign that one or more of the Program’s schedule goals cannot be met with the current plan.

As described above, the term ‘negative float’ has different meaning to different people even within the project management community.  To be precise, the term refers to a property assigned to each task or milestone in the schedule called Total Float, or Total Slack in Microsoft Project.  The values in the property usually represent days and are assigned as a result of a scheduling analysis run.  These numbers can be positive, zero or a negative number of days:

  1. For tasks with positive numbers assigned to the Total Float property, the tasks can be slipped by that number of days before impacting a milestone or the end of the project.
  2. When the task Total Float value is zero, the task cannot slip at all.  Conditions 1 and 2 should be the norm, with all tasks having zero or higher total float values.  If the schedule were well constructed, has realistic task durations and includes all discrete scope, the schedule indicates the project has a good plan in place to achieve its goals, albeit contractual or internal.
  3. When tasks have negative float values, the schedule is sounding an alarm.    Tasks with negative float values indicate probable failure to meet one or more completion goals.  These goals are represented in the schedule as date constraints assigned to tasks or more preferably, milestones. These date constraints represent necessary delivery deadlines in the schedule and if the current schedule construct is unable to meet those delivery deadlines, negative float is generated on every task that is linked in that potential failure.  The more tasks with negative float, and the larger the negative float values on those tasks, the more unrealistic the schedule has become.

If the schedule contains tasks with negative float, the first step is to quantify it. This can be performed in the tool using filters or grouping by float values.  Analysis tools, such as Deltek’s FUSE, Steelray or the DCMA’s new Compliance Interpretive Guide (CIG), are used to evaluate contractor delivered data and provide metrical analysis to Auditors prior to a review.  The tolerance threshold in the CIG (current nickname ‘Turbo’), as in all schedule analysis tools, is 0 (zero) percent of tasks with negative float.

Once identified, the next step is to determine the cause of the issue(s).  Because negative float is generated by a date constraint in the schedule, if the end point can be determined, then the predecessors can be identified that are forcing the slip to the end point.  One of the easiest ways to do this is to group the schedule by float and sort by finish date.  This is because most of the string of tasks that push a task/milestone with a delivery date constraint share the same float values; look for those groups of tasks with the same negative float values.

The final step is to take action.  Planners, CAMs and their managers should meet and collaborate to determine the cause and options available to solve the issues.  These meetings should result in a corrective action plan to solve the problem. In general, there are five options available to the program team:

  1. Change durations – if the negative float leading up to a delivery point were low, perhaps additional resources assigned to those tasks may help reduce the durations of the activities and relieve the negative float issues.  It is important to understand that reducing durations just to avoid a bad metric reading for negative float is just putting off the issue until the ultimate surprise is delivered; a delay in delivery, and all the pain associated with that delay (penalties, lost award fees, lost business if consistently late, etc).
  2. Change relationships – perhaps some tasks may be run in parallel instead of in series. A review of all the logic contributing to the negative float condition should be performed and adjustments should be made only if they make sense.
  3. Review date constraints in the Integrated Master Schedule (IMS) – for example, if subcontractors could deliver product earlier, that could also help solve the issue. If waiting for customer-provided equipment or information, perhaps the effort can be accelerated to relieve the stress on the schedule.
  4. Consume Schedule Margin – If there is still negative float leading up to a major contract event or contract completion, and if all of the above options have been exhausted, the PM has the option to use a portion of the Schedule Margin to relieve the negative float pressure leading up to the milestone.  If the Schedule Margin were represented by a bar, it means decrementing the forecast duration of the bar.  If the Schedule Margin were represented as a milestone, the date constraint on that milestone can be changed to a later point in time, but not later that the contractual delivery date assigned to it.
  5. Ask for relief – if, after all processes above have been completed and the schedule still has negative float indicating an inability to meet schedule deadlines, it is time to have a discussion with the customer.  It is usually better to have these bad news discussions earlier rather than later when there is still time to implement work-around or corrective action plans.  The customer has been reading the same schedule and may have helpful suggestions to solve the problems or could potentially provide contractual relief for the delivery dates.   As a last resort, the contractor can inform the customer and seek concurrence that an Over Target Schedule (OTS)* should be instituted to relieve the schedule condition and a more realistic schedule developed.  This is an option of last resort and should not be taken lightly unless all of the other options have been thoroughly explored. *See our blog: Is it OTB/OTS Time or Just Address the Variances?
    .

Summary

The definition of a schedule is a time phased plan that defines what work must be done, and when, in order to accomplish the project objectives on time. Earned value and negative float is a condition in the schedule that indicates the project will be unable to meet one or more of its objectives. It should not be ignored, or worse, marginalized with slap-dash tricks to get rid of it such as deleting relationships or reducing durations to zero.

Instead, negative float should be quantified, analyzed and addressed with a corrective action plan which includes steps and follow-up reviews to ensure adequate remediation of the problem.  It is a zero tolerance metric with most customers and, if not addressed internally, will most likely be identified by the customer for action.

Contact Humphrey’s & Associates, Inc. with questions or information on how to set up a corrective action plan for earned value and negative float. 

Earned Value and Negative Float Read Post »

Reviewing Authority Data Call – Not Just a Wish List

Authority Data Call

Data Call

One of the most important items needed to prepare for an Earned Value Management System (EVMS) review is the data call. This is not just a list of random data; the reviewing authorities have a defined set of data items they want to review so they can evaluate the EVMS implementation and compliance.

Required Artifacts

Over the years the reviewing authorities have fine-tuned the review process and created a very specific list of required artifacts. They use these items to pre-determine the review focus areas so they are prepared to get right to the soft spots in the system and processes.

Formal Review Notification

The process begins when the contractor receives a notification from the reviewing authority that they will conduct a formal review of a project. This could be a Compliance Review (CR); an Integrated Baseline Review (IBR); standard Surveillance; or one of many other reviews conducted to determine the implementation or continued compliance of the EVMS processes and reports. Regardless of the type of review, one of the key items is the data call request. The data call is used to request project information, and could consist of 12 reporting periods, or more, of data. This will vary by agency, type of program, and type of review. In most cases, a minimum of three months of project data will be required; typically, however, 6 to 12 months of data would be requested.

Basic Reports

Some of the basic reports requested are the Contract Performance Reports (CPRs), Integrated Program Management Reports (IPMRs), or similar time phased project performance reports produced from the earned value (EV) cost tool database. The data call request includes the detail source data from the EV cost tool as well as the Integrated Master Schedule (IMS) from the beginning of the program. This source data is often delivered electronically to a customer following the IPMR or Integrated Program Management Data and Analysis Report (IPMDAR) Data Item Description (DID) prescribed data formats. The Baseline Logs are often also requested.

Quality Data

It is essential to provide quality data in response to the Review Authority data call. The entire review process can be derailed when data call items are incomplete or inaccurate. Some of the things to consider are:

  1. Make sure the list of requested items is fully understood (some nomenclature issues could cause an issue).
  2. The data should be available in the format required in the call.
  3. Determine the best way to support the data call delivery if it is not specified in the request. The data can be provided using electronic media such as thumb drive, as attachments to emails (the size of the files may prohibit this), or possibly establishing a secure access cloud server to store the data for the reviewing authority to retrieve.
  4. Contact the requesting reviewing authority to establish a meeting to discuss the data call. This meeting should be used to resolve or clarify any issues regarding the requested information, negotiate potential equivalents of the project data if it does not exactly match the requested information, and establish a method to transmit all data files.
  5. Develop an internal plan to monitor the progress of data collection. Be sure to have non-project personnel review the data for accuracy and compliance with the specifics in the data call.
  6. Submit the data call to the requesting authority, follow-up with a phone call or meeting to verify the reviewing authority received the data, can open all the files, and agrees the complete set of data has been provided.
  7. Follow-up with another call a few weeks before the review to check if the reviewing authority has any issues or problems in evaluating and understanding the data call information. Be willing to work with them until the authority is comfortable with the data.

[NOTE: The number of items on the list depends on (1) the agency conducting the review and on (2) the type of review being conducted. The number of items requested could vary from around 30 to 100 or more.]

Typical Data Call

Some of the basic items typically requested in the data call are:

  1. Earned Value Management System Description including the matrix of the System Description and related system documentation mapped to the 32 guidelines in the EIA-748 Standard for Earned Value Management Systems as well as to the current version of the reviewing agency’s EVMS Cross Reference Checklist.
  2. EVMS related policies, processes, procedures, and desktop instructions. Examples include organizing the work, scheduling, budgeting, work authorization, details about earned value techniques and how each is applied, change control, material planning and control, subcontract management, and risk/opportunity management.
  3. Organization charts down to the Control Account Manager (CAM) level.
  4. Accounting calendar.
  5. Project directives including the Statement of Work (SOW) pertaining to Program Management or Statement of Objectives (SOO), EVM clauses, and EVM Contract Data Requirements List (CDRLs) or Subcontract Data Requirements List (SDRLs).
  6. Work Breakdown Structure (WBS) Index and Dictionary.
  7. Responsibility Assignment Matrix (RAM) including budget detail at the CAM level.
  8. Project and internal work authorization documents.
  9. Integrated Master Plan (IMP) or milestone dictionary.
  10. Contract Budget Base Log, Management Reserve Log, and Undistributed Budget Log.
  11. Risk/opportunity identification and assessments, risk/opportunity management plan.
  12. Cost performance reports (all applicable formats) or datasets. Provide the reports or dataset in the format provided to the customer such as PDF, Excel, UN/CEFACT XML, or JSON encoded data per the DID on contract such as the CPR, IPMR, or IPMDAR.
  13. Integrated Master Schedule (IMS) submissions and related native schedule file. This includes the IMS summary report if required.
  14. IMS Data Dictionary.
  15. Most recent Contract Funds Status Report (CFSR) or equivalent funding status report.
  16. Variance Analysis Reports (VARs) or equivalent progress narrative reports as well as the internal and external variance thresholds.
  17. List of subcontractors including value and type (such as cost reimbursable, firm fixed price, time and materials) including the applicable purchase orders. When EVM requirements are flowed down to the subcontractors, provide a copy of subcontractor EVM related contractual requirements (CDRLs and DIDs).
  18. Major subcontractor CPRs, IPMRs, or equivalent cost performance reports (all applicable formats) or IPMDAR datasets.
  19. Major subcontractor IMS submissions.
  20. Previous audit or surveillance findings, resulting reports, corrective action plans, and resolution and tracking Logs.
  21. List of specific software toolsets used for accounting, scheduling, cost management, resource management, risk/opportunity management, or performance analysis.
  22. EVMS Storyboard and flowcharts.
  23. Chart of accounts, including cost element definition.
  24. Staffing plans or weekly/monthly labor reports.
  25. List or copy of contract modifications.
  26. Cost Accounting Standards (CAS) disclosure statement or equivalent internal corporate procedures.
  27. Baseline Change Requests.
  28. Any other data previously provided to the customer as part of a data call.
  29. Basis of Estimates (BOE) or historical data/productivity rates and efficiency factors.
  30. Estimate to Complete (ETC) and Estimate at Completion (EAC) documentation.
  31. Budget reports or control account plans by element of cost (labor hours and dollars, material dollars, and other direct cost dollars) and associated burdens or overhead costs.
  32. Actual cost reports.
  33. Open commitment reports.
  34. Bill of material including cost detail.
  35. Quantifiable Backup Data for percent complete work packages including MRP/ERP Reports for production work packages.

Reacquaint Yourself

The list includes items that are used frequently, as well as items that are used only at specific times during the project, and will probably be less familiar to the review team. As the collection of the data call items progresses, be sure to establish quick refresher sessions on the less frequently used documents and any other items where the review team might be having difficulty. As part of the process of gathering the data call items, be sure internal reviews are conducted to verify accuracy and traceability, verify the users of the data are familiar with the data content so they can be prepared to answer questions, and current data are available to the review team.

NOTE: This Data Call List is intended for general guidance in preparation for any agency review (e.g., DCMA, DOE, FAA, etc.). For example, in the past, the DCMA Compliance Review Data Call item list contained 102 specific items, but this number varies from review to review and has changed over the years.  The number is not as important as the quality of the data items that are delivered to the review authority.

First Impressions

The data call items will provide the first look at the project’s EVM data and process for many of the review team members. The review team members will have the data several weeks prior to the on-site review. They will be performing multiple validation checks using various analytical software tools as well as hands-on analysis of the information. If the data is incomplete, contains errors, and does not trace well, the review team will form a more negative opinion of the EVMS application.

Double Check the Data Call

The data analysis results will be a basis for where attention is focused during the on-site review, as it emphasizes areas that contain anomalies or indicates a lack of system integrity. Significant emphasis should be devoted to the data call items to ensure accuracy and compliance with the review authority’s requests, as it is a very positive way to begin the data call review.

A Humphreys & Associates EVM specialist is always available to answer questions. Give us a call or send an email.

Reviewing Authority Data Call – Not Just a Wish List Read Post »

Common Problems Found in EVMS and Recommended Corrective Actions – Part 5

This is the last of a five part series regarding common findings discovered in contractors’ Earned Value Management Systems (EVMS), and the recommended corrective actions to mitigate those findings.

The previous articles discussed: 

Common Errors and CA part 4

Part 5 of this series includes:  Inappropriate use of PERT and LOE; Misuse of Management Reserve; Administrative Control Account Managers.

1)  Inappropriate use of PERT and LOE

The Program Evaluation and Review Technique (PERT) earned value method is a simple method for calculating the BCWP, where:  BCWP = (ACWP/EAC) X BAC.  In this method, the earned value is completely contingent upon cumulative expenditures (ACWP) divided by an estimate of total expenditures.  Because the results of this formula often have little to do with actual progress, its use is limited to non-critical work, and generally is applied only to high volume, low dollar fixed price material.  The PERT method should never be used for any critical path task, labor, or high dollar value material.  Guideline 7 of the EIA-748-C Standard requires that an EVMS “Identify physical products, milestones, technical performance goals, or other indicators that will be used to measure performance”. The primary condition that must be satisfied in a review of earned value techniques is the application of “meaningful indicators” for use in measuring the status of cost and schedule performance.

Level of effort (LOE) tasks consist of management or sustaining type activities that have no identifiable end products or an established relationship to other measurable effort.  The standard for the control of LOE is documented in Guideline 12, which requires that “Only that effort which is immeasurable or for which measurement is impractical may be classified as level of effort”.  There is no standard threshold for a contract or WBS level that would signify “too much” LOE. However, a common practice during review discussions with the control account managers is to challenge any LOE to assess its appropriateness.  There is always pressure on a contractor to minimize the LOE as the nature of LOE can easily mask or distort the performance of discrete work.

Most Common Corrective Action Plans

The most common response to findings regarding both PERT and LOE is to establish a screening/approval process, with thresholds, during the budgeting process.  For PERT, most Earned Value Management System Description Documents (EVM SDD) will specify the limited use of the technique for high volume, low dollar fixed price material.  Many also take the next step and create a threshold for what is considered “low dollar” and short duration.  This is dependent on the nature of the work, but it is not unusual for an SDD to require that any material extended value (quantity of parts times budgeted unit value) or part number greater than $10,000 (or some other threshold) must be tracked discretely in the IMS and may not use the PERT method.  Some also establish a a duration threshold like no greater than 3 months when there are many parts of low value. One of the signs that PERT is being used inappropriately is when variance analysis included in the Integrated Program Management Report (IPMR) or Contract Performance Report (CPR) Format 5 consistently refers to PERT accounts as drivers, or a schedule variance explanation that refers to material not being tracked in the IMS.

Level of Effort should be justified on a case-by-case basis.  A common strategy for setting the appropriate level of LOE is to require the Program Manager’s approval on all LOE accounts.  While control accounts may contain a mixture of LOE and Discrete Effort, many organizations establish rules concerning the maximum allowable LOE in a control account to prevent the distortion of status; often a threshold of 20% is established.  Above that threshold, a separate control account would be required for LOE work.  While it is a goal to have no more LOE than is required, care must be taken not to measure work that is truly LOE.  CAMs have been known to say that they were required to establish a discrete account even though the nature of the work is impractical to measure.  This type of discovery by a review team can also result in a finding for use of an inappropriate earned value technique.

2)  Misuse of Management Reserve

Management Reserve (MR) is a portion of the overall contract budget held for management control purposes and unplanned events that are within the scope of the contract.  H&A has often heard in the course of our consulting or training that there are few rules regarding MR, and the restrictions are unclear.  This is not quite accurate.  The Integrated Program Management Report Data Item Description (IPMR DID, DI-MGMT-81861) lists four restrictions on the use of MR:

  • MR shall not be used to offset cost variances.
  • MR shall never be a negative value.
  • If MR includes the contractor and subcontractor amounts together, the breakout shall be discussed in Format 5.
  • Amounts from MR applied to WBS elements during the reporting period shall be listed in Block 6.b of Format 3 and explained in Format 5.
    • Format 5:  Identify the sources and uses of MR changes during the reporting period.  Identify the WBS elements to which MR was applied and the reasons for its application.

The EIA-748-C Standard adds a few more caveats for MR:

  • Held for unexpected growth within the currently authorized work scope, rate changes, risk and opportunity handling, and other program unknowns.
  • May be held at the total program level or distributed and controlled at lower management levels.
  • Held for current and future needs and is not used to offset accumulated overruns or under runs.
  • Is not a contingency that can be eliminated from prices during subsequent negotiations or used to absorb the cost of program changes.
  • Must not be viewed by a customer as a source of funding for added work scope. This is especially important to understand that it is a budget item and that it is not for added work scope.

In addition, the Defense Acquisition University Evaluation Guide (EIA 748 Guideline Attributes & Verification Data Traces) requires that the internal MR Log be reconciled with the IPMR (or CPR).

The specific nature of the above requirements reflects the types of abuse experienced with the MR budgets.  The EVM SDD should establish specific organizational guidelines for the application of MR; however, those rules and the organization’s practices must fall within the bounds established by the EIA-748-C and the appropriate Data Item Description (DID).

The discrepancies found in recent reviews take two primary forms: inappropriate application of MR in the current accounting period and poor reporting and discussion involving MR use in the IPMR/CPR Formats 3 and 5.  With the exception of rate or process changes, all applications of MR must be made in association with additional work scope authorized to control accounts.  Because of the prohibition in the requirements regarding the offset of overruns or underruns, applications in the current period can be far more suspicious than in future periods.  Care must be taken to fully justify the timing and use of MR in terms of additional scope.  All applications of MR must have a full accounting in Formats 3 and 5 of the IPMR (or CPR).

Most Common Corrective Action Plans

Response to these discrepancies is usually a matter of policy, training, and discipline.  The EVM SDD must contain a policy that enforces the rules in the guidance documents mentioned above, as well as document who has the authority for approval of MR application (generally the Program Manager).  Those responsible for authorization of MR must be familiar with the approval policies, and their support staff must ensure there is complete visibility in the customer reporting and reconciliation to the program logs.  One of the more common corrective actions is to structure the IPMR (or CPR) Format 5 so that reporting of MR transactions meets the intent of the guidance cited above.

3)  Administrative Control Account Managers (CAMs)

The role of the CAM can change across organizations, and there is no standard set of criteria that defines the CAM’s duties.  The National Defense Industrial Association (NDIA) Integrated Program Management Division (IPMD) Earned Value Management Systems Intent Guide states that “The control account manager is responsible for ensuring the accomplishment of work in his or her control account and is the focal point for management control”.  The requirements for management control can be defined as having three essential attributes for the role of the CAM: responsibility, authority, and accountability.  These attributes assume ownership of the technical, schedule and cost aspects of the scope authorized to a Control Account Manager (CAM).  It is not always the case that the CAM must be the technical expert over the scope of the control account; sometimes that is the role of the “performing organization” rather than the “responsible organization”.

Primarily through discussion with the CAMs, it is easy to assess if they fail in performing any or all three of the above essential attributes.  Because being a CAM brings a set of duties that are over and above those of a technical manager or engineer, the CAM’s role is often not a welcomed addition and some organizations hand it over to employees who have little knowledge of, or responsibility for, the effort.  A comment on a Corrective Action Request (CAR) for an organization that was employing Administrative CAMs was “CAM does not stand for ‘Control Account Monitor’”.  The role of the CAM does include necessary administrative responsibilities; such as, reporting status, maintaining records, and developing analysis for the control accounts.  However, these cannot be the only functions the CAM performs.

A CAM should be active in the development of the control account plans in the IMS and EVM Systems, including being the primary architect for defining the tasks, logic for the schedule and the adequacy of the budget.  The CAM should be the primary contact for the Program Manager regarding the control account, including risk management and corrective action planning.  The CAM should also have the authority to assign and coordinate work performed by other organizations.  The CAM should have enough knowledge of the scope and the executing environment to develop a realistic forecast of costs beyond that of mathematical extrapolation.  If control accounts contain subcontracted work, the CAM is also responsible for management of that subcontractor effort.

Most Common Corrective Action Plans

Choosing the right personnel to fulfill the requirements of the CAM role can be difficult.  One of the first considerations is the appropriateness of the organization that is given the responsibility for management.  An example is for material control accounts where the organization responsible during a program’s development phase may not be appropriate for the production phase.  It is very important for the contractor when submitting the corrective action plan (CAP) to treat the job of the CAM as being critical to project success, and not one relegated to people in the organization who do not have responsibility, authority, or accountability.

The Correction Action Plan should also include a list of essential CAM attributes, and be very clear on the responsibilities and authority of the CAM role.  Companies should make a commitment to ensure that the position is considered critical and not just created to fulfill the requirements of earned value.  This can be demonstrated by not only choosing the right individuals to perform the functions, but also providing the necessary resources, training, and support to function successfully.

This completes our 5 part series. Thank you for your readership.

If you have any questions or would like to inquire about our services, please feel free to contact us

Common Problems Found in EVMS and Recommended Corrective Actions – Part 5 Read Post »

Scroll to Top