EVM Terminology

EVM Terms

Including Level of Effort (LOE) in the Integrated Master Schedule (IMS)

, , ,

A recent H&A blog titled “Level of Effort (LOE) Best Practice Tips” discussed different approaches for handling LOE to avoid generating false variances. That discussion did not elaborate on including the LOE tasks in the integrated master schedule (IMS). This blog is a follow on to that earlier discussion with a focus on options for including LOE in the IMS along with notes on best practices, tips, and customer expectations.

In the general sense of an earned value management system (EVMS), the LOE scope of work is contained in summary level planning packages (SLPPs) or control accounts as subordinate planning packages or work packages. The budget values for those elements will most likely come from a resource loaded IMS or a resource loading mechanism aligned with the IMS. Not all organizations resource load the IMS activities but instead extract time buckets from the IMS for resource loading using other mechanisms. Resource loading the IMS activities is the recommended practice because it assures cost/schedule integration, but it can be difficult.

LOE work might not appear in the IMS since it is considered optional by some customers such as the Department of Defense (DoD). The Department of Energy (DOE) requires LOE tasks to be included so you can expect it to be in the IMS when DOE is the customer.

Before we talk about LOE in the IMS we must think about the type of work the LOE tasks represent. LOE might be a general task such as “Control Account Management” that is not directly related to other work except perhaps in the time frame in which they happen. But some LOE tasks such as support tasks are related to other discrete work. Modeling the LOE in the IMS starts by understanding what type of effort is involved and can help to determine the approach for linking activities. 

LOE Best Practice Tips Related to the IMS

The Level of Effort (LOE) Best Practice Tips blog included these points related to the IMS:

  • “When LOE activities are included in the schedule, they should not drive the date calculations of discrete activities in the integrated master schedule (IMS). They should also not appear on the critical path.”
  • “LOE must be segregated from discrete work effort. In practice, this means a work package can only be assigned a single earned value method.”
  • “Consider shorter durations for the LOE when that LOE is supporting discrete effort. Should the first occurrence of the LOE trigger a data anomaly test metric, it can be proactively handled along with any future replanning. The remaining LOE would already be in one or more separate work packages so there won’t be any criticism for changing open work packages.”

Government Agency and Industry Guidance on LOE on the IMS

Is there any guidance that can help clarify how best to handle LOE tasks in the IMS? Let’s take a look at three of the guidance documents that may be useful for your environment.

  1. The Integrated Program Management Data and Analysis Report (IPMDAR) Data Item Description (DID), DI-MGMT-81861C (August 2021). This DID is typically placed on contracts with the DoD or NASA that exceed the contract value threshold for EVM reporting or EVMS compliance. Relevant mentions of the data requirements for the IMS in the DID are as follows.

“2.4.1.1 Content. The Schedule consists of horizontally and vertically integrated discrete tasks/activities, consistent with all authorized work, and relationships necessary for successful contract completion.”

Note: This is where the option to exclude LOE from the IMS appears since this requires only discrete tasks/activities. The following sections provide additional guidance when LOE is included in the IMS.

“2.4.2.7 Level of Effort (LOE) Identification. If tasks/activities within an LOE work package are included in the Schedule, clearly identify them.”

“2.4.2.9 Earned Value Technique (EVT). Identify the EVT (e.g., apportioned effort, level of effort, milestone).”

  1. National Defense Industrial Association (NDIA) Integrated Program Management Division (IPMD) Planning and Scheduling Excellence Guide (PASEG) (Version 5.0). The PASEG is a widely recognized industry guide on scheduling best practices in government contracting environments. Section 5.8, Level of Effort (LOE) provides a discussion on the topic including things to promote and things to avoid. Excerpts from the PAGEG follow.

“There are pros and cons around including or excluding LOE tasks in the IMS. Including LOE tasks in the IMS allows for a more inclusive total program look at resource distribution, which aids in the maintenance and analysis of program resource distribution. However, if modeled incorrectly, including LOE tasking in the IMS can cause inaccurate total float and critical path calculations.”

“Tasks planned as LOE in the IMS should be easily and accurately identifiable. This includes populating the appropriate Earned Value Technique field (as applicable) and possibly even identifying the task as LOE in the task description.”

“Consider adding an LOE Completion Milestone to tie all LOE tasking to the end of the program.”

“LOE tasks should not be networked so that they impact discrete tasks. Incorrect logic application on LOE can lead to invalid impacts to the program critical path.”

“Level of Effort tasks should have no discrete successors and should therefore never appear on critical/driving paths.”

  1. DOE Guide 413.3-24 Planning and Scheduling (April 2022). This document provides guidance for acceptable practices in a DOE contractual environment. The discussion on LOE can be found in Section 7 Planning and Scheduling Special Topics, 7.2 Level of Effort, and 7.3 Inclusion of Level of Effort in the Integrated Master Schedule. Excerpts and image from the Guide follow. 

“Overview: Activity-based methods either cannot, or impracticably can measure the performance of LOE WPs and activities. Include all activities, both discrete and LOE, in the IMS.”

“LOE is planned in the IMS so that it does not impact discrete work. Figure 6 shows the recommended linkages in the IMS for planning level of effort.”

Interpreting this DOE Guide diagram for the recommended modeling of LOE in the IMS, notice the inclusion of a “LOE Complete” milestone following the Critical Decision (CD) 4 milestone with no constraint. CD4 in this diagram represents the end of contract effort. The purpose of this LOE-complete milestone, with no constraint, is to provide a successor for all LOE tasks where one is needed. That will prevent generating issues where tasks have no successors.

This recommended modeling is done so that the LOE tasks are not linked to the end of the contract work and thus will not push it. The LOE tasks will also not appear on the critical path since they are not in the path that established the end date.

Also note that the LOE tasks in green are linked as successors to discrete work which is a logic linking approach intended to keep the LOE work aligned with the discrete work but off the critical path. Study the logic and you see that a movement to the right of a discrete task will drag along its related LOE task.

DOE requires the use of Primavera schedule tools so the relationships shown here can be accomplished in that tool. That may not be true of all tools. Know how your tools work before you generate any guidance.

Additional Relevant Guidance Search

H&A earned value consultants recently conducted a survey of the various government and non-government documents regarding the IMS and collected relevant guidance related to LOE among other things. The table below lists the results from a search for “LOE” wording. Note: this is a representative sample of typical government agency and industry IMS references. You should verify current references before you generate your own internal IMS guidance.

Source DocumentGuidance for Capturing all Activities, LOE in IMS
DCMA EVMS Compliance Metrics (DECM) Checks (version 6.0)
  • 06A210a: Do LOE tasks/activities have discrete successors? (0% threshold)
  • 12A101a: Are the contractor’s Level of Effort (LOE) WPs supportive in nature and/or do not produce technical content leading to an end item or product? (≤ 15% threshold)
  • 12A301a: Does the time-phasing of LOE WP budgets properly reflect when the work will be accomplished? (≤ 10% threshold)
IPMDAR DID DI-MGMT 81816CIf tasks/activities within an LOE work package are included in the Schedule, clearly identify them.
DOE Guide 413.3-24 Planning and Scheduling, Appendix A Schedule Assessment PrinciplesPrinciple 20. No LOE on critical path.
GAO Schedule Assessment Guide: Best Practices for Project Schedules (December 2015)Selected excerpts:
  • LOE activities should be clearly marked in the schedule and should never appear on a critical path.
  • LOE activities … derive their durations from other discrete work.
  • Best Practices for confirming the critical path is valid: Does not include LOE activities, summary activities, or other unusually long activities, except for future planning packages.
NDIA IPMD PASEG (version 5.0) (as noted above)
  • Tasks planned as LOE in the IMS should be easily and accurately identifiable.
  • LOE tasks should not be networked so that they impact discrete tasks.
  • Level of effort tasks should have no discrete successors and should therefore never appear on critical/driving paths.
PMI Practice Standard for Scheduling (Second Edition)Since an LOE activity is not itself a work item directly associated with accomplishing the final project product, service, or results, but rather one that supports such work, its duration is based on the duration of the discrete work activities that it is supporting.

Conclusion

Based on the various sources of guidance, it is possible to structure the IMS to include LOE in a way that provides cost/schedule integration and keeps all work correctly aligned yet does not cause issues with the critical path and the driving paths. From this guidance, it should be a straightforward effort to generate your own internal scheduling procedure defining how to handle LOE in the IMS if you choose to include it or if you are required to include it.

Need help producing a clear and concise scheduling procedure or tool specific work instructions? H&A earned value consultants and scheduling subject matter experts have worked with numerous clients to create easy to follow guides that help to ensure schedulers are following your company’s best practices using the scheduling tools of choice. Call us today at (714) 685-1730 to get started. 

Including Level of Effort (LOE) in the Integrated Master Schedule (IMS) Read Post »

Improving Integrated Master Schedule (IMS) Task Duration Estimates

, , , ,
Improving Integrated Master Schedule (IMS) Task Duration Estimates

One of the top reasons projects fail is because of poor task duration estimating for an integrated master schedule (IMS). Without accurate and consistent estimates, project outcomes can become unpredictable, leading to missed deadlines, budget overruns, and overall project failure. A realistic schedule is required to place the necessary resources in the correct timeframe to adequately budget the work as well as to produce credible estimates to complete and to forecast completion dates. While missed deadlines and budget overruns are detrimental for any project, there can be additional business ramifications when producing schedules in an Earned Value Management System (EVMS) contractual environment.

While there are effective methods available to improve task duration estimates, they are often underutilized. A common reason for this oversight is the lack of time allocated to developing the project schedule and determining task durations.

During the proposal phase, initial durations are typically estimated at a more summary level than the detailed execution phase. The proposed work is often defined at a level one to two steps higher than where the actual tasks will be performed. After project initiation, the team’s initial effort is to break the work down into more manageable tasks. This decomposition is crucial for achieving more accurate estimates. It’s no surprise, then, that the initial breakdown efforts often result in duration estimates that don’t align with the proposed durations.

Parkinson’s Law tells us that work expands to fill the time available. If task durations are excessively long, costs will inevitably rise. To counter this, it’s important to require estimators to provide both the estimated effort and the duration needed to accomplish the task. This approach helps to gain a better understanding of the scope of the task and to avoid unrealistic estimates. If you see a task that requires 10,000 hours with a duration of 2 weeks, then you immediately would suspect something is wrong with the estimates.

Techniques for Developing More Accurate Task Duration Estimates

What are your options? H&A earned value consultants and senior master schedulers often employ the following techniques to help a client produce a more realistic IMS.

  1. Establish a Probability Goal. It is essential to set clear expectations for the estimating team. Without guidance, teams may default to estimates with a 50/50 probability of success, which is a recipe for failure. Instead, directing the team to aim for estimates within a 75% to 80% probability range can lead to better outcomes.
  2. Break Down Tasks. Decompose tasks into smaller, more manageable components. The further out the task’s horizon, the greater the variability in estimates. For example, asking someone to estimate the drive time from Washington, DC, to Boston without specifying the vehicle, route, limitations, or conditions introduces unnecessary uncertainty.
  3. Use Professional Judgment. Engage someone with experience in the specific type of work required for the task. A seasoned expert will provide more accurate duration estimates based on their knowledge and experience. Often, we ask the potential task manager to do the estimate, but that person may not be the one with the most related experience or knowledge about the work.
  4. Leverage Historical Data. If the task or a similar one has been done before, use that historical data to inform the estimate. This approach provides a realistic benchmark for future estimates.
  5. Use generative AI. If you have access to an AI capability along with access to historical data, that could be an option to leverage the source data using specific prompts to glean relevant information. As with all AI tools, always verify the generated results to ensure it is a useful basis to substantiate the estimate.
  6. Apply Parametric Estimating. When possible, use parametric analysis to estimate the durations. For example, if it took a specific number of days to clean up a certain amount of toxic waste under similar conditions, this data can be used to estimate the duration of a new but comparable task.
  7. Engage Multiple Estimators. Gathering estimates from more than one person helps to reduce individual biases and provides a more rounded estimate.
  8. Apply the Delphi Method. This technique involves three knowledgeable individuals providing estimates or three-point estimates. The initial estimates are analyzed, and the results are shared with the estimators without attributing specific values to any individual. After discussing the findings, the estimators revise their estimates based on the collective insights, leading to a more refined and accurate duration estimate.
  9. Use Three-Point Estimates. Ask estimators to provide best-case (BC), most likely (ML), and worst-case (WC) durations, along with their reasoning. Applying a formula like the Program Evaluation and Review Technique (PERT) duration formula (1BC+4ML+1WC)/6 can yield an adjusted and realistic estimate. You can vary the best and worst case estimate for risk if you have information on that.

    To see how this simple approach can work, walk through this exercise. Ask yourself how long it takes you to drive to work most of the time. Let’s say the answer is 45 minutes. Then ask yourself how long it would take on a Sunday morning in the summer when the roads were dry (the best case). Let’s say your answer is 25 minutes. Then ask yourself how long it would take on a Monday morning in the winter during a moderate snow event (the worst case). You tell yourself 90 minutes. Now you have enough information to calculate the PERT duration.

    Best Case = 25 minutes
    Most likely = 45 minutes
    Worst Case = 90 minutes
    PERT Duration = (25 + 180 + 90)/6 = 49 minutes

    Finally, let’s say you ask yourself how likely it is that you end up on the high side instead of the low side. If your answer is it is much more likely to encounter conditions that slow you down, you would modify the formula to use one and a half times the worst case (25 + 180 + 135)/6 = 57 minutes. That longer duration shows the impact of your impromptu risk analysis and provides a duration that has a much higher probability of being achievable.

    Now think about the same scenario but conducted by you interviewing three people who drive the same route to work. That would approximate the Delphi method.
  1. All or something less. It may not be necessary to analyze every task to the degree suggested. Even if you could do the analysis along the top several critical paths that would be an improvement. If you were to apply numerical factors to the tasks in related portions of the project that would be impactful. For example, all mechanical design tasks or all software development tasks.

What is the best approach?

You will need to analyze your project and determine which approach or approaches would yield useful information at a reasonable cost. If you apply your own thinking on how to improve your duration estimates, you will undoubtedly find a method most suitable for your situation. Depending on a project’s complexity and risk factors, you may also find it useful to take a more formal approach. Conducting a schedule risk assessment (SRA), a probabilistic assessment of a project’s outcome, can help you gain a better understanding of where the duration risk exists in the schedule.

H&A earned value consultants and scheduling subject matter experts often assist clients to establish basic guidance to help scheduling personnel to get into the habit of adequately defining tasks and using techniques to improve duration estimates. This is critical to be able to produce well-constructed and executable schedules to improve the likelihood of achieving project technical, schedule, and cost objectives.

H&A offers a range of project scheduling training workshops that can help schedulers to implement industry best practices. These workshops also cover how to take the next step to implement advanced scheduling techniques such as schedule risk assessments to ensure the schedule is realistic and achievable. H&A earned value consultants and master schedulers often provide one-on-one mentoring using the scheduling tool of choice to help scheduling personnel work through the learning curve of using advanced network scheduling techniques to produce executable schedules.  

Call us today at (714) 685-1730 to get started.

Improving Integrated Master Schedule (IMS) Task Duration Estimates Read Post »

Management Reserve Best Practice Tips

, , , , , ,

A recurring theme H&A earned value consultants find themselves discussing with clients is emphasizing that management reserve (MR) is a very precious budget set aside that must be protected and used appropriately. Unfortunately, MR is often used inappropriately, and quickly depleted in the early stages of a project.

What happens when MR is consumed for other uses than what it was intended? There is no budget available for appropriate uses of MR such as for emerging work, rework, redesign, or make/buy adjustments within the scope of the contract when it is needed in the latter stages of a project. When that happens, a project manager is forced to create a “home” for actual costs for these activities. This results in other inadvisable actions such as:

  • Zero budget work packages which are also known as estimate to complete (ETC) only work packages.
  • De earning the budgeted cost for work performed (BCWP) and opening completed work packages to accept charges.
  • Culling budgets from future unopened work packages, and if they exist, planning packages, summary level planning packages (SLPP), and undistributed budget (UB).

These actions will call into question the integrity of the EVMS and EVM data. The customer conducting EVMS surveillance will also be quick to point out this deficiency in the EVMS implementation and raise the issue to ensure it has management’s attention to correct. The inappropriate use of MR has created a cascade of problems that could have been avoided. In some instances, project personnel were simply not following the rules for the use of MR found in the contractor’s EVM System Description. That’s an easier problem to resolve than other root causes.

The Role of Risk and Opportunity Management in Establishing MR

What H&A earned value consultants often uncover as the root cause of inappropriate uses of MR was that a robust risk and opportunity (R&O) management process would have made a difference in establishing a quantified set aside for MR to handle realized risks. Proactively identifying and managing risks improves project performance. The expectation of specific risks occurring leads to risk handling plans that lower the likelihood and impact of risks. It also provides an informed basis to establish an adequate amount of MR that reflects identified and assessed risks.

The risk assessment provides additional information that assists a project manager’s decision making process to validate a request to use MR is appropriate and has the backup data needed to justify the use of MR and the amount of MR allocated. This detail is necessary for the baseline change request (BCR) approval process as well as the Integrated Program Management Report (IPMR) Format 5 or Integrated Program Management Data and Analysis Report (IPMDAR) Performance Narrative Report (PNR). A project manager is required to identify the changes to MR during the reporting period and provide a brief explanation of the change. This explanation has the potential to pique the interest of the customer to gain a better understanding of why MR was used and the potential impact to the integrity of the EVM data.

Note: MR may increase or decrease for a variety of reasons. The primary use of MR is to handle realized risks within a control account that is within the statement of work (SOW) for the contract. All MR debits or credits should be tracked in a log for full traceability for the entire life of the project. Remember that MR can never be a negative value.

Acceptable Uses of MR

As highlighted in an H&A article titled “The Effective Use of Management Reserve,” examples of the appropriate uses of MR include:

  • Newly identified work is authorized and assigned to a control account manager (CAM). It may be that once the work begins, one or more tasks that were missed in the original planning process now need to be scheduled and resource loaded. Newly identified work could also be the result of internal replanning that required a change in approach or resource requirements.

    An example of this could be a project manager issued a work authorization to a CAM to conduct three tests to meet the requirements in the contract SOW. In the middle of the first test, it becomes clear to the CAM and project manager that a fourth test will be necessary. The project manager and CAM should be aware of this potential risk and be prepared to implement their risk handling strategy as a result of the R&O management process. The CAM can quickly prepare a BCR that the project manager can immediately approve to allocate MR budget to complete the fourth test. 
  • It is necessary to redo a task. This may include unanticipated redesign, remake, or retest. Hopefully, the project’s risk register identified the potential risks associated with the original tasks and management was prepared for the realized risk. 
  • Make/buy adjustments.  This could result in an MR debit or credit. 
  • Statement of work transfers from one organization to another. This could result in an MR debit or credit. 

Inadvisable Uses of MR Commonly Allowed

Although it is often allowed in a contractor’s EVM System Description, it is inadvisable to use MR for direct and indirect rate changes in the future. Note: MR should never be used to make any rate adjustments (or any other adjustments) to historical budgeted cost for work scheduled (BCWS) or BCWP data.

A rate change is not a change to the SOW for a CAM. It is merely a change to the cost of that work. Cost variances that occur because of direct and/or indirect rate changes can easily be explained in a Variance Analysis Report (VAR). Ironically, this use of MR is typically treated as a one-way street. Contractors apply MR when the direct and/or indirect rates are going up in the future but do not return to MR when the rates are projected to go down.

When a contractor’s EVM System Description allows MR to be used for future direct and/or indirect rate changes, ideally, the likely rate changes are identified as a risk and quantified when the initial MR is established for a project. This requirement should be noted in the EVM System Description. That way the set aside for MR includes budget for corporate rate adjustments that are outside of the control of the project manager or CAM. 

Another example of a commonly allowed but inadvisable use of MR is to “true up” a purchase order that is in excess of the original budget at completion (BAC) for material, equipment, or purchased services. For example, a project manager issues a work authorization to a CAM that includes purchasing material, equipment, or services from a supplier. The CAM then reaches an agreement with a supplier with scope, schedule, and budget. If that agreement is greater or less than the BAC, MR should not be applied, nor should budget be returned to MR to make the BAC match the PO value. Assuming the scope does not change, then MR should not be used to wipe out a cost variance whether positive or negative. The cost variance can be easily explained and the EAC can be increased or decreased. This is another example where contractors are treating this as a one-way street; they apply MR when it goes up, but do not return to MR when it goes down. A contractor would not “true up” for internal work overruns/underruns so why “true-up” for material or services provided by a supplier? 

Best Practice Tips

The following is a short list of best practices H&A earned value consultants often recommend clients implement for managing MR.

  • The EVM System Description should clearly spell out what are appropriate and inappropriate uses of MR. It should also provide guidance to eliminate instances of the “one way street” debit from MR. If needed, provide supplemental procedures, decision trees, or other work instructions to help project personnel follow EVM best practices and preserve MR for handling realized risks which typically occur in latter stages of a project.
  • Ensure that the R&O management process is integrated with the EVMS and provides the necessary risk identification and assessment information for the project manager to establish a realistic MR set aside based on quantifiable information. Where applicable, ensure likely rate changes are captured as a potential risk to the project and considered when the initial MR for the project is established if they intend to use MR for rate changes in the future.
  • Conduct recurring training to reinforce the purpose for MR and the appropriate use of MR. A recommended approach is to discuss a variety of use cases with project personnel so they know how to handle various situations that may occur on a project. 

Have you noticed “creative” uses of MR that are contrary to EVM best practices? Hopefully, you identified those situations as part of your EVMS self-governance process and were able to quickly implement corrective actions before your customer pointed out the issue to you. H&A earned value consultants often assist clients with producing procedures or work instructions that clearly spell out how to use MR appropriately. We also offer a range of EVMS training to reinforce EVM best practices including the appropriate use of MR. Call us today to get started.

Management Reserve Best Practice Tips Read Post »

Introduction to the Cost and Software Data Reporting (CSDR) Reporting Requirements

, , ,

A common client request is to assist them with sorting through the various DoD contractual reporting requirements and contract value reporting thresholds that apply. We frequently run into situations where a contractor needs clarification on why they have a Cost and Software Data Reporting (CSDR) requirement and whether they should seek to waive the requirement. Subcontractors to a prime often question the requirement to provide actual cost data directly to the DoD, especially for Firm Fixed Price (FFP) contracts.

Background

CSDRs are the primary means the DoD uses to collect data on the development, production, and sustainment costs incurred by contractors performing DoD acquisition contracts. It is a DoD system for collecting actual costs, software data, and related business data. The resulting data repository serves as the primary source for contract cost and software data for most DoD resource analysis efforts including cost database development, applied cost estimating, cost research, program reviews, analysis of alternatives (AoAs), and life cycle cost estimates.

CSDR reporting requirements are determined by the contract value regardless of the acquisition phase and contract type. In general, CSDR reporting is required for Acquisition Category I-II programs and Information System (IS) programs valued at more than $50M. They can also be required for Middle Tier Acquisition programs (greater than $20M) and other programs (greater than $100M). Risk can also be a determining factor regardless of the contract value.

DoD Instruction (DoDI) 5000.73, Cost Analysis Guidance and Procedures (March 2020), provides additional details about the cost data reporting. Table 1 in the 5000.73 lists the cost reporting requirements contract value thresholds. The DoD Manual 5000.04 Cost and Software Data Reporting (May 2021) is the primary requirements document for the development, implementation, and operation of the DoD CSDR system to ensure data reported is accurate and consistent.

About CADE

The Office of the Secretary of Defense Cost Assessment and Program Evaluation (OSD CAPE) established the Cost Assessment Data Enterprise (CADE), a secure web-based information system that hosts the controlled unclassified CSDR repository, the Defense Acquisition Cost Information Management System, and the forward pricing rate library. CADE also contains a selected acquisition report database, a contracts database, data analytics capabilities, and a library containing cost estimating content such as cost analysis requirement descriptions and cost estimates. CADE is access-controlled, and available through the public-facing CADE Portal website.

Similar to the cost estimating and proposal pricing functions within contractor’s organizations that rely on historical actual costs to assess the validity of a proposed cost estimate, independent and sound cost estimates are vital for effective DoD acquisition decision making and oversight. CADE plays a critical role in capturing the expenditure, technical, and programmatic data after contract execution in a consistent manner to enable independent cost estimating and analysis. This cost estimate data is essential to support efficient and effective resource allocation decisions throughout the planning, programming, budgeting, and execution process for the DoD.

CSDR Reporting Requirements

There are a series of Data Item Descriptions (DIDs) for this reporting requirement.  Some forms are submitted electronically using DoD defined XML schemas, Excel, or JSON encoded data in accordance with a File Format Specification (FFS) and Data Exchange Instruction (DEI). The list of DIDs are as follows. These DIDs can be downloaded from the CADE website.

  • Contract Work Breakdown Structure, DI-MGMT-81334D (May 2011).
  • Cost Data Summary Report, DI-FNCL-81565C (May 2011), DD Form 1921, XML Schema.
  • Functional Cost-Hour Report, DI-FNCL-81566C (September 2015), DD Form 1921-1, XML Schema.
  • Progress Curve Report, DI-FNCL-81567C (May 2011), DD Form 1921-2, XML Schema. 
  • Sustainment Functional Cost-Hour Report, DI-FNCL-81992 (May 2011), DD Form 1921-5, XML Schema.
  • Contractor Business Data Report, DI-FNCL-81765C (March 2021), DD Form 1921-3, Excel. 
  • Software Development Report, DI–MGMT-82035A (October 2022), DD Form 3026-1, XML Schema. 
  • Software Maintenance Report, DI–MGMT-82035A (October 2022), DD Form 3026-2, XML Schema.
  • Enterprise Resource Planning (ERP) Software Development Report, DI-MGMT-82035A (October 2022), DD Form 3026-3, XML Schema.
  • Cost and Hour Report (FlexFile), DI-FNCL-82162 (November 2017), JSON encoded data file following FFS and DEI.
  • Quantity Data Report, DI-MGMT-82164 (November 2017), JSON encoded data file following FFS and DEI.
  • Maintenance and Repair Parts Data Report, DI-MGMT-82163 (November 2017), Excel.
  • Technical Data Report, DI-MGMT-82165 (November 2017), Excel.

The Cost and Hour Report (FlexFile) and Quantity Data Report play a critical role in collecting cost data from contractors for the DoD data repository because they use JSON data encoding to organize the content. They are intended to replace the legacy 1921 series of paper-based formats including the DD 1921, 1921-1, 1921-2, and 1921-5. It also requires contractors to provide significantly more historical cost data than the 1921 formats. As a result, the DoD cost estimating community has additional insight into historical costs. The goal is to establish a common framework and standard nomenclature to collect data from different contractors, all of them with unique cost accounting structures, that are mapped to the DID, FFS, and DEI requirements for use in the data repository.

Establishing a Consistent, Repeatable Process to Produce the CSDR Data Deliverables

For contractors new to the CSDR reporting requirements and in particular, the FlexFile JSON data encoding, can appear to be daunting. That’s where software tools such as those from Midnite Dynamics can help. Midnite Dynamics specializes in assisting contractors with producing the CSDR data deliverables. 

Their software tool, C*CERT+, streamlines, automates, validates, and produces the legacy 1921 family of Excel and XML reports as well as the FlexFile and Quantity Data Report JSON submittals. C*CERT+ eliminates what otherwise is a manually intensive, resource draining, tedious and costly effort subject to recurring rejections. It is one thing to create the required legacy reports or FlexFile JSON files for submittal, it is another to pass the submittal validation process. C*CERT+ provides numerous data validations and analysis reports to ensure the data is 100% compliant before it is submitted. For example, the software includes over 90 FlexFile validations to ensure data compliance as illustrated in Figure 1.

Figure 1: Example of FlexFile data validation results.
Figure 1: Example of FlexFile data validation results.

The software includes a Validation and Remarks utility to analyze the source data details that could result in a Validation Trip. Remarks can be entered directly into the validation module for anything that requires an explanation. This is illustrated in Figure 2. This narrative is included with the data submittal.

Figure 2: Example of providing remarks about the FlexFile data content.

C*CERT+ also interfaces with existing EVM cost tools and accounting systems to produce the existing legacy 1921 reports, the FlexFile, and other data submittals as well as to consolidate separate projects/CLINs/task orders into a single contract report.

Once the C*CERT+ Standard Category Mapping Rules are set up, they can be shared throughout the corporation or business unit to establish a standard and repeatable process for producing the data deliverables. This mapping process translates the contractor’s source data into an output that matches the CSDR data submittal format rules. This saves a tremendous amount of time and makes it much easier to consistently produce the CSDR data deliverables. An example of the Mapping Rules is illustrated in Figure 3.

Figure 3: Mapping Rules translate contractor unique cost data into a format that matches the CSDR data submittal requirements.

Do your process and procedures or training materials need an update to include specific guidance for project control teams to produce required DoD contractual reports or data submittals using your tool sets of choice? Give us a call today at (714) 685-1730 to get started. 

Introduction to the Cost and Software Data Reporting (CSDR) Reporting Requirements Read Post »

Level of Effort (LOE) Best Practice Tips

, , ,
Level of Effort (LOE) Best Practice Tips

Clients are often seeking advice from our earned value consultants about implementing a practical approach in response to government customer requirements to proactively manage level of effort (LOE) tasks. The DoD EVMS Interpretation Guide (EVMSIG), NASA guidance, and DOE guidance such as the Compliance Assessment Governance (CAG) document clearly state the requirements for contractors related to planning, maintaining, and managing LOE. DOE also specifies a limit to the percentage of LOE allowed within a control account to avoid skewing performance measurement of the discrete work effort. In addition, both the DCMA and DOE EVMS data quality test metric specifications include manual and automated tests with thresholds specific to LOE.

Common accepted best practices for LOE include:

  • Reducing the amount of LOE to the lowest level possible to minimize the number of activities that need to be actively managed. Objective measures of performance are always preferred.
  • When LOE activities are included in the schedule, they should not drive the date calculations of discrete activities in the integrated master schedule (IMS). They should also not appear on the critical path.
  • LOE must be segregated from discrete work effort. In practice, this means a work package can only be assigned a single earned value method. The work package is one of three types. It is either 1) discrete effort with an assigned earned value technique such as the Milestone or Percent Complete technique, 2) apportioned effort, or 3) LOE. 
  • It must be verified it is truly LOE, i.e., it is management or sustainment type of activity that has no identifiable end products or established relationship to other measurable effort. It is clearly not discrete effort or apportioned effort. Remember that with LOE, the passage of time is the only measurement criteria. At the end of the performance month, the budget value for that month is earned. For this reason, LOE is the least desirable earned value method. 
  • The budget or estimate to complete the work effort is time phased and reflects the planned or forecast period of performance. The period of performance and resource requirements must be substantiated. Determining the basis of estimate for the LOE activity can also help to verify the work is truly LOE.

So, what is the problem? 

Common situations H&A earned value consultants run into are contractors where:

  • Managing the LOE is put on “auto pilot.” This might work for project management type of activities that span the duration of the project. It does not work so well when the LOE is associated with the occurrence of discrete work effort that is subject to change – i.e., the discrete work effort duration changes or the start date and/or the complete date changes. The result?
  • LOE tasks may incur actual cost of work performed (ACWP) with no budgeted cost for work performed (BCWP);
  • LOE tasks earn BCWP with no ACWP; or
  • The estimate at completion (EAC) is greater than the ACWP with BCWP equal to the budget at completion (BAC).

    Any one of these conditions would trip the DCMA and DOE test metrics and should be avoided. These types of situations were illustrated in a previous blog, “Level of Effort Decision Tree” that discusses how to properly replan LOE. 
  • Their EVM System Description doesn’t provide sufficient guidance to project personnel on what proactive management of LOE means. What are the rules for planning and maintaining LOE? How is LOE handled differently from discrete work packages?

    Some System Descriptions allow LOE replanning to occur within the “freeze period,” usually defined as the current reporting period and often plus one additional month. This is contrary to other best practice guidance about how to handle changes for open discrete effort work packages. For discrete effort work packages, changes within the freeze period are not allowed and the work package must be closed to replan the remaining work. What’s the process for handling that open LOE work package? What about retroactive changes when the LOE work occurs earlier or later than planned, or the duration is different than planned? Then what?

    When project personnel lack guidance, then arguments often ensue about what is the “correct” interpretation of the wording in governing documents or test metric specifications that are often inconsistent.
  • Validation checks are not routinely performed. This includes validation checks to ensure that control account managers (CAMs) are selecting the appropriate earned value method for a work package following the EVM System Description guidance during the work definition and planning phase. It also includes routine monthly data checks to identify common data anomalies typically associated with LOE such as ACWP and no BCWP or BCWP with no ACWP. The goal is to fix problems in the current reporting month and avoid making any retroactive changes. You should be catching and fixing avoidable DCMA or DOE EVMS test metric “triggers” every reporting period.

Best Practice Tips

Here is a short list of best practice tips that H&A earned value consultants have helped clients to implement over the years to ensure LOE is properly planned and proactively managed. The approach is tailored for each client to reflect the type of work the company typically performs. This is documented in their EVM System Description, related procedures, and recurring training to ensure project personnel have clear, specific guidance they can follow.

  • Consider using the Percent Complete earned value technique instead of LOE. A best practice is to identify quantifiable backup data (QBD) for a work package using the Percent Complete earned value technique. The QBD for the LOE type of work package could be the milestones identified for the discrete effort work package the LOE work package is supporting. This helps to ensure the work packages are reviewed and managed together.
  • Could the Apportioned Effort method be used instead of LOE? Is it possible to establish a direct relationship between the discrete effort and supporting effort? For example, is historical data available to document that the support number of hours is a given percentage of the discrete effort labor hours? If so, then using the Apportioned Effort method is a much better alternative. When the discrete work package is statused, the apportioned effort work package would be automatically statused as well. 
  • Consider shorter durations for the LOE when that LOE is supporting discrete effort. Should the first occurrence of the LOE trigger a data anomaly test metric, it can be proactively handled along with any future replanning. The remaining LOE would already be in one or more separate work packages so there won’t be any criticism for changing open work packages. Any adjustments can be made in the current reporting period avoiding any retroactive changes that would trigger other data metric tests. What is considered to be “short duration” should be defined in the EVM System Description. An example would be LOE work packages of 3 to 4 months in duration. Be sure to provide specific guidance to project personnel on how to process these types of current reporting period LOE replanning adjustments. The LOE work package breakpoints should be technically related. For example: “Phase I Support,” “Drawing Support,” and so forth instead of generic descriptions such as “April Support,” “May Support,” or “June Support.”
  • Use rolling wave planning. This is by far one of the better solutions. This helps to ensure the discrete tasks and any supporting LOE tasks are planned together before the work is authorized to begin. Shorter durations for the LOE tasks are often used to align with the forward planning window.
  • Incorporate LOE earned value method checks into your routine status and data analysis process. Identify any upcoming LOE activities (for example, the next 60 to 90 days), along with the CAM responsible for the work effort to verify they accurately reflect the current plan. It is always better to proactively replan future LOE when needed instead of defaulting to an “auto pilot” mode. The CAM should understand this is part of their responsibilities.

Does your EVM System Description or training materials need a refresh to include specific guidance for project personnel that documents the preferred approach for planning, maintaining, and managing LOE? H&A earned value consultants frequently help clients with EVM System Description content enhancements or creating specific procedures that reflect your unique business environment. Call us today at (714) 685-1730 to get started.

Level of Effort (LOE) Best Practice Tips Read Post »

Timely Subcontractor Data – Mission Impossible?

, , ,
Timely IPMDAR Subcontractor Data – Mission Impossible?

With the arrival of the Integrated Program Management Data and Analysis Report (IPMDAR) requirements for electronic cost and schedule dataset submittals, DoD contractors with EVMS or EVM reporting contractual requirements have a tighter time frame for submitting their month end data. A previous blog, Introduction to the IPMDAR Data Deliverable – Tips for Producing the Outputs summarizes these data reporting requirements. This includes the Contract Performance Dataset (CPD) for the time phased cost data and the Schedule Performance Dataset (SPD) along with a native file export out a schedule tool.

For month end data submittals, the IPMDAR Data Item Description (DID), DI-MGMT-81861C (20210830) states:

1.8.1 Monthly Submission Requirement. IPMDAR data shall be required at least monthly. The reporting frequency shall be specified in the Contract Data Requirements List (CDRL). All reports shall reflect data from the same accounting period and shall be provided at any time after the close of the contractor’s accounting period, but no later than sixteen (16) business days after the contractor’s accounting period end date.

On the surface, you might say requiring data delivery 16 business days after the contractor’s accounting period end date doesn’t sound unreasonable or even much different from the previous Integrated Program Management Report (IPMR) DID (DI-MGMT-81861A), and you would be right.

What is the issue?

Some people think that because the IPMDAR submittals are electronic datasets instead of report formats it is easier to generate and report that information. That is not necessarily true, and shortening the data turnaround time exacerbates the problem. The tighter time requirements also apply when there are EVM reporting subcontractors providing performance data to a prime contractor.

The third sentence of 1.8.1 above states: “All reports shall reflect data from the same accounting period…” This requirement is very challenging, especially when a subcontractor operates on a different month end accounting calendar; for example, a “5-4-4” versus the prime’s “4-4-5” calendar. Even when the prime and subcontractor are on the same month end calendar, for the prime to submit IPMDAR data in 16 business days, the subcontractor has less time to provide their data to the prime. It becomes even more challenging on very large programs that have several tiers of subcontractors.

Subcontractors cry “foul” because they don’t have enough time to get all the performance data ready in the reduced time. Prime contractors cry “foul” because they are held accountable for data that may or may not come from one or more tiers of subcontractors in time for them to conduct basic data analysis and deliver month end data for the IPMDAR. The government customer still insists all the data must be for the same accounting month end date, even though that may not be well defined. Customers also do not want the subcontractor data delayed by a month just to get the subcontractor data “caught up” – i.e., “comparing apples to oranges.”

Is incremental delivery of IPMDAR the answer?

The government suggests that incremental delivery could resolve this dilemma. The DoD IPMDAR Implementation and Tailoring Guide (August 24, 2021) expands on the paragraph from the IPMDAR DID:

1.8.1.1 Incremental Delivery. Reports may be provided incrementally, including preliminary data, with the number of days for delivery of each submittal tailored in the CDRL. Data delivered is not considered authoritative until the final submission and signature. The recommended incremental delivery process is the Schedule, followed by the CPD and the Executive Summary, Government review of submittals, Government directed Detailed Analysis, Contractor Detailed Analysis delivery and all final data.

The IPMDAR Implementation and Tailoring Guide also provides a notional example of how an incremental delivery could be handled:

1. SPD – To be delivered with native file five (5) working days after the end of the contractor’s accounting period (may be labeled preliminary)

2. CPD – To be delivered with the Executive Summary ten (10) working days after the end of the contractor’s accounting period (may be labeled preliminary)

3. Contracting Office to select items for detailed analysis (variances) – to contractor thirteen (13) working days after the end of the contractor’s accounting period

4. Performance Narrative Analysis – to be delivered NLT sixteen (16) working days after the end of the contractor’s accounting period along with any other “final” versions of previously submitted files

Note: The notional incremental delivery plan above is not additive.

Doing the above might demonstrate to the government customer that the prime contractor is at least trying their best to make the prime/subcontract situation work – even though they would be using “estimated data” until the final versions come in from the subcontractors. Does this approach really make the timely delivery of the data easier to attain? The bottom line does not change. Per number 4 above, the prime still has to deliver all the data in “final versions” by the 16th business day following the close of their accounting calendar. The note at the bottom specifies the days indicated in each step are not “additive” – i.e., the contractor does not get 5+10+13+16 = 44 business days.

In some circumstances, incremental delivery might allow some subcontractors a bit more time to get the data to the prime contractor, but there would still have to be tighter delivery dates for the incremental deliveries, so the problem does not really go away.

What are your options?

This difficult situation arises because few contractors consider the implications of having to get all data by their accounting month end. Not all the subcontractor work elements are set up the same way. Contractors who have the EVM reporting requirement, who do or will have EVM reporting subcontractors, should address this basic difference as part of the contract negotiation process. One possible part of this negotiation could be to use the IPMDAR DID, paragraph 1.4, to help level the field for reporting purposes. This paragraph states:

1.4 Direct Reporting Contractor Role.

1.4.1 A Direct Reporting Contractor is any contractor required to provide the IPMDAR directly to the Government. This includes prime contractors, subcontractors, intra-government work agreements, and other agreements, based on the contract type, value, duration, nature of the work scope, and the criticality of the information. In this document, instances of “Contractor” are synonymous with “Direct Reporting Contractor.”

There is a footnote to this paragraph that states:

In the event that the Direct Reporting Contractor is a contractor other than the prime, the Direct Reporting Contractor will additionally report to the prime. Subcontractor data shall be provided to the prime in a manner that supports the contractor’s submission to the Government.

One solution is to negotiate to have each EVM reporting subcontractor deemed a “Direct Reporting Contractor” that submits their data directly to the government, including the customer, as well as to the prime contractor. The prime and subcontractors are each submitting their IPMDAR electronic deliverables to the DoD EVM Central Repository (EVM-CR).

Each level of contract, the prime through however many tiers of subcontractors there may be, will have the same 16 business days after their own accounting month end dates to provide all interested parties with the EVM data. The prime contractor still must at least get estimated subcontractor data to do their monthly assessment, making corrections in the next month after they have received the final version of the data from their subcontractors.

Should the government customer want analysis performed on subcontracted effort, the IPMDAR dataset submittals will be in the DoD EVM-CR. They can do that analysis independently of the prime contractor’s analysis that would be provided after the prime’s 16th business day.

This approach would put pressure on the prime because they will not have seen the subcontractor’s data prior to it being delivered to the government, but that could be addressed in the next reporting period’s “errata” variance analysis narrative. The government would also have the detailed data from the subcontractors when the subcontractor is providing a reduced set of data such as only total cost data to the prime. Should the government customer not want to do that level of analysis, the government customer may need a different contracting solution to avoid requiring EVM reporting down through various levels of contracts. This is often determined by the contract value and risk factors associated with the subcontractor. 

This approach also requires the subcontractor to produce two deliverables. One for the prime contractor in an agreed upon format and one for the government customer following the IPMDAR DID electronic submittal requirements for the DoD EVM-CR. These reporting requirements should be negotiated with the subcontractor well in advance; the subcontractor needs to know their data deliverable and reporting requirements when they bid on the work effort for the prime.

This subcontractor data incorporation issue has been around for many years and can be very confusing. H&A earned value consultants can help you work through the various responses to this requirement in the best possible way for your situation. Call us today at (714) 685-1730 to get started.

Timely Subcontractor Data – Mission Impossible? Read Post »

Scroll to Top