NDIA

Revitalizing Earned Value Management Systems (EVMS)

, , , , , , ,
Revitalizing Earned Value Management Systems (EVMS)

Quick Summary

  • Regulatory changes and updated standards are creating an opportunity to revitalize EVM Systems. The FAR overhaul, revised agency thresholds, and the EIA-748-E streamlined requirements while reinforcing the continued need for an effective EVMS.
  • Organizations have an opportunity to refocus on value-driven EVM practices. Rather than treating EVMS as a check-the-box requirement, this is an opportunity to renovate bloated processes and remove non-value-added activities to establish a flexible “living” system that supports proactive project management and credible forecasting.
  • BI and AI tools can transform EVM data into a real-time decision-making advantage. When supported by reliable, integrated data, these tools can rapidly organize information, improve visibility, identify risks early, and help project teams respond faster to changing priorities as well as technical, schedule, and cost challenges.

With the recent changes in the government regulatory requirements, the publication of the EIA-748-E Standard for EVMS, and evolving Business Intelligence (BI) and AI tools, the components for revitalizing Earned Value Management Systems (EVMS) are falling into place. This is an opportunity to refocus on the original purpose of an EVMS and effective use of real-time EVM data to quickly address problems before they become critical.

As highlighted in a previous blog, “Earned Value Management (EVM): How Much is Enough?”, being merely “compliant” with the EIA-748 Guidelines should not be the goal. That strategy fails to take advantage of the benefits of an EVMS; it is also short-sighted. Too often an EVMS is perceived as a contractual check-the-box exercise or focused on detailed score keeping.

The goal should be about being efficiently expert at EVM; a commitment to become “best-in-class” as expert practitioners of EVM. Following this strategy, an organization’s EVMS is actively maintained and used to ensure it provides relevant, useful information needed to manage projects for success. EVM is a powerful project management methodology that integrates scope, schedule, and cost management to provide a clear picture of project performance, the forecast completion date, and estimate at completion. BI and AI tools are enhancing the ability to rapidly organize and analyze real-time EVM data for proactive management and clear transparent communication with the customer. This also aligns with the need for speed in delivering capabilities to the customer when trade offs between requirements, schedule, and cost must be made.

Trimming Contractual and Guideline Requirements

The regulatory environment has been evolving; government entities are either simplifying or changing the requirements for an EVMS. As a reminder, the Capital Programming Guide Supplement to the Office of Management and Budget (OMB) Circular A-11 Planning, Budgeting, and Acquisition of Capital Assets establishes the government major acquisition requirements for an EVMS. This Guide states contractors must use an EVMS that meets the EIA-748 guideline requirements to monitor contract performance. All agency EVMS regulations point to the A-11.

A summary of recent changes follows.

Revolutionary Federal Acquisition Regulation (FAR) Overhaul that began in May 2025 focused on removing most non-statutory rules and rewriting requirements in plain language. Subpart 34.2 – Earned Value Management System was trimmed to the basic EVMS and Integrated Baseline Review (IBR) requirements. The Pre-Award IBR and Notice of EVMS Post-Award IBR clauses were removed; it now just states an IBR is required. Subpart 52.234-4 – Contract Clause for EVMS text was streamlined. Key takeaways: Reaffirmed the value of an EVMS and IBRs. What is unchanged: An EVMS is required for major acquisitions for development contracts, requirements flow down to subcontractors, and IBRs are required.

Defense Federal Acquisition Regulation Supplement (DFARS) Class Deviations (2026-O0011 February 2026), in response to the FAR Overhaul. Subpart 234.2 Earned Value Management System, 234.201 Policy raised the contract value threshold from ≥ $20M to ≥ $50M for EVMS reporting and incorporated the 2015 Class Deviation Memo increasing the contract value threshold for compliance reviews to ≥ $100M. There are also new related Class Deviation Clauses: 252.234-7001 is now 252.234-7998 Notice of EVMS; 252.234-7002 is now 252.234-7999 EVMS.

NASA FAR Supplement 1834.201 Policy Class Deviation (June 2025) as well as their solicitation clause (1852.234-1) and contract clause (1852.234-2) align with the DoD contract value threshold changes and revised clauses.

National Nuclear Security Administration (NNSA). Although NNSA is part of the DOE, as of September 2025 they are the Cognizant Federal Agency (CFA) for NNSA projects. They purposely simplified their compliance and surveillance process to be able to rapidly respond to threats. Contractors self-assess their EVMS. NNSA uses an EIA-748 Guideline checklist, reviews data artifacts, and conducts interviews for evidence of compliance. Certification reviews are required when the Total Project Cost is > $300M and are subject to surveillance reviews.

EIA-748-E Standard for EVMS approved and published in February 2026. This long overdue update reduced the number of guidelines to 27 and reflects current business system capabilities. The previous set of 32 guidelines were revised or merged, two were added, and four were deleted to improve clarity.

With the publication of the EIA-748-E, industry guides as well as government agency compliance and surveillance review materials have been or are in the process of being updated. The NDIA IPMD Intent Guide for EIA-748-E will be available on the NDIA IPMD web site once it completes the membership review and approval process. The DoD Earned Value Management System Interpretation Guide (EVMSIG) is also being updated to reflect the EIA-748-E. Once the EVMSIG is published the DCMA EVMS Group will be updating their Business Practices, appendices, and EVMS Compliance Metrics (DECM). DCMA has already trimmed their DECMs to a set of 60 standard, 10 conditional, and 72 low priority tests.

Impact of BI and AI Enabled Tools and Apps

BI and AI tools speed up the process to pull data from different sources for defined use cases and to organize it for analysis. The time lag to view current data can be eliminated with the right business system interfaces and tools. These tools can quickly produce a variety of dashboards or data views with the ability to drill down into the data as well as to sort and filter as needed for root cause analysis. AI agents designed for specific use cases can also speed up the process to organize and present data for real-time decision making. These dashboards and views can be tailored for specific users such as project managers, control account managers (CAMs), functional managers, schedulers, finance, material or subcontract management, and others.

Taking advantage of BI and AI does require a defined enterprise strategy to successfully leverage these powerful tools. Data is the backbone of any AI model – data is needed to “teach” AI how to spot patterns and make predictions. This includes the vast volume of an organization’s transaction records, analytics, and proprietary information across multiple systems.

The problem? Organizations often lack a consistent, verified version of data (the single source of truth) – there is uncertainty about what data should be used to analyze and “feed” their AI models. Internal proprietary data must not be exposed to the outside world. The single source of truth must exist in a governed and curated environment; it must be organized and integrated with a defined data model to be able to analyze real-time streams of data while avoiding multiple versions of the truth.

The challenge is that many organizations are still doing their enterprise planning, including estimating, budgeting and many other functions, in spreadsheets. It is not accessible to others or captured in a common database. Employees end up debating discrepancies between spreadsheets rather than analyzing the data in question.

Once the system that contains the official single source of truth has been determined and how data is organized and integrated, there are a variety of commercial off the shelf (COTS) tools available for the next step. Employees (the power users) familiar with BI and AI tools can quickly turn ideas into apps in a matter of hours or days that help them and their team to get things done. They can quickly build business environment specific dashboards, analyze real-time data pulled from various data sets, and produce outputs designed for different users or use cases.

Putting All the Pieces Together

What are the three primary takeaways?

The requirement to provide a fact-based assessment of project progress and forecast isn’t going away. The FAR overhaul didn’t do away with EVMS or the related fundamental requirements. It does, however, require organizations to be efficiently expert at EVM. A “living” EVMS (i.e., actively maintained and used) that can be scaled/tailored to management needs for each project is essential.

Changes to the requirements provides an opportunity to update “bloated” processes and procedures or that haven’t been updated to reflect new tools. Since the EVMS will need to be reviewed anyway to verify it supports the revised guidelines as well as updated agency requirements, there may be non-value added content or steps that can be eliminated.

BI and AI tools are useful for organizing real-time data into actionable information. Organizations taking advantage of these tools can rapidly respond to realized or emerging risks and changing scope or priorities in response to evolving threats. This creates a competitive advantage.

Returning to a Focus on Proactive Management

This is an opportunity to return to the original objective of an EVMS: timely and relevant information for proactive decision making to ensure project success and a happy customer. The effectiveness of an EVMS should be measured by the technical, schedule, and cost performance metrics. Product acceptance and in-process controls are examples of technical performance metrics. Schedule status and forecast, cumulative to date cost performance index (CPI), estimate at completion (EAC), and the to complete performance index (TCPI) are examples of schedule and cost performance metrics.

Too often the perceived approach to a “compliant” EVMS is to drive the data to an excessive level of detail along with restrictive rules and guidance that result in a system that is cumbersome and painful to use. It reinforces the perception that EVMS is too costly – something the customer doesn’t want to pay for because they don’t see the value.

The alternative? An organization that is efficiently expert at EVM where the customer has directly experienced the value of using real-time performance data to successfully manage their program. Non-value activities have been eliminated. An actively maintained and used EVMS is also resilient; project teams can quickly respond to evolving priorities and threats. Taking advantage of the power and agility of BI and AI tools/apps can help project teams to focus on what matters with real-time data and analytics.

Taking Advantage of the Opportunity to Revitalize EVM

Changing the view that EVMS is burdensome, costly, and of no value will take time. It depends upon organizations choosing to become efficiently expert at EVM.

Recent changes in requirements and the guidelines will require organizations to review the state of their EVM Systems. It creates an opportunity to eliminate non-value added activities. At the same time, powerful BI/AI tools enable real-time data analysis so project teams can be more proactive as well as renovate EVMS functions. The effectiveness of the EVMS is apparent because it provides real-time visibility into project performance with a credible forecast completion date and estimate at completion.

There is no need for excessive oversight by government customers that drives up the cost of managing projects when the customer has confidence the organization’s EVMS provides the visibility they need – and that earned value based project management is a valuable tool.

Next Steps

Consider having an independent third party complete a thorough assessment of your EVMS process areas and documentation to identify where content can be trimmed and clarified or where non-value added steps can be removed – particularly if you are starting to integrate BI and/or AI tools into your EVMS and other business systems. Call us today to get started.

Revitalizing Earned Value Management Systems (EVMS) Read Post »

Video Release – Assessing Schedule Risk Using Deltek’s Acumen Risk 6.1 | Part 2 of 2

, , , , , , , , , , ,

The conclusion of our review of the foundational elements of performing a schedule risk assessment (SRA) using Acumen Risk 6.1

0:17 – Risk Exposure Chart
1:03 – Tornado Chart
2:14 – Parting Thoughts

Read the blog post at:

Assessing Schedule Risk Using Deltek’s Acumen Risk 6.1 | Part 2 of 2

Video Release – Assessing Schedule Risk Using Deltek’s Acumen Risk 6.1 | Part 2 of 2 Read Post »

Video Release – Assessing Schedule Risk Using Deltek’s Acumen Risk 6.1 | Part 1 of 2

, , , , , , , , , , ,

 

How confident are you that your project will finish on time? Review the foundational elements of performing a schedule risk assessment (SRA) using Acumen Risk 6.1

2:22 – Schedule Health Diagnostics
4:55 – Duration Uncertainty
6:35 – Risk Events
8:20 – Simulation Process
 
Read the blog post at:

Assessing Schedule Risk Using Deltek’s Acumen Risk 6.1 | Part 1 of 2

Video Release – Assessing Schedule Risk Using Deltek’s Acumen Risk 6.1 | Part 1 of 2 Read Post »

Assessing Schedule Risk Using Deltek’s Acumen Risk 6.1 | Part 1 of 2

, , , , , , , , , , ,

Why Perform Schedule Risk Assessments? EVMS and Agile implementations within the same company or on the same project.

Before a project is ready to be baselined, a typical question the customer asks the project manager is, “How confident are you that the project will finish on time?”

This is a more difficult question than you might think.  In competitive environments, guessing is not an option.  The probability of success on a project must be quantified.  The risks that impact the odds for success must also be quantified.  If the risk is managed, the probability of completing the project on time and under budget is improved.

Customers are not blind to the importance of risk management.  This is evidenced by recent changes in government contracting requirements that call for formal risk assessments of project schedules.  Even if risk management were not a contractual requirement, it would be irresponsible for any project manager to ignore the need for risk management and proceed without identifying and assessing the project’s risks.

Schedule risk exists in every project.  This risk can be quantified, analyzed, and mitigated, or it can be ignored.  However, ignoring schedule risk does not make it go away.  Fortunately, there are advanced software tools, such as Deltek’s Acumen Risk, that can help model the expected impacts of risk in the schedule. Then, the answer to “how confident are you that the project will finish on time?” can be answered with quantifiable information.

In the following sections, a few of the foundational elements of performing a schedule risk assessment (SRA) using Acumen Risk 6.1 will be discussed.  The software was designed with the understanding that not everyone is an expert in schedule risk analysis.  The software provides beginners with an easy to follow path to perform in-depth schedule risk analysis as well as advanced features for experienced risk experts.

Along with quick start guides and help documentation, the menu structure is laid out like a schedule maturity timeline.  From left to right, the menu selections take one from the start-up steps of importing the schedule, to analyzing the schedule, assessing schedule risk, accelerating the schedule, and advanced customization features.

Deltek_Acumen-Top-Level_MenusDeltek Acumen – Top-Level Menus

 

Schedule Health Diagnostics

Before delving into schedule risk assessments, let’s take one minor detour from risk into schedule diagnostics.

Would you trust a broken watch to tell you the correct time?  The same goes for a schedule risk assessment.  A broken schedule network cannot be trusted to yield reliable, and therefore actionable, SRA results.

The National Defense Industrial Association (NDIA) Integrated Program Management Division (IPMD) Planning & Scheduling Excellence Guide (PASEG), is widely regarded as one of the premier references on scheduling best practices.  The PASEG was created by a joint team of both government and industry scheduling experts, thus it has no particular point of view to promote or defend.  One of the scheduling best practices the PASEG discusses is that the integrated master schedule (IMS) should be validated before any SRA is performed.  “Validated” means that the tasks, logic, durations, constraints, and lags in the IMS should be analyzed and corrected as necessary.

Acumen Fuse provides a complete set of schedule diagnostics.  When I first clicked on the “Diagnostics” tab, I saw an initial set of metrics.

EVMS: Acumen Fuse Schedule diagnostics

Each one of these metrics was applied to the project’s timeline that which makes it easy to see both where and when the issues occur.  What I did not notice at first was that these metrics were just one subset; I was only looking at the “Schedule Quality” subset of the diagnostics.  There were similar subsets in the areas of Logic, Duration, Constraints, Float, and the DCMA 14-point Schedule Assessment, just to name a few.  All of these diagnostic tests can be modified to reflect your company or customer’s standards.

Before leaving the topic of schedule health, there are a few words of caution.  No matter how useful a schedule analysis tool may be, there is no substitute for the task managers taking ownership of the IMS and ensuring that it is in good working order.  For example, analysis software can be used to check to determine if a task has a predecessor and a successor, but only someone familiar with the effort can determine if a task has the “correct” predecessor and successor.  Analysis software is becoming more and more sophisticated, but people still control the success or failure of the project.

Duration Uncertainty

Once a sound schedule has been developed, the next foundational elements of an SRA are the duration uncertainty estimates.  There are two widely accepted methods of assigning duration uncertainty.

The preferred and more precise method is to obtain three-point duration estimates (best case, worst case, and most likely) from the task owners.  At a minimum, this should be performed on all critical and near-critical tasks (and driving and near-driving tasks supporting significant events).  For larger schedule networks, it may not be reasonable to gather this type of information for every task.  If custom three-point estimates are not available, templated duration uncertainty could be applied based on the type of work, the task owner, historical performance, or any other applicable task characteristic.

Acumen Risk handles both methods very easily.  Custom three-point estimates can be entered for each task in days (or hours), or as a percentage of the current remaining duration of the task.  Standard duration uncertainty templates are easily applied to a task by selecting the appropriate risk level on the calibration bar.  To streamline the process, by setting the calibration at any summary level, the uncertainty template is cascaded down to all the “children” tasks.

Description. Calibration.

Risk Events

One thing traditional Critical Path Method (CPM) networks do poorly is model unexpected results.  For example, if there is a 90% success rate on fatigue testing, the IMS will generally be constructed to assume the test will be successful, with no disruption to downstream tasks.

EVMS: Critical Path Method

But what happens if the test fails?  While unlikely, there is still a very real possibility that the results will be unfavorable.  If the test does return unfavorable results, there will likely be a significant delay while re-work is performed in the areas of design, build and test.  A traditional CPM network can model a successful test or an unsuccessful test, but not both.  This is not a problem with a schedule risk assessment.  Information from the project’s risk register can be used to model the likelihood of a test failure, as well as the consequence, or delay to downstream tasks resulting from that failure.

EVMS: CPM Risk Events Consequence

Is this an acceptable risk?  An SRA can quantify the risk and provide information on the likelihood of successful deliveries.  Acumen does not stop there though.  One of its newest features is to organize and track all risk events within its built-in risk register, as well as to track the steps being taken to help mitigate that risk.  Or, if your organization already maintains an external risk register in Excel, it can be imported into Acumen to eliminate the duplicate tracking of risk events.  Whether the risk register is imported from Excel or built from scratch within Acumen, a single risk event can then be mapped to one or more activities, or a single activity can be associated with one or more risk events.

EVMS: risk registers 

 

Simulation ProcessEVMS: Simulation Process

A typical SRA uses Monte Carlo techniques to simulate hundreds or thousands of potential project outcomes using the risks and uncertainties that have been supplied.

For most users, simply accepting the default settings and pushing the “Run Risk Analysis” button would be sufficient.  But if terms like “Convergence”, “Correlation Coefficient”, “Central Limits Theorem” and “Seed Value” are part of your normal working environment, Acumen provides a variety of settings that can be customized to tune the SRA to best model your project.

No matter which approach you take, the Acumen toolset provides a quick and easy simulation process.

 

 

 

 

 

 

 

 

 

 

What to Expect in Part 2

Part 2 of this blog will delve into the interpretation of SRA results.

 

Yancy Qualls, PSP

Engagement Director, Schedule Subject Matter Expert (SME)

Humphreys & Associates, Inc.

Assessing Schedule Risk Using Deltek’s Acumen Risk 6.1 | Part 1 of 2 Read Post »

EVMS Variance Analysis — EVMS Analysis and Management Reports

, , , ,

A Variance Analysis Report (VAR) that includes specific information about the cause, impact, and corrective action “provides management with early insight into the extent of problems and allows corrective actions to be implemented in time to affect the future course of the program” [reference: NDIA, IPMD EIA-748 (Revision D) EVMS Intent Guide]. Unfortunately, variance analysis is an easy target for criticism during EVMS reviews. There are many examples of inadequate variance analysis to choose from, but what they all have in common is the lack of specific information on the “why, what, how, when, and who” of any variance. The variance analysis reporting requirements are found in the EIA-748 (Revision D) Guidelines in Section IV., Analysis and Management Reports, Guidelines 22-27.

EIA-748 Guidelines
Section IV. Analysis and Management Reports
22 2-4a Control Account Monthly Summary, Identification of CV and SV
23* 2-4b Explain Significant Variances | Earned Value Management
24 2-4c Identify and Explain Indirect Cost Variances
25 2-4d Summarize Data Elements and Variances thru WBS/OBS for Management
26* 2-4e Implement Management Actions as Result of EVM Analysis
27* 2-4f Revise EAC Based on Performance Data; Calculate VAC

A VAR that includes specific information and data about a problem will allow management to make informed decisions and mitigate project risk. Getting specific about variance analysis reporting includes the following elements.

Overall:

  • Emphasis on the quantitative, not qualitative
  • Emphasis on the specific, not the general
  • Emphasis on significant problems, not all problems
  • Define abbreviations and acronyms at first use
  • The Control Account Manager (CAM) is the most knowledgeable person to write the variance analysis report but will need information from the business support team

Cause:

  • Isolate significant variances
  • Discuss cost and schedule variances separately
  • Clearly identify the reason (root cause) for the variance (ties to the corrective action plan)
  • Clear, concise explanation of the technical reason for the variance
  • Provide cost element analysis
    • Labor – hours, direct rates, skill mix, overtime (rate & volume)
    • Material – unplanned requirements, excess quantities, unfavorable prices (price & usage)
    • Subcontracts – changing requirements, additional in-scope work, schedule changes
    • Other Direct Costs – unanticipated usage, in-house vendor
    • Overhead (indirect) – direct base, rate changes
  • Identify what tasks are behind schedule and why

Impact:

  • Describe specific cost, schedule, and technical impact on the project
  • Project future control account performance (continuing problem)
  • Address effect on immediate tasks, intermediate schedules, critical path, driving paths, risk mitigation tasks
  • Describe erosion of schedule margin, impacts to contractual milestones or delivery dates, and when the schedule variance will become zero (this may only mean the work getting completed late (BCWPcum =BCWScum); and does not necessarily mean getting “back on schedule”
  • Describe any impact to other control accounts
  • Assess the need to revise and provide rationale for the Estimate at Completion (justify ETC realism – CPI to TCPI comparison, impacts of corrective action plan, risk mitigation, open commitments, staffing changes, etc.)
  • Note: If there is a root cause, there will be an impact. It could be related to cost, schedule, lessons learned to be applied to future activity, an update required to a process to support the corrective action or a re-prioritization of resources to meet a schedule.

Corrective Action Planning:

  • Describe specific actions being taken, or to be taken, to alleviate or minimize the impact of the problem
  • Include the individual or organization responsible for the required action
  • Include schedules for the actions and estimated completion dates (ECD)
  • If no corrective action is possible, explain why
  • Include results of corrective action plans in previous VARs.

Ask yourself, is the analyses presented in a manner that is understandable? Does the data support the narrative? Does the variance explanation provide specifics of:

why” the problem occurred,
what” is impacted now or in the future,
how” the corrective action is being taken,
when” the corrective actions will occur,
when” the schedule variance will become zero, and/ or the work gets “back on schedule”
who” is responsible for implementing the corrections?

Remember, a well-developed Variance Analysis Report can reduce the risk of a Corrective Action Request (CAR) during an EVMS review.

EVMS Variance Analysis — EVMS Analysis and Management Reports Read Post »

Scroll to Top