Earned Value Management System (EVMS)

Earned Value Management: How Much Is Enough?

, , , , , ,

How Much EVMS Is Enough

I took the scenic route to selecting the theme of this blog. First, it was suggested that I write a blog on the benefits and costs of the earned value process as it applies to program management. Next it was suggested that I describe the harm of not using any of the elements of the earned value process.

In the case of the benefits and costs of the earned value management process, it would be difficult to improve upon Dr. Christensen’s 1998 paper on this heading or to attempt to improve other papers and studies done by Wayne Abba, Gary Humphreys, Gary Christle, Coopers & Lybrand and others. So I will not make citations to these past studies. Rather I will leave them undisturbed, as the monuments they have become.

This blog will summarize my observations of how companies have chosen “how much EVM is enough” for them and share my observations of the results of these decisions. Each company has selected an EVM implementation strategy and each company’s strategy falls along a bounded continuum.

I will describe this continuum of company EVM strategies with a left hand and a right hand goal post, and the space between as a cross bar. The “left hand goal post” represents companies that elect to be very poor at EVM or to not use EVM at all. The “right hand goal post” represents companies that have committed to being “best-in-class” practitioners of the EVM process and are the polar opposite of the companies at the left hand goal post. There are few companies at either the left or right hand goal posts. The “cross bar” represents the vast majority of companies that have selected an EVM strategy somewhere between the left and right goal posts.

Two Goal Posts And A Cross Bar; Recalcitrant, Merely Compliant, Efficiently Expert

There are as many strategies to earned value management as there are companies using EVM to manage their programs and projects.

Left Goal Post; The Recalcitrant

I have firsthand experience with a company, that at the time I initially joined them and had decided to ignore earned value management even though it was a requirement in several of its contracts. After many painful years of attempting to maintain this recalcitrant EVM strategy, this company decided that a better strategy would be to become “efficiently expert” at EVM.

Cross Bar; Merely Compliant at EVM

It has been my experience that most companies desire to “become EVM compliant,” which generally means being compliant to the 32 guidelines and not failing those guidelines so as to be de-certified. This is the vast middle ground between the two goal posts. I will now share five observations regarding companies in the “cross bar” majority.

Observation #1: Compliance As A Goal; Golf and EVM

Compliance should be a “given,” or a “pre-condition,” not a “goal.” Remaining merely compliant implies a status quo or static posture.

I will use the game of golf as an analogy. Golf is a game of honor and compliance to well established rules. All PGA professional tour golfers “comply” with the rules that govern golf. Although all PGA tour pro golfers comply with these rules, their performance on tour differs dramatically.

Fifty-three percent of all PGA golf pros, past and present, have no tour wins. That means only 47% of all PGA tour golf pros have won at least a single PGA tour. There are seven players in the history of the PGA that have fifty or more tour wins. If the bar is lowered to forty or more wins, only three players are added to the list. If the bar is lowered yet again to thirty or more tour wins, only eight more players are added to the list. Only 18 golfers have won 30 or more PGA tournaments.

Professional golfers do not confuse compliance with performance, nor do these professionals assume that “being compliant” will improve their performance.

Observation #2: “The Tyranny of The Status Quo”

With apologies to Milton Friedman and his book of the same name, companies that attempt to maintain mere guideline compliance will do no better than the status quo, and more often than not, regress toward non-compliance. Maintaining status quo is a myth – you either improve or regress.

All professionals, companies included, must compete in their markets and selected fields. To succeed in this competition requires constant improvement in areas critical to success. A company, organization, or individual without the means or the desire to improve will eventually fail and perhaps perish.

Observation #3: Blaming The Scoreboard

As a program manager, I considered EVM as my scoreboard. I reacted to the EVM data – the scoreboard – and made decisions based on that data (GL #26).

I recall the 2014 Super Bowl’s final score: Patriots 28, Seahawks 24. Did the scoreboard cause the Seahawks to lose the game or did a poor decision by their coach cause the loss? Imagine a coach that cannot see the scoreboard. That coach does not know the score or how much time remains. That coach cannot react to the realities of the game.

Observation #4: EVM Causes Poor Program Performance

I have witnessed several company leaders assert that the use of EVM on a poorly performing program is the cause of that program’s poor cost and schedule performance. A correlation between two variables, or a sequence of two variables (use of EVM and poor performance), does not imply that one caused the other. This is the logical fallacy known as “X happened, then Y happened, therefore X caused Y.” Night follows day, but day does not cause night. Use of EVM does not cause poor program performance. Not reacting to EVM data and promptly taking corrective action with your program’s cost and schedule performance often leads to poor outcomes.

Observation #5: It Takes More Energy To Be Poor At EVM Than To Be Expert

Returning to the earlier golf analogy, professional golfers make very difficult shots appear easy. I played in one pro/am tournament years ago. The pro I was teamed with took me to the range hours before our tee time. He asked me how many balls I hit before each round. I told him sometimes none and sometimes 50. He hit 1,000 balls before our round. When we finished our round, he was ready for another 18 holes. I was not. Both of us “complied” with the rules of golf. His score was significantly lower than mine. His game was effortless and produced a below par score. My game was labored and produced a poor result.

And so it is with EVM or any other process. The better you are at a skill, the easier it becomes. Experts consume far fewer calories at their craft than ambivalent amateurs.

Right Goal Post; Efficiently Expert At EVM

The polar opposite of a recalcitrant strategy to EVM is a strategy to become “efficiently expert.” As I mentioned earlier, I joined a company that attempted to sustain a recalcitrant EVM strategy. Their recalcitrant EVM strategy led to de-certification, large dollar withholdings, and significant damage to their corporate reputation.

After the most ardent EVM recalcitrants in this company “sought employment elsewhere,” a new strategy was adopted. This company embraced a strategy to become “best-in-class” as expert practitioners of EVM. This company’s goal was EVM perfection. EVM perfection is an impossible ambition, but wiser than “mere compliance.” And as with the PGA tour golf pro, EVM became nearly effortless.

Which EVM strategy will your company choose?

 

Robert “Too Tall” Kenney
H&A Associate

Earned Value Management: How Much Is Enough? Read Post »

Clarification on the New Department of Defense Earned Value Management System EVMS Thresholds | DOD & DPAP

, , , , , , , , , , , , , , , ,

New Department of Defense Earned Value Management System (EVMS) ThresholdsOn September 28, 2015, the Defense Procurement and Acquisition Policy Directorate (<abbr=”Defense Procurement and Acquisition Policy Directorate”>DPAP) released a memorandum entitled “Class Deviation – Earned Value Management System Threshold”. In this memo the DoD changed the threshold for <abbr=”Earned Value Management System”>EVMS application to $100 million for compliance with EIA-748 for cost or incentive contracts and subcontracts. That same memorandum stated that no EVMS surveillance activities will be routinely conducted by the Defense Contract Management Agency (<abbr=”Defense Contract Management Agency”>DCMA) on contracts or subcontracts between $20 million to $100 million. As attachments to this memorandum, there was a reissuance of the Notice of Earned Value Management System <abbr=”Department of Defense Federal Acquisition Regulations”>DFARs clause (252.234-7001) and the Earned Value Management Systems DFARs clause (252.234-7002), with both reflecting the new $100 million threshold.In response to this guidance, a series of questions from both contractors and other government personnel were submitted to Shane Olsen of the DCMA EVM Implementation Division (<abbr=”EVM Implementation Division”>EVMID). Below are the salient points from this communication:

  • There will be no EVMS surveillance of DFARs contracts under $100 million. Contracts without the DFARs clause, such as those under other agencies using the FAR EVM clause, will continue surveillance under their current thresholds.
  • The $100 million threshold is determined on the larger of the contract’s Ceiling Price or Target Price; as reported on the Integrated Program Management Report (IPMR) or Contract Performance Report (CPR) Format 1.
  • The threshold is based on the Contract Value including fee (at Price) as noted above. If there is an approved Over Target Baseline (OTB) which increases the Total Allocated Budget (TAB), this cannot push a contract over the threshold.
  • The new thresholds not only apply to subcontracts, but also Inter-organizational work orders with an EVMS flow-down.
  • Regardless of the circumstances, the DCMA will not conduct surveillance on contracts less than $100 million. However, if there are Earned Value issues that the buying command or other parties believe need to be reviewed, then the DCMA may conduct a Review for Cause (RFC) of the system against potentially affected guidelines.
  • The DCMA Operations EVM Implementation Division (EVMID) will not be conducting Compliance Reviews in FY-2016 unless there is an “emergent need”.
  • If a site is selected for a Compliance Review, only contracts greater than $100 million would be in the initial scope of the Implementation Review (IR). However, if an issue is discovered that requires the team to “open the aperture”, other contracts are not precluded.

The DCMA is still working on a response to the following questions:

  • How do I handle a contract that is currently below $100 million but has options that, in aggregate, would exceed $100 million?
  • How is the contract value determined on:
    • Indefinite Delivery/Indefinite Quantity (ID/IQ) Contracts
    • Non-ID/IQ with Multiple CLIN-Level or Task Order reports?

This blog will be updated and reposted as answers to these questions are given.

Clarification on the New Department of Defense Earned Value Management System EVMS Thresholds | DOD & DPAP Read Post »

EVMS Variance Analysis — EVMS Analysis and Management Reports

, , , ,

A Variance Analysis Report (VAR) that includes specific information about the cause, impact, and corrective action “provides management with early insight into the extent of problems and allows corrective actions to be implemented in time to affect the future course of the program” [reference: NDIA, IPMD EIA-748 (Revision D) EVMS Intent Guide]. Unfortunately, variance analysis is an easy target for criticism during EVMS reviews. There are many examples of inadequate variance analysis to choose from, but what they all have in common is the lack of specific information on the “why, what, how, when, and who” of any variance. The variance analysis reporting requirements are found in the EIA-748 (Revision D) Guidelines in Section IV., Analysis and Management Reports, Guidelines 22-27.

EIA-748 Guidelines
Section IV. Analysis and Management Reports
22 2-4a Control Account Monthly Summary, Identification of CV and SV
23* 2-4b Explain Significant Variances | Earned Value Management
24 2-4c Identify and Explain Indirect Cost Variances
25 2-4d Summarize Data Elements and Variances thru WBS/OBS for Management
26* 2-4e Implement Management Actions as Result of EVM Analysis
27* 2-4f Revise EAC Based on Performance Data; Calculate VAC


A VAR that includes specific information and data about a problem will allow management to make informed decisions and mitigate project risk. Getting specific about variance analysis reporting includes the following elements.

Overall:

  • Emphasis on the quantitative, not qualitative
  • Emphasis on the specific, not the general
  • Emphasis on significant problems, not all problems
  • Define abbreviations and acronyms at first use
  • The Control Account Manager (CAM) is the most knowledgeable person to write the variance analysis report but will need information from the business support team

Cause:

  • Isolate significant variances
  • Discuss cost and schedule variances separately
  • Clearly identify the reason (root cause) for the variance (ties to the corrective action plan)
  • Clear, concise explanation of the technical reason for the variance
  • Provide cost element analysis
    • Labor – hours, direct rates, skill mix, overtime (rate & volume)
    • Material – unplanned requirements, excess quantities, unfavorable prices (price & usage)
    • Subcontracts – changing requirements, additional in-scope work, schedule changes
    • Other Direct Costs – unanticipated usage, in-house vendor
    • Overhead (indirect) – direct base, rate changes
  • Identify what tasks are behind schedule and why

Impact:

  • Describe specific cost, schedule, and technical impact on the project
  • Project future control account performance (continuing problem)
  • Address effect on immediate tasks, intermediate schedules, critical path, driving paths, risk mitigation tasks
  • Describe erosion of schedule margin, impacts to contractual milestones or delivery dates, and when the schedule variance will become zero (this may only mean the work getting completed late (BCWPcum =BCWScum); and does not necessarily mean getting “back on schedule”
  • Describe any impact to other control accounts
  • Assess the need to revise and provide rationale for the Estimate at Completion (justify ETC realism – CPI to TCPI comparison, impacts of corrective action plan, risk mitigation, open commitments, staffing changes, etc.)
  • Note: If there is a root cause, there will be an impact. It could be related to cost, schedule, lessons learned to be applied to future activity, an update required to a process to support the corrective action or a re-prioritization of resources to meet a schedule.

Corrective Action Planning:

  • Describe specific actions being taken, or to be taken, to alleviate or minimize the impact of the problem
  • Include the individual or organization responsible for the required action
  • Include schedules for the actions and estimated completion dates (ECD)
  • If no corrective action is possible, explain why
  • Include results of corrective action plans in previous VARs.

Ask yourself, is the analyses presented in a manner that is understandable? Does the data support the narrative? Does the variance explanation provide specifics of:

why” the problem occurred,
what” is impacted now or in the future,
how” the corrective action is being taken,
when” the corrective actions will occur,
when” the schedule variance will become zero, and/ or the work gets “back on schedule”
who” is responsible for implementing the corrections?

Remember, a well-developed Variance Analysis Report can reduce the risk of a Corrective Action Request (CAR) during an EVMS review.

EVMS Variance Analysis — EVMS Analysis and Management Reports Read Post »

DoD Earned Value Management System Interpretation Guide | EVMSIG

, , , , ,

The updated DoD Earned Value Management System Interpretation Guide (EVMSIG), dated February 18, 2015 was released in March, 2015.

This DoD update, per the GAO, focuses on “(1) problems facing the cost/schedule control system (CS2) process; (2) progress DOD has made with reforms; and (3) challenges DOD faces in fostering and managing potentially significant changes”.

The update commences with:

EVMSIG INTRODUCTION

1.1 Purpose of Guide

Earned Value Management (EVM) is a widely accepted industry best practice for program management that is used across the Department of Defense (DoD), the Federal government, and the commercial sector. Government and industry program managers use EVM as a program management tool to provide joint situational awareness of program status and to assess the cost, schedule, and technical performance of programs for proactive course correction. An EVM System (EVMS) is the management control system that integrates a program’s work scope, schedule, and cost parameters for optimum program planning and control. To be useful as a program management tool, program managers must incorporate EVM into their acquisition decision-making processes; the EVM performance data generated by the EVMS must be timely, accurate, reliable, and auditable; and the EVMS must be implemented in a disciplined manner consistent with the 32 EVMS Guidelines prescribed in Section 2 of the Electronic Industries Alliance Standard-748 EVMS (EIA-748) (Reference (a)), hereafter referred to as “the 32 Guidelines.”

The DoD EVMS Interpretation Guide (EVMSIG), hereafter referred to as “the Guide”, provides the overarching DoD interpretation of the 32 Guidelines where an EVMS requirement is applied. It serves as the authoritative source for EVMS interpretive guidance and is used as the basis for the DoD to assess EVMS compliance to the 32 Guidelines in accordance with Defense Federal Acquisition Regulation Supplement (DFARS) Subpart 234.2 and 234.201 (References (b) and (c)). The Guide provides the DoD Strategic Intent behind each guideline as well as the specific attributes required in a compliant EVMS. Those attributes are the general qualities of effective implementation that are tested in support of determining EVMS compliance as it relates to the 32 Guidelines. As applicable, the DoD Strategic Intent section may clarify where differences in guideline interpretation exist for development and production type work. DoD agencies and organizations charged with conducting initial and continuing EVMS compliance activities will establish amplifying agency procedures and/or guidance to clarify how they are implementing this Guide to include the development of evaluation methods for the attributes associated with each of the 32 Guidelines.

1.2 EVM Policy

The Office of Management and Budget Circular No. A-11 (Reference (d)), the Federal Acquisition Regulation (FAR) Subpart 34.2 and Part 52 (References (e) through (h)) require federal government agency contractors to establish, maintain, and use an EVMS that is compliant with the 32 Guidelines on all major capital asset acquisitions. Based on these federal regulations and the DoD Instruction 5000.02 (DoDI 5000.02) (Reference (i)), the DoD established the Defense Federal Acquisition Regulation Supplement (DFARS) 234.201 (Reference (c)), which prescribes application of an EVMS, via the DFARS 252.234-7002 EVMS clause (Reference (j)). When EVM reporting is contractually required, the contractor must submit to the government an Integrated Program Management Report (IPMR) (DI-MGMT-81861) (Reference (k)) to report program cost and schedule performance data. The IPMR is being phased in to replace the Contract Performance Report (CPR) (DI-MGMT-81466) and the Integrated Master Schedule (IMS) (DI-MGMT-81650). Hereafter, for simplicity purposes, the term “IPMR” is used to reference legacy or current CPR/IMS DIDs. There are times in this Guide when the IMS reference is to an output of the contractor’s internal management system, i.e., a work product, which may not be referred to in the same context as the IPMR. [The full EVMSIG update is found here.]

Furthermore, also in March, 2015 the GAO released its “Report to the Committee on Armed Services, House of Representatives: Defense Acquisition | Better Approach Needed to Account for Number, Cost, and Performance of Non-Major Programs”.

An overview:

The Department of Defense (DOD) could not provide sufficiently reliable data for GAO to determine the number, total cost, or performance of DOD’s current acquisition category (ACAT) II and III programs (GAO-15-188Better Approach Needed to Account for Number, Cost, and Performance of Non-Major Programsoverview). These non-major programs range from a multibillion dollar aircraft radar modernization program to soldier clothing and protective equipment programs in the tens of millions of dollars. GAO found that the accuracy, completeness, and consistency of DOD’s data on these programs were undermined by widespread data entry issues, missing data, and inconsistent identification of current ACAT II and III programs. See the figure below for selected data reliability issues GAO identified. [The full GAO-15-188 document is found here.]

DoD Earned Value Management System Interpretation Guide | EVMSIG Read Post »

Corrective Action Response: Planning and Closure – Part 2 of 2

, , , , , ,

Review Part 1 of Corrective Action Responses addresses Planning and Closure


Responding to a Corrective Action Request (CAR)– Planning and Closure

It is important that the contractor develop a disciplined, standardized approach for responding to a corrective action response.  This not only helps ensure that the responses are complete and contain compliant corrective actions, but that they also represent the position of the entire contractor team.  Below are nine suggested steps for successful Corrective Action Plan (CAP) development.

1)    Review the DRs/CARs with the customer

Prior to developing a corrective action in response to a Corrective Action Request (CAR), the first step is to ensure that both parties, the contractor and the review team, have a mutual understanding of the finding.  This also serves to screen those findings that may have been the result of a misunderstanding with the data or an incorrect statement from a member of the contractor’s team.  It is also recommended that DRs/CARs with similar or duplicative findings be grouped together so that a single Corrective Action Plan (CAP) can be used to address the issue.  When doing this, it is imperative that this approach is communicated to the review team lead and the grouping strategy approved before beginning corrective actions.  This is generally an acceptable approach providing the CAP closures can be traced to the original findings.

2)     Organize for successful CAP management

Once a mutual understanding has been reached on the corrective actions, the contractor must then begin the process of correcting or mitigating the identified issues.  It is critical that the process of corrective action has the participation of key management and organizations that can affect change.  When there are a significant number of findings that are to be corrected, the establishment of a senior management Review Board is a recommended method for managing the process.  The roles of the board are:

    • Ensure a CAP is developed and supported by a structured CAR/DR resolution process;
    • Assign an individual from the responsible organization to lead the corrective action efforts;
    • Review the proposed schedule for the CAP, and monitor progress towards CAP closure;
    • Review and approve all CAR/DR root cause assessments and proposed corrective action including the closure criteria;
    • Serve as the primary point of contact with the Customer for CAR/DR resolution and closure.

3)     Begin a thorough Root Cause Analysis

A tempting direction at this stage is to allow for a quick fix of the identified issue.  This may be acceptable for “just fix it” types of findings such as typos, formula errors, incorrect data runs, etc.; but most findings require a more in-depth approach to ensure that the underlying drivers of the issue are being addressed.  Most organizations have employees who are specialized in root cause analysis, such as Six Sigma or LEAN process improvement advisers. This would be a good time to employ their skills.  Tools such as “The 5 Whys” and the Ishikawa Fishbone Diagram are excellent methods for identifying the root causes.  These tools and processes are extremely effective in uncovering the sources of the problem.

A customer review team often samples a subset of CAMs, processes, or data in its review because of a limited amount of time or resources.  It is often the case that a more thorough root cause analysis conducted by the contractor team will uncover additional issues that need to be addressed and corrected.   The contractor’s obligation to the customer is to provide full visibility regarding the corrective actions associated with those findings identified by the customer.  While it is important that all issues are corrected or mitigated, it is, however, the contractor’s choice to allow visibility into those issues that were not discovered by the customer review team.

4)     Develop and evaluate Corrective Action Plans

A single DR or CAR issued by a customer team may have numerous corrective actions identified in the solution process.  Often a single problem may have corrective actions that entail changes in processes, training, tools, or management approach, or any combination of all of these.  Regardless, it is important to identify corrective actions that will prevent recurrence of similar outcomes, and will not cause or introduce other new or additional problems.  One important benefit of including senior management in the CAP Review Board process is the capability to reach beyond the owners of a particular CAP to influence other stakeholders in the organization who have the responsibility to incorporate corrective actions or who may be impacted by the solutions being identified.

5)     Develop verification closure steps

It is critical that verification methods, objective measures, metrics, artifacts, and evidential products are identified that will verify that the corrective actions are effective.  This includes any exit criteria for any activities in the CAP Integrated Master Schedule (IMS), which is a schedule network that contains all the detailed work packages (including activities and milestones) and planning packages to support the events, accomplishments, and criteria of the Integrated Master Plan  (if applicable). It is directly traceable to the Contract Work Breakdown Structure (CWBS) and the contract statement of work. The IMS is critical to CAP success.  On data driven findings, the criteria for verification often involves producing several accounting periods of results as evidence that the corrective actions were effective.  The CAP Review Board is responsible for reviewing the status of the exit criteria, and verifying that the required objective measures have been satisfied.

6)     Develop a detailed Integrated Master Schedule for CAP implementation

A critical component of any project, including corrective action development and implementation, is a detailed IMS containing the project scope and the required dates of completion.  There should be a unique IMS for each CAP that includes:

    1. Root Cause Analysis
    2. Changes to processes, tools, training, and other required system adjustments
    3. Management Review and regular team meetings
    4. Responsibility assignment for each activity
    5. Development of products and artifacts which will demonstrate effectiveness
    6. Validation and Verification steps with Closure Criteria

Resource loading the IMS is an important process, as it communicates to the management team the required personnel to accomplish implementation of the Corrective Action Plans, and can serve as a commitment on its part to support the process until closure.  If there is a lack of available resources available to support the process, this may impact the completion dates established for the corrective actions.  All tasks should be logically networked (with predecessors and successors) without any constraints.  Progress should be based on a 0 to 100% scale without subjective interpretation.  As mentioned above, data validation normally requires several months of data submittals, and these deliveries should be milestones in the IMS.  Completion milestones should include notifying the customer of corrective action implementation and confirmation by the customer that the implementation is complete.  Each activity should also have fields which identify the CAR or DR number, the EV Process Area and Guideline, the responsible manager for the CAP, and a unique ID number for each task.

Reviewing the CAP IMS and the accomplishment status is a critical role of the CAP Review Board.

7)     Submit CAP and CAP IMS to the customer for approval prior to implementing the Corrective Actions

While some corrective actions may be straight forward responses to simple findings, it is important to reach a mutual agreement of the CAP approach prior to implementation.  Often the customer’s approval of the CAP is a required step before proceeding.  Important in this agreement is consensus on the artifacts and data sets that will be delivered, along with the timing of the deliveries.

One topic that may need to be addressed with the customer review team is a cutoff date for data corrections.  For example, it is important to reach agreement on the “as of” date for clean data, because changing historical data is usually an unnecessary step.  Occasionally a corrective action is delayed until a new contract modification is implemented or a new contract baselined before a correction can be implemented and verified.  These conditions need to be agreed upon with the customer prior to proceeding.

8)     Implement Corrective Action Plans and track progress to successful completion

One path to the escalation of a CAR to Level IV*, and possibly the introduction of Business System payment withholds, is the failure to successfully implement an agreed upon Corrective Action Plan.  Many organizations discover that the actual implementation of the approved corrective actions is the most difficult part of the process.  Sometimes a successful plan will include interim modifications or fixes in the short term, with long term changes identified as well.  If for example, the issue were with the integration between the  MRP and EVM systems, an interim solution may involve a change in the interface or translation of data between the systems while in the long term a replacement of the MRP is required.  It is important to have CAP solutions that not only mitigate the findings, but also can also be implemented in an acceptable period of time.

It is also important to meet interim commitments of data, processes, or any agreed to delivery of an artifact.  If the execution of a CAP will be delayed for any reason, this should be communicated quickly to the customer.

9)     CAR closure and follow-up

When the issuer of the CAR is satisfied that the contractor’s corrective actions are appropriate to prevent recurrence of the noncompliance, and the solutions have been verified to be effective, the contractor will be notified that the CAR is considered closed.  Even after closure, the areas identified as needing improvement are often targeted for periodic follow-on reviews; so it is important that management attention is maintained to sustain the corrective action.  A well organized and disciplined internal surveillance program is often the best safeguard against future discrepancy reports.

For more information about responding to Corrective Action Requests, contact our consultants at Humphrey’s & Associates.

*Link to part 1 of Corrective Action Response: Sources

Corrective Action Response: Planning and Closure – Part 2 of 2 Read Post »

Scroll to Top