The Office of the Inspector General (OIG) says that following the recommendations of a High-Level Panel in 2011 to improve management and internal controls, the Global Fund took significant steps to establish appropriate governance structures, processes, controls and systems for managing its portfolio of grants. However, the OIG added, “several control improvements are still needed to ensure that governance, risk management and internal controls over grant monitoring are fully effective and sustainable.”
These observations are contained in a report of the OIG’s audit into monitoring process for grant implementation. The report was released on 3 November.
The audit considered all active grants as of 31 March 2017, but focused particularly on a representative sample of 27 grants. The review covered an implementation period of January 2016 to March 2017.
The OIG noted that its audit occurred at a time when the Global Fund’s Grant Management System (GMS) is undergoing significant change as a result of the introduction of a comprehensive new platform, the Grant Operating System (GOS). The GOS is still being implemented. The OIG said that some of the issues it identified in the audit will be addressed by the new platform, but that other issues had root causes that go beyond what the GOS can accomplish.
Editor’s Note: The audit report describes the limitations and deficiencies of the current grant monitoring processes in considerable detail (and often in rather technical terms). In the space we have available, we can only provide highlights. Readers are advised to consult the audit report for more information.
Annual funding decision process
The audit singled out for particular attention control gaps in the annual funding decision process. The OIG said these gaps could compromise the ability of the Global Fund to apply its performance-based funding principles effectively.
The purpose of the annual funding decision is to assess recent grant performance to ensure that funds are used as intended, to evaluate if expected results are being achieved, and to provide an appropriate level of funding for the coming year.
The OIG noted that although there are several monitoring processes in place, the annual funding decision is the only time when performance is comprehensively and holistically assessed by Secretariat staff independent of the country team.
The audit found that (a) policies and procedures related to the annual funding decision were not clear; and (b) there were limited controls to ensure the validity, accuracy and completeness of the information that led to the decision.
Current policies contain several requirements to support the funding decision, the OIG said. However, due to insufficient guidance, interpretations of these requirements differ significantly across different country teams. Although grant performance ratings are a key component of the funding decision, the OIG said, clear criteria are not in place to guide adjustments to the initial quantitative indicator ratings automatically generated by the GMS. The result is that in some cases expected adjustments are not made, while in other cases significant adjustments are made but are not supported.
The OIG found that:
- ratings adjustments for known data quality issues were not made by the country team in 47% of the funding decisions it reviewed. In another 33% of the funding decisions, required adjustments to the ratings were not made to account for major program management and financial issues. In both situations, the rationale for not adjusting the ratings was not documented by the country team;
- when funding decisions included exceptions requiring further approval, as defined in the operational policy, they did not receive the requisite level of approval in 60% of the cases reviewed and there was no documentation to support the deviation from policy; and
- when conditions precedent were attached to the grant agreement and were not met, these were not taken into account in 53% of the funding decisions reviewed, and no explanation was documented for the exception.
Regarding data quality issues, the OIG said that the annual funding decision process requires that ratings be adjusted if a grant has known data quality issues such as, for example, overstated program results. The Fund’s operational policies do not define the severity of data quality issues to be considered by the country team and do not provide guidance on potential triggers or scope of the downgrades, the OIG stated. In the grants it reviewed, the OIG said, some of the data quality issues were grant-specific, such as the PR’s inability to report impact and outcome indicators due to delays in its management information system. Others were more systemic, such as identified weaknesses in the national health information systems, lack of data quality assurance mechanisms, or weak supervision systems.
Due to the lack of guidelines in the current process, the OIG said, it was unclear which of these issues should have triggered an adjustment to the ratings. The annual funding decision tool did not document how these data quality issues were factored into the performance ratings, or the rationale for not adjusting those ratings. In addition, the OIG said, it is not possible to ascertain the completeness of the potential downgrades as data quality issues are currently not formally recorded and tracked.
In conclusion, the OIG found that processes related to the annual funding decision “need significant improvement.” This is the second-lowest rating in the OIG’s four-tiered rating scheme.
Grant performance framework and grant budget
The OIG noted that in recent years, the Global Fund has developed comprehensive guidelines to support the development of the grant performance framework and budget. However, the OIG said, existing controls are not effective to ensure that the framework and budget are consistent with those guidelines.
The audit found that there was limited alignment between the performance framework and the budget. The OIG said that of 27 grants it reviewed, on average, 28% of the total grant budget was allocated to activities or line items for which there were no related performance indicators.
The OIG said that after the grant agreement is signed, the country team’s Public Health, Monitoring and Evaluation Officer is responsible for entering key data from the performance framework into the GMS. However, the OIG added, effective controls are not in place to ensure that the officer accurately and completely captures all relevant information. For 14 indicators in seven of the 27 grants that the OIG reviewed, target values were either missing in the performance framework, or the target value captured in the GMS was inconsistent with the framework. In these same grants, the OIG said, there were also instances where target values were incorrectly captured in the GMS, with percentages used instead of the actual values.
The OIG said that several grants received the highest performance rating of A1 while absorbing as little as 25% of the funds. The OIG added that although there can be a number of valid reasons for variances between programmatic performance and absorption, such as contributions from other donors or reporting results on national targets, this may indicate a misalignment between the performance framework and the budget in some cases.
The audit report documents several weaknesses with respect to how grant ratings are determined. For example, the OIG stated, there is inadequate guidance on rating upgrades. In addition, in the current grant management system, grant ratings can be changed by anyone in the country team at any time – even after a rating has been formally approved and signed off for purposes of the annual funding decision. There is no audit trail of changes made to the ratings, the OIG added.
The OIG noted that a new quality assurance framework process was formalized in August 2017 to ensure better compliance with the guidelines. While the new process was tested and found generally adequate, the OIG observed, it still needs enhancing by including controls to ensure accountability for validity, accuracy and completeness of the performance frameworks.
In summary, the OIG found that processes related to the use of the performance framework and budget “need significant improvement.”
Ongoing monitoring of grant implementation
The OIG noted that the Global Fund has developed various reporting requirements to routinely monitor grant performance throughout the implementation period. As these reporting requirements have expanded over time, the OIG said, complex and often duplicative processes have developed without being sufficiently tailored to the specific country context or to the nature and timing of the various grants.
As a result, the OIG said, compliance with the reporting requirements has become increasingly challenging. For example, it said, routine progress updates from the principal recipient (PR) take an average of 129 days to submit compared to the 75-day requirement. Some of the progress updates were submitted as late as 11 months after the end of the period. Similarly, the OIG observed, performance letters took an average of two months to complete. Some of the letters were not issued at all while others were sent as late as 11 months after receiving the progress update from the PR.
The audit report included a diagram (see figure) illustrating the sheer volume of tools and sources of information that the fund portfolio manager and the rest of the country team need to manage, analyze and use for decision-making.
The audit also found that there is a need for more systematic monitoring of overall portfolio performance at the senior management level. Although grant management is the core business of the Global Fund, the OIG said, “an effective oversight body at the executive management level, with cross functional representation, is not yet in place to monitor the high-level performance of the grant portfolio, evaluate emerging trends and guide the related responses at a strategic level.”
The OIG found that, overall, the adequacy and effectiveness of the Secretariat grant monitoring structures were “partially effective.” This is the second highest rating in the OIG’s four-tiered rating scheme.
Figure: Set of reports for monitoring grant risks and performance
Source: OIG audit report on Monitoring Processes for Grant Implementation
The OIG found that although the Secretariat has improved grant revision policies and processes, there is a lack of effective mechanisms, system or manual, to track grant revisions. Until such a mechanism is instituted, the OIG said, there is no reasonable assurance that grant revisions are performed for valid reasons, and are accurately and completely captured; that their impact on grant performance is effectively assessed; and that the revisions are approved at the right level.
The OIG concluded that the processes related to grant revisions “need significant improvement.”
Differentiation for impact
The OIG said that the implementation of the Differentiation for Impact Project represents a significant step forward in the management of grants. “This realignment is a continuous journey,” the OIG said, “and, if implemented effectively, some of its benefits may take years to fully materialize. However, in the short term and two years after its launch, it still remains unclear to what extent the immediate objectives of differentiating processes and optimizing resource allocations have been achieved.” In general, the OIG stated, up until now, the project has not yet achieved as radical a shift in resources and processes as initially envisaged.
The OIG said that key post-implementation processes were designed to support and embed the differentiated model, but that many of these supporting processes were not implemented. For example, it said, a dedicated team was supposed to be established to assist country teams on a day-to-day basis to answer questions, and to resolve challenges and issues in the transition to the new systems and revised processes. However, the OIG said, this support has been limited and not at the planned level, due to a number of conflicting priorities at the Secretariat. “This has resulted in country teams struggling to adapt to the new processes and, in many cases, reverting to the old processes,” the OIG stated.
In addition, the OIG said, the Secretariat was supposed to have evaluated the operational effectiveness of the differentiated model in July 2017, approximately six months after the implementation of the project. This evaluation has not yet occurred “due to competing organizational priorities,” the OIG said, and it is not scheduled to occur in the balance of 2017.
Although the Secretariat has made significant efforts to amend grant management processes to reflect the differentiated framework, the OIG said, the changes have been limited for focused countries despite a reduction in staffing. For example, processing an annual funding decision still takes the same amount of time for a focused country team as it does for a high-impact or core country team. The same applies to the progress update, the OIG said. The scope of the progress update for a focused country has not changed although the update is now only required on an annual basis. However, the OIG said, the local fund agent no longer verifies and analyses the results of the progress update for a focused country. As a result, the time gained by the country team by no longer having to review the progress update semi-annually has been offset by the additional time needed to analyze the progress update in order to issue the PR with a performance letter.
The Differentiation for Impact project was meant to align the staff resources allocated to the country teams with the size of each country’s grant allocation, its disease burden and the strategic priorities of the organization, the OIG explained. The project aimed to split the allocation of grant management resources at 42% for high-impact countries, 37% for core and 21% for focused. Following the implementation of the project, the grant management resources are currently split at 41% (high-impact), 35% (core) and 24% (focused). Although this result was a significant achievement and very close to the initial target set by the project, the OIG said, the reallocation did not consider several key factors that impact the actual level of effort required in the day-to-day grant management activities of country teams.
For example, additional complexities such as the number of grants, the number and capacity of PRs in each portfolio, and the management of regional grants were not considered, the OIG said. In the case of regional grants, the OIG noted that responsibility for managing 13 regional grants amounting to $301 million – increasing to approximately $450 million in the 2017-2019 allocation period – was assigned to eight high-impact country team portfolios. In one case, a single FPM and country team are responsible for six high-impact country grants amounting to $442 million in addition to a regional high impact grant of $116 million. As a result of these imbalances, the OIG said, while three high-impact portfolios (Nigeria, DR Congo and India) received an increase in staff, the rest of the high-impact country teams had an increase in portfolio responsibilities without a corresponding increase in resources.
Agreed management actions
According to the audit report, the Global Fund Secretariat has plans to address the risks identified in the audit in several ways, including, most notably, through a follow-up stage (Phase II) of the new GOS. This project will be launched at the beginning of 2018 and will run for 18 months. Specifically, the Secretariat will:
- review grant implementation processes and controls implemented in the GOS re-design;
- update controls and associated operational guidance, as appropriate; and
- develop business intelligence reports to regularly monitor operational performance and compliance.
In addition, as part of the Impact through Partnership Transformation project, a Portfolio Review Committee will be established to review portfolio-wide performance on a regular basis. The Secretariat will also complete a post-implementation review of the Differentiation for Impact Project with a specific emphasis on focused countries.
Finally, with respect the annual funding decision process, the OIG noted that the new GOS contains a module on this topic, that it was implemented by the Secretariat in September 2017, and that it should be operational by the end of this year.
The audit reported presented several examples of best practices. The following is a summary.
Grants within the Cameroon portfolio had a performance framework that adequately covered all of the key objectives in the grant agreement. “This was due to the country team’s integrated and cascading approach with the PR for the development of the performance framework,” the OIG said. Technical specialists first reviewed and assessed the framework for appropriate coverage of the grant objectives, and alignment with the country’s other grants. In addition to the Public Health, Monitoring and Evaluation Specialist, the Health Products Management Specialist, the Legal Counsel, the Risk Officer and finance officers were involved.
The country team for DR Congo, a high-impact country, developed a comprehensive performance improvement plan, and used it to guide its day-to-day grant management activities. The plan covered all relevant PR actions from different monitoring tools, including OIG audit reports and investigations. It assigned accountability to the country team members for each area in the plan. The country team also included the results of the plan in its performance letter to ensure there was an adequate record of performance-related issues involving the PR. At year-end, the fund portfolio manager (FPM) sent a letter taking stock of the state of the portfolio to all interested parties, including the country coordinating mechanism, all PRs, and development partners. In addition, the FPM circulated an internal memo outlining the country team’s strategy for the year ahead.
For Haiti, a core country, the country team refined the structure of its performance letter to clearly distinguish programmatic, financial and other issues. The letter also identified which issued were considered major and which were minor. Management actions and deadlines were spelled out and the completed actions from the previous period were noted.
For El Salvador, a focused country, a summary of management actions, open and closed, is routinely disclosed to all PRs.
For Africa and the Middle East and AELAC (Africa, Europe, Latin America and the Caribbean), the regional teams were the only ones in the Grant Management Division that systematically assessed country and grant performance on a monthly basis. The teams developed internal tools and reporting templates to present country and grant performance. Performance issues are discussed, and action plans are created and then followed up at the next regional team meeting.
Message from the Executive Director
In a message attached to the audit report, Interim Executive Director Marijke Wijnroks referred to the fact that the audit was conducted while the Global Fund was implementing its new grant operating system. “The grant monitoring processes that were the subject of the audit are no longer in use,” Ms Wijnroks said. “The system improvements introduced by the Secretariat, including through the implementation of AIM [Accelerated Integration Management], address many of the issues identified in the audit.”
Ms Winjroks said that the new Annual Funding Decision module that was launched in September 2017 is now fully operational, and that it addresses most of the gaps identified in the audit report. “The module allows Global Fund staff to review the implementation progress of each grant, identify implementation issues, risks and corresponding mitigating measures, [and] determine and commit funding to be disbursed on a pre-set disbursement schedule,” she said. “Every grant funding decision and disbursement request is now processed through this system. Since mid-September, a total of 132 disbursements have been processed for a total of approximately $180 million. At present, an additional 615 disbursements are in the pipeline for the remainder of 2017 and in 2018.”
In her message, Ms Winjroks said that a Grant Revisions module was launched May 2017, and enhanced in August 2017. This module enables the initiation, management and monitoring of revisions in line with the updated operational policy, she said. To date, 145 revisions have been initiated or processed through this module.
Ms Winjroks said that the achievements of the GOS “represent a tremendous milestone in strengthening the grant monitoring process. They are the result of an energetic and sustained effort by the Accelerated Integration Management (AIM) project that began in 2015 to review, re-design, and optimize the grant management processes and to deliver an integrated solution to support efficient portfolio management.”
Throughout 2016 and 2017, Ms Winjroks said, “the AIM team worked extremely hard to streamline and automate the core grant management processes resulting in aligned processes, strengthened controls and enhanced efficiency. It is a credit to that team that monitoring processes for grant implementation are now in much stronger shape.”
The OIG said that it will perform a follow-up audit once Phase II of the GOS has been completed, the new system is fully operational, and the revised processes and controls have been in place sufficiently long to enable a test of their effectiveness.