First Evaluations Spark Reflection on OGP Effort

1 November 2013

By Toby McIntosh

The official launch Nov. 1 of the reports by independent reviewers about the national action plans of the founding members of the Open Government Partnership provides preliminary answers to several key questions about the OGP experiment

Is it making a difference?

How is the review process doing?

The first eight evaluation reports were announced officially Nov. 1, covering the founding OGP countries: Brazil, Indonesia, Mexico, Norway, the Philippines, South Africa, the United Kingdom, the United States. (Most have been on the OGP website for some days. Public comment on them is being invited.)

“I think the jury is still out on whether the IRM is giving real teeth to the OGP,” said Mo Ibrahim, founder of Celtel International and the Mo Ibrahim Foundation, at the plenary session. Ibrahim is a senior advisor to the process, which is known as the OGP Independent Reporting Mechanism (IRM).

Francis Maude, the minister for the UK Cabinet office, called the evaluations “crucial to the credibility of the whole OGP.”

“The whole point about the IRM is that it holds us to account. You don’t just get to mark your own paper,” he said during the plenary session. The evaluator was “bang to right” that the UK consultation process needed improvement, and the message caused changes, he said

Commitment Fulfillment Tally

The overall numbers presented in a spreadsheet paint a mixed picture of accomplishment.

The eight countries made a total of 175 commitments in the first year since their issuance.  Of these, 79 were completed. Substantial progress was reported for 40 commitments. Limited progress was made in 42 instances. Accounting for the others: eught withdrawn, three not started, two “unclear” and one “no answer.”

Is the OGP making a difference? “A qualified yes,” said Rosemary McGee, a technical expert to the IRM process from the UK Institute of Development Studies during the plenary session

“There is a social capitol being built,” Warren Krafchik, an OGP co-chair told, saying the development of  “a conversation between civil society and governments” is the signal achievement of the first two years.

During the many sessions at the two-day London summit Oct. 31 – Nov. 1, attended by more than 1,00 persons, enthusiasm was tempered with suggestions. One indication of intensifying attention on the OGP was that major nongovernmental groups held a strategy session.

The independent reviewers’ reports are particularly tough on the process through which the national actions plans were prepared, tending to agree with persistent civic society complaints that they were no sufficiently involved.

The main caveat on the numbers, said one person involved with the process is that countries that picked easy goals will score better than countries that picked ambitious ones.

How is the review process going?

The preparation of the first eight reports is revealing strengths and weaknesses in the process, according to those involved.

The lessons were frankly discussed at a meeting of the reviewers, including some of those hired to evaluate 35 more countries in the coming months

The first evaluations ruffled some feathers.

Some countries have complained that they weren’t consulted about the final reports and didn’t like the content.

The process allowed countries to comment on the facts in the draft reports. This spawned requests for more time and one country, Brazil, submitted 200 comments. Some countries wanted another crack at the final report before its release. A South African representative said during that the  plenary session Nov. 1 that the reviewers should meet with heads of state.

At the closed OGP Steering Committee meeting Oct. 29, several countries complained about the process. OGP officials said a review is under way to clarify the terms of engagement.

Informal Session on Lessons

Some of the first reviewers shared their experience with the next generation of reviewers.

Among the common experience was that the OGP’s profile, which appears glitteringly high at the London summit, is actually quite low among civil society groups and government officials around the world. One described her role as being a “marketer” explaining the OGP.

The reviewers’ reports attracted considerable media attention in South Africa and the Philippines, some press in the United Kingdom, but very little elsewhere.

The reports on South Africa and the Philippines are judged the most critical, but other factors were credited for the low visibility of the reports, including no practical support for publicizing them and the subject matter.

Some reviewers said their job was made more difficult by disorganized governments and vague commitments.

Another topic that remains under active discussion is the scope of the reviews, particularly whether they should look at goals not chosen or comment on constraints on civic space.

Process Affects Reviews

The reviewers “had difficulties, largely because of the quality of what we can do depends very much on how the OGP process has been approached in country,” according to Rosemary McGee, an advisor to the IRM process from the UK Institute of Development Studies.

Kevin Dunion, the former Scottish information commissioner who reviewed the UK plan, said one “shock” was it was hard to find the architects of the plan, noting that they were surprised to find someone coming along to write follow-up.

Similarly, Brazilian plan reviewer Laura Waisbich said it “was difficult to reconstruct the process.”

South African reviewer Ralph Mathekga, whose report criticized the government for not engaging the public in developing the action plan, said this meant that few CSOs “had a clue” and that knowledge within the government was “very caged.” He said, “The big challenge for was just knowing if the commitments had been completed.”

In Mexico, where there  “ lot of involvement,” the rush produce a report generated  quantity (37 commitments) but not on quality, observed Paulina Gutierrez. It was “a challenge” to document the results. “It is one thing is what they wrote about, one thing is what the do, and another thing is what they report to OGP,” she said.

Jordanian reviewer Mai E’llemat reported frustration when officials replied to her questions by saying “it is all in the action plan.” CSOs didn’t really understand the process, she said, and the government claimed no one told them to do a public consultation, a clear requirement in the OGP guidelines.

Session moderator Simon Burral, director of, summarized, “It’s been the process that has made the challenge of doing the reporting more difficult.”

Promoting and ReviewingSeveral reviewers reported spending a lot of time explaining the OGP.

Norwegian reviewer Christopher Wilson said CSOs “never heard of it” and ”didn’t have a lot of time for us.”

He said,  “We were doing special promotion for OGP at the same time we were supposed to be doing an evaluation.”




Be Sociable, Share!


Filed under: What's New