The Carter Center April 23 unveiled a tool for assessing the implementation of access to information laws.
The goal is to evaluate the “plumbing” that makes access laws work. The assessment tool uses 65 questions and the results are displayed with red, yellow and green ovals.
The tool is not designed to rank countries or evaluate compliance with the laws. Rather, the goal is to suggest where improvements should be made and to foster collaboration and reform.
Almost five years in development, the Information Assessment Tool (IAT) method has been tested in six to seven ministries in eleven countries through three pilot phases.
Phase I included: Bangladesh, Mexico, and South Africa; in Phase II, the original countries reapplied the tool, and were joined by Chile, Indonesia, Scotland and Uganda. Pilot Phase III was completed in April, 2014; researchers in Georgia, Guatemala, Jordan and the United States applied the tool along with the previous seven (for a total of 11 countries in the final pilot phase). A final review of the methodology and indicators was just completed, and minor changes will be made in order to establish the final set of indicators.
Although the findings have not been publicized in the countries, and are not yet available online, they have been shared with some of the government agencies. Several of the in-country experts who used the tool said at a Washington presentation that they have noticed some improvements as a result of the evaluations and see promise in its use.
After access laws are passed, said Mukelani Dimba, Executive Director of the Open Democracy Advice Centre in South Africa,“there are many stories of disappointment.”
“While it is important to spend time and energy on enactment, it is even more important to spend time on implementation.” Dimba called the tool “a magnifying glass that helps us understand the system a bit better” and can result in “targeted interventions.”
65 Questions to Answer
The questions are built around four core areas:
– fundamental functions;
– the ability to receive and respond to requests;
– proactive disclosure; and
– records management.
The questions probe whether the leaders are committed to access, and whether rules and administrative systems have been established. The adequacy of resources and the presence of monitoring systems are also tested.
For example, question 55 asks:
“Has the agency created or adopted a system to manage its paper records?
This is followed by three options:
a. The agency has created or adopted a system to manage its paper records that includes all of the following:
- Creation and classification;
- Survey and inventory;
- Indexes and circulation logs;
- Security rights and access permission; and
- Retention and disposal
b. The agency has created or adopted a system for managing paper records but it does not include all of the above.
c. The agency has not created or adopted a paper record management system.
The research was carried out by experts hired by the Carter Center. Their work included interviewing government officials. They also prepare narrative reports. Most of the questions are objective, but some request the reviewers’ opinions.
The findings were then run by focus groups of national experts and officials. They were further examined by a separate “blind peer reviewer” and evaluated by the Carter Center staff. The process took about four months and approximately $3,000 per agency, officials said.
The 65-questions, though not the final version, along with background materials about the process and the methodology, are available via the Carter Center website. The findings will be up “shortly.”
Future Plans for Greater Use
The tool, though still being given final tweaks, is now available for wider use, said Laura Neuman, manager of the Carter Center’s Global Access to Information Initiative.
The Carter Center hopes that governments will employee the Atlanta-based organization to administer the tool in their countries. It is encouraging civil society groups to propose such a commitment in the national action plans of the Open Government Partnership countries.
Under preparation are additional explanatory materials, justifying each question; materials to guide the evaluators using “the tool” (the term preferred by speakers at the program to “IAT”); and toolkits for agencies seeking to improve.
Testimony of Value
The tool “tells us why” agencies are performing badly in providing information, said Manfredo Marroquin, chairman of the Guatemalan Chapter of Transparency International, who said it identified “weaknesses in many areas,” including poor management and a lack of leadership. “It gives us now the opportunity to engage with the institutions especially those who are not doing well” and to “engage with them.”
“We need to know exactly what we need in terms of capacity and that is what this tool brings for us,” commented Gilbert Sendugwa, coordinator of the African Freedom of Information Centre.
From Chile, Alberto Urzua Toledo, Lecturer of Law and Social Sciences at Alberto Hurtado University, said that initially observers felt that the systemic problems in Chile related to resources and a lack of proper record management, but the evaluation showed that “the actual problem was the lack of internal routine guidelines and instructions”
“As a whole I do believe that the use of the IAT gives us a very clear and accurate picture of the general system in Mexico,” said Juan Pablo Guerrero, Secretary General of the Federal Institute of Access to Information and Data Protection. He said it “gives us hints as to where there are challenges and where there are areas for improvement.”
The short oral reports on the findings in the subject countries often cited poor training, problems with records management and lack of leadership as key issues.
Filed under: What's New