The source of data is indicated for each table, figure and annex. Data were mainly sourced from annual reports by CGIAR Research Programs (CRPs) and Platforms, using standard reporting templates and indicators. Altmetric data are drawn from online sources.
This was the second year of reporting against new CGIAR-wide templates and common results reporting indicators. The templates and guidance were modified in 2018 to respond to detailed comments received in 2017. All reporting templates and guidance documents for 2018, along with Frequently Answered Questions (FAQs), can be viewed on the CGIAR reporting website.
The process went more smoothly in this second year. However, some details of the templates and guidance still require further improvement. This will be particularly important in view of the weight that is placed on high quality reporting in 2020 to meet the new Program Management Performance Standard on Quality of Results Reporting.
Data Quality Assurance
The agreed principles behind reporting include checkability and evidence for all claims. Checks on data for 2018 were carried out at several levels: by Flagship leaders, by CRP Program Management Units, by Management Information System (MIS) managers (when relevant) and lastly by a quality assurance team.
Time frames were tight, teams were stretched, and it is unlikely that the compiled databases are completely error-free. However, researchers are aware that all claims would be visible in the public domain and potentially scrutinized by their immediate colleagues and partners, as well as Funders, and this is a strong incentive for avoiding over-claiming. For System-level quality checks, most attention was paid to checking claims of outputs, outcomes and impacts: data on progress towards system level targets, Outcome Impact Case Reports, policies and innovations, as well as gender scoring. Before carrying out the full quality assurance (QA), the System Organization managed a participatory (peer review) QA exercise involving around 30 members of the CGIAR Monitoring, Evaluation and Learning Community of Practice (MELCOP) to pre-test the QA criteria, as well as to make recommendations for improvements in the reporting guidance and templates.