Skip to main content

Justice Sector Reform

Results and Indicators for Development
Governance

Results and Indicators

Result Indicator(s)
Output:
Legal aid schemes (in the broad sense, i.e. including legal representation) for people without financial means developed and operational.
Number of cases referred to/processed through alternative dispute resolution provided by the project
(Numeric)
Data Source: Project M&E system – database of referrals disaggregated by sex, location, case topic and outcome of alternative dispute resolution 
Additional Information:
Number of LAS Advocates and legal aid providers trained who can demonstrate increased knowledge in the relevant areas (disaggregated by sex, training topic, duration and location)
(Numeric)
Data Source: Project M&E system: (disaggregated by sex, training topic, duration and location)
Additional Information: Please specify areas of capacity building. Examples of training from PLEAD: LAS Advocates / Legal Aid Providers jointly trained on trial advocacy, monitoring and reporting on Legal Aid issues; guidelines for assisting victims and vulnerable groups; Alternative Dispute Resolution – PLEASE add more examples if needed.
Number of people who were able to access justice thanks to EU support (disaggregated by sex, gender, age, disability, as well as type of support provided)
(Numeric)
Data Source: Project M&E system – database of beneficiaries to be established by the project (disaggregated by sex, age, disability, as well as type of support)
Additional Information: This indicator is appropriate at output level if the project will provide direct support (i.e. free legal aid, sign language or minority language interpretation of proceedings, etc.). Please add additional criteria of disaggregation if other factors represent a disadvantage in the local context (i.e. ethnic origin, religion). 
Number of people directly benefitting from legal aid programmes supported by the EU (disaggregated by sex, age and disability/social group, and type of case: criminal, civil or administrative)
(Numeric)
Data Source: Project M&E system: database of direct beneficiaries (disaggregated by sex, age and disability/social group)
Additional Information: This indicator could be made more specific, e.g. number of people directly benefitting from legal representation  - this could be separated out Qualitative aspects on quality of legal aid provided to be checked Link to note
Number of people reached through public campaigns on legal aid [who can demonstrate increased knowledge of the relevant legal aid topics]
(Numeric)
Data Source: Project M&E system: for measuring campaign reach: listenership of radio/TV programme and number of event participants For assessing level of knowledge before and after the campaign: two specialized surveys of target communities  
Additional Information: Where possible and practical it is best to ascertain whether people have absorbed the knowledge and understand where/how to look for information on legal aid. This requires a survey of the communities which were targeted by the campaign.
Result Indicator(s)
Output:
Strengthened capacity of individuals and CSOs to scrutinise institutions' performance and support individuals in claiming and defending their rights (e.g. through awareness-raising and advocacy campaigns, legal advice, monitoring of trials, etc.).
Number of CSO coalitions established/strengthened by the project
(Numeric)
Data Source: Project M&E system – database of CSO coalitions
Additional Information: Please specify whether you will only count networks that are officially registered or even informal coalitions. Please note if there is a particular level you are focusing on (i.e. only national coalitions or only coalitions which have more than a certain number of members). You may also specify whether you are focusing on coalitions of CSOs working on a certain issue, i.e. women’s rights.
Number of CSO representatives trained by the project who can demonstrate increased knowledge in the relevant areas (disaggregated by sex, training topic, duration and location)
(Numeric)
Data Source: Project M&E system: a. – database of training participants (disaggregated by sex, CSO, training topic, duration and location)  b. database with results of pre- and post- training tests   
Additional Information: Please provide a list of areas that we think CSO representatives should be trained in (e.g. public awareness raising, advocacy, international good practices in justice sector monitoring and accountability tools…) Please specify that the implementing partner should develop pre- and post-training tests to be administered to all training beneficiaries, with results to be reported on in each Progress Report. The design of pre- and post-training tests needs to be a separate activity to be conducted by a specialist once the training curriculum is approved.  
Number of CSOs supported by the project in organizing an advocacy event or publishing advocacy material on the justice sector’s independence, impartiality or accountability (including budget tracking and other monitoring activities)
(Numeric)
Data Source: Project M&E system – database of event participants (disaggregated by sex and location), reports or links to online advocacy material produced by CSOs with project support
Additional Information: Advocacy material can include a report, brochure, website, youtube video or similar. Advocacy event could be a public hearing, conference, workshop or similar
Number of people reached through CSO awareness campaigns to inform the public of their rights
(Numeric)
Data Source: Project M&E system:  a. listenership statistics to be obtained by the project team for radio/TV programmes they broadcast, b. database of event participants (disaggregated by type, date and location of event, date, as well as sex of participant)    
Additional Information: Estimated listenership of radio/TV programmes, number of conference or other public event participants 
Result Indicator(s)
Output:
Promoted access to legal information for the public (e.g. about the justice system in general, court fees, how to initiate a claim, etc., through bulletin boards, leaflets, websites and media)
Number of /Status of official internet sites/portals/leaflets developed by the action to provide information about the justice system/free legal aid
(Numeric)
Data Source: Project M&E system 
Additional Information: Free available on internet/portal/leaflets - legal texts (e.g. codes, laws, regulations, etc.), case law of higher court(s), other documents (e.g. forms/online registration), inform/help victims of crime. Links for internet sites/portals should be provided in progress reports
Number of downloads of leaflets developed by the action to provide information about the justice system/free legal aid
(Numeric)
Data Source: Project M&E system 
Additional Information:
Number of individual website visitors on portals developed by the action to provide information about the justice system/free legal aid
(Numeric)
Data Source: Project M&E system 
Additional Information:
Result Indicator(s)
Output:
Improved capacities of justice actors to meet the justice needs of the population (e.g. support to the revision/optimisation of the judicial map/ geographical coverage/ of courts, funding of mobile courts, set up of front desks in courts etc.)
Number of courts deployed and/or supported in under-served (or poor) regions by this project (disaggregated by type of court and location)
(Numeric)
Data Source: Project M&E system – list of courts disaggregated by location and type of court
Additional Information: This issue is particularly pertinent to access to justice Greater territorial coverage and specialisation make it easier for parties to take their cases to court. Please definite which areas would count as "underserved" prior to the start of the project. The could be made more specific if only certain types of courts are suggested by the project, e.g. Number of mobile courts deployed to underserved by the project    
Number of people provided with access to court as result of project-supported [mobile*] courts (disaggregated by sex, age, disability, location and type of case)
(Numeric)
Data Source: Project M&E system –database of beneficiaries (disaggregated by sex, age, disability, location and type of case)
Additional Information: * replace with a type of court that your project is supporting
Number of policies/ regulations/ legislations on the demarcation of courts [developed/amended/adapted/ implemented *] supported by the project
(Numeric)
Data Source: Policies/regulations/legislation before and after changes 
Additional Information: *delete as appropriate. The level will depend on for the exact nature of the project and the local context.  An indicator on the number of policies supported is not a good indicator as this is an activity only. If only supporting one policy then please use phrase "status of policy X". Please qualify what the system should be like for the target  
Result Indicator(s)
Output:
Promoted access to legal information for professionals (e.g. legal databases and publications, publication of relevant case law, websites, statistics, etc.)
Number of legal professionals trained by the action to use new sources of legal information (i.e. journals, databases), disaggregated by sex
(Numeric)
Data Source: Database of training participants (disaggregated by type and sex of official, training topic, duration and location) 
Additional Information:
Number of legal professionals with access to legal information thanks to the project (disaggregated by sex)
(Numeric)
Data Source: Project database of beneficiaries
Additional Information: If possible, please be more specific about the type of information or the means that the project will provide.
Result Indicator(s)
Specific Objective - Outcome:
Improved access to affordable justice for all, including to effective and accessible complaint and redress mechanisms at national and local level
Annual public budget allocated to justice sector as a percentage of the State budget
(Percentage)
Data Source: Budget data provided by the government, at the beginning and end of intervention
Additional Information: Value of state budget allocated to the Ministry of Justice out of the overall State budget
Annual public budget allocated to legal aid
(Numeric)
Data Source: Budget data provided by the government, at the beginning and end of intervention
Additional Information: Value of state budget allocated for legal aid
Annual ratio of allocated vs. executed budget for legal aid
(Percentage)
Data Source: Analysis of budget data provided by the government, at the beginning and end of intervention
Additional Information:
Annual ratio of allocated vs. executed budget for the justice sector
(Percentage)
Data Source: Budget data provided by the government, at the beginning and end of intervention
Additional Information:
Average score of expert perception on access to redress for miscarriage of justice
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: UN Rule of Law indicators (47): Whether victims of miscarriage of justice have access to effective legal recourse and redress Question: “To what extent do you agree that people who are wrongfully convicted are able to receive compensation or other forms of redress?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: fully agree (4); partly agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time. Please define X before the start of the project. Please consider/check availability and quality of administrative records (e.g. completeness, accuracy, quality control, timeliness, etc.) to judge whether they are appropriate to use for monitoring purposes. If public records have gaps or inconsistencies, the project needs to conduct its own study.  
Average score of expert perception on the availability of free legal assistance for indigent defendants
(Numeric)
Data Source: At least two rounds of expert surveys as part of the Project M&E system 
Additional Information: UN Rule of Law indicators (49): Whether and to what extent indigent defendants receive free legal assistance (both legal advice and legal representation for people without financial means, particularly women) at all stages of criminal and/or civil proceedings against them. Survey question: How often do indigent people accused of serious crimes actually receive free legal assistance at all stages of proceedings against them? Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: very often (4); often (3); rarely (2); never (1). Dynamic: Direction and level of change in average score over time. NOTE - Applicable for all UN Rule of Law Indicators: these are indicators proposed by the UN for projects/ programmes working in the justice sector; the relevant data collection is carried out by the projects/programmes themselves, based on different data collection methods: public survey, expert survey, document review, or administrative and field data. Thus, it is up to the project/programme to design the appropriate data collection mechanism to gather data on the indicator of interest.    
Average score of expert perception on the availability of interpreter
(Numeric)
Data Source:

At least two rounds of expert surveys as part of the project M&E system

Additional Information:

UN Rule of Law indicators (45): How available are the services of interpreters (e.g. for criminal, civil, administrative procedures)

Survey question: How would you rate the availability of interpreters to assist victims and defendants during criminal proceedings?

Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: 

  • very good (4); 
  • good (3); 
  • poor (2); 
  • very poor (1). 

Dynamic: Direction and level of change in average score over time. Note: A supplementary question is asked to help determine whether interpreters are available to only one of these groups. Please consider/check availability and quality of prison records/statistics (e.g. completeness, accuracy, quality control, timeliness, etc) to judge whether it's appropriate to use for monitoring purposes.

Average score of expert perception on the quality of legal representation
(Numeric)
Data Source: At least two rounds of expert surveys as part of the Project M&E system
Additional Information: UN Rule of Law indicators (50): The quality of the legal representation generally available to defendants during criminal proceeding. Survey question: “How would you rate the legal representation generally available to defendants during criminal proceedings?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: very good (4); good (3); poor (2); very poor (1). Dynamic: Direction and level of change in average score over time. NOTES:  •    Expert perception can also be coupled with information on outcomes of such proceedings. •    This indicator can be expanded to legal representation available to defendants during civil proceedings   
Existence of court fees required to start a proceeding at a court of general jurisdiction (disaggregated by type of case)
(Qualitative)
Data Source: Public records from e.g. the National Institute of the Statistics or the Ministry of Justice
Additional Information: European Commission for the Efficiency of Justice (CEPEJ) Scheme for Evaluating Judicial Systems (2014-2016)- Question 8: Are litigants required in general to pay a court tax or fee to start a proceeding at a court  of general jurisdiction: (a) for criminal cases? (b) for other than criminal cases? There may be a general rule in some states according to which a party is required to pay a court tax or fee to start a proceeding at a court of general jurisdiction. Court taxes or fees do not concern lawyers' fees. If this general rule has exceptions, please indicate them. For the purposes of this question, courts of general jurisdiction are those courts which deal with civil law and criminal law cases. A portion of the budget of courts can be financed by an income resulting from the payment by the parties of such court taxes or fees. As regard the method for calculating the court fees, in certain countries this can be a set sum whereas in others it can consist of a percentage of the contested amount or of an amount determined by the nature of the proceedings.  
Number of cases heard in project-supported [mobile*] courts (disaggregated by type of case: civil, criminal or administrative)
(Numeric)
Data Source: Project M&E system –database maintained by the project team based on the courts’ data
Additional Information: * replace with a type of court that your project is supporting. Please specify additional disaggregation if needed, i.e. within civil cases, you could disaggregated by sub-categories such as property, family, etc. and within criminal cases, you could disaggregated SGBV cases and others.
Number of cases of human rights abuses, including gender based and sexual violence, brought to court
(Numeric)
Data Source: Performance data from the LACON Pro Bono Clearing House Database, NPS CMS and other records
Additional Information:
Number of cases which are investigated, prosecuted and adjudicated by the relevant institutions, by type of cases , e.g. criminal case, civil case
(Numeric)
Data Source: For criminal cases: prison records; For civil: reports of Ministry of Justice
Additional Information:
Number of judges per population (disaggregate by urban / rural area)
(Numeric)
Data Source: Public sector administrative data to be requested by the project at least at the beginning and end of implementation 
Additional Information:
Number of people who received public representation free of charge (disaggregated by sex, age, disability, and type of case: civil, criminal or administrative)
(Numeric)
Data Source: Ministry of Justice reports to be requested by the project team
Additional Information: Please use this indicator only to measure the general situation of the legal aid system in the country, rather than the outputs of the EU-funded project (which is monitored at output level). Please add additional criteria of disaggregation if other factors represent a disadvantage in the local context (i.e. ethnic origin, religion). Public data (statistics) on access to justice for vulnerable/marginalized groups may not be collected by the government/courts so the project should be strongly encouraged from the inception phase to report on available data and plans for filling any gaps.
Number of reported cases of miscarriage of justice
(Numeric)
Data Source: Administrative records Project M&E system (survey)  
Additional Information:
Number of sentences passed in project-supported [mobile*] courts (disaggregated by type of case: civil, criminal or administrative)
(Numeric)
Data Source: Project M&E system –database maintained by the project team based on the courts’ data
Additional Information:
Percentage of public survey respondents who report that victims of crime have to pay official or unofficial fees often or very often in order to have their complaints proceed to court
(Percentage)
Data Source: At least two rounds of public surveys to be conducted by the project
Additional Information: UN Rule of Law indicators (48): Whether, according to public survey respondents, victims of crime have to pay an official or unofficial fee to have their complaints proceed to court Question: “How often do victims of crime have to pay an official or unofficial fee to have their complaints proceed to court?” Rating: Average score of respondents on a four-point scale corresponding to the following four response categories: never (1); rarely (2); often (3); very often (4). Dynamic: Direction and level of change in average score over time.  
Proportion of population under x hours of fully functioning court
(Percentage)
Data Source: Public sector administrative data to be requested by the project at least at the beginning and end of implementation 
Additional Information: Please specify the [x] in the indicator, as well as the basic services a “fully functioning court” needs to provide.
Proportion of requests for legal assistance and free interpreters being met (criminal and civil proceedings) annually
(Percentage)
Data Source: Prison statistics (monthly/quarterly/ annually collected by prison records unit and prison case management system
Additional Information:
Proportion of victims of violence in the previous 12 months who reported their victimization to competent authorities or other officially recognized conflict resolution mechanisms (disaggregated by gender, sex, age, marginalised groups)
(Percentage)
Data Source: SDG monitoring – likely country level only  At least two rounds of local surveys as part of the project M&E system  
Additional Information: Note that the SDG indicator 16.3.1 is a Tier II indicator (i.e. Indicator conceptually clear, established methodology and standards available but data are not regularly produced by countries) Possible custodian agency: UNODC Alternative:  This indicator can be expanded to legal aid with respect to civil proceedings (both legal advice and legal representation for people without financial means, particularly women)  
Proportion of victims who received compensation within X months
(Percentage)
Data Source: Administrative records Project M&E system (survey)  
Additional Information: Please specify the relevant period (in terms of the number of months) that is relevant (expected, or stipulated in policies) in your local context.
Result Indicator(s)
Specific Objective - Outcome:
Improved transparency and accountability of the judicial system
Average expert review score on whether courts have performance guidelines and a system for monitoring performance that holds judges accountable for unnecessary delays in proceedings, case backlog, or absenteeism
(Numeric)
Data Source: At least two rounds of expert review of relevant official documents as part of the project M&E system
Additional Information: Measurement: Review of documents to determine whether courts have performance guidelines and a performance monitoring system that holds judges accountable for unnecessary delays in proceedings, case backlog, or absenteeism. Rating: Very good performance guidelines and monitoring system (4); good performance guidelines and monitoring system (3); poor performance guidelines and monitoring system (2); very poor performance guidelines and monitoring system (1). Dynamic: Direction and level of change in average score over time.  
Average expert review score on whether internal procedures and mechanisms exist within prosecution services to assess and monitor compliance with departmental performance guidelines
(Numeric)
Data Source: At least two rounds of expert review of relevant official documents as part of the project M&E system 
Additional Information: Measurement: Review of documents to determine whether prosecution services have performance guidelines and a performance monitoring system that holds prosecutors accountable for unnecessary delays in proceedings, case backlog, or absenteeism. Rating: Very good performance guidelines and monitoring system (4); good performance guidelines and monitoring system (3); poor performance guidelines and monitoring system (2); very poor performance guidelines and monitoring system (1). Dynamic: Direction and level of change in average score over time  
Country score for public sector accountability and transparency according to the Ibrahim Index of African Governance (IIAG)
(Numeric)
Data Source: Ibrahim Index of African Governance (IIAG). Data can be explored online here http://mo.ibrahim.foundation/iiag/ 
Additional Information: This indicator captures the extent to which the executive and public employees can be held to account by the electorate, legislative and judiciary. It consists of two sub-indicators: 1. Public sector accountability and transparency (AfDB) 2. Public sector accountability and transparency (WB) – IDA Resource Allocation Index *NOTE: Only African countries covered. Could use Public sector accountability and transparency score, an sub indicator for the IDA resource allocation index for IDA eligible countries.      Unit number can be between 1-100  
Number of sanctions pronounced against judges and public prosecutors (disaggregated by type of sanctions)
(Numeric)
Data Source: Public sector administrative data to be requested by the project at least at the beginning and end of implementation
Additional Information: “Sanctions” refers to disciplinary action (not criminal charges).
Percentage of people who partly or fully agree agree that judges and prosecutors are generally respectful of the rights of defendants and victims (disaggregated by sex)
(Percentage)
Data Source: At least two rounds of public surveys as part of the project M&E system
Additional Information: UN Rule of Law indicators (42): Public perception of how respectful judges and prosecutors are of the rights of defendants and victims Question: “To what extent do you agree that judges and prosecutors are generally respectful of the rights of defendants and victims?” Rating: Average score of respondents on a four-point scale corresponding to the following four response categories: fully agree (4); partly agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time.  
Percentage of people who partly or fully agree that courts treat people fairly regardless of their income, race, national or social origin, gender or religion (disaggregated by sex)
(Percentage)
Data Source: At least two rounds of public surveys as part of the project M&E system
Additional Information: UN Rule of Law indicators (43): Whether the courts are perceived by the population to be treating people fairly and impartially regardless of their income, race, national or social origin, gender or religion Question: “To what extent do you agree that courts treat people fairly regardless of their income, race, national or social origin, gender or religion?” Rating: Average score of respondents on a four-point scale corresponding to the following four response categories: fully agree (4); partly agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time.  
Percentage of people who partly or fully agree that judges are able to make decisions without direct or indicrect interference by Government or politicians (disaggregated by sex)
(Percentage)
Data Source: At least two rounds of public surveys as part of the project M&E system
Additional Information: Whether the population believes that judges are able to make decisions free from direct or indirect interference by Government or politicians Question: “Do you think that judges are able to make decisions without direct or indirect interference by Government or politicians?” Rating: Average score of respondents on a four-point scale corresponding to the following four response categories: always able (4); sometimes able (3); rarely able (2); never able (1). Dynamic: Direction and level of change in average score over time.  
Percentage of people who partly or fully agree that prosecution decisions are made in a fair, efficient and effective manner (disaggregated by sex)
(Percentage)
Data Source: At least two rounds of public surveys as part of the project M&E system
Additional Information: Whether the public believe that prosecution decisions are made in a fair, efficient and effective manner Question: “Do you agree that the public believes that prosecution decisions are made in a fair, efficient and effective manner?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: fully agree (4); partly agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time.  
Percentage of the population who perceive the overall quality of justice dispensed as good or very good
(Percentage)
Data Source: At least two rounds of public surveys as part of the project M&E system 
Additional Information: The project will need to conduct at least two surveys in order to obtain data for this indicator. The survey may combine direct questions (i.e. “How do you perceive the overall quality of justice dispensed by the court in your city/village/community?”) with proxy questions (i.e. “If you were wrongly accused of a crime, would you expect to be treated justly by the courts?”, or similar). Dynamic: Direction and level of change in average score over time.
Proportion of formal investigations of person with judicial functions resulting in disciplinary action or prosecution
(Percentage)
Data Source: Public sector administrative data to be requested by the project at least at the beginning and end of implementation
Additional Information:
Proportion of persons with judicial functions (e.g. judges and prosecutors) formally investigated for breach of duty, irregularity, abuse (e.g. for corruption)
(Percentage)
Data Source: EU intervention monitoring and reporting systems (Progress and final reports for the EU-funded intervention; EU-funded feasibility or appraisal reports ; Baseline and endline studies conducted and budgeted by the EU-funded intervention )
Additional Information:
Status of quality standards body for judicial officials (including judges, lawyers etc.), or a complaints body for dealing with judicial officials
(Qualitative)
Data Source: Project M&E system: government/ parliamentary decision on the establishment of the new body
Additional Information: Please be specific in the target: what type of body is sought, should it be independent or appointed, to do what or with what membership? 
Result Indicator(s)
Output:
Increased management, administrative and technical capacities of the justice actors, including the Ministry of Justice, Supreme Council of the Judiciary, courts, prosecution services, and lawyers
Amount of material resources (in EUR or specific items) delivered through the project to the courts to consult the law, record proceedings, schedule cases, and store and maintain records (disaggregated by type of resource, beneficiary institution, locatio
(Numeric)
Data Source: Project M&E system – list of resources provided by the project, disaggregated by type of resource, beneficiary institution, location and date
Additional Information: Please specify types of material sources that can be provided by the project and the target institutions
Number of guidelines, policy notes and regulations for improved court administration and case / file management developed / implemented with support of the project
(Numeric)
Data Source: Project M&E system – text of guidelines, policy notes or regulations; reports on implementation 
Additional Information: Specific formulation will depend on the specific project context – for example, if regulations already exist, then we need to monitor the project’s support for their implementation. If they do not, we might start with development and aim for implementation at a later time.
Number of judges/courts protected with EU support from threats, harassment, assault, assassination or intimidation
(Numeric)
Data Source: Project M&E system – list of means provided by the project (disaggregated by type, location, date/duration)
Additional Information: This indicator is appropriate at output level if the project provides security staff, scanners, fences, etc.
Number of judiciary staff (prosecutors, judges, court clerks), prison officers, law enforcement officers and other legal officials trained by the project in the penal system and civil justice who can demonstrate increased knowledge in the relevant areas (
(Numeric)
Data Source: Project M&E system: a. database of training participants (disaggregated by sex, training topic, duration and location)  b. database with results of pre- and post- training tests  
Additional Information: Please specify the main training topics. If the action does not have a high M&E budget, you can use a simpler version of this indicator “Number of judiciary staff (prosecutors, judges, court clerks), prison officers, law enforcement officers and other legal officials trained by the project in the penal system and civil justice” (without going into the level of their knowledge before and after the training).  You may also decide to use the more advanced version of this indicator - requiring the implenmenting partner to show whether the beneficiaries increased their knowledge). In this case, please specify that the implementing partner should develop pre- and post-training tests to be administered to all training beneficiaries, with results to be reported on in each Progress Report. The design of pre- and post-training tests needs to be a separate activity to be conducted by a specialist once the training curriculum is approved.  
Number of prosecutors’ offices equipped by the EU-funded intervention to record testimonies, store and maintain evidence, and keep track of pending cases and hearing dates
(Numeric)
Data Source: Project M&E system  – list of resources provided by the project, disaggregated by type of resource, beneficiary institution, location and date
Additional Information: Please specify types of material sources that can be provided by the project 
Number of registry clerks and administrative staff trained by the project who can demonstrate increased knowledge in relevant topics (disaggregated by sex, training topic, duration and location)
(Numeric)
Data Source: Project M&E system: a. database of training participants (disaggregated by sex, training topic, duration and location)  b. database with results of pre- and post- training tests  
Additional Information: Please specify the main training topics. If the action does not have a high M&E budget, you can use a simpler version of this indicator “Number of registry clerks and administrative staff trained by the project on case/records management” (without going into the level of their knowledge before and after the training).  You may also decide to use the more advanced version of this indicator - requiring the implenmenting partner to show whether the beneficiaries increased their knowledge). In this case, please specify that the implementing partner should develop pre- and post-training tests to be administered to all training beneficiaries, with results to be reported on in each Progress Report. The design of pre- and post-training tests needs to be a separate activity to be conducted by a specialist once the training curriculum is approved.  
Number of staff from prosecution and investigation services trained with support of the project in various specialisations of interest (e.g. gender-based violence, economic crimes, etc.) who can demonstrate increased knowledge in the relevant areas (disag
(Numeric)
Data Source: Project M&E system: a. database of training participants (disaggregated by type and sex of official, training topic, duration and location)  b. database with results of pre- and post- training tests  
Additional Information: Please specify the main training topics. If the action does not have a high M&E budget, you can use a simpler version of this indicator “Number of staff from prosecution and investigation services trained with support of the project in various specialisations of interest (e.g. gender-based violence, economic crimes, etc.)” (without going into the level of their knowledge before and after the training).  You may also decide to use the more advanced version of this indicator - requiring the implenmenting partner to show whether the beneficiaries increased their knowledge). In this case, please specify that the implementing partner should develop pre- and post-training tests to be administered to all training beneficiaries, with results to be reported on in each Progress Report. The design of pre- and post-training tests needs to be a separate activity to be conducted by a specialist once the training curriculum is approved.  
Result Indicator(s)
Output:
Established mechanisms/platforms for improving cooperation and ...
Number of mechanisms e.g. Memorandum of Understanding, periodical coordination meetings, new policies / regulations on specialised courts, to improve [cooperation / coordination / efficiency*] [developed / adopted / implemented*]
(Numeric)
Data Source: Project M&E system: text of MoU/policy the project supported, or minutes of coordination meetings the project organized, including the list of participants
Additional Information: *delete as appropriate. The level will depend on for the exact nature of the project and the local context. An indicator on the number of policies supported is not a good indicator as this is an activity only. If only supporting one mechanism then please use phrase "status of mechanism X". Please qualify what the system should be like for the target  
Status of process or structured dialogue between lawyers and courts as regards the way cases are presented before courts
(Qualitative)
Data Source: Project M&E system Official reports on the structured dialogue process  
Additional Information: Please describe the process/dialogue to be desired for the target, e.g. the organisation of the process, including how the number and planning of hearings will be organised, duty periods for urgent cases, selection of simplified modes of prosecution… etc. A binary "existence of" is not a very good target. 
Result Indicator(s)
Output:
Increased capacities of different justice actors to develop and enforce codes of ethics and professional conduct (e.g. support to drafting of code of ethics and professional conduct, support to judicial inspections etc.)
Number of professionals from the justice sector (judges, prosecutors, lawyers etc.) who have received deontological training and can demonstrate increased knowledge in the relevant areas (disaggregated by type and sex of official, specific training *)
(Numeric)
Data Source: Project M&E system: a. database of training participants (disaggregated by type and sex of official, training topic, duration and location)  b. database with results of pre- and post- training tests  
Additional Information: *specific training topic, duration and location) Please specify that the implementing partner should develop pre- and post-training tests to be administered to all training beneficiaries, with results to be reported on in each Progress Report. The design of pre- and post-training tests needs to be a separate activity to be conducted by a specialist once the training curriculum is approved.
Status of internal procedures within prosecution services to assess and monitor compliance with departmental performance guidelines
(Qualitative)
Data Source: Procedures document that the project helped develop and analysis of key elements in progress reports 
Additional Information: This indicator is appropriate at output level if the project will help to develop performance guidelines and a performance monitoring system that holds prosecutors accountable for unnecessary delays in proceedings, case backlog, or absenteeism. You can add other indicators if the project, for example, trains staff on the application of new procedures (i.e. number of legal and administrative staff trained on…).  
Status of performance guidelines for monitoring performance of courts that hold judges accountable for unnecessary delays in proceedings, case backlog, or absenteeism
(Qualitative)
Data Source: Guidelines document that the project helped develop and analysis of key elements in progress reports
Additional Information: This indicator is appropriate at output level if the project will help to develop performance guidelines that hold judges accountable for unnecessary delays in proceedings, case backlog, or absenteeism. You can add other indicators if the project, for example, trains staff on the application of the new performance guidelines (i.e. number of legal and administrative staff trained on…).  
Result Indicator(s)
Output:
Developed or revised legal framework for civil, criminal and administrative proceedings in line with international best standards (e.g. revision of relevant codes etc.)
Number of supported policies / legislation to improve efficiency within the justice sector [developed/ revised/ implemented*] (report separately for policies/legislations specific related to juveniles)
(Numeric)
Data Source: Expert assessment as part of the Project M&E system, text of draft law/amendment
Additional Information: This is a process indicator assessing whether the appropriate legal framework and regulations are in place. Improvements maybe include: e.g. improved civil law: status of law on alternative dispute resolution due process of law: need to have equality of arms, accusatorial processes, procedures for access to information, financial legal aid, interpreter, right for a speedy trial, etc . e.g. for Human Rights: law/policy alignment with relevant international human rights treaties, accrediation of national human rights institutions by the rules  of procedure of the International Coordinating Committee of National Institutions, framework improvements on security, handling of criminality and abuse by law enforcement officials, etc. * Please specify the most appropriate for the intervention, e.g. development maybe appropriate where there is no law, but amendment is more appropriate a draft legislation is already in place    
Status of special jurisdictions and detention system for juvenile crime in line with international standards
(Qualitative)
Data Source: Public records, project M&E system – specifically an assessment to be conducted as a minimum at the beginning and end of the project
Additional Information: Please specify the relevant standard, this could include: CRC (art. 37, 39, 40) and its optional protocol  (art 8and 9), Mandela Rules, UN Standard Minimum Rules for the administration of juvenile justice ( Bejing Rule),  UN Rules for the Protection of Juveniles Deprived of their Liberty (the Havana Rules), UN Guidelines on Justice in Matters involving Child Victims and Witnesses of Crime, Guidelines on Child-Friendly Justiceat regional level e.g. CoE Guidelines on Child-Friendly Justice. It could also include other relevant standards, please specify.
Result Indicator(s)
Specific Objective - Outcome:
Improved efficiency and effectiveness of the judicial system
Annual enforcement / execution rate of decisions (disaggregated by type, e.g. criminal and civil matters)
(Percentage)
Data Source: Public sector administrative data to be requested and analysed by the project at least twice during the implementation period
Additional Information: Percentage of decisions enforced / executed compared to total number of decisions per year (disaggregated by type, e.g. criminal, administrative and civil matters)
Average expert assessment score of the courts’ access to material resources needed to consult the law, record proceedings, schedule cases and store and maintain records
(Numeric)
Data Source: Project M&E system: expert surveys conducted at the beginning and end of project implementation
Additional Information: Whether the material resources available to the courts are adequate Indicator 75- Question: “With respect to the courts across most of the country (not just the capital), to what extent do you agree that courts have the material resources they need to consult the law, record proceedings, schedule cases, and store and maintain records?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: fully agree (4); partly agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time. Note: A supplementary question is asked to determine the specific challenged faced by the courts in this regard.  
Average expert assessment score of the Prosecution’s material resources for record testimonies, store and maintain evidence, and keep track of pending cases and hearing dates
(Numeric)
Data Source: Project M&E system: expert surveys conducted at the beginning and end of project implementation
Additional Information: Whether prosecutors have the material resources necessary to record testimonies, store and maintain evidence, and keep track of pending cases and hearing dates  Indicator 77 - Question: “To what extent do you agree that prosecutors have the means and resources to record testimonies, store and maintain evidence, and keep track of pending cases and hearing dates?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: fully agree (4); partly agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time. Note: A supplementary question is asked to determine which aspects of this capacity are particularly lacking.  
Average expert assessment score on the courts’ strategic planning and budgeting capacity
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “How would you rate the courts’ capacity to plan their operations strategically and to budget efficiently?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: very good (4); good (3); poor (2); very poor (1). Dynamic: Direction and level of change in average score over time. Note: A document review will also establish whether recent strategic plans and budget forecast documents exist.  
Average expert assessment score on the effectiveness of the courts’ administrative systems
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Whether the courts have in place effective administrative systems to support key management functions such as the management of finances, assets, procurement and human resources  Question: “How would you rate the administrative systems on which the courts rely to perform key management functions such as the management of finances, assets, procurement and human resources?” Rating: Average score of relevant experts on a four-point scale corresponding to the four response categories: very good (4); good (3); poor (2); very poor (1). Dynamic: Direction and level of change in score over time. Note: A supplementary question is asked to identify the main strengths and weaknesses of these systems.  
Average expert assessment score on the effectiveness of the prosecutors’ administrative systems
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Whether the prosecutors have in place effective administrative systems to support key management functions such as the management of finances, assets, procurement and human resources. Question: “How would you rate the administrative systems on which prosecutors rely to perform key management functions such as the management of finances, assets, procurement and human resources?” Rating: Average score of relevant experts on a four-point scale corresponding to the four response categories: very good (4); good (3); poor (2); very poor (1). Dynamic: Direction and level of change in score over time. Note: A supplementary question is asked to identify the main strengths and weaknesses of these systems.  
Average expert assessment score on the prosecutors’ strategic planning and budgeting capacity
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: The public prosecution office’s strategic planning and budgeting capacity Question: “How would you rate the public prosecution office’s capacity to plan its operations strategically and to budget efficiently?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: very good (4); good (3); poor (2); very poor (1). Dynamic: Direction and level of change in average score over time. Note: A document review will also establish whether recent strategic plans and budget forecast documents exist.  
Average expert assessment score on the sufficiency of entry-level prosecutors' salaries for recruiting and retaining qualified professionals
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “To what extent do you agree that entry-level prosecutors’ salaries are sufficient to recruit and retain qualified professionals?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: fully agree (4); partly agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time.  
Average expert assessment score on the sufficiency of judges’ salaries for attracting and retaining qualified judges
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “To what extent do you agree that judges’ salaries are sufficient to attract and retain qualified judges, enabling them to live in a reasonably secure environment without having to resort to other sources of income?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: fully agree (4); partly agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time.  
Average length of enforcement of decisions (number of days)
(Numeric)
Data Source: Public sector administrative data to be requested and analysed by the project at least twice during the implementation period
Additional Information: Number of days
Average length of time a case takes from registration to judgement (first instance only, disaggregated by category or type of cases, e.g. for criminal/for civil proceedings)
(Numeric)
Data Source: Public sector administrative data to be requested and analysed by the project at least twice during the implementation period
Additional Information: Number of days
Average score of expert perception of undue delays in the hearing and conclusion of criminal cases
(Numeric)
Data Source: At least two rounds of experts surveys as part of the project M&E system
Additional Information: Ability of the judicial system to hear and conclude criminal cases without undue delays  Question: “How would you rate the ability of the judicial system to hear and conclude criminal cases without undue delays?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: very good (4); good (3); poor (2); very poor (1). Dynamic: Direction and level of change in average score over time.  
Calculated or measured disposition time – number of days required to close a pending case (disaggregated by different type of cases, e.g. administrative, commercial, etc.)
(Numeric)
Data Source: Public sector administrative data to be requested and analysed by the project at least twice during the implementation period
Additional Information: The disposition time refers to the time needed to shift the backlog of pending cases. It estimates the number of days required to close a pending case. A short version of the formula for calculating is: disposition time=  (no.of unresolved cases at the end of a period)/(no.of resolved cases in a period)  x  no.of days in period  Annex 6 to the CEPEJ Study provides more information on calculations.  
Clearance rate (disaggregated by different type of cases, e.g. administrative, commercial, civil, etc.)
(Percentage)
Data Source: Public sector administrative data to be requested and analysed by the project at least twice during the implementation period
Additional Information: The clearance rate is the ratio of the number of resolved cases over the number of incoming cases. It measures whether a court is keeping up with its incoming caseload. When the clearance rate is about 100% or higher, it means the judicial system is able to resolve at least as many cases as come in. When the clearance rate is below 100%, it means that the courts are resolving fewer cases than the number of incoming cases. It shows how the court or judicial system is coping with the inflow of cases. clearance rate (%)=  (resolved cases in a period)/(incoming cases in a period)  x 100 Please define "resolved", e.g. at first instance, at appeals, etc.   
Number and percentage of cases [handled/referred/resolved*] through alternative dispute resolution (ADR)
(Numeric)
Data Source: Public sector administrative data to be requested and analysed by the project at least twice during the implementation period
Additional Information: * delete as appropriate depending on the context. You may also want to have two indicators, e.g. on covering referrals and one covering resolution. 
Number of enforcement agents in the judicial system
(Numeric)
Data Source: Public sector administrative data to be requested and analysed by the project at least twice during the implementation period
Additional Information:
Number of pending cases in courts – first instance per 100 inhabitants (disaggregated by type of case: civil, commercial, administrative and other)
(Numeric)
Data Source: Judiciary statistics in State of the Judiciary reports
Additional Information: Number of cases
Number of specialised chambers/courts/departments established with support of the project
(Numeric)
Data Source: Project M&E system: expert surveys conducted at the beginning and end of project implementation
Additional Information: Please specify type of specialised chambers/courts/departments
Percentage of cases are over-ruled or where the sentence is reduced on appeal
(Percentage)
Data Source: Public sector administrative data to be requested and analysed by the project at least twice during the implementation period
Additional Information:
Percentage of citizens who fully or partly agree that the courts complete criminal proceedings without any unnecessary delay
(Percentage)
Data Source: At least two rounds of public surveys as part of the project M&E system
Additional Information: Whether the public perceives that the courts complete criminal proceedings without unnecessary delays  Question: “To what extent do you agree that the courts complete criminal proceedings without any unnecessary delay?” Rating: Average score of respondents on a four-point scale corresponding to the following four response categories: fully agree (4); partly agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time.  
Percentage of court records on pending cases that include at a minimum the date the case was transferred to the court, the charge(s) involved and the date of the next hearing or other action
(Percentage)
Data Source: Field data gathered by UN (survey of a random sample of court records on pending cases – conducted at a minimum in the inception and final phase of the project)
Additional Information: Measurement: Field data gathered from a sample of court records to determine whether they contain complete information on the date the case was transferred to the court, the charge(s) involved and the date of the next hearing or other action. Rating: The indicator is rated using a four-point scale corresponding to the following four categories: 100% of files contain the relevant information (very good = 4); 75-99% of files contain the relevant information (good = 3); 50-74% of files contain the relevant information (poor = 2); less than 50% of files contain the relevant information (very poor = 1). Dynamic: Direction and level of change in percentage over time.  
Percentage of judges who are women [or another relevant group]
(Percentage)
Data Source: Public sector administrative data to be requested and analysed by the project at least twice during the implementation period
Additional Information: Please specify relevant group in the indicator.
Percentage of prosecution records that are apparently complete in the following categories: (a) cases accepted for prosecution, (b) cases dismissed, and (c) charges for each case
(Percentage)
Data Source: Field data gathered by United Nations field personnel (survey of a sample of prosecution records – conducted at a minimum in the inception and final phase of the project)
Additional Information: Whether prosecutors’ offices maintain apparently complete records on: (a) all cases accepted for prosecution; (b) cases dismissed; and (c) charges for each case  Measurement: Field data gathered from a sample of active prosecution files to determine whether they contain complete information on: (a) when the case was accepted for prosecution; (b) the action taken in the case; (c) the nature of the charges for each case; and (d) the date of the next appearance. Rating: The indicator is rated using a four-point scale corresponding to the following four categories: 100% of files contain the relevant information (very good = 4); 75-99% of files contain the relevant information (good = 3); 50-74% of files contain the relevant information (poor = 2); less than 50% of files contain the relevant information (very poor = 1). Dynamic: Direction and level of change in percentage over time.  
Proportion of unresolved cases over X days/years
(Percentage)
Data Source: Public sector administrative data (may require a specialized study to be commissioned by the project – at least twice during implementation in order to enable evaluation of results)
Additional Information: Please specify the number of days or years in the indicator.
Prosecution success rates
(Percentage)
Data Source: Public sector administrative records, e.g. office of the Director of Public Prosecutions statistics and reports
Additional Information:
The percentage of cases (appeals/original proceedings) disposed of within established time guidelines
(Percentage)
Data Source: Public sector administrative data to be requested and analysed by the project at least twice during the implementation period
Additional Information: Please define the "established time guidelines" suitable for the local context.
Time from filing to disposition in cases of small financial value (disaggregated by fillers' age, sex, race, ethnicity, disability, etc.)
(Numeric)
Data Source: Public sector administrative data to be requested and analysed by the project at least twice during the implementation period
Additional Information: Please define "small financial value" that is suitable for the local context Number of days
Total backlog rate (disaggregated by category or type of cases)
(Numeric)
Data Source: Public sector administrative data (may require a specialized study to be commissioned by the project – at least twice during implementation in order to enable evaluation of results)
Additional Information: The total backlog refers to cases remaining unresolved at the end of the period, defined as difference between the total number of pending cases at the beginning of the period, and the cases resolved within the same period. Example: If there were 1,000 cases pending at the beginning of the calendar year, and the court terminated 750 cases during the calendar year, at the end of the calendar period there would be 250 cases that are calculated as total backlog
Result Indicator(s)
Output:
Developed legal and regulatory frameworks related to individual ...
Number of staff trained to supervise a merit-based recruitment exam for judicial officials (disaggregated by sex and category: administrative or judicial staff)
(Numeric)
Data Source: Database of training participants
Additional Information: This may include administrative and/or judicial staff.
Status of law/regulation supported by the project to provide a guaranteed tenure for judges (or prosecutors, please specify) who are appointed for fixed terms
(Qualitative)
Data Source: Document review: new law/regulation and analysis of key elements by the project
Additional Information: This indicator is appropriate at output level if the project will directly support the development or review of the new law or regulation. Tenure should be protected until retirement age or the expiration of a defined term of substantial duration. You can add more indicators if the project will also support training of staff on the application of the new law/regulation, or organize events to promote good practices in this field.  
Status of legislation on the composition of the Council for the Judiciary according to the nomination process
(Qualitative)
Data Source: Project progress reports
Additional Information: This legislation, drafted with support of the Action, should outline the effective independence of the self-governing body from the other State Powers, particularly the executive. NB: this Council does not exist in all countries. The EU JUSTICE SCOREBOARD 2018 includes numerous categories of the Council of the Judiciary members, e.g. judges (elected by their peers), judges (appointed or proposed by their peers), court presidents (ex officio), prosecutors (elected by their peers), members appointed by associations of lawyers/legal practitioners, members elected/appointed by the Parliament and others (Figure 49 on p. 38). However, there are no firm rules so the Action needs to examine different international good practices and standards in helping to develop this legislation (in particular, see Council of Europe).  
Status of regulation/law supported by the project for the merit-based selection/appointment of judicial officers
(Qualitative)
Data Source: Text of the law/regulation and analysis of key elements by the project
Additional Information: This indicator is appropriate at output level if the project will directly support the development or review of the new law or regulation. Please specify objective criteria for merit-based selection/appointment and who selects the judges (CEPEJ p. 31 no 111) – minimum criteria that the project needs to pursue. Issues to consider: composition of the Supreme Council for the Judiciary, e.g. to see if the members have political ties.   You can add more indicators if the project will also support training of staff on the application of the new law/regulation, or organize events to promote good practices in this field.  
Status of regulations on safeguards regarding the transfer/dismissal of judges (or prosecutors, please specify) without their consent (irremovability)
(Qualitative)
Data Source: Text of the law/regulation and analysis of key elements by the project  
Additional Information: This indicator is appropriate at output level if the project will directly support the development or review of the new law or regulation. Please define what is considered as a "safeguard" – what exactly is this particular project meant to pursue (i.e. a standard to be defined in a new regulation or law amendment)? You could consider following aspects (EU justice score card 2018): whether the transfer of judges without their consent is allowed, which authority can take the decision (the project could provide guidance on the best  practice…), and whether it is possible to appeal.  
Status of the regulation/law supported by the project on the protection of judges (or prosecutors, please specify) from arbitrary removal or punishment
(Qualitative)
Data Source: Text of the law/regulation and analysis of key elements by the project
Additional Information: This indicator is appropriate at output level if the project will directly support the development or review of the new law or regulation. You can add more indicators if the project will also support training of staff on the application of the new law/regulation, or organize events to promote good practices in this field.  
Status on legislation on the powers of the Councils for the Judiciary
(Qualitative)
Data Source: Project progress reports
Additional Information: This legislation, drafted with support of the Action, should ensure that the Council of the Judiciary, as a self-goverening body in charge of preserving the independence of judges, is in charge of dismissal, promotion, transfer, disciplinary proceedings and career management in general, so that this is not under the power of the executive. NB: this Council does not exist in all countries.
Result Indicator(s)
Output:
Developed legal and regulatory framework on the system of judicial self-administration in line with the principle of the separation of powers
Number of government/judicial/ parliamentary staff trained on the importance of separation of powers and good practices in this field is increased (disaggregated by sex and branch: executive, parliamentary, judicial)
(Numeric)
Data Source: Database of training participants (disaggregated by sex, branch, location and duration of training)
Additional Information:
Number of government/judicial/ parliamentary staff whose awareness of the importance of separation of powers and good practices in this field is increased (disaggregated by sex and branch: executive, parliamentary, judicial)
(Numeric)
Data Source: Database of pre- and post-training test results
Additional Information:
Number of laws/regulations supported by the project operationalising the system of judicial self-administration in line with the principle of the separation of powers
(Numeric)
Data Source: Project progress reports
Additional Information:
Result Indicator(s)
Output:
Promoted provision of adequate resources to justice sector (e.g. through awareness raising of relevant state actors and/or special conditions for budget support disbursement and policy dialogue in budget support operations)
Number of justice sector actors representatives trained by the action on resource management (or another topic, please specify), disaggregated by sex, institution, training topic, location, duration
(Numeric)
Data Source: Database of training participants (disaggregated by sex, institution, training topic, duration and location)
Additional Information: Please specify the training topic and if possible, target institution.
Result Indicator(s)
Output:
Developed or revised penitentiary legal and regulatory framework in line with international best standards (e.g. set up of a specialised body of prison guards, revision of prison conditions, etc.)
Number of convicts benefitting from work reintegration programmes through EU support (disaggregated by sex)
(Numeric)
Data Source: Project database of beneficiaries, disaggregated by sex
Additional Information:
Number of detention facilities constructed or renovated by the project (disaggregated by type of facility and location)
(Numeric)
Data Source: Project M&E system : technical reports on the construction/ renovation process, list disaggregated by type of facility and location, also specifying the estimated number of detainees and staff who will use the newly constructed or renovated facilities
Additional Information:
Number of detention facilities supported by the project to achieve the UN Standard Minimum Rules for the Treatment of Prisoner
(Numeric)
Data Source: Project’s technical reports on new infrastructure, expert needs assessment reports
Additional Information:
Number of detention facilities supported by the project to provide prisoner with clean water and sanitation installations
(Numeric)
Data Source: Project’s technical reports on new water and sanitation installation and water quality
Additional Information:
Number of detention facility staff trained on the UN Standard Minimum Rules for the Treatment of Prisoner
(Numeric)
Data Source: Training participants database (disaggregated by sex, training topic, duration and location), database of pre- and post- training scores, etc.
Additional Information:
Number of independent inspections visit of prisons carried out with project support
(Numeric)
Data Source: Project M&E system: reports on inspections the project supported
Additional Information:
Number of juvenile detention centres and rehabilitation/recuperation centers for children in conflict with the law constructed/renovated by the project
(Numeric)
Data Source: Project’s technical reports on the construction/renovation of detention centres
Additional Information:
Number of juveniles in adult prisons supported by the project
(Numeric)
Data Source: Project M&E system: database of beneficiaries who received services with project support (disaggregated by sex and age of beneficiary, as well as the type and location of detention facility)
Additional Information: In the indicator, please specify the type of support that the project will provide.
Number of probation officers trained by the project who can demonstrate increased knowledge in the relevant areas (disaggregated by sex, training topic, location and duration)
(Numeric)
Data Source: Project M&E system: a. database of training participants (disaggregated by sex, training topic, duration and location) b. database with results of pre- and post- training tests  
Additional Information: Please specify key topics that probation officers should be trained in. Please specify that the implementing partner should develop pre- and post-training tests to be administered to all training beneficiaries, with results to be reported on in each Progress Report. The design of pre- and post-training tests needs to be a separate activity to be conducted by a specialist once the training curriculum is approved.  
Number of water points and sanitation facilities constructed/upgraded in supported detention facilities
(Numeric)
Data Source: Project progress reports
Additional Information:
Status of computerised management system used by detention institutions for transfer management
(Qualitative)
Data Source: Project M&E system – progress reports
Additional Information:
Status of the [development/ revision/ implementation* ](of X])
(Qualitative)
Data Source: Project M&E system: progress reports
Additional Information: Please specify the penitentiary law/policy the project is working on directly, for example, the "Administration of Criminal Justice Act (ACJA) and Administration of Criminal Justice Laws at federal and state levels". Specific measurement will depend on the project context – for example, if laws/regulations already exist, then we are aiming for implementation. If they do not, we might start with drafting and aim for implementation at a later time.
Result Indicator(s)
Output:
Alternative measures to detention and imprisonment, and diversion and rehabilitation mechanisms developed and promoted (incl. for children in conflict with the law)
Number of detainees provided with VET, psycho-social support, access to healthcare (or please specify the type of support the project will provide) – disaggregated by sex, type of detention facility, location and age
(Numeric)
Data Source: Project M&E system: database of beneficiaries who received services with project support (disaggregated by sex and age of beneficiary, as well as the type and location of detention facility)
Additional Information: In the indicator, please specify the type of support that the project will provide.
Number of judges trained by the project to impose penalties alternative to incarceration who can demonstrate increased knowledge/use of alternative sentencing (disaggregated by sex of participants, duration, topic and location of the training)
(Numeric)
Data Source: Project M&E system: a. database of training participants (disaggregated by type and sex of official, training topic, duration and location) b. database with results of pre- and post- training tests Primary and secondary if an analytical study of sentencing before and after the training is required
Additional Information: Please specify that the implementing partner should develop pre- and post-training tests to be administered to all training beneficiaries, with results to be reported on in each Progress Report. The design of pre- and post-training tests needs to be a separate activity to be conducted by a specialist once the training curriculum is approved.
Result Indicator(s)
Specific Objective - Outcome:
Improved independence and impartiality of the judiciary
Average expert assessment score of the courts’ means to protect judges from threats, harassment, assault, assassination or intimidation
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system  
Additional Information: Whether the courts have the means and resources to protect judges from threats, harassment, assault, assassination or intimidation  Indicator 76 - Question: “To what extent do you agree that courts have the means and resources to protect judges from threats, harassment, assault, assassination or intimidation?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: fully agree (4); partly agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time. Note: A supplementary question is asked to determine the specific challenges faced by the courts in this regard.  
Average expert assessment score on whether judges experience delays in receiving their salaries
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Whether judges experience delays in receiving their salaries Question: “How frequently do judges experience delays in receiving their salaries?" Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: very rarely (4); sometimes (3); often (2); very often (1). Dynamic: Direction and level of change in average score over time.  
Average expert assessment score on whether prosecutors experience delays in receiving their salaries
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Whether prosecutors experience delays in receiving their salaries Question: “How frequently do prosecutors experience delays in receiving their salaries?" Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: very rarely (4); sometimes (3); often (2); very often (1). Dynamic: Direction and level of change in average score over time.  
Average expert assessment score on whether publicly funded defence counsels experience delays in receiving their professional fees or salaries
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “How frequently do publicly funded defence counsels experience delays in receiving their salaries or professional fees?" Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: very rarely (4); sometimes (3); often (2); very often (1). Dynamic: Direction and level of change in average score over time.  
Country score for judicial independence according to the Ibrahim Index of African Governance (IIAG)
(Numeric)
Data Source: Ibrahim Index of African Governance (IIAG) Data can be explored online here and can be downloaded from here
Additional Information: This indicator captures the independence of the judiciary from the influence of external actors; whether the judiciary has the ability and autonomy to interpret and review existing laws, legislation and policy; and the integrity of the process of appointing and removing national-level judges. It consists of three sub-indicators: 1. Independent Judiciary (BS) – Bertelsmann Transformation Index (BTI) (source: Bertelsmann Stiftung) 2. Judicial Autonomy (WEF) – Global Competitiveness Report (source: World Economic Forum) 3. Judicial Independence (GI) – IIAG calculation * NOTE: Only African countries covered. Could use Judicial independence index from the WEF for non-African countries. Score go from 1 to 100
Country score for the judicial process according to the Ibrahim Index of African Governance (IIAG)
(Numeric)
Data Source: Ibrahim Index of African Governance (IIAG)
Additional Information: This indicator captures the extent to which the legal process is free from interference, and the existence of formal judicial reasoning. It consists of two sub-indicators: 1. Strength and fairness of the judicial system (EIU) – Economist Intelligence Unit Dataset 2. Judicial Decisions (GI) – Africa Integrity Indicators * NOTE: Only African countries covered. Score numbers go from 1 to 100
Existence of legal safegurards preventing the transfer/dismissal of judges without their consent
(Qualitative)
Data Source: Baseline and endline assessments to be conducted by the Action
Additional Information: Qualitative information meaning Yes or No
General government expenditure on law courts as a % of GDP
(Percentage)
Data Source: Government budget and report on GDP
Additional Information:
Length of a probationary period for judges
(Numeric)
Data Source: Baseline and endline assessments to be conducted by the Action
Additional Information:
General government total expenditure on law courts (per inhabitant)
(Numeric)
Data Source: Government budget, population data from the National Statistics Institute
Additional Information: Euros Please ensure that the same source for population data is used for the inception and final calculations. If any calculations are made (i.e. to estimate population increase over a number of years), they should be explained in project reports.
Number of transfers/dismissals of judges without their consent
(Numeric)
Data Source: Baseline and endline assessments to be conducted by the Action
Additional Information:
Percentage of judges who have permanent tenure
(Percentage)
Data Source: Baseline and endline assessments to be conducted by the Action
Additional Information: This indicator helps measure the security of tenure for judges.
Percentage of members of the Judiciary Council who are elected by their peers
(Percentage)
Data Source: Baseline and endline assessments to be conducted by the Action
Additional Information: Judiciary Councils exist in some civil law countries (though in common law countries, we may also see a technical body with members from the judiciary, which plays a similar role in ensuring fair and merit-based criteria for the appointment of judges). This Council is in charge of appointment procedures for judges, transfer, discipline, career management, etc.
Percentage of the Ministry of Justice budget compared to overall public budget
(Percentage)
Data Source: Government budget
Additional Information:
Proportion of experts who agree that the government does not overturn judicial decisions
(Percentage)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Based on agreement with the statement  "The government does not overturn judicial decisions" (Likert-scale: agree/neutral/disagree)
Share of budget allocated to the justice sector on the implemented budget
(Percentage)
Data Source: Government budget and report on execution
Additional Information: Reports on budget execution may be delayed so data availability should be checked during the Inception Phase.
Result Indicator(s)
Specific Objective - Outcome:
Improved prison management and detention conditions in line with human rights standards
Average duration of pre-trial detention by type of case
(Numeric)
Data Source: Public sector administrative data to be requested by the project at least at the beginning and end of implementation
Additional Information: Please specify number of days or months, the calendar year or another.
Average expert assessment score on the adequacy of facilities used to detain children
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “How adequate are the facilities used to detain children?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: very adequate (4); mostly adequate (3); poor (2); very poor (1). Dynamic: Direction and level of change in average score over time. Note: A supplementary question is asked to determine the main issues with respect to the conditions of detention of children.  
Average expert assessment score on the adequacy of resources for transporting inmates to court hearings
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “To what extent do you agree that prisons have adequate resources to transport inmates to court hearings?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: fully agree (4); agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time. Note: A supplementary question is asked to help determine what the main issues are in this respect.  
Average expert assessment score on the adequacy of the efficiency of the mechanism for regular prison inspections and for following up on the issues identified during such inspections
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “To what extent do you agree that an efficient mechanism is in place for regular prison inspections and for following up on the issues identified during such inspections?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: fully agree (4); partly agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time.  
Average expert assessment score on the adequacy of the existing vetting process for ensuring that individuals who committed gross human rights abuses and other serious crimes are identified and prevented from serving as prison officers
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “To what extent do you agree that those who committed gross human rights abuses and other serious crimes are identified and prevented from serving as prison officers?” Rating: Average score of respondents on a four-point scale corresponding to the four response categories: fully agree (4); agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time.  
Average expert assessment score on the adequacy of the facilities used to detain women and girls
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “How adequate are the facilities used to detain women and girls?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: completely adequate (4); mostly adequate (3); poor (2); very poor (1). Dynamic: Direction and level of change in average score over time. Note: A supplementary question is asked to determine the main issues with respect to the conditions of detention of women.  
Average expert assessment score on the adequacy of the prison service’s resources and capacity for properly training new recruits
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “How would you rate the prison service’s resources and capacity to properly train new recruits?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: very good (4); good (3); poor (2); very poor (1). Dynamic: Direction and level of change in average score over time.
Average expert assessment score on the adequacy of the prison staff’s human rights training
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “How adequate is the human rights training received by prison staff?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: completely adequate (4); mostly adequate (3); poor (2); very poor (1). Dynamic: Direction and level of change in average score over time. Note: A supplementary question is asked to determine how such training should be improved.  
Average expert assessment score on the adequacy of training and skills of prison officers for responding to various prison situations without excessive use of force
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “To what extent do you agree that prison officers generally have the necessary skills and training to respond to various prison situations without excessive use of force?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: fully agree (4); agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time.  
Average expert assessment score on the effectiveness of the prison service’s administrative systems to support key management functions such as the management of finances, assets, procurement and human resources
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “How would you rate the administrative systems on which the prison service relies to perform key management functions such as the management of finances, assets, procurement and human resources?” Rating: Average score of relevant experts on a four-point scale corresponding to the four response categories: very good (4); good (3); poor (2); very poor (1). Dynamic: Direction and level of change in score over time. Note: A supplementary question is asked to identify the main strengths and weaknesses of these systems.  
Average expert assessment score on the extent to which prisons are managed in compliance with human rights standards
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “To what extent do you agree that prisons are managed in compliance with human rights standards?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: fully agree (4); agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time. Note 1: A supplementary question is asked to determine which aspects of prison management are particularly problematic from the point of view of human rights and children’s rights. Note 2: A second supplementary question is asked to determine whether experts believe that there is a difference with respect to compliance with children’s rights.  
Average expert assessment score on the prison service’s strategic planning and efficient budgeting capacity
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system. Document review (prison strategies and budget plans)
Additional Information: Question: “How would you rate the prison service’s capacity to plan its operations strategically and to budget efficiently?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: very good (4); good (3); poor (2); very poor (1). Dynamic: Direction and level of change in average score over time. Note: A document review will also establish whether recent strategic plans and budget forecast documents exist  
Average expert assessment score on the strength of the prison service’s record keeping and information management capacity
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system  
Additional Information: Question: “How would you rate the prison service’s record keeping and information management capacity?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: very good (4); good (3); poor (2); very poor (1). Dynamic: Direction and level of change in average score over time. Note: Field data will also be collected on the quality of the information contained in a sample of prison files.  
Average expert assessment score on the sufficiency of entry-level salaries of prison officers for recruiting and retaining qualified professionals
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “How adequate are entry-level salaries for prison officers in terms of recruiting and retaining qualified professionals?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: very adequate (4); barely adequate (3); inadequate (2); grossly inadequate (1). Dynamic: Direction and level of change in average score over time.  
Average expert assessment score on whether overcrowding is a serious problem in the country’s prisons
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “How serious is the problem of overcrowding in the country’s prisons?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: not a problem (4); a minor problem (3); a serious problem (2); a very serious problem (1). Dynamic: Direction and level of change in average score over time. Note 1: A supplementary question is asked to provide more information on where (region, type of institution) the problem is most severe. Note 2: Supplementary information is collected and reported, if possible, on the percentage of inmates housed in “overcrowded prisons” based on review of administrative data, when available, on prison capacity and prison population.  
Average expert assessment score on whether prison staff experience delays in receiving their salary
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “How frequently do prison staff experience delays in receiving their salary? Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: very rarely (4); sometimes (3); often (2); very often (1). Dynamic: Direction and level of change in average score over time.  
Average expert assessment score on the adequacy of existing mechanisms for hearing complaints registered by prisoners about their treatment in prison
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “To what extent do you agree that there exist adequate mechanisms for hearing complaints registered by prisoners about their treatment in prison?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: fully agree (4); partly agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time. Note: Supplementary questions are asked to determine whether adequate complaint mechanisms also exist in juvenile detention facilities and how they could be improved.  
Average expert assessment score on the prison leaders’ ability and determination to improve the capacity, integrity and performance of the prison service
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “How would you rate the ability and determination of prison leaders/managers to improve the performance of the prison service?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: very good (4); good (3); poor (2); very poor (1). Dynamic: Direction and level of change in average score over time.  
Average expert assessment score on whether and to what extent prisoners of all faiths and denominations are permitted to freely practise their religion
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “To what extent do you agree that prisoners of all faiths and denominations are permitted to freely practise their religion in prison?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: fully agree (4); agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time.  
Average expert assessment score on whether families are allowed to visit their imprisoned relatives without any kind of official or unofficial fee
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “Do you agree that families of prisoners are generally allowed to visit their imprisoned relatives without any kind of official or unofficial fee?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: fully agree (4); agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time. Note 1: A supplementary question is asked to help determine whether this is also true of family visits in the case of children in detention. Note 2: Where administrative data exist, the percentage of children in detention who have been visited by, or visited, a parent, guardian, or family member in the last three months will be calculated and reported.  
Average expert assessment score on whether the professional health care generally available to prisoners is adequate
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “How adequate is the professional health care generally available to prisoners?” Rating: Average score of all relevant experts on a four point scale corresponding to the following four response categories: very adequate (4); adequate (3); inadequate (2); very inadequate (1). Dynamic: Direction and level of change in average score over time. Note: A supplementary question is asked to help determine whether the same is true for women prisoners.  
Average expert perception score on whether prisons provide food of sufficient nutritional value for the prisoners to remain healthy and strong
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: “To what extent do you agree that prisons generally provide prisoners with food of sufficient nutritional value to remain healthy and strong?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: fully agree (4); agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time. Note: Supplementary information is collected and reported, if available (based where possible on field data), on the average percentage of minimum recommended daily calories received by prisoners in selected prisons.  
Average expert score on the quality of the prisons' clean water and sanitation installations
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: Question: "How would you rate the prisons' supply of clean water and sanitation installations?" Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: very good (4); good (3); poor (2); very poor (1). Dynamic: Direction and level of change in average score over time.  
Average expert score on whether detention of children is used only as a last resort
(Numeric)
Data Source: At least two rounds of public surveys as part of the project M&E system
Additional Information: UN Rule of Law indicators ( indicator 73): Question: “Would you agree that detention is used only as a measure of last resort and for the shortest possible period of time in all cases involving children as defendants?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: strongly agree (4); agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time. Note: When national juvenile justice sentencing data exist, the percentage of sentenced children receiving a custodial sentence in a given year will be calculated and reported together with the main findings in order to help quantify the justice system’s reliance on detention as a response to youth crime.  
Extent to which the prison system adheres to the UNs Standard Minimum Rules for the Treatment of Prisoners (also known as Nelson Mandela Rules), with regards to different aspects of prison conditions (such as prisoner's nutrition...*)
(Qualitative)
Data Source: Expert surveys for specific aspects of prison conditions – to be commissioned by the project
Additional Information: *such as prisoner's nutrition, clean water and sanitation, separate detention of male and female prisoners as well as children, family visits,  quality of health care services etc.) Use if information is available at the overall level. If comprehensive data is not available, please use the more specific indicators below.
Number of cases of use of bail and alternatives to imprisonment
(Numeric)
Data Source: Public sector administrative data to be requested by the project at least at the beginning and end of implementation
Additional Information:
Number of children in detention per 100,000 child population (disaggregated by sex)
(Numeric)
Data Source: Public sector administrative data to be requested by the project at least at the beginning and end of implementation
Additional Information: Measurement: Percentage of children in detention not wholly separated from adults, divided by the total number of children in detention, multiplied by 100.
Number of children in pre-sentence detention per 100,000 child population
(Numeric)
Data Source: Public sector administrative data to be requested by the project at least at the beginning and end of implementation
Additional Information: Measurement: Number of children in pre-sentence detention per 100,000 child population. Rating: Not rated. Dynamic: Direction and level of change in the number over time.  
Number of children in pre-sentence detention per 100,000 child population
(Numeric)
Data Source: Public sector administrative data to be requested by the project at least at the beginning and end of implementation
Additional Information: Measurement: Number of children in pre-sentence detention per 100,000 child population. Rating: Not rated. Dynamic: Direction and level of change in the number over time.  
Number of implemented recommendations of the National Preventive Mechanism (of UN Convention on Torture)
(Numeric)
Data Source: Government and civil society shadow reports (external analysis may need to be commissioned by the project)
Additional Information: Measurement: number of recommendations Qualitative aspects to be analysed: •    To what extent have recommendations been implemented? •    Which recommendations have been implemented and which ones are delayed and why?
Number of non-violent deaths per 1,000 prisoners within the last 12 months
(Numeric)
Data Source: Public sector administrative data to be requested by the project at least at the beginning and end of implementation
Additional Information: Measurement: Number of non-violent deaths of prisoners within the last 12 months divided by the prison population (e.g., average monthly count), multiplied by 1,000. Dynamic: Changes in the number of non-violent deaths over time. Note: When possible, data disaggregated by gender and by age will be used. This makes it possible to determine how many children, if any, died a non-violent death while in prison.  
Number of prisoners per prison officer
(Numeric)
Data Source: Public sector administrative data to be requested by the project at least at the beginning and end of implementation
Additional Information: Measurement: The number of prisoners divided by the number of prison officers on a representative, specified day of the year. Dynamic: Changes in the ratio over time. Note: Wherever possible, administrative data on the number of children in detention will also be obtained and reported.
Number of reported cases of arbitrary detention (e.g., as reported to the Working Group on Arbitrary Detention) in the reporting period
(Numeric)
Data Source: Project M&E system: baseline and endline studies to be conducted by the project
Additional Information:
Percentage of all detainees who have been held in detention for more than 12 months while awaiting sentencing or a final disposition of their case (disaggregated by sex)
(Percentage)
Data Source: Public sector administrative data to be requested and analysed by the project at least twice during the implementation period
Additional Information: Measurement: Percentage of prison detainees on a given date who have been held in detention for more than 12 months while awaiting sentencing or another final disposition of their case (excluding appeals). Rating: Not rated. Dynamic: Direction and level of change in the percentage over time. Note: Data on child detainees should also be collected and reported when available.  
Percentage of all detainees who have been held in detention for more than 12 months while awaiting sentencing or a final disposition of their case (excluding appeals)
(Percentage)
Data Source: Public sector administrative data to be requested by the project at least at the beginning and end of implementation
Additional Information: Measurement: Percentage of prison detainees on a given date who have been held in detention for more than 12 months while awaiting sentencing or another final disposition of their case (excluding appeals). Rating: Not rated. Dynamic: Direction and level of change in the percentage over time. Note: Data on child detainees should also be collected and reported when available.  
Percentage of children in detention not wholly separated from adults
(Percentage)
Data Source: Public sector administrative data to be requested by the project at least at the beginning and end of implementation
Additional Information: Measurement: Percentage of children in detention not wholly separated from adults, divided by the total number of children in detention, multiplied by 100.
Percentage of female prisoners who are held completely separately from male prisoners
(Percentage)
Data Source: Public sector administrative data to be requested by the project at least at the beginning and end of implementation
Additional Information: Women detained separately from male prisoners – Whether and to what extent female prisoners are kept separate from male prisoners Measurement: Percentage of female prisoners who are held completely separately from male prisoners. Dynamic: Changes in the percentage over time.  
Percentage of pre-trial detainees who are held completely separated from convicted prisoners (disaggregated by sex)
(Percentage)
Data Source: Public sector administrative data to be requested by the project at least at the beginning and end of implementation
Additional Information:
Percentage of prison population with access to vocational education and training / medical care (disaggregated by sex)
(Percentage)
Data Source: Public sector administrative data to be requested by the project at least at the beginning and end of implementation
Additional Information: This is a general outcome, about the situation overall. If the project is directly providing vocational education and training, or medical care, please move this indicator to output level.
Percentage of prisoners who have been examined by a qualified medical professional at the time of their admission to prison (disaggregated by sex)
(Percentage)
Data Source: Public sector administrative data to be requested by the project at least at the beginning and end of implementation
Additional Information: Measurement: The percentage of prisoners admitted to prison during a year who were examined by a qualified medical professional at the time of their admission. Dynamic: Changes in the percentage over time.  
Percentage of sentenced children receiving a custodial sentence in reporting period
(Percentage)
Data Source: Public sector administrative data to be requested by the project at least at the beginning and end of implementation
Additional Information: Please specify the reporting period (i.e. last 12 months, the calendar year, or another).
Percentage of survey respondents who fully or partly agree that agree discrimination against certain groups of prisoners is a problem in the country’s prisons (disaggregated by sex)
(Percentage)
Data Source: At least two rounds of public surveys as part of the project M&E system
Additional Information: Question: “To what extent do you agree that discrimination against certain groups of prisoners is a problem in the country’s prisons?” Rating: Average score of respondents on a four-point scale corresponding to the following four response categories: fully agree (4); agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time.  
Proportion of bail applications accepted by the court in the reporting period
(Percentage)
Data Source: Public sector administrative data to be requested by the project at least at the beginning and end of implementation
Additional Information: Please specify the reporting period (i.e. last 12 months, the calendar year, or another).
Unsentenced detainees as a proportion of overall prison population
(Percentage)
Data Source: SDG database
Additional Information: SDG Tier I indicator 16.3.2 (i.e. Indicator conceptually clear, established methodology and standards available and data regularly produced by countries) Possible custodian agency: UNODC  
Result Indicator(s)
Specific Objective - Outcome:
Right to a fair trial and equality before the law is ensured
Average expert assessment score on whether judges impose different punishments for the same type of crime based on a defendant’s or victim’s personal or ethnic characteristics
(Numeric)
Data Source: At least two rounds of expert surveys as part of the project M&E system
Additional Information: UN Rule of Law indicators (n. 69): Equal application of the law by judges Question: “How likely are judges to impose different punishments for the same type of crime, for example an armed assault, based on the defendant’s or the victim’s personal or ethnic characteristics?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: very unlikely (4); somewhat unlikely (3); likely (2); very likely (1). Dynamic: Direction and level of change in average score over time.  
Average score of expert perception on the protection of the rights of defendants and victims
(Numeric)
Data Source: At least two rounds of expert survey as part of the project M&E system
Additional Information: UN Rule of Law indicators (46): Whether the rights of victims and defendants are sufficiently protected during criminal court proceedings stion: “To what extent do you agree that the rights of victims and defendants are sufficiently protected during criminal court proceedings?” Rating: Average score of all relevant experts on a four-point scale corresponding to the following four response categories: fully agree (4); partly agree (3); disagree (2); strongly disagree (1). Dynamic: Direction and level of change in average score over time  
Conviction rates for [group X] defendants provided with legal representation (represented as a ratio of conviction rates for defendants with lawyer of their own choice)
(Percentage)
Data Source: Ministry of Justice records if available. Otherwise the project will need to conduct specialized studies (surveys) at the beginning and end of implementation
Additional Information: Please define group X, e.g. indigent/ethnic minorities/ another appropriate marginalised group if necessary. Public data (statistics) on this issue may not be collected by the government/courts so the project should be strongly encouraged from the inception phase to report on available data and plans for filling any gaps.
Result Indicator(s)
Impact:
To promote and protect the rule of law and human rights for all
Bertelsmann Transformation Index (BTI) – Rule of law index score
(Numeric)
Data Source: BTI Atlas
Additional Information: Bertelsmann Stiftung Foundation - The Bertelsmann Transformation Index (BTI) analyses and evaluates the quality of democracy, a market economy and political management in 129 developing and transition countries. It measures successes and setbacks on the path toward a democracy based on the rule of law and a socially responsible market economy. The state of political transformation is measured in terms of five criteria (Stateness, Political Participation; Rule of Law; Stability of democratic institutions; Political and social integration), which in turn are derived from assessments made in response to 18 questions Rule of law is measured by the normative statement: "State powers check and balance one another and ensure civil rights." Thresholds determine five categories: Excellent (8.50 – 10.00), Sound (6.50 – 8.49), Fair (4.50 – 6.49), Flawed (2.50 – 4.49), Poor (1.00 – 2.49)  
Country score for rule of law according to the Worldwide Governance Indicators (WGI) Project
(Numeric)
Data Source: Worldwide Governance Indicators (WGI)
Additional Information: The WGI are a research dataset summarizing the views on the quality of governance provided by a large number of enterprise, citizen and expert survey respondents in industrial and developing countries. These data are gathered from a number of survey institutes, think tanks, non-governmental organizations, international organizations, and private sector firms.  This indicator reflects perceptions of the extent to which agents have confidence in and abide by the rules of society, and in particular the quality of contract enforcement, property rights, the police, and the courts, as well as the likelihood of crime and violence. Estimates of governance range from approximately -2.5 (weak) to 2.5 (strong) governance performance. Please check the standard errors when comparing over time or between countries. NOTE: This indicator also corresponds to EU RF Level 1 #4  
Country score or ranking in the World Justice Project Rule of Law Index (please select whether you prefer to use the country score or its global ranking in the world)
(Numeric)
Data Source: The World Justice Project
Additional Information: The WJP Rule of Law Index captures adherence to the rule of law (as defined by the WJP’s universal principles above) based on 9 factors: 1. Constraints on government power; 2. Absence of Corruption; 3. Open Government; 4. Fundamental Rights; 5. Order and Security; 6. Regulatory Enforcement; 7. Civil Justice; 8. Criminal Justice; 9. Informal Justice, further disaggregated into 47 specific sub-factors. 
Ibrahim Index of African Governance (IIAG) – Overall Rule of Law score
(Numeric)
Data Source: Ibrahim Index of African Governance (IIAG)
Additional Information: The 2016 IIAG consists of one Overall Governance score, four categories (1. Safety & Rule of Law; 2. Participation & Human Rights; 3. Sustainable Economic Opportunity; 4. Human Development).  Here are we interested in the overall Rule of Law Score (category 2). More specific components of RoL also measured by the IIAG (e.g. judicial independence) are used for lower levels of the results chain. Please check the standard errors when comparing over time or between countries. * NOTE: Only African countries covered.  
Result Indicator(s)
Specific Objective - Outcome:
Improved transparency and accountability of the judicial system
Number of sanctions pronounced against lawyers (disaggregated by type of sanctions)
(Numeric)
Data Source: Public sector administrative data to be requested by the project at least at the beginning and end of implementation
Additional Information: “Sanctions” refers to disciplinary action (not criminal charges).