Skip to main content

Main menu

  • Home
  • Content
    • Current
    • Ahead of print
    • Archive
    • Supplementary Material
  • Info for
    • Authors
    • Subscribers
    • Institutions
    • Advertisers
  • About Us
    • About Us
    • Editorial Board
  • Connect
    • Feedback
    • Help
    • Request JHR at your library
  • Alerts
  • Free Issue
  • Special Issue
  • Other Publications
    • UWP

User menu

  • Register
  • Subscribe
  • My alerts
  • Log in
  • My Cart

Search

  • Advanced search
Journal of Human Resources
  • Other Publications
    • UWP
  • Register
  • Subscribe
  • My alerts
  • Log in
  • My Cart
Journal of Human Resources

Advanced Search

  • Home
  • Content
    • Current
    • Ahead of print
    • Archive
    • Supplementary Material
  • Info for
    • Authors
    • Subscribers
    • Institutions
    • Advertisers
  • About Us
    • About Us
    • Editorial Board
  • Connect
    • Feedback
    • Help
    • Request JHR at your library
  • Alerts
  • Free Issue
  • Special Issue
  • Follow uwp on Twitter
  • Follow JHR on Bluesky
Research ArticleArticles
Open Access

Can Public Rankings Improve School Performance?

Evidence from a Nationwide Reform in Tanzania

Jacobus Cilliers, Isaac M. Mbiti and Andrew Zeitlin
Journal of Human Resources, July 2021, 56 (3) 655-685; DOI: https://doi.org/10.3368/jhr.56.3.0119-9969R1
Jacobus Cilliers
Jacobus Cilliers is at Georgetown University
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: ejc93{at}georgetown.edu
Isaac M. Mbiti
Isaac M. Mbiti is at University of Virginia, J-PAL, BREAD, NBER, and IZA
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Andrew Zeitlin
Andrew Zeitlin is at Georgetown University and Center for Global Development
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Supplemental
  • Info & Metrics
  • References
  • PDF
Loading

Article Figures & Data

Figures

  • Tables
  • Additional Files
  • Figure 1
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 1 Exam Performance by Within-District Decile Rank—Pre- vs. Postreform

    Notes: All panels show regression coefficients and 90 percent confidence intervals, estimated using Equation 1. In Panels A and C the light lines refer to the prereform period Embedded Image, and the dark lines refer to the postreform period Embedded Image. Panels B and D shows results for the difference in these ranking decile effects between pre- and post periods Embedded Image. In both the pre- and postreform periods, schools are compared to schools in the middle two deciles of the district rank. In Panels A and B the outcome is a school average exam performance, scaled from zero to 250; in Panels C and D the outcome is pass rate.

Tables

  • Figures
  • Additional Files
    • View popup
    Table 1

    Descriptive Statistics

    201120122013201420152016
    Panel A: NECTA
    Average marks111.3185.96102.76108.65120.41119.17
    (26.76)(22.28)(24.51)(26.14)(30.17)(27.60)
    Average number of test-takers56.3453.9749.9247.4448.29
    (37.82)(37.92)(36.44)(36.08)(37.49)
    Average number of passed candidates17.3127.3128.4532.1833.97
    (22.89)(28.95)(28.67)(30.48)(31.42)
    Average pass rate0.290.490.560.670.69
    (0.25)(0.28)(0.28)(0.27)(0.25)
    Total number of test-takers1,010,084865,534844,921792,118763,602789,236
    National pass rate0.580.310.510.570.680.70
    Number of schools14,93915,36215,65615,86716,09616,344
    Number of districts136136151166166184
    Panel B: EMIS Data
    Private school0.050.05
    (0.22)(0.22)
    Grade 4 enrollment67.3263.46
    (50.76)(49.45)
    Grade 5 enrollment63.3361.83
    (47.68)(46.71)
    Grade 6 enrollment61.6861.58
    (47.23)(45.93)
    Grade 7 enrollment49.3648.82
    (38.99)(38.11)
    Total number of schools16,17816,351
    Panel C: Service Delivery Indicators
    Average classroom presence0.520.59
    (0.23)(0.22)
    Average number of teachers17.4717.48
    (1.73)(2.39)
    Capitation grants received (TSh/student)5,1345,966
    (3,891)(17,796)
    Textbooks received per student0.120.63
    (0.45)(0.84)
    Number of inspections1.561.45
    (2.65)(1.74)
    Number of schools395395
    • Notes: Panel A shows school-average examinations performance, collected by the National Examination Council of Tanzania (NECTA); Panel B reports enrollment data, according to the Education Management Information System (EMIS). School-level EMIS data are only available for 2015 and 2016. Panel C reports the summary data for key variables used from the Service Delivery Indicators (SDI) data set. The SDI data were collected in 2014 and 2016, but some variables were collected using the previous year (2013 or 2015) as the reference period. Means and standard deviations reported, unless otherwise noted. Capitation (or per-pupil) grants are reported in nominal Tanzanian shillings (TSh).

    • View popup
    Table 2

    Impacts of the Reform on School Exam Performance

    MarksPass RateNumber Passed
    (1)(2)(3)(4)(5)(6)
    0–10th percentile in previous year4.406***6.147***0.058***0.079***1.828*2.180***
    (1.004)(0.849)(0.012)(0.011)(1.026)(0.754)
    10–20th percentile in previous year2.049***2.382***0.024***0.030***0.9300.852**
    (0.563)(0.578)(0.006)(0.007)(0.845)(0.379)
    Diff-diffYesYesYesYesYesYes
    School fixed effectsNoYesNoYesNoYes
    Control lagged exam scoreYesYesYesYesYesYes
    Control mean, post BRN109.46109.460.580.5830.8230.82
    Observations77,73177,43177,73177,43177,73177,431
    R20.6550.8010.6070.7630.4250.912
    • Notes: Each column represents a separate regression. All specifications include district-by-year fixed effects, flexible controls for lagged test scores, and indicators for prereform associations between district-rank deciles and subsequent outcomes. Reported coefficients correspond to the differential effect of being ranked in the associated decile of within-district performance in the post- (vs. pre-) reform period, compared to the middle six deciles. In even columns, the specification is augmented with school fixed effects. In Columns 1 and 2 the outcome is the average PSLE score (ranging from 0–250), in Columns 3 and 4 it is the pass rate, and in Columns 5 and 6 it is the number of pupils who passed. 300 singleton schools are dropped when results are estimated using school fixed effects. Standard errors are clustered at the district level.

    • View popup
    Table 3

    Number of Test-Takers and Enrollment

    PSLE Data—Exam-SittersEMIS Data—Enrollment
    All years2015 and 2016Grades 4–6Grade 6Grade 7Grade 7/Grade 6
    (1)(2)(3)(4)(5)(6)
    0–10th percentile in previous year−2.039***−1.676***−0.773−0.329−1.646**−0.028**
    (0.763)(0.589)(1.787)(0.882)(0.737)(0.013)
    10–20th percentile in previous year−1.846***−1.068**1.9290.865−0.479−0.008
    (0.635)(0.423)(1.334)(0.659)(0.480)(0.010)
    Diff-diffYesNoNoNoNoNo
    Control lagged exam scoreYesYesYesYesYesYes
    Fixed effectsYesYesYesYesYesYes
    Control mean52.0150.07197.1464.1051.550.84
    Observations77,43131,52031,15031,15031,15015,828
    R20.9100.9530.9680.9300.9310.127
    • Notes: Each column represents a separate regression. All specifications include flexible controls for lagged test scores and school and district-by-year fixed effects. Column 1 is estimated on outcomes from 2012–2016, including indicators for prereform associations between district-rank deciles and subsequent outcomes. Reported coefficients in that column correspond to the differential effect of being ranked in the associated decile of within-district performance in the post- (vs. pre-) reform period, compared to the middle six deciles. In Columns 2–6, data are restricted to postreform years 2015–2016 in which EMIS data are available; this does not allow for a difference-in-difference specification. In Columns 1 and 2 the outcome variable is the number students sitting the exam, using PSLE data. In Columns 3–6 outcomes are different constructions of enrollment, based on EMIS data. The outcome in Columns 3–5 is the number of students enrolled in the corresponding grade(s). In Column 6 the outcome indicator is Grade 7 enrollment in 2016 divided by Grade 6 enrollment in 2015. Standard errors are clustered at the district level.

    • View popup
    Table 4

    District Ranking Impacts on Monitoring, Teacher Effort, Resource Spending, and Allocation

    TeachersTextbooksCapitation GrantsInspectionsTeacher Presence
    (1)(2)(3)(4)(5)
    0–10th percentile in previous year0.9210.1100.202−0.298−0.036
    (0.665)(0.174)(0.976)(0.544)(0.082)
    10–20th percentile in previous year−0.202−0.139−1.0520.782−0.015
    (0.500)(0.084)(-0.669)(0.552)(0.057)
    Post-BRN mean: 20–80th percentile17.880.383857.171.490.54
    Observations760754756758760
    • Notes: Each column represents a separate regression, estimated using SDI data, with flexible controls for lagged test scores and district-by-year and school fixed effects. Coefficients correspond to the effect of being ranked in the associated decile of within-district performance in the postreform period, compared to the middle six deciles. The SDI data were collected in 2014 and 2016 (the postreform period), but some variables were collected using the previous year (2013 or 2015) as the reference period. For each column, only two years of data are available: Columns 1, 4, and 5 use outcomes for the years of 2014 and 2016, and Columns 2 and 3 use outcomes for the years 2013 and 2015. The dependent variables in Columns 2 and 3 are inverse hyperbolic sine transformations (an approximation for the natural logarithm) and calculated at a per-student level, using enrollment data from 2014. Data from Column 3 are reported in Tanzanian shillings. The mean values reported in the penultimate row is of the untransformed outcome. We adjust for outliers in the following way: (i) we adjust downwards the per-student capitation grant to the maximum that a school can receive, 10,000 Tanzanian shillings; (ii) we set as missing one school that reported receiving 600 textbooks per student. Since the specifications include school fixed effects, schools with only one observation are dropped. Standard errors are clustered at the district level.

    • View popup
    Table 5

    Robustness Checks

    District PerformanceSTEP ProgramNegative ShockSmall School
    BelowAboveSTEPOtherShockNoneSmallestLarge
    (1)(2)(3)(4)(5)(6)(7)(8)
    0–10th percentile in previous year4.155***5.572***5.871***4.313***−18.5582.940***3.817*4.329***
    (1.207)(1.636)(1.509)(1.305)(13.720)(0.482)(2.263)(0.953)
    10–20th percentile in previous year1.833***2.770***2.764***1.942**−3.3790.917***2.0681.722***
    (0.679)(0.856)(0.702)(0.763)(33.090)(0.344)(1.453)(0.551)
    Diff-diffYesYesYesYesNoNoYesYes
    Fixed effectsNoNoNoNoYesYesNoNo
    Observations41,09636,63525,39552,33693256,46315,10862,623
    R20.5800.6420.6690.6500.8640.8150.6560.664
    • Notes: Each coefficient refers to the decile of within-district performance rank, compared to the middle six deciles. The outcome variable is average school performance, which can take values of 0–250). In Column 1 the sample is restricted to the bottom half of districts, in terms of a district’s average school performance on the previous year exam; in Column 2 the sample is restricted to the top half. In Column 3 the sample is restricted to districts where the STEP remedial education training took place; in Column 4 it is restricted to districts where it did not take place. In Column 5, the sample is restricted to schools that dropped 30 percentiles in its national rank between year t and t – 1. Schools in Column 5 did not experience such test score declines. In Columns 5 and 6 we do not difference out the baseline relationship between rank and performance, since we do not have data for performance in 2010 so do not know which schools in 2011 experienced a large drop. In Column 7, the sample is restricted to smallest quintile of schools—measured in the number of test-takers in 2012. Column 8 is the complement of Column 7. Standard errors are clustered at the district level.

Additional Files

  • Figures
  • Tables
  • Free alternate access to The Journal of Human Resources supplementary materials is available at https://uwpress.wisc.edu/journals/journals/jhr-supplementary.html

    • 0119-9969R1_supp.pdf
PreviousNext
Back to top

In this issue

Journal of Human Resources: 56 (3)
Journal of Human Resources
Vol. 56, Issue 3
1 Jul 2021
  • Table of Contents
  • Table of Contents (PDF)
  • Index by author
  • Back Matter (PDF)
  • Front Matter (PDF)
Print
Download PDF
Article Alerts
Sign In to Email Alerts with your Email Address
Email Article

Thank you for your interest in spreading the word on Journal of Human Resources.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Can Public Rankings Improve School Performance?
(Your Name) has sent you a message from Journal of Human Resources
(Your Name) thought you would like to see the Journal of Human Resources web site.
Citation Tools
Can Public Rankings Improve School Performance?
Jacobus Cilliers, Isaac M. Mbiti, Andrew Zeitlin
Journal of Human Resources Jul 2021, 56 (3) 655-685; DOI: 10.3368/jhr.56.3.0119-9969R1

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
Can Public Rankings Improve School Performance?
Jacobus Cilliers, Isaac M. Mbiti, Andrew Zeitlin
Journal of Human Resources Jul 2021, 56 (3) 655-685; DOI: 10.3368/jhr.56.3.0119-9969R1
Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One
Bookmark this article

Jump to section

  • Article
    • Abstract
    • I. Introduction
    • II. Context and Reform
    • III. Data and Descriptive Statistics
    • IV. Empirical Strategy
    • V. Results
    • VI. Robustness
    • VII. Discussion
    • Footnotes
    • References
  • Figures & Data
  • Supplemental
  • Info & Metrics
  • References
  • PDF

Related Articles

  • Google Scholar

Cited By...

  • School Accountability, Long-Run Criminal Activity, and Self-Sufficiency*
  • Google Scholar

More in this TOC Section

  • Early-Life Exposure to the Great Depression and Long-Term Health and Economic Outcomes
  • Does the Gender Wage Gap Influence Intimate Partner Violence in Brazil?
  • Free Movement of Workers and Native Demand for Tertiary Education
Show more Articles

Similar Articles

Keywords

  • I21
  • I25
  • I28
  • O15
UW Press logo

© 2025 Board of Regents of the University of Wisconsin System

Powered by HighWire