Abstract
Merit‐based school choice often presents an unfulfilled promise of educational mobility. In Ghana, where a standardized exam determines secondary school admission, students from low‐performing elementary schools apply to weaker secondary schools than equally qualified students from high‐performing elementary schools. This study investigates why students with the same academic potential make different application choices. I outline a theoretical model and empirical strategy to analyze heterogeneity in student demand. Using administrative data, I show that disadvantaged students value school proximity more and live farther away from high‐performing schools, suggesting the interaction between demand and the spatial distribution of schools limits educational mobility.
I. Introduction
Few issues in public discourse on education have attracted as much attention as the imperative of reducing inequality. Policymakers and researchers increasingly recognize open enrollment school choice systems as a way to address disadvantage by giving children in low‐performing schools the chance to move to higher‐performing schools. Merit‐based systems are particularly appealing because admission priorities reward a student’s individual performance instead of favoring luck, residential location, or family connections. Yet school choice can only foster educational mobility if parents and students make expedient choices.
It is therefore concerning that studies of school choice consistently find that disadvantaged students forgo opportunities to attend better schools. Figure 1 plots secondary school application behavior by educational background for elementary school students in Ghana. Although a standardized exam determines admission, students from low‐performing elementary schools apply to lower‐performing secondary schools than students from high‐performing elementary schools with the same exam scores. Disadvantaged students applying to less selective schools than they merit is a global phenomenon that has been widely documented, especially regarding college applications in the United States (for example, Manski and Wise 1983; Avery and Kane 2004; Bowen, Chingos, and McPherson 2009; Hoxby and Avery 2013).
Exam Performance and Application Choices
Notes: Figure illustrates differences in academic performance of selected secondary schools for students with the same ninth grade basic education certificate exam (BECE) scores but from low‐performing public, high‐performing public, and private junior high schools (JHSs). Secondary school performance is a normalized measure of the average percentage of students earning a credit in the math and English 12th grade secondary school certification exam (SSCE) for the period 2003–2008. Low‐performing public JHSs are those where the average BECE score of students is below the median for public schools in the country; high‐performing public JHSs are those with above‐median performance.
This study investigates why students with the same academic potential make different application choices. Existing literature has identified at least two possible explanations: (i) utility maximization and (ii) decision‐making ability. Under the first hypothesis, disadvantaged students make utility‐maximizing decisions not to apply to higher‐performing schools. Their choices could stem from a preference to attend nearby schools or those with peers from a familiar background (Hastings, Kane, and Staiger 2008; Burgess et al. 2015; Walters 2018). They could also reflect lower actual or expected returns from attending high‐performing schools, which may arise from differences in expected completion rates (for example, Dustan, de Janvry, and Sadoulet 2017). Additionally, students’ choices could reflect their concerns about school attendance costs, given their budget constraints. Under the second hypothesis, disadvantaged students would optimally choose to attend high‐performing schools if they were fully informed and strategically sophisticated, but they have limited information and are prone to application mistakes because of difficulties identifying or adopting the optimal application strategy (for example, Abdulkadiroğlu et al. 2006; Hastings and Weinstein 2008; Pathak and Sӧnmez 2008; Lai, Sadoulet, and de Janvry 2009; Lucas and Mbiti 2012; Hoxby and Turner 2013; Pallais 2015).
I use rich administrative data on secondary school applications in Ghana to test the first hypothesis, that students from disadvantaged backgrounds have a weaker demand for attending selective schools. Although this paper focuses on the role of demand, the goal is not to reject the second hypothesis definitively and rule out the role of information challenges or other potential explanations, but rather to examine whether there is any evidence supporting the first hypothesis that students from disadvantaged backgrounds are less willing to attend high‐performing schools. I do not directly observe any indicators of family background, so I proxy for students’ socioeconomic status using the type of junior high school they attended and classify disadvantaged students as those coming from low‐performing public schools relative to those coming from high‐performing public schools or from private schools.
A key identification challenge is how to uncover preferences from observed application lists in a setting where students have incentives to select their choices strategically. I formulate a generalizable school choice model and generate precise predictions to test for differences in preferences using two complementary empirical methodologies. The first approach looks at the ordering of schools in students’ ranked lists, based on the insight that students do not have any incentives to misrepresent the rank ordering of their submitted choices once they have decided on the set of schools to include on their list. Analyzing the ordering of schools in each student’s ranked list, I show that disadvantaged students are significantly more likely to rank nonselective schools above more selective schools on their lists, indicating a weaker preference for attending selective schools that cannot be explained by differences in strategic considerations. Using secondary school exam performance as an alternative measure of school quality, I similarly find that disadvantaged students are relatively more likely to rank a lower‐performing secondary school above higher‐performing schools on their list.
The second approach evaluates students’ preferences for observable school characteristics. Here, I focus on the characteristics of a student’s first choice school. To eliminate the possibility of strategic application behavior, I restrict students’ choice sets to schools of similar selectivity and then estimate a discrete choice model. The results indicate that disadvantaged students have stronger preferences for schools in close proximity and place relatively less weight on schools’ academic performance. Combined with evidence that disadvantaged students are less likely to have high‐performing schools nearby, this analysis supports the argument that the interaction between demand and the spatial distribution of schools limits educational mobility. This estimation approach relies on a simplifying assumption about student information. In modeling students’ beliefs about their admission chances, I assume that students have perfect information about school selectivity, even though they may still face uncertainty about their own exam performance. To provide support for my assertion that student application behavior is not driven by a lack of information about school selectivity in this context, I show that school rankings in Ghana are stable over time, and I conduct several robustness checks, including focusing on students in areas with historically stable school rankings. Nevertheless, information remains a potential explanation for some of the differences in school choice behavior that I observe.
As an alternative approach to address the possibility that students omit more‐preferred selective schools from their rank‐ordered lists because of low expected admission chances, I estimate an additional set of results using the method proposed by Artemov, Che, and He (2021). This strategy estimates students’ likelihood of selecting their assigned school out of a set of other clearly feasible choices. I find similar preference patterns, reinforcing my conclusions that students from disadvantaged backgrounds place a higher value on school proximity.
My results broaden the perspective of research on this topic. Much of the existing school choice literature analyzes lottery‐based admission systems. My analysis of Ghana’s merit‐based system provides new insights about application differences in situations where individual performance determines admission and where strategic behavior may affect outcomes. Moreover, much of the educational mobility literature focuses on high‐income settings. These findings from Ghana temper growing optimism over the efficacy of providing information or simplifying the application process as a means to encourage disadvantaged students to apply to higher‐performing schools. Efforts such as these are likely to succeed in contexts with free schooling or where financial aid programs enable low‐income students to attend selective schools at a subsidized cost. In lower‐income settings without these provisions, the barriers to educational mobility will likely be harder to overcome.1
Understanding what restricts educational mobility has crucial implications for students’ future outcomes. Using a randomized experiment in Kenya, Duflo, Dupas, and Kremer (2011) find positive effects of academic tracking on test scores, suggesting that high‐achieving students in low‐performing schools would benefit from moving to schools with similarly high‐achieving peers. A growing literature also finds that attending a better school improves nonacademic outcomes (Cullen, Jacob, and Levitt 2006; Deming 2011; Lavy 2010). Additionally, Black, Denning, and Rothstein (2020), analyzing the effect of the Texas Top Ten Percent rule, find that access to selective colleges increased graduation rates and earnings for high‐performing students from nontraditional sending schools.2
II. Institutional Background
Compulsory education in Ghana consists of six years of primary school and three years of junior high school (JHS). Each year, all 350,000 students graduating from the more than 9,000 JHSs compete for admission to senior high school (SHS) and may apply to any of the 700 SHSs in the country that follow the national curriculum. This provides an ideal context in which to study educational mobility because there is considerable variation in school quality at both levels of schooling.
In 2005, Ghana introduced a centralized merit‐based admission system to improve the transparency and fairness of the secondary school transition process. The Computerized School Selection and Placement System (CSSPS) allocates JHS students to SHS based on students’ ranking of their preferred program choices and their performance on a standardized exam (Box 1 in Online Appendix Section A provides more detail). I refer to students as the decision‐makers throughout this paper for simplicity; however, application choices often result from discussions between students and their parents, teachers, or friends.3 In practice, admission occurs in three stages:
Students submit a ranked list of choices, stating a senior high school and a program track within that school for each choice.4
Students take the nationally administered Basic Education Certificate Exam (BECE).
All students who qualify for admission to SHS are admitted to a school and program.
On average, only half of all candidates received a sufficient grade in the BECE to qualify for admission to SHS during the first five years of the CSSPS.5 Programs have a predefined capacity and admit students using a deferred‐acceptance algorithm (Gale and Shapley 1962; see Online Appendix A.1 for a more detailed discussion of the matching properties of this algorithm), with priority determined strictly by academic merit as follows:
Step 1: Each student applies to the first program in their ordered list of choices. Each program tentatively assigns its seats to applicants one at a time in the order of students’ academic performance (based on BECE scores). Each program rejects any remaining applicants once all of its seats are tentatively assigned.
In general, at
Step k: Each student who was rejected in the previous round applies to the next choice on their list. Each program compares the set of students it has already tentatively accepted with the set of new applicants. It tentatively assigns its seats to these students one at a time in the order of students’ academic performance and rejects remaining applicants once all of its seats are tentatively assigned.
The algorithm terminates when no spaces remain in any of the programs selected by rejected students. Each student is then assigned to their final tentative assignment.6 Rejected students who do not gain admission to any of their chosen programs are administratively assigned to an undersubscribed program with available spaces. Efforts are made to place these students in their preferred district or region wherever possible, but there is limited regard for students’ initially ranked choices. As such, there are high stakes involved in the application decision.
Two aspects of the Ghanaian school choice system are especially noteworthy. First, students can only submit a limited number of choices. This is partly a legacy of the manual application system that preceded the CSSPS. Initially, students were allowed to list up to three choices as they had been under the manual system. This increased to four choices in 2007 and to six choices in 2008. Between 94 and 100 percent of students listed the maximum number of choices each year.
Second, the application process includes a substantial degree of uncertainty. Students must submit their applications before taking the entrance exam,7 and admission cutoffs are endogenously determined by the quality of applications to a given program each year since schools only define the number of available spaces, but not an explicit exam score required for admission. Therefore, students have incomplete information about their admission chances when they select their choices, even though admission is definitively based on exam performance.
These features of constrained choice and incomplete information are remarkably common. In the United States, school choice systems in Denver, Chicago, and New York limit the number of choices to 5, 6, and 12, respectively (Pathak and Sӧnmez 2013). Other countries use Ghana‐like merit‐based systems for secondary school admission, including Kenya (limit three), Trinidad and Tobago (limit four), and the United Kingdom (varies by local council but typically three to six), and for college entry in Canada (for example, limit five in Ontario), Chile (limit eight), Hungary (limit four), and Spain (limit eight).8 Moreover, students generally apply under uncertainty—either due to random assignment or endogenous cutoffs in merit‐based systems in which schools predefine their vacancies but not an explicit exam score that guarantees admission.9 Theoretical studies by Chade and Smith (2006), Haeringer and Klijn (2009), and Chade, Lewis, and Smith (2014) model the problem of constrained choice with incomplete information and show that these features severely complicate the decision‐making process. Although Calsamiglia, Haeringer, and Klijn (2010) confirm several theoretical predictions in a lab experiment, we know little about how the effects of constrained choice and uncertainty differ based on students’ socioeconomic backgrounds or the extent to which they limit the ability of high‐achieving students to attend better schools in practice.
Another notable component of Ghana’s school choice environment is the variety of available alternatives. Although students can apply to any school in the country, cost and distance may be key considerations for disadvantaged families because education in public schools was tuition free for primary and JHS but not for SHS. Junior high schools often still charged various nontuition fees, but senior high school was substantially more expensive. For example, fees for day students in public SHSs in 2007 were c33,000 ($30) per term, and boarding fees were c784,700 ($75) per term. The feeding fee was c746,700 ($72), and the approved list of total fees payable on admission (covering admission, school uniform, house attire, and physical education kits) was c442,000 ($42) (Ghanaian Times 2007). A boarding student therefore had to pay up to $190 per term or $570 per year. Annual GDP per capita was $1,100, and the minimum wage was approximately $650 per year.
Thus, both the complexity of the school choice process and the varying characteristics of available options are potential factors that could deter disadvantaged students from applying to better schools. The next section describes the detailed administrative data from Ghana that allow me to examine empirically school choice and educational mobility in this setting.
III. Data
I use CSSPS administrative data on senior high school applicants and supplementary data on school characteristics to analyze application behavior and admission outcomes in Ghana.
A. Student Applications
CSSPS data cover the universe of students who took the BECE in Ghana between 2005 and 2009 and report their background characteristics, application choices, BECE scores, and admission outcomes. Panel A in Online Appendix Table A.1 presents descriptive statistics by year. Data on admission outcomes are incomplete for 2006 (administrative assignments are missing), so I use a panel of data for 2007–2009 as the core of my analysis.
For each student, I observe the junior high school attended and the complete ordered list of choices submitted to the CSSPS—the student’s “application portfolio.”10 In most specifications, I focus on the 50 percent of students who qualify for senior high school admission because I also observe individual exam scores and admission outcomes for these students. I convert BECE scores for each year into a standardized score with a mean of zero and standard deviation of one, so that they are comparable across years.
To examine the extent of educational mobility, I proxy for students’ socioeconomic status using the type of junior high school they attended, since I do not directly observe any indicators of family background. I calculate junior high school performance on the BECE exam for the period 2005–2009 and split the sample into three groups: students from public JHSs where the average BECE score is above the median average for all public schools in the country, students from public JHSs with performance below the median, and students from private JHSs. Table 1 presents a comparison of background characteristics, application choices, and admission outcomes for students in my 2007–2009 analysis sample. The majority of students attend public junior high schools (82 percent overall and 73 percent among students who qualify for senior high school admission).
Summary Statistics by JHS Performance
There are large differences in BECE performance across schools; 49.8 percent of students overall qualify for SHS admission. The respective qualification rates are 33.0 percent, 67.4 percent, and 75.6 percent for students from low‐performing public, high‐performing public, and private JHSs, respectively.
B. Secondary School Characteristics
I supplement the CSSPS student data with information on secondary school characteristics. Ghana Education Service updates a register of schools each year to provide information on each school’s location and to indicate whether a school is public or private, single sex or coeducational, and day or boarding. The register also lists the types of programs offered and the number of vacancies in each program. Additionally, I obtained school‐level distributions of grades in the Secondary School Certificate Examination (SSCE) from the West African Examination Council. The SSCE is taken at the end of senior high school and used for admission to university. It is centrally administered to students at a national level, so exam scores are comparable across schools. As a measure of schools’ academic performance, I standardize the average percentage of students earning a credit on the SSCE math and English exams for the period 2003–2008 to have mean zero and a standard deviation of one.11 I use the CSSPS data on students’ admission outcomes to construct a measure of school selectivity based on the distribution of BECE scores of students admitted each year.
Finally, I use three additional indicators of school quality. The first measure indicates the 34 “colonial” schools that were constructed before Ghana gained independence in 1957. These pre‐independence schools have a historical prestige similar to that of the Ivy League universities in the United States. The second measure indicates the 65 top‐ranked (“Category A”) schools according to a categorization scheme introduced by the government in 2009 to reflect schools’ available facilities. The third measure indicates the 22 “elite” schools that are both colonial and Category A schools. Panel B in Online Appendix Table A.1 summarizes the senior high school data.
C. Evidence of Systematic Differences in Student Outcomes
Figure 1 presents a descriptive illustration of students’ application behavior. I plot the average academic performance of senior high schools in students’ application portfolios against students’ individual BECE scores for each of the three groups of applicants: students from public JHSs where the mean BECE score is below the median average for all public schools in the country, students from public JHSs above the median performance, and students from private JHSs. The upward slope for all three groups in the figure indicates that students with higher BECE scores apply to higher‐performing schools. However, there is a persistent divergence in application behavior based on students’ JHS backgrounds. For a given BECE score, students from private and high‐performing JHSs apply to a higher‐performing set of schools than students from low‐performing JHSs.
This divergence in application behavior implies qualified students from low‐performing schools are not taking full advantage of the opportunity to attend higher‐performing schools. Throughout this study, I interpret BECE scores as a measure of students’ true underlying ability. An alternative interpretation would be to presume that students who received a high BECE score but came from a low‐performing school were simply lucky. Empirical analysis in a related paper (Ajayi 2014) and presented in Online Appendix Table A.2 informs my interpretation of BECE scores. Conditioning on individual BECE scores, I find that students who attended low‐performing JHSs perform better on the SSCE exam than their senior high school peers who attended high‐performing JHSs. Thus, BECE scores likely understate the true ability of students from low‐performing schools, and observed differences in application behavior would be even more extreme if we could compare students with the same academic potential.12
To provide additional evidence on the extent of educational mobility, Table 2 evaluates the main predictors of students’ application choices and admission outcomes. Each column presents coefficients from a linear regression of the following form:
1
Differences in Application Choices and Admission Outcomes
where Yijd is some characteristic of the application set or admission outcome of student i from junior high school j in district d. is student i’s standardized BECE score, ϕj is the mean BECE score in student i’s junior high school j, and Publicj is an indicator for whether student i attended a public junior high school. I also include district fixed effects, δd, for each of Ghana’s 138 administrative districts in the study period to ensure that comparisons focus on students in the same geographical area and to account for any district‐specific factors that may influence school choice. I cluster standard errors at the junior high school level to allow for correlations in unobserved factors within schools. The key parameters of interest, α2 and α3, indicate how JHS background relates to application behavior and admission outcomes.
This analysis examines how application behavior and admission outcomes vary with individual ability () and separately with socioeconomic background (proxied by JHS type). I use two indicators of JHS characteristics to capture two components of family background. Average BECE test scores ϕj indicate the average educational ability of peers, as well as the average productivity of the school. Attending a public over a private JHS (Publicj) proxies for a family’s ability and willingness to invest in education (capturing both the value placed on education as well as available resources). Both ϕj and Publicj have independent explanatory power. They both significantly predict application and admission outcomes—the R2 on the equation goes from 0.349 to 0.375 with the inclusion of ϕj and increases further to 0.390 with the inclusion of Publicj. Although we may expect that higher‐ and lower‐ability students from different backgrounds behave differently, I do not include an interaction between
and ϕj to maintain the focus on the question of whether equally qualified students from different backgrounds make the same application choices.
Panel A of Table 2 presents models that take as a dependent variable the average academic performance of the portfolio of schools to which student i applies, where academic performance is the standardized percentage of students earning a credit in the SSCE English and math exams. Column 1 indicates that a standard deviation (σ) increase in individual BECE scores implies a 0.231σ increase in the average academic performance of schools to which a student applies, while a standard deviation increase in JHS average performance implies a 0.203σ increase in the performance of selected schools. Column 2 includes controls for whether a student attended a public JHS, and the coefficient on JHS average performance falls to 0.163. The negative coefficient on attending a public JHS (α3 = −0.180) indicates that students from public schools apply to lower‐performing senior high schools holding all else equal. Column 3 adds an indicator for having an elite, colonial, or Category A school in a student’s JHS district. The correlation with portfolio selectivity is positive and the coefficient on JHS average performance falls again to 0.145. Column 4 includes district fixed effects, and the coefficient on JHS average performance decreases to 0.084, which suggests that geographical location is an important determinant of differences in portfolio choices, and that application differences by JHS background exist even within districts.
The final column looks for evidence of school‐specific determinants of application choices by controlling for the average number of students in a given JHS who applied to an elite SHS in the first two years of the CSSPS. The coefficient on previous application behavior is positive and significant. Meanwhile, the coefficient on JHS average performance decreases to 0.048, suggesting a strong persistence in application behavior within schools.
Panel B of Table 2 presents an additional set of results on the academic quality of students’ chosen schools and their admission outcomes. Column 1 repeats my preferred specification from Column 4 of Panel A, using the regression specified in Equation 1 to examine the average performance of schools in a student’s application portfolio. Column 2 analyzes school selectivity as an alternative measure of school quality and finds similar results. Columns 3–5 focus on admission outcomes: the performance, selectivity, and peer quality in a student’s assigned senior high school. The results consistently indicate that while individual ability predicts the quality of schools to which students apply and gain admission, junior high school background is also a significant predictor of students’ choices and admission outcomes.
Higher‐performing students and students from higher‐performing and private schools are more likely to apply to schools that are located far from their homes. This certainly appears to be a dimension that separates students from different educational backgrounds, with 54.7 percent of students from low‐performing public schools admitted to high school as a boarding student, compared to 69.4 percent and 73.0 percent of students from high‐performing public schools and private schools. Students from low‐performing public schools apply and gain admission to high schools that are approximately 5 and 10 miles closer than students from high‐performing public schools and private schools, respectively.
Despite these differences in geographical mobility by educational background, I find persistent gaps in application behavior and admission outcomes when I restrict the sample to the 55 percent of students who only applied to schools within their JHS region or the 5 percent who only applied within their JHS district (results reported in Online Appendix Table C.1). The coefficients on a student’s individual BECE score decrease (indicating that a student’s academic performance becomes a weaker predictor of application choices and admission outcomes). Meanwhile, the coefficients on measures of students’ educational background remain statistically significant, with larger magnitudes in some cases compared to the full sample (indicating that disadvantaged students apply and gain admission to less selective high schools even when compared to more advantaged students choosing from schools in the same local market).
IV. School Choice Model
My primary objective is to evaluate whether differences in demand can explain differences in the application choices of equally qualified students from different educational backgrounds. The central contribution of the model below is to establish the optimal application behavior given the school choice mechanism (allowing me to interpret the rank order of selected schools as a truthful preference ordering) and to outline an approach to estimate the parameters of underlying student preferences (allowing me to characterize the determinants of demand for schools, despite the presence of strategic application behavior). An alternative methodology would be to estimate a fully structural model of students’ application behavior and to then simulate the choices that students would submit under counterfactual scenarios. The large number of alternatives and interdependence between admission chances in a typical merit‐based system prevent simplification of the school choice problem. Thus, a fully structural solution requires a substantial number of additional assumptions and is extremely challenging to implement.
A. Setup
I model the application decision in terms of a portfolio choice problem, following Chade and Smith (2006). Consider a finite set of students I = {1,. . .,K}, each with ability , which is unknown to the student, and a finite set of schools S = {1,. . .,M}, each with a known selectivity level qs. Each student receives some utility Uis from attending a school.13 Given the uncertainty about their actual ability, each student forms a belief about their expected exam performance Ti and associates this belief with some subjective probability of being admitted to a given school,
. Thus, each student has some expected value of applying to a school:
.
Students face the task of selecting an application portfolio that is an ordered subset of A schools. Finally, there is an application cost c(|A|) associated with selecting a portfolio of size |A| schools. For the CSSPS case, institutional restrictions permit a fixed number of applications n, so c(|A|) = 0 if |A| ≤ n and c(|A|) = ∞ if A| > n.
In the resulting portfolio choice problem, each student i chooses an application portfolio Ai = {1,. . .,N} to maximize net expected utility: . The optimal portfolio
consists of a ranked set of chosen schools and solves:
2
where the numerical subscript indicates the cth‐ranked choice in the application set, and N ≤ n. Further, is the conditional probability of being admitted to choice c given that a student is not admitted to any of their more preferred choices.14
B. Optimal Application Strategy
Haeringer and Klijn (2009) theoretically analyze the problem of constrained school choice and conclude that while there is no weakly dominant strategy, an undominated strategy is to select a set of n schools and to rank these schools truthfully (p. 1929, Proposition 4.2). Although students might be strategic in selecting the set of n schools to include in their application list, they do not gain by misrepresenting their relative preferences for these schools when it comes to submitting their rank ordering. This means that any school included in a student’s list of submitted choices should be weakly preferred to any schools ranked lower on that list (that is, Ui1 ≥ Ui2 ≥ . . . ≥ UiN). I therefore take students’ ranking of schools in their submitted list as a truthful preference ordering.15
Another notable implication of the portfolio choice problem is that a student does not necessarily apply to their most preferred school overall—the school that satisfies max(Uis)—because admission chances are uncertain and students are constrained in the number of schools to which they can apply.16 Instead, students pick schools based on their expected utility, . This implies that although a school in a student’s application portfolio may not be preferred overall, it will be preferred to all other schools that are perceived to be equally selective (that is, all other schools to which a student believes they have equal admission chances). This key insight provides the foundation for my empirical estimation of demand‐related factors responsible for application differences.
C. Empirical Estimation Strategy
We would need to know students’ expected admission chances at each school to compute the optimal application portfolio. Without this information, I focus on estimating the demand parameters that characterize student preferences. Given that students pick schools based on their expected utility, , any school in the application portfolio will be preferred to all other schools that are perceived to be equally selective. This allows us to estimate students’ revealed preferences for schools selected from alternatives in a restricted choice set.
Taking the highest‐ranked choice, for example, student i’s expected utility satisfies the following statement:
3
Consequently, . We can therefore specify a discrete choice estimation framework for the selection of a first choice school based on the fact that student i chooses the most preferred school s out of the set of all schools with equal selectivity as their first choice school, which I will denote by
. Defining the dependent variable of interest as:
it follows that the probability that student i lists school s as a first choice is:
If we assume that Uis = Xisβ + ϵ, where Xis is a vector of observable school characteristics and the idiosyncratic component ϵis of student utility is independently and identically distributed (i.i.d.) extreme value, then the probability that student i chooses school s as a first choice can be written as:
4
5
This yields the log‐likelihood function:
6
which we can estimate using maximum likelihood. The independence from irrelevant alternatives property of the multinomial logit model implies that focusing on the subset of alternatives in this restricted choice set () still provides a valid estimate of the parameters determining student preferences.17
D. Student Beliefs
A core component of linking the theory to data is to specify students’ formulation of beliefs about their admission chances in the absence of complete information. The model’s setup implies that uncertainty about stems from uncertainty about individual ability and assumes that admission cutoffs are known. This setup abstracts from year‐to‐year variation in school selectivity (taking admission cutoffs qs as fixed) and models uncertainty as coming entirely from students’ incomplete information about their exam performance. In essence, this simplification requires that students’ subjective expectations of their admission chances are a rank‐preserving transformation of their actual admission chances, so that
, where the function g(·) is a rank‐preserving transformation which ensures that
.18 Empirically, I assume that students form expectations about their admission chances based on the selectivity of schools in the previous year. School selectivity in Ghana is indeed relatively stable over the period studied, with a correlation in selectivity levels above 0.90 for all years. Figure 2 illustrates the correlation in school selectivity between 2007 and 2008.
Correlation in Secondary School Selectivity over Time
Notes: School selectivity measures the average exam score of students admitted to a secondary school in a given year.
E. Student Utility
My empirical approach also relies on standard assumptions about students’ utility functions. I assume that student i’s utility from attending school s depends on a set of observed and unobserved factors, where the observable component is a linear function of school selectivity qs and a vector of student‐specific school characteristics Xis. Thus, demand for school selectivity is additively separable from demand for other school characteristics:
7
The error term in this utility function denotes students’ valuation of school characteristics that are unobserved by the researcher. The subscript is indicates that school characteristics result from an interaction between school attributes and student characteristics. For example, proximity to a given school varies across students.
To examine heterogeneity in preferences, I allow demand for school characteristics Xis to vary by student performance, , the mean performance in a student’s junior high school, ϕj, and attendance of a public junior high school, Publicj. I therefore parameterize students’ utility function in the following way:
8
My key objective is to evaluate whether preferences for school characteristics significantly vary by JHS background—that is, whether β3 ≠ 0 and β4 ≠ 0. In this analysis, I would like to see how preferences for school characteristics vary with individual ability and separately with educational background (proxied by JHS type), after accounting for differences in expected admission chances. If students from disadvantaged backgrounds have significantly different preferences for school attributes, then this would suggest that demand‐related factors limit educational mobility. Once again, I cluster standard errors at the junior high school level to allow for correlation in the preferences of students within a school.
V. Results
The theoretical model presented in the preceding section suggests two approaches for analyzing differences in students’ application behavior in the face of constrained choice under uncertainty. First, we can focus on heterogeneity in the rank ordering of schools in a students’ submitted list of choices to establish whether students from disadvantaged backgrounds are more likely to rank a high‐performing school below a lower‐performing one. Second, we can estimate the parameters of student preferences by restricting their choice sets to schools of equal selectivity and can then examine heterogeneity by educational background. In the remainder of this section, I therefore adopt these two approaches to explore whether demand‐related factors appear to explain observed differences in application choices. I begin by looking for differences in students’ rank ordering of their selected schools. I then look for differences in students’ preferences for observable school characteristics.
A. Heterogeneity in Ranking of Selected Schools
I examine revealed preferences for selective schools using the fact that it is an undominated strategy for applicants to rank their chosen schools in truthful order of preference. I measure the selectivity of schools using the performance distribution of students admitted to a school in previous years. I then assess the extent to which students rank a selective school lower than a less selective school on their submitted list. I use four measures to characterize students’ application portfolios according to school selectivity (qs): (i) choices are strictly ranked in order of selectivity, (ii) choices are weakly ranked in order of selectivity, (iii) the highest‐ranked choice is the most selective school in a student’s portfolio and the lowest‐ranked choice the least selective school, and (iv) the highest‐ranked choice is weakly more selective than the lowest‐ranked choice. Each row of Table 3 indicates the share of students who satisfy each condition.
Differences in Ordinal Ranking of Selected Schools
Overall, students from low‐performing junior high schools are more likely to rank a more selective secondary school below a less selective school on their list. As Panel B of Table 3 shows, less than 8 percent of students in the full sample ranked their choices strictly in order of selectivity. This figure was 11.4 percent among students who qualified for high school admission. The likelihood that students consistently list selective schools above less selective schools on their list changes with students’ educational background—6.5 percent of qualified students from low‐performing public schools, 12 percent of qualified students from high‐performing public schools, and 17.4 percent of students from private schools satisfied this criterion. The likelihood that students ranked their most selective choice first and their least selective choice last shows a similar pattern, with respective rates of 23.5 percent, 34.9 percent, and 46.3 percent. Similarly, 82.5 percent of students ranked their selected schools such that their first‐choice school was equally or more selective than their lowest‐ranked school. Among students who qualified for high school admission, 88.6 percent did. Notably, only 82.4 percent of students who qualified from low‐performing public schools did, compared to 90.5 percent of students who qualified from high‐performing public schools and 94.4 percent of students from private junior high schools. These differences are all statistically significant at the 1 percent level.
Altogether, these statistics indicate that students from lower‐performing schools are substantially more likely to rank selective schools below less selective ones they choose. In addition, students from more disadvantaged backgrounds apply to a less diversified portfolio of schools. The final row in Table 3 reports the standard deviation of selectivity of schools in the application portfolio. The standard deviation in selectivity of selected schools is 0.601 for qualified students from low‐performing public schools, 0.696 for those from high‐performing public schools, and 0.792 for private school students.
Panel B of Table 3 presents a similar analysis instead using the SSCE performance of selected schools as the key measure of interest. Again, I find that students from high‐performing and private JHSs are significantly more likely to rank their selected schools in order of SSCE performance, indicating that they have a stronger preference for attending high‐performing schools. Notably, students’ rank‐ordered lists reflect the value they place on school quality as well as their concerns about other school characteristics, including location or distance. These results thus confirm that students from disadvantaged backgrounds are less willing to attend higher‐performing schools, but do not provide additional insights into the underlying reasons why.
B. Heterogeneity in Preferences for School Characteristics
To further examine heterogeneity in preferences, I estimate the discrete choice model outlined in Section IV.C. Table 4 summarizes portfolio characteristics and confirms that the subset of five equally or less selective schools used in the discrete choice estimation represent schools of more similar selectivity than those in students’ actual application portfolios. Table 5 reports coefficient estimates from the resulting multinomial logit regressions. Columns 1–4 include all students who qualified for admission to SHS. Column 1 estimates average preferences and shows that students generally prefer schools with higher SSCE performance and those established before Ghana gained independence (a signal of historical prestige). They also prefer public, single‐sex, and schools with boarding facilities, as well as those that are closer to their JHS districts.
Summary Statistics for Discrete Choice Analysis
Discrete Choice Model Estimates
Column 2 allows for heterogeneity by student BECE score, Column 3 includes heterogeneity by JHS performance, and Column 4 includes heterogeneity by public JHS attendance. The coefficients on these additional interaction terms indicate that higher‐performing students and students from higher‐performing JHSs have a stronger preference for attending high‐performing and single‐sex secondary schools. Moreover, students from high‐performing JHSs have a weaker preference for attending public schools and are more willing to attend boarding schools and schools that are farther away, presumably because they can afford the associated increases in cost. Students from public JHSs have a stronger preference for school proximity, are less inclined to select boarding schools, and have a weaker preference for attending high‐performing, colonial, and single‐sex secondary schools.19
To provide a more conservative estimate of heterogeneity in preferences for senior high school characteristics, in Column 5 I limit my analysis to public school students. Public JHS students are more likely to comply with their admission outcomes instead of opting out into the handful of elite international schools that have independent admissions procedures but charge substantially more than schools following the national curriculum. Despite restricting my analysis to the more homogeneous group of public JHS students, the differences in preferences by JHS background remain statistically significant and reflect the same patterns as within the full sample.20
Overall, these results indicate that students from disadvantaged educational backgrounds are less willing to travel to attend higher‐performing schools. While the coefficient on the interaction between individual exam scores and school distance is of equal magnitude to the coefficient on the interaction with JHS mean scores, individual student scores are four times as important as JHS mean scores in determining heterogeneity in preferences for academic performance. This suggests that students from different educational backgrounds differ more in their preferences for school proximity than they do in their preferences for school quality.
As an additional approach to address the possibility that students omit more‐preferred selective schools from their rank‐ordered lists because they have low expected admission chances, I estimate another set of results using the strategy proposed by Artemov, Che, and He (2021). I begin by restricting the sample to students who are admitted to a clearly feasible choice (where they scored above the fifth percentile of admitted students). I then construct a choice set consisting of five randomly selected schools that are also clearly feasible and estimate the probability that students select their admitted choice out of the set of other clearly feasible choices. Column 6 in Table 5 reports coefficient estimates from the resulting multinomial logit regression, again restricting the sample to students from public schools. These estimates reflect the same patterns as those from my main specifications, with students from low‐performing schools having a stronger preference for school proximity and a weaker preference for school quality.
To address the fact that students’ feasible choice sets differ based on their exam achievement, I also separately estimate this discrete choice model by BECE score decile. Table 6 reports these results. To focus on differences between students with similar choice sets, each column presents estimates from a regression that compares students within the same BECE score decile. I find that preferences for school proximity still significantly differ by JHS mean scores across the whole BECE sore distribution, indicating that JHS background still predicts preferences for school attributes for students with similar individual BECE scores. Students from disadvantaged backgrounds are more likely to value schools that are close by and less likely to value schools with boarding facilities. Differences in preferences for SSCE performance are significant in the top BECE decile, but there are no significant differences for lower‐performing students, suggesting that the aggregate heterogeneity in preferences for academic performance may largely be driven by higher‐performing students.
Discrete Choice Model Estimates by BECE Score Decile—Artemov, Che, and He (2021) Method
C. Additional Robustness Checks
I conduct several sensitivity tests to provide additional supportive evidence for my assumptions about student beliefs and to show the robustness of my results to alternative econometric specifications. My baseline estimates measure school selectivity using the mean BECE scores of students admitted to a school in the previous year. As a robustness check, I estimate these same models using the fifth percentile BECE score of students admitted to a school in the previous year as an alternative measure of school selectivity. The idea here is that if students are primarily concerned about their admission chances, they might be focused on the lower tail of the exam score distribution for admitted students instead of focusing on the mean. The results estimated using this alternative measure of school selectivity are reported in Online Appendix Tables A.3, A.4, and A.5. The one result that changes is that the estimated difference in preferences for SSCE performance by JHS mean exam score in the discrete choice analysis becomes smaller and switches sign, providing further indication that school proximity likely matters more than preferences for school quality, although students from private schools continue to place more value on school quality than students from public schools. Models using the fifth percentile BECE scores of admitted students as a measure of selectivity generally produce a lower R2, suggesting that using mean BECE scores provides a better fit to the data.
I also examine whether the estimates are robust to including students who do not qualify for secondary school admission, by imputing a constant value for students’ missing BECE scores and running my analysis on the full sample of students with an indicator to control for having a missing BECE score. My main findings remain.
To isolate the potential role of geographical mobility, I estimate separate regressions for the 55 percent of students who only applied to schools within their JHS region and the 5 percent who only applied within their JHS district. Online Appendix Tables C.1, C.2, and C.3 present these results. Students who apply within their local district are almost twice as likely to rank their schools weakly in order of selectivity, while there are no differences for students who apply to their local region. Yet, there is a persistent divergence in application behavior and admission outcomes by JHS background for both students applying to their local districts and regions, suggesting that any improved information students may have about local schools is not sufficient to overcome constraints to educational mobility.
D. Areas with Stable School Rankings
As a final empirical exercise, I present suggestive evidence from areas with exceptionally stable secondary school rankings over time. If student uncertainty about school selectivity is driving the main results, then we should expect to see an increase in the coefficient on individual student performance and a decrease in the coefficients on measures of JHS background when we estimate Equation 1 for students in stable areas. If students are generally well informed about school selectivity, then we should expect the main results to hold even when focusing on areas with especially predictable school rankings. Since most students apply to schools across the country, I separately analyze the behavior of students in stable areas who exclusively applied to schools within their districts.
Cape Coast Municipal district has the highest stability in school selectivity among districts with more than three schools.21 The district was the original capital of the British colonial administration and has a long educational legacy. Its 11 secondary schools include the oldest secondary school in the country and five of Ghana’s 22 elite schools (established before independence and ranked in the top category by the 2009 classification scheme). As Figure 3 illustrates, the correlation in school selectivity was 0.9974 between 2007 and 2008. It remained above 0.99 during the period of study. Eleven percent of Cape Coast JHS students who qualified for secondary school admission exclusively applied to schools within their district, which is more than twice the nationwide rate.
Correlation in Secondary School Selectivity over Time (Cape Coast District)
Notes: School selectivity measures the average exam score of students admitted to a secondary school in a given year.
As Table 7 indicates, the correlation between individual exam scores and student outcomes is higher than elsewhere. Column 1 analyzes the average performance of students’ selected schools. The coefficient on individual BECE scores is 0.230 for students nationwide (Panel A), 0.283 for students in Cape Coast (Panel B), and 0.380 for local applicants in Cape Coast (Panel C). Similarly, the correlation between individual performance and the selectivity of selected schools also increases (Column 2). The remaining three columns indicate an associated increase in the correlation between individual performance and students’ admission outcomes. Column 3 analyzes the academic performance of a student’s assigned school. The coefficient on individual BECE scores is 0.614 for students nationwide, 0.654 for Cape Coast students, and 0.784 for Cape Coast local applicants. Thus, individual academic performance becomes a stronger predictor of student outcomes.
Differences in Application Choices and Admission Outcomes (Cape Coast District)
Nevertheless, the importance of educational background also increases. The coefficients on JHS mean performance for Cape Coast local applicants are 0.246 and 0.267 in Columns 1 and 3 (compared to 0.084 and 0.141 for students nationwide). This indicates that even when students are able to predict school rankings, students from higher‐performing junior high schools are more likely to apply and gain admission to high‐performing secondary schools. Breaking down the estimates by JHS type indicates that these results are largely driven by high‐performing students and by differences within the set of high‐performing public and private schools (see Figure 4 and results reported in Online Appendix Tables D.5 and D.6). While the role of JHS background becomes insignificant for students in the bottom half of the JHS performance distribution, students from the highest‐performing JHSs are substantially more likely to apply and gain admission to more selective schools when compared to equally qualified students from lower‐performing schools in the top half of JHSs.
Exam Performance and Application Choices (Cape Coast District)
Notes: Figure illustrates differences in academic performance of selected secondary schools for students with the same ninth grade basic education certificate exam (BECE) scores but from low‐performing public, high‐performing public, and private junior high schools (JHSs). Secondary school performance is a normalized measure of the average percentage of students earning a credit in the math and English 12th grade secondary school certification exam (SSCE) for the period 2003–2008. Low‐performing public JHSs are those where the average BECE score of students is below the median for public schools in the country; high‐performing public JHSs are those with above‐median performance.
I next turn to analyze the rank ordering of students’ selected schools. Students in Cape Coast district are substantially more likely to rank their schools in order of selectivity, and this is particularly true for students who exclusively apply to schools in that district. Table 8 reports characteristics of students’ application portfolios for all students in the country (Panel A), for Cape Coast district students (Panel B), and for Cape Coast district students who restrict themselves to applying to schools in their district (Panel C). Ten percent of all students ranked their schools weakly in order of selectivity, 19 percent of Cape Coast students did, and 58 percent of Cape Coast local applicants did. Among local applicants, students from low‐performing public JHSs are more likely to rank their schools in order of school quality than students from high‐performing public JHSs.
Differences in Ordinal Ranking of Selected Schools (Cape Coast District)
I analyze application behavior and student outcomes in other areas with stable rankings over time to assess the extent to which these results generalize. Online Appendix Section D presents results for three additional districts—Sunyani, which has a correlation of 0.9960 in school selectivity and no elite schools; Kumasi Metro, which a correlation of 0.9920 and three elite schools; and East Akim, which has a correlation of 0.9903 and no elite schools (illustrated in Online Appendix Figure D.1).
In contrast to the Cape Coast case, JHS background becomes a less significant predictor of application behavior and admission outcomes in Sunyani, Kumasi, and East Akim, while individual student performance becomes a stronger predictor of student admission outcomes (Online Appendix Table D.1). Strikingly, the coefficient on individual performance is 0.987 in a regression analyzing the performance of student’s assigned school for local applicants in Kumasi, and the coefficient on JHS performance is −0.024. In Sunyani, where students do not have local access to an elite school, the correlation on individual student performance is 0.654, while the correlation on JHS performance is −0.099 (statistically insignificant). The respective coefficients in East Akim are 0.212 and −0.073 (statistically insignificant). In the two districts without elite schools, the coefficient on individual exam scores remains substantially lower than one, despite the reduced significance of JHS background. This result suggests the importance of geographical mobility as a determinant of educational mobility in a nationwide merit‐based school choice system—in many cases, high‐performing students would need to move outside their district to attend high‐performing schools. Across the board, local applicants in stable areas are substantially more likely to rank their schools in order of selectivity (Online Appendix Table D.3).
Looking at all students in districts with stable rankings, individual exam scores have a stronger correlation with students’ application behavior and admission outcomes in areas with stable rankings, but JHS background remains a significant factor and students from higher‐performing JHSs apply and gain admission to more selective schools than equally qualified students from lower‐performing JHSs (Panel B in Online Appendix Table D.2).
Altogether, these results suggest that, in some cases, application differences may decrease when students are better able to predict school rankings, whereas in other settings (such as Cape Coast), students appear to be making informed choices to apply to lower‐quality schools than they merit. A caveat to note is that these results are tenuous because they focus on the small minority (2–11 percent) of students who exclusively apply to schools within their JHS districts. Additionally, districts with stable rankings tend to be in urban areas and have well‐established schools. The choice to focus on local schools could indicate other unobserved factors rather than capturing a causal effect of living in a place with stable rankings—students who choose to apply locally in stable areas might be those who particularly value school quality, so the observed differences in application behavior and admission outcomes may reflect selection bias rather than capturing the causal effect of being in an area with predictable school rankings.22 At a larger level, school ranking stability is likely to be endogenous. In areas with stable rankings, there is stronger sorting by individual exam scores, and higher‐achieving students are consistently more likely to attend high‐performing secondary schools, which could suggest that students in these areas place more value on schools’ academic performance, leading to stronger educational mobility.
VI. Conclusions
A key objective of school choice systems is to offer disadvantaged students access to better schools, yet eligible students often forgo available opportunities. Understanding this puzzle is crucial to the success of efforts to reduce inequality.
This work empirically examines why equally qualified students make different application choices based on their educational backgrounds. The first contribution has been to outline a theoretical model that formulates two methods to estimate preferences in the presence of strategic application behavior. The second contribution has been to apply the theory to rich administrative data on secondary school applications in Ghana, generating new insights on the barriers to educational mobility in a large merit‐based school choice system. I find that students from disadvantaged educational backgrounds are significantly more likely to rank low‐performing schools above higher‐performing ones on their application lists. Additionally, I find considerable heterogeneity in demand for school attributes, with disadvantaged students being more sensitive to school distance, and suggestive evidence that they place less value on academic performance. These findings suggest firm limits on the extent to which merit‐based school choice systems can increase educational mobility without additional support to disadvantaged students.
My results have broader implications for other contexts in which policymakers seek to provide disadvantaged students the opportunity to attend better schools. Research on educational mobility in the United States and other high‐income settings increasingly emphasizes information and decision‐making ability as a dominant part of the policy solution. I provide contrasting evidence from a developing economy and demonstrate that household demand and geographical access to high‐performing schools are persistent impediments. Thus, while realizing the potential for school choice mechanisms to reduce inequality will require ensuring that students have the ability to make informed decisions, complementary interventions, such as offering merit‐based financial assistance for students from disadvantaged backgrounds or improving the quality of secondary schools in underserved areas, may also be necessary to address the fundamental challenges of affordability and geographical access.
Acknowledgments
Part of this work was completed while Ajayi was at Boston University, the University of California at Berkeley, and the World Bank. The author is grateful to SISCO Ghana, Ghana Education Service, the Computerised School Selection and Placement System Secretariat, Ghana Ministry of Education, and the West Africa Examinations Council for providing data and background information. She thanks David Card, Caroline Hoxby, Patrick Kline, Kevin Lang, Ronald Lee, David Levine, Justin McCrary, Edward Miguel, Claudia Olivetti, Parag Pathak, Sarath Sanga, Modibo Sidibe, and conference and seminar participants for helpful comments. This research was supported by funding from the Spencer Foundation, the Institute for Business and Economic Research, the Center for Effective Global Action, and the Center for African Studies at UC Berkeley. The findings, interpretations, and conclusions expressed in this paper are entirely those of the author and do not necessarily represent the views of the World Bank, its Executive Directors, or the governments of the countries they represent. This paper uses confidential data from Ghana Education Service. The data can be obtained by filing a request with the Director General, Ghana Education Service, P.O. Box M45, Ministries, Accra, Ghana. Tel: +233 302 674247.
Footnotes
↵1. In related work, Ajayi, Lucas, and Friedman (2020) find that providing information to students in Ghana did not increase their likelihood of attending higher‐performing secondary schools.
↵2. Research on the academic effects of attending a selective school has produced mixed findings, but most studies do not identify the effects of merit‐based educational mobility. Researchers studying merit‐based admission systems typically use regression discontinuity designs that focus on students marginally admitted to selective schools and therefore cannot directly speak to the effects of high‐achieving students gaining admission to schools with similarly high‐achieving peers (for example, Clark 2010; Jackson 2010; Pop‐Eleches and Urquiola 2013; Abdulkadiroğlu, Angrist, and Pathak 2014; Dobbie and Fryer 2014; Lucas and Mbiti 2014; Dustan, de Janvry, and Sadoulet 2017). Researchers studying lottery‐based admission systems typically estimate effects of random assignment to a preferred school that need not be high‐performing and therefore identify average effects for randomly selected students, rather than for high‐achieving students moving to schools with high‐achieving peers (for example, Cullen, Jacob, and Levitt 2006; Hastings, Kane, and Staiger 2008). Notably, Hastings, Kane, and Staiger (2008) find positive effects of lottery‐based admission exclusively for the subgroup of students who applied to high‐performing schools.
↵3. Less than half of parents have attended senior high school, and parents with higher levels of education are more likely to assist their children with selecting schools (Ajayi and Telli 2013). Parental involvement may therefore be one channel through which the gaps between privileged and underprivileged children persist.
↵4. Available programs include: General Arts, General Science, Agriculture, Business, Home Economics, Visual Arts, and Technical Studies.
↵5. The requirements for admission to SHS are that students receive a passing grade in the four core subjects (mathematics, English, integrated science, and social studies), as well as in any two additional subjects. All students who qualify are guaranteed admission to a school.
↵6. Note that the deferred acceptance feature of this assignment mechanism means that a student who lists a program as their second choice could displace a student who has a lower score but listed that same option as their first choice and was tentatively assigned to that program in an earlier round.
↵7. Logistical concerns largely dictate timing of the application process. Because students have dispersed on vacation by the time they get their BECE scores in August, it is easier to register their senior high school application choices earlier in the year when they are still enrolled in school.
↵8. Kenya (Lucas and Mbiti 2014), Trinidad and Tobago (Jackson 2010), the UK (Coldron et al. 2008), Ontario (http://www.ontariocolleges.ca/apply), Chile (Hastings, Neilson, and Zimmerman 2013; Busso et al. 2016), and Hungary and Spain (Calsamiglia, Haeringer, and Klijn 2010).
↵9. Additionally, these contexts have strong parallels to the context of college choice in the United States. Students apply to a limited number of schools because of application costs (both in terms of time and money), and students are relatively uncertain about their admission chances because admission is partly based on measures of students’ academic performance which may be known at the time of applying (for example, SAT/ACT scores and high school GPA), but also on subjective assessments of other background information (such as personal statements, extracurricular activities, and recommendation letters).
↵10. These data are especially informative because more than 95 percent of students submit a complete list of senior high school choices. This is an extremely high participation rate compared to most school choice programs that have been studied in existing literature. For example, in the United States, less than 50 percent of students in Boston Public Schools listed the full number of available choices (Abdulkadiroğlu et al. 2006), and 40–60 percent did in Charlotte‐Mecklenburg Public Schools (Hastings, Kane, and Staiger 2008). This comprehensive coverage of applications allows me to compare students from a wide range of backgrounds.
↵11. I aggregate the SSCE data over multiple subjects and years to create a measure of persistent rather than transitory academic performance. This measure is missing for a small number of new schools that had not yet presented any SSCE candidates.
↵12. Consider three students from three different backgrounds: Student 1 attends a rural public school with few resources and has illiterate parents, Student 2 attends an urban public school and comes from a wealthy family, and Student 3 attends an expensive private school with state‐of‐the‐art facilities. All three students get the same BECE score and go on to senior high school. Once in an environment with more similar resources, Student 1 outperforms the other two. Presumably, Student 1 has higher intrinsic academic ability.
↵13. We can think of this utility as a comprehensive measure of the positive and negative factors associated with attending a school (including the costs of tuition payments and distance traveled, as well as the benefits of available facilities, peer quality, and the net present value of expected future income).
↵14. This basic notion of conditional probabilities captures the more general observation that merit‐based assignment implies that a student’s admission chances are correlated, so rejection from choice c reduces the expected admission chances at all other chosen schools. For example, a student who receives a negative shock and performs poorly on the BECE will have a lower chance of gaining admission to all schools than if they had performed as well as expected. As such, their realized admission outcome for a given school provides additional information on their admission chances for their lower ranked choices.
↵15. In Ghana, the CSSPS explicitly encourages students to order their set of selected schools truthfully. The official handbook issued each year provides limited advice to students about their selection of schools, and the guidelines specifically instruct students to be truthful about their ordering of choices, urging that “choices must be listed in order of preference” (MOES 2005, p. 5).
↵16. See Online Appendix A.2 for an analysis of the equilibrium solutions in the cases of perfect information and unconstrained choice.
↵17. The theoretical foundations of this discrete choice estimation approach are reviewed in Train (2003). Several recent empirical studies use application data to analyze revealed preferences in school choice settings, including Agarwal and Somaini (2018); Avery et al. (2013); Calsamiglia, Fu, and Guell (2020); Fack, Grenet, and He (2019); Griffith and Rask (2007); and Hastings, Kane, and Staiger (2008). This paper contributes an alternative estimation strategy to this growing literature.
↵18. The key requirement of this assumption is that students should be able to accurately gauge which set of schools are equally as selective as their preferred choice. Although this assumption is somewhat restrictive, it allows for students to have different beliefs about their absolute admission chances and only imposes that students have correct beliefs about their relative chances of gaining admission into various schools (that is, certain students may be more or less confident than others, but they must be uniformly biased about their chances of gaining admission to all schools.)
↵19. An alternative interpretation of distance to schools could be that it reflects a lack of information, if students are more likely to be aware of schooling options within close proximity. While distance could be an especially important barrier to information acquisition for disadvantaged students, this alternative interpretation does not explain the heterogeneity in preferences for public and boarding schools.
↵20. Nonetheless, I cannot entirely rule out the possibility that students from high‐performing public JHSs apply more aggressively because they have the outside option of attending a private school; see Shorrer (2019) for a theoretical discussion of this possibility.
↵21. There are 20 districts with two or three schools and a correlation in school selectivity above 0.99 for all years. Very few students exclusively apply to schools within these districts, so the samples are not large enough to analyze differences by JHS background.
↵22. A suggestive indication of this is the fact that in general, lower‐performing students are more likely to apply exclusively locally (7 percent of nonqualifying students do, and 5 percent of qualifying students do). In Cape Coast and Kumasi (both areas with stable rankings and a high number or elite schools), higher‐performing students are more likely to exclusively apply locally—local applicants make up 7 and 1 percent of students who don’t qualify for admission, respectively, and 11 and 5 percent of students who do qualify. In Sunyani and East Akim, the pattern reverses again (3 and 9 percent for students who fail, and 3 and 4 percent for students who pass).
- Received April 1, 2017.
- Accepted January 1, 2022.
This open access article is distributed under the terms of the CC BY-NC-ND 3.0 IGO DEED license (https://creativecommons.org/licenses/by-nc-nd/3.0/igo/) and is freely available online at: https://jhr.uwpress.org.