Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Treating Schools to a New Administration: Evidence of the Impact of Better Practices in the System-Level Administration of Schools

Treating Schools to a New Administration: Evidence of the Impact of Better Practices in the... Abstract A large body of literature focusses on how efficient practices in national and sub-national administrations can improve the welfare of citizens. Yet it is difficult to demonstrate this effect empirically, partly because citizens are not randomly assigned to different administrations and standard impact evaluation techniques are thus not viable. Within education, randomised trials that attribute educational improvement to specific interventions have been influential. Yet critics have argued that these studies, by focussing on single interventions, fail to prove their validity in a context of multiple interventions and, above all, weak system governance. The current paper takes advantage of provincial boundary changes occurring in South Africa, where provinces manage schools, to measure the effects of better administration, using a quasi-experimental approach. Changing to a more effective province was found to improve the mathematics performance of secondary school students, in national examinations, by a magnitude that is half of that seen in the fastest improving countries in international testing programmes. The replacement of teachers or school principals does not explain the improvement, though the addition of administrative support staff does emerge as a likely contributing factor. Greater efficiency, and not additional funding, appears to account for most of the change. It seems noteworthy that one of the provinces found to be relatively effective makes use of fixed-term contracts for senior managers in the administration, as opposed to permanent tenure, to enhance organisational capacity. 1. Introduction Identifying the causes behind improvements in schooling is inherently difficult, even when good data are available, and often such data are not available. The primary contribution of this paper is to provide causal estimates of the effect of better public administration by exploiting a natural experiment in which schools were ‘treated’ to a new administration. We find a substantial positive impact of a better provincial government administration, with the performance gap between treated schools and the counterfactual growing over time. A further contribution made by this paper is to unpack, to a limited extent, what it is about better administrations that makes a difference to school performance. Neither changes in overall funding, nor changes in the educator personnel at affected schools, played a significant role, though better access to schools-based administrative support staff appears to have made a difference. Mostly, improvements seem to have been achieved through better accountability mechanisms facing officials, better curriculum support to schools, and a prioritisation of specific learning support materials. Analysts are often warned against the use of traditional examinations data to gauge systemic improvements in education (UNESCO, 2014: 6). This paper confirms that while such data are difficult to use, they can yield robust findings if one exercises sufficient caution. This is important given how scarce the ideal of standardised test data can be in developing countries. Finally, this paper illustrates the potential use of quasi-experimental methods using spatial dynamics. The specific context which this paper takes advantage of is the limited redrawing of South Africa’s provincial boundaries in 2005. For 710 schools across the country, this meant a new provincial education administration. The boundary changes did not occur for any reason linked to education, yet they had profound implications for many of the schools affected since provincial governments play a central role in the administration of schools. Random assignment of students or schools to governance systems is extremely rare. This partly explains why there is so little hard evidence on the impact of governance on education, although one suspects governance does play a large role. Two research questions stand out. Does the governance system under which one falls make a difference to educational outcomes? If so, what elements in the system make the difference? The only study we know of which, like ours, makes the assumption that the redrawing of political boundaries constitutes a natural experiment with sufficient elements of randomisation, and then examines impacts on education, is that of Cogneau and Moradi (2014), who focus on West Africa in the early 20th century. They conclude that who governs did make a difference to literacy levels, and attribute the difference to more generous and cost-effective school funding provided by British colonial authorities, relative to French ones. Like us, they make use of difference-in-differences models to answer the first research question, but also draw from a wider range of sources to address the policy issues (the second question). The value of answering the first question, on whether governance matters, and by how much, should not be under-estimated. Such answers can in themselves help to shift the emphasis in the policy debates towards what occurs in the administration. This is a shift advocated by, for instance, Pritchett et al. (2012), who argue that in developing countries too much ‘mimicry’ of administrative functionality, sometimes facilitated by donor countries, has diverted attention away from administrative reform. Apart from Cogneau and Moradi, the only other study we found which used a ‘natural experiment’ to estimate the effects of different packages of education policies is that of Hahn et al. (2015). This study exploits an unusual system in Seoul, South Korea, whereby students are assigned randomly to different sub-systems of secondary schools within the city, the one featuring more school autonomy than the other. Which sub-system one is assigned to is found to impact on results in standardised tests. Differences are attributed to ‘autonomy in personnel decisions, together with strong incentives for principals and teachers to perform’ (Hahn et al., 2015: 30). These policy conclusions draw in part from the interaction between, on the one hand, time-variant and school-level variables relating to the employment of teachers and, on the other hand, a dummy representing the sub-system, within a difference-in-differences regression. Our methods moreover resemble those of Häkkinen et al. (2003), who used a panel of examinations and other administrative data to conclude that spending cuts during a 1990s recession in Finland did not influence educational outcomes. Section 2 discusses the various datasets we use, and introduces the institutional elements of South Africa’s schooling system, and the 2005 boundary changes. Section 3 presents the econometric analysis. Section 4 discusses the findings in the context of additional information. Section 5 concludes. The current paper builds on an earlier working paper by the authors (Gustafsson and Taylor, 2016). 2. The data and institutional background 2.1 The examinations data South Africa’s grade 12 examinations, written in around 5,600 schools, have offered for many decades the only reliable measure of school performance. Consequently, much attention has focussed on grade 12 indicators, in particular ‘pass rates’, the percentage of examination candidates successfully obtaining a national certificate or surpassing minimum thresholds in individual subjects. Around 50% of youths have obtained the grade 12 certificate in recent years. Roughly a further 20% of youths participate in grade 12 but do not obtain the certificate, while the bulk of the remaining 30% of youths do not reach grade 12 or any equivalent level of education outside a school. Both public and some private schools participate in the national public examinations. The examinations approach changed between 2007 and 2008. Subjects were redesigned, and a distinction between standard grade and higher grade examination papers across all subjects was removed. This paper focuses on improvements in mathematics, a subject that is widely taken, by around half of students, across virtually all schools. Apart from being the subject taken by most schools, it is also the subject most likely to produce consistent school rankings between 1 year and the next, when using school means. The mathematics examination papers are national, and security procedures established over several decades ensure that the ‘leaking’ of papers before the time at which all students sit for examination is extremely rare. After examinations are written, scripts are sent to provincial marking centres, which are monitored by the national authorities. Poor student performance, in particular in mathematics, is widely acknowledged as a hurdle to skills development and economic growth in South Africa. Typically, a mark of at least 60 or 70 out of 100 is a requirement for entry into under-graduate university programmes requiring strong mathematical competencies, for instance engineering. In 2013, a mark of 62 was obtained by the 95th percentile of mathematics students, while the mean for these students was as low as 35. The basic pass mark is 30. We had at our disposal student records of the grade 12 examination results, for the years 2005–2013. These data were obtained from the national Department of Basic Education (DBE), where both of the authors are employed. Though public access for researchers to school-level examinations data, through the University of Cape Town’s DataFirst portal, is now possible, student-level data are not widely available, in part due to privacy concerns and weak capacity in the administration to anonymise data. In the case of the 2005 to 2007 data, we used overlaps between participation in mathematics and in physical science higher grade, in order to put all mathematics students, whether higher or standard grade students, on a common scale. The standard of the examinations, it could be argued, is set too high, as seen in the low mean. Pritchett and Beatty (2012) have argued that overly demanding examinations are a common feature in developing countries, and go on to say that ‘Paradoxically, learning could go faster if curricula and teachers were to slow down’. The fact that the examinations are demanding is likely to create floor effects, or weaknesses in the ability of the examinations to differentiate between relatively weak students. For this reason, and because high-level achievement is important for development, much of our analysis focusses on data collapsed to the school level, and specifically each school’s 95th percentile mark. Marks in traditional examinations systems are never fully equivalent across years, despite mark adjustments by authorities (including the South African ones) to enhance across-year comparability (the marks in our data were post-adjustment marks). This is a problem inherent to examinations which lack secure anchor items (common questions kept secret). We control for this comparability problem by using year dummies in our analysis. Moreover, our analysis rests strongly on the assumption that provinces apply similar marking standards within each year, and that national controls in this regard work well. What we do not report on in the paper is an analysis of relatively well-performing schools with stable demographics, similar to what appears in Gustafsson (2016). The ranking of these schools did not shift over time according to province, a finding which seemed to confirm that provincial marking standards were sufficiently constant. 2.2 Province-switching Seven provinces of the nine South African provinces saw their boundaries change in 2005 (see list in Table 1 note). A total of 710 schools, 158 of which offer grade 12, experienced a change in provincial administration. The distribution of the 158 schools is shown in Figure 1. The five largest province-to-province ‘migrations’—the five rows of Table 1—account for 151 of the 158 schools, and it is these 151 we consider province-switching schools in all our models in Section 3. Table 1: Count of Schools Sending province (before change) Switching group Receiving province (before change) EC > KN 736 15 1254 LP > MP 1052 83 321 MP > LP 321 13 1052 NW > GP 278 29 445 NW > NC 278 11 82 Total 151 Sending province (before change) Switching group Receiving province (before change) EC > KN 736 15 1254 LP > MP 1052 83 321 MP > LP 321 13 1052 NW > GP 278 29 445 NW > NC 278 11 82 Total 151 Note: Numbers reflect schools considered in models A to D of Table 3. The seven provinces involved in the boundary changes are: Eastern Cape (EC), Gauteng (GP), KwaZulu-Natal (KN), Limpopo (LP), Mpumalanga (MP), Northern Cape (NC) and North West (NW). Table 1: Count of Schools Sending province (before change) Switching group Receiving province (before change) EC > KN 736 15 1254 LP > MP 1052 83 321 MP > LP 321 13 1052 NW > GP 278 29 445 NW > NC 278 11 82 Total 151 Sending province (before change) Switching group Receiving province (before change) EC > KN 736 15 1254 LP > MP 1052 83 321 MP > LP 321 13 1052 NW > GP 278 29 445 NW > NC 278 11 82 Total 151 Note: Numbers reflect schools considered in models A to D of Table 3. The seven provinces involved in the boundary changes are: Eastern Cape (EC), Gauteng (GP), KwaZulu-Natal (KN), Limpopo (LP), Mpumalanga (MP), Northern Cape (NC) and North West (NW). Figure 1: View largeDownload slide Schools experiencing a province change after 2005. Note: Provincial boundaries are those created by the 2005 boundary changes. Schools which moved from a neighbouring province are marked by large points. Schools which moved from Limpopo to Mpumalanga are represented by large grey points, to make them distinct from schools which moved in the opposite direction, from Mpumalanga to Limpopo. Small points represent non-switching schools participating in the grade 12 examinations. Figure 1: View largeDownload slide Schools experiencing a province change after 2005. Note: Provincial boundaries are those created by the 2005 boundary changes. Schools which moved from a neighbouring province are marked by large points. Schools which moved from Limpopo to Mpumalanga are represented by large grey points, to make them distinct from schools which moved in the opposite direction, from Mpumalanga to Limpopo. Small points represent non-switching schools participating in the grade 12 examinations. All but 2 of the 151 had grade 12 groups which were between 90% and 100% black African (the two exceptions were in the MP > LP switching group). The schools are thus interesting in terms of understanding educational improvements for the most historically disadvantaged segments of the South African population. The boundary changes occurred to ensure that no municipality straddled two provinces. Municipalities do not play a role in the administration of schools. This is the responsibility of the provincial government. Reporting directly to the provincial authorities are a number of ‘education districts’, whose boundaries often coincide with those of the municipalities, though institutionally they are independent of each other. Provincial education departments are funded by the provincial department of finance. Though legislation changing the provincial boundaries was passed in December 2005, shortly after the 2005 examinations had been written, the transfer of schools to their new provincial administrations was not immediate. The official transfer for all affected provinces appears to have occurred in April 2007, which would have been a few months into the 2007 academic year, starting in January 2007. The 2008 academic year would therefore have been the first such year in which a new province would have been in full administrative control of the transferred school. The fact that from December 2005 schools knew that the transfer was imminent could have influenced behaviour in the schools from as early as the 2006 school year. For instance, school principals may have felt invigorated or dejected by the fact that they were being transferred to what was perceived as a better or worse province. 2.3 TIMSS data and measures of provincial effectiveness TIMSS1 secondary school data for 2002, 2011 and 2015 were used to gain a sense of whether a move from one province to another could be considered an improvement or a deterioration for a school. This was obviously important for understanding the nature of the ‘treatment’. Even roughly objective measures of the effectiveness of any administration, national or sub-national, are hard to come by. An advantage with the TIMSS data is that they allow one to control for the socio-economic status of students. Values for δ in equation (1) were derived through a series of simple education production functions. Four regressions, one for each of the four pair of provinces involved in the inter-provincial movement, were run. Data from the three years were pooled. ei=βˆ0+βˆ1si+β2si2+δˆD+uˆi (1) Here e is the TIMSS mathematics score of student i converted to a z-score, and s is the socio-economic status of student i calculated using a principal components analysis that drew from around 10 household items declared in the student’s background questionnaire (the number varied slightly by year). s was also converted to a z-score. Table 2 below reports the values of δ and its standard error, where the p value was no more than 0.1. The regressions used weights for students adjusted in a manner that gave an equal weight to each of the three years. Clustering by school was taken into account throughout. The pooled data thus represents a time period 2002 to 2015, which encompasses the 2005 to 2013 period from which the examinations data come. Table 2: Conditional Inter-Provincial Quality Differences (δ) Using TIMSS 2002 2011 2015 Pooled EC > KN 0.203 (0.114) 0.159 (0.088) LP > MP 0.342 (0.118) 0.178 (0.101) 0.190 (0.095) NW > GP 0.258 (0.108) 0.491 (0.125) 0.292 (0.091) NW > NC 0.407 (0.164) 0.247 (0.115) 2002 2011 2015 Pooled EC > KN 0.203 (0.114) 0.159 (0.088) LP > MP 0.342 (0.118) 0.178 (0.101) 0.190 (0.095) NW > GP 0.258 (0.108) 0.491 (0.125) 0.292 (0.091) NW > NC 0.407 (0.164) 0.247 (0.115) Source: Own calculations using (for 2011 and 2015) data available at https://timssandpirls.bc.edu. For 2002, grade 9 data were obtained from the Human Sciences Research Council. Thus all 3 years use grade 9 data. Table 2: Conditional Inter-Provincial Quality Differences (δ) Using TIMSS 2002 2011 2015 Pooled EC > KN 0.203 (0.114) 0.159 (0.088) LP > MP 0.342 (0.118) 0.178 (0.101) 0.190 (0.095) NW > GP 0.258 (0.108) 0.491 (0.125) 0.292 (0.091) NW > NC 0.407 (0.164) 0.247 (0.115) 2002 2011 2015 Pooled EC > KN 0.203 (0.114) 0.159 (0.088) LP > MP 0.342 (0.118) 0.178 (0.101) 0.190 (0.095) NW > GP 0.258 (0.108) 0.491 (0.125) 0.292 (0.091) NW > NC 0.407 (0.164) 0.247 (0.115) Source: Own calculations using (for 2011 and 2015) data available at https://timssandpirls.bc.edu. For 2002, grade 9 data were obtained from the Human Sciences Research Council. Thus all 3 years use grade 9 data. All values of δ are positive, indicating that for these four movements, the move was to a better province. Only the MP > LP movement emerges as a move to a worse province—here the value for δ would be the negative of δ for LP > MP. It is not entirely accidental that moves to better provinces feature so prominently. The town of Khutsong was supposed to move from Gauteng to North West, but protests against this, which was seen as a move to a province with worse service delivery, eventually stopped the boundary demarcation. It was the only instance of a reversal to the changes announced in 2005. 2.4 Other administrative data Apart from the examinations data, two other administrative data sources were used: school census data (specifically the so-called Snap Survey) and payroll data. The school census data were used, above all, to provide estimates of the cohort of students in each year who would have been in grade 12, had there been no selection effects in the form of dropping out. These selection effects, which may change by school over time, play a large role in influencing each school’s relative performance in the examinations. Dropping out is not random. Academically struggling students are more likely to drop out, and different schools, and provinces, employ different methods to reduce, or even encourage, dropping out. There is in fact a strong incentive for schools to allow weaker students to drop out, as this improves examination pass rates. Grade 10 enrolment 2 years prior to the grade 12 examination event was considered the optimal estimate of who would have reached the examination in the absence of dropping out. This estimate has serious drawbacks mostly as grade repetition rates are high in grade 10, with considerable variation across schools. The ideal would have been the number of 15-year-olds 2 years prior to the examination. Around 98% of the population remains in school until the year they turn fifteen. However, age data in the school census data are missing for a considerable number of schools. Payroll data were used mainly to gauge whether province-switching schools experienced exceptional changes in their staffing as a result of the switch. 3. Analysis of student performance and province-switching 3.1 The estimation model Equations for three estimation models, which serve as our point of departure, appear below. Equation (2) is a basic difference-in-differences (DiD) model with panel data. For this model, we have collapsed our student data to the level of the school, meaning we have a panel with schools linked across years. E is the performance of school i in year t. To begin with, E is the score of the student at the 95th percentile, relative to all students who took or might have taken mathematics, meaning the school’s grade 10 enrolment 2 years previously. In calculating the 95th percentile, it was assumed that students who did not reach the mathematics class, calculated as grade 12 mathematics students minus earlier enrolment in grade 10, would have performed poorly in mathematics. These non-mathematics students were assigned a mark of zero in our data. We also explore other versions of E, specifically the mean and statistics where selection effects are not taken into account, in Section 3.3. The variables S in brackets refer to dummy 0–1 variables for each of the schools i, except for one (school i = 0). Each school, except for one, thus carries its own intercept λ. This means that unobserved and time-invariant phenomena influencing the general level of performance of each school are controlled for. T is a 0–1 dummy variable indicating which school in which year was subject to the treatment. T would thus be 1 for schools in the four switching categories other than MP > LP, and for the ‘treatment’ years 2008 to 2013. Finally, eight year dummies D for the periods 1–8, or 2006–2013, are entered, making the model a two-way fixed effects model (school and year are fixed). In terms of the difference-in-differences method, the first difference is controlled for by the school fixed effects λ. The second difference, or the extent to which the difference between the treated and control schools changed, is captured in β1 (Imbens and Wooldridge, 2009: 70; Schlotter et al. 2011: 127). A positive treatment effect would be reflected in a positive value for β1. Eit=λˆ0+(λˆ1Si=1,+…+λˆnSi=n)+βˆ1Tit+βˆ2Dt=1+…+βˆ9Dt=8+uˆit (2) We assume that the selection of treated schools, though not strictly random since they had to be close to borders that moved, can be considered random for the purpose of causal inference. Put differently, we assume that these schools were not on a distinct performance trajectory even before the move. We test this assumption below, both by looking at trends in the pre-treatment years, so 2005 to 2007, and by testing the trends for schools which were geographically close to the moving schools in the ‘sending’ province but did not move. The problem with equation (2) in the context of several treatment years, and we have treatments extending across 6 years, 2008–2013, is that whilst β1 may tell us that on average treated schools performed better, it will not provide information on the slope of E over the years 2008 and 2013. We are interested in a slope, and would expect a positive one where the move is to a more effective province, given that effects are likely to take time, largely because grade 12 results reflect progress in earlier grades. To gauge the slope, we need equation (3) below. Here the treatment variable of interest is the interaction term PT, meaning the interaction between period and the 0–1 treatment dummy. P carries value 0 for 2008, the first treatment year (T on its own takes on a value of 1 for 2008, and later years). If the treated schools display a slope for E which is more positive (or less negative) than non-treated schools, β2 will be positive. Eit=λˆ0+(λˆ1Si=1,+…+λˆnSi=n)+βˆ1Tit+βˆ2PTit+βˆ3Dt=1+…+βˆ10Dt=8+uˆit (3) One further adaptation is needed. Equation (3) is appropriate if we have one treatment group, being all schools moving to a better province, and consider as irrelevant the sending province from which each school comes. However, if we want to test each group of switching schools as a separate treatment group, with its own distinct treatment experience, we should not only consider more than one treatment group, we should also consider more than one control group. This is if we want to avoid running four separate models, one for each combination of sending province and group of schools switching to a better province. (For now we ignore the group which moved to a worse province.) Equation (4) illustrates the required model. Here four treatment variables, T1–T4, are employed, but also three control variables, K1–K3, representing the three sending provinces EC, LP and NW. Each K is a dummy taking on a value of 1 for a specific sending province, and if the year is in the treatment period, in other words 2008–2013. We interact the dummies with period as defined previously. Using this approach, we for instance consider the LP > MP group of schools a subset of all schools which started off in LP. Eit=λˆ0+(λˆ1Si=1,+…+λˆnSi=n)+αˆ1K1it+αˆ2PK1it+…+αˆ5K3it+αˆ6PK3it+βˆ1T1it+βˆ2PT1it+…+βˆ7T4it+βˆ8PT4it+βˆ2Dt=1+…+βˆ12Dt=8+uˆit (4) 3.2 Panel model with school fixed effects Table 3 reports on the results of four key models, all using as the dependent variable the school’s performance at the 95th percentile, with non-selected students (based on the earlier grade 10 enrolment) being assigned a score of zero. In Model A, being in one of the four moving groups involving a move to a better province meant a value of 1 was assigned to the variable ‘Is to better’ in the case of observations of the treatment years, 2008 to 2013. ‘Is to better’ is thus the treatment dummy T of equation (2). A similar ‘Is to worse’ dummy for the only group involving a move to a worse province, MP > LP, was also created. Both dummy variables produce statistically significant coefficients of the expected sign, and of similar absolute magnitudes. Moving to a better (worse) province is associated with a gain (loss) of between 3 and 4 marks with respect to E. That this should emerge so clearly is remarkable, given that ‘to better’ schools were spread across four ‘migrations’. One might expect a single ‘migration’ to display gains in the case of a ‘receiving province’ which was particularly good at fixing problems experienced in the ‘sending province’, but a general positive effect of moving to a better province seems striking. Table 3: Regression Outputs for School Fixed Effects Panel Model A B C D Dependent variable → Mark at 95th percentile with non-selected students (based on earlier grade 10) assigned a mark of zero Constant 41.2*** (0.27) 40.0*** (0.31) 40.0*** (0.00) 40.0*** (0.32) Is to better (B) 3.480*** (0.69) −1.494 (0.94) Int. B & Period 1.317*** (0.26) Is to worse (W) −3.745* (2.23) −8.380*** (2.45) Int. W & Period −0.049 (0.70) Is EC > KN 2.781* (1.41) Int. EC > KN & Period −1.488** (0.38) Is LP > MP −1.362 (2.40) Int. LP > MP & Period 1.663*** (0.84) Is MP > LP −8.380*** (2.79) Int. MP > LP & Period −0.049 (0.58) Is NW > GP −3.052 (2.63) Int. NW > GP & Period 1.882*** (0.63) Is NW > NC −5.521** (0.60) Int. NW > NC & Period 1.490** (0.30) Provincial diff. (δ) 1.643 (3.71) Int. δ & Period 6.011*** (0.98) Is EC −3.755*** (0.46) −3.785*** (0.36) −3.764*** (0.35) Int. EC & Period 0.528*** (0.12) 0.585*** (0.10) 0.535*** (0.10) Is LP −0.511 (0.40) −0.549 (0.55) −0.549 (0.39) Int. LP & Period 0.953*** (0.15) 0.953*** (0.14) 0.997*** (0.14) Is MP 2.736*** (0.54) 2.736*** (0.52) 2.407*** (0.58) Int. MP & Period 0.886*** (0.12) 0.860*** (0.14) 0.900*** (0.09) Is NW 0.464 (0.69) 0.712 (0.10) 0.371 (0.65) Int. NW & Period −0.291* (0.18) −0.356*** (1.46) −0.343 (0.22) N 42,392 42,392 42,392 42,392 Number of schools 4,711 4,711 4,711 4,711 R2 overall 0.087 0.089 0.089 0.089 A B C D Dependent variable → Mark at 95th percentile with non-selected students (based on earlier grade 10) assigned a mark of zero Constant 41.2*** (0.27) 40.0*** (0.31) 40.0*** (0.00) 40.0*** (0.32) Is to better (B) 3.480*** (0.69) −1.494 (0.94) Int. B & Period 1.317*** (0.26) Is to worse (W) −3.745* (2.23) −8.380*** (2.45) Int. W & Period −0.049 (0.70) Is EC > KN 2.781* (1.41) Int. EC > KN & Period −1.488** (0.38) Is LP > MP −1.362 (2.40) Int. LP > MP & Period 1.663*** (0.84) Is MP > LP −8.380*** (2.79) Int. MP > LP & Period −0.049 (0.58) Is NW > GP −3.052 (2.63) Int. NW > GP & Period 1.882*** (0.63) Is NW > NC −5.521** (0.60) Int. NW > NC & Period 1.490** (0.30) Provincial diff. (δ) 1.643 (3.71) Int. δ & Period 6.011*** (0.98) Is EC −3.755*** (0.46) −3.785*** (0.36) −3.764*** (0.35) Int. EC & Period 0.528*** (0.12) 0.585*** (0.10) 0.535*** (0.10) Is LP −0.511 (0.40) −0.549 (0.55) −0.549 (0.39) Int. LP & Period 0.953*** (0.15) 0.953*** (0.14) 0.997*** (0.14) Is MP 2.736*** (0.54) 2.736*** (0.52) 2.407*** (0.58) Int. MP & Period 0.886*** (0.12) 0.860*** (0.14) 0.900*** (0.09) Is NW 0.464 (0.69) 0.712 (0.10) 0.371 (0.65) Int. NW & Period −0.291* (0.18) −0.356*** (1.46) −0.343 (0.22) N 42,392 42,392 42,392 42,392 Number of schools 4,711 4,711 4,711 4,711 R2 overall 0.087 0.089 0.089 0.089 Note: ‘Period’ here makes the years 2005–2008 zero, 2009 value 1, 2010 value 2, and so on. All listed variables would be zero for the years 2005–2007. Not reported here are single-year dummies, used in all models, with 2013 being the reference year (because of the exceptionally low values for E in the starting year 2005, this year is not the reference). All models make use of bootstrap estimation of standard errors, which appear in parentheses. Here and in tables that follow ***indicates that the estimate is significant at the 1% level of significance, ** at the 5% level and * at the 10% level. Table 3: Regression Outputs for School Fixed Effects Panel Model A B C D Dependent variable → Mark at 95th percentile with non-selected students (based on earlier grade 10) assigned a mark of zero Constant 41.2*** (0.27) 40.0*** (0.31) 40.0*** (0.00) 40.0*** (0.32) Is to better (B) 3.480*** (0.69) −1.494 (0.94) Int. B & Period 1.317*** (0.26) Is to worse (W) −3.745* (2.23) −8.380*** (2.45) Int. W & Period −0.049 (0.70) Is EC > KN 2.781* (1.41) Int. EC > KN & Period −1.488** (0.38) Is LP > MP −1.362 (2.40) Int. LP > MP & Period 1.663*** (0.84) Is MP > LP −8.380*** (2.79) Int. MP > LP & Period −0.049 (0.58) Is NW > GP −3.052 (2.63) Int. NW > GP & Period 1.882*** (0.63) Is NW > NC −5.521** (0.60) Int. NW > NC & Period 1.490** (0.30) Provincial diff. (δ) 1.643 (3.71) Int. δ & Period 6.011*** (0.98) Is EC −3.755*** (0.46) −3.785*** (0.36) −3.764*** (0.35) Int. EC & Period 0.528*** (0.12) 0.585*** (0.10) 0.535*** (0.10) Is LP −0.511 (0.40) −0.549 (0.55) −0.549 (0.39) Int. LP & Period 0.953*** (0.15) 0.953*** (0.14) 0.997*** (0.14) Is MP 2.736*** (0.54) 2.736*** (0.52) 2.407*** (0.58) Int. MP & Period 0.886*** (0.12) 0.860*** (0.14) 0.900*** (0.09) Is NW 0.464 (0.69) 0.712 (0.10) 0.371 (0.65) Int. NW & Period −0.291* (0.18) −0.356*** (1.46) −0.343 (0.22) N 42,392 42,392 42,392 42,392 Number of schools 4,711 4,711 4,711 4,711 R2 overall 0.087 0.089 0.089 0.089 A B C D Dependent variable → Mark at 95th percentile with non-selected students (based on earlier grade 10) assigned a mark of zero Constant 41.2*** (0.27) 40.0*** (0.31) 40.0*** (0.00) 40.0*** (0.32) Is to better (B) 3.480*** (0.69) −1.494 (0.94) Int. B & Period 1.317*** (0.26) Is to worse (W) −3.745* (2.23) −8.380*** (2.45) Int. W & Period −0.049 (0.70) Is EC > KN 2.781* (1.41) Int. EC > KN & Period −1.488** (0.38) Is LP > MP −1.362 (2.40) Int. LP > MP & Period 1.663*** (0.84) Is MP > LP −8.380*** (2.79) Int. MP > LP & Period −0.049 (0.58) Is NW > GP −3.052 (2.63) Int. NW > GP & Period 1.882*** (0.63) Is NW > NC −5.521** (0.60) Int. NW > NC & Period 1.490** (0.30) Provincial diff. (δ) 1.643 (3.71) Int. δ & Period 6.011*** (0.98) Is EC −3.755*** (0.46) −3.785*** (0.36) −3.764*** (0.35) Int. EC & Period 0.528*** (0.12) 0.585*** (0.10) 0.535*** (0.10) Is LP −0.511 (0.40) −0.549 (0.55) −0.549 (0.39) Int. LP & Period 0.953*** (0.15) 0.953*** (0.14) 0.997*** (0.14) Is MP 2.736*** (0.54) 2.736*** (0.52) 2.407*** (0.58) Int. MP & Period 0.886*** (0.12) 0.860*** (0.14) 0.900*** (0.09) Is NW 0.464 (0.69) 0.712 (0.10) 0.371 (0.65) Int. NW & Period −0.291* (0.18) −0.356*** (1.46) −0.343 (0.22) N 42,392 42,392 42,392 42,392 Number of schools 4,711 4,711 4,711 4,711 R2 overall 0.087 0.089 0.089 0.089 Note: ‘Period’ here makes the years 2005–2008 zero, 2009 value 1, 2010 value 2, and so on. All listed variables would be zero for the years 2005–2007. Not reported here are single-year dummies, used in all models, with 2013 being the reference year (because of the exceptionally low values for E in the starting year 2005, this year is not the reference). All models make use of bootstrap estimation of standard errors, which appear in parentheses. Here and in tables that follow ***indicates that the estimate is significant at the 1% level of significance, ** at the 5% level and * at the 10% level. Model B essentially implements equation (4), though with still one positive treatment group and one negative treatment group. The interaction between ‘Is to better’ and period (where 2008 is zero, 2009 is one, and so on) produces a coefficient that is statistically significant at the 1% level. For each year that passes in the treatment period, 1.3 marks is added to E. The results obtained if one enters each year separately for just the treatment group are explored below. The negative coefficient on the ‘Is to worse’ dummy is now much larger mostly as this is now relative to the new dummy ‘Is MP’, the sending province for the ‘Is to worse’ group. In Model C, which implements equation (4) fully, the performance slope during the treatment period is examined at the level of the five switching groups. Three of the four groups moving to better provinces display the expected positive slope, significant at least at the 10% level. The exception is EC > KN, where a positive coefficient on the dummy appears together with a negative (and significant) coefficient for the period interaction. The MP > LP group displays the same results as in Model B: a statistically significant negative coefficient on the 0–1 dummy, without a statistically significant slope. Finally, Model D considers the effectiveness gap between the sending and receiving provinces as a non-binary treatment variable representing the ‘dosage’ of the treatment. Values from the last column of Table 2 were used for δ, with MP > LP schools taking the negative for LP > MP. The coefficient on the interaction with period indicates that a ‘step-up’ (step-down) of one standard deviation of performance in TIMSS, associated with the move to a new province, comes with an annual gain (loss) of 6 marks at the 95th percentile of the mathematics examination. The relationship between the step-up and the actual annual gain is illustrated in Figure 2 below. The confidence intervals (at the 95% level) for the step-up (horizontal lines) draw from Table 2. The estimated gain by 2013 (midpoints of the vertical lines) was calculated using the coefficients for both the dummy and the dummy–period interaction in Model C, in the case of each of the five groups. The confidence intervals represented by the vertical lines use the standard errors around the predicted means produced when Model C as a whole is calculated. Clearly one cannot say much about the differences between the four step-ups. Their confidence intervals overlap to a large degree. Yet the general pattern is telling. Three groups moving to better provinces display confidence intervals, against both axes, which are mostly within the ‘correct’ quadrant, namely the top-right one, representing a step-up associated with actual improvements. The move to a worse province for MP > LP puts this group in the correct bottom-left quadrant, with a deterioration according to the vertical axis (minus 8.6) which is not too different in absolute terms from the improvement of 7.0 experienced by the group moving in the opposite direction, LP > MP. Only EC > KN produces an anomalous result: a loss in performance, relative to the sending province, though the move was to a better performing province. Figure 2: View largeDownload slide Gain by 2013 and size of δ. Note: Three-point vertical and horizontal lines represent the point estimate for the coefficient with a confidence interval at the 95% level. Figure 2: View largeDownload slide Gain by 2013 and size of δ. Note: Three-point vertical and horizontal lines represent the point estimate for the coefficient with a confidence interval at the 95% level. The fact that there were movements between LP and MP in both directions, and that these are associated with opposite effects of roughly equivalent magnitudes, is particularly important for our conclusions. This points strongly, independently of any use of TIMSS-based measures of province effectiveness, to significant quality differences between provinces, and to significant and fairly rapid impacts of these quality differences on incoming schools. The descriptive statistics in Table 4 illustrate the unconditional trends with respect to the dependent variable. The fact that all slopes are positive indicates that gains and losses must be understood in terms of faster and slower positive change. The general upward trend could be due to changing standards, occurring for instance with the introduction of the new examination system in 2008, but TIMSS has also pointed to a general and real improvement over this period in secondary schools. Of note is the fact that four of the five switching groups, all except for NW > GP, display means that are below both the sending and receiving provinces. In this sense, the switching schools are not representative of their sending provinces. Also of note is the fact that LP > MP and NW > GP, the two largest groups of the five, display slopes which are not just steeper than those of their sending provinces, but also their receiving provinces, suggesting these groups could catch up to the higher means of their new provinces. Table 4: Descriptive Statistics for 95th Percentile of all Potential Students Mean S.d. Slope Mean S.d. Slope Mean S.d. Slope Country 36 19 1.8 Sending province Moving group Receiving province EC > KN 31 18 1.4 30 14 0.7 37 17 2.1 LP > MP 32 17 2.2 27 15 3.3 36 19 2.6 MP > LP 36 19 2.6 31 20 1.3 32 17 2.2 NW > GP 34 20 1.6 35 17 2.6 43 22 1.4 NW > NC 34 20 1.6 23 12 1.9 37 25 0.4 Mean S.d. Slope Mean S.d. Slope Mean S.d. Slope Country 36 19 1.8 Sending province Moving group Receiving province EC > KN 31 18 1.4 30 14 0.7 37 17 2.1 LP > MP 32 17 2.2 27 15 3.3 36 19 2.6 MP > LP 36 19 2.6 31 20 1.3 32 17 2.2 NW > GP 34 20 1.6 35 17 2.6 43 22 1.4 NW > NC 34 20 1.6 23 12 1.9 37 25 0.4 Note: The statistics all describe the dependent variable of models A to D, namely mark at the 95th percentile relative to earlier grade 10. The slope and overall mean are calculated using the year-group means for all the years 2005–2013. The slope is the annual slope across the year-group means. Table 4: Descriptive Statistics for 95th Percentile of all Potential Students Mean S.d. Slope Mean S.d. Slope Mean S.d. Slope Country 36 19 1.8 Sending province Moving group Receiving province EC > KN 31 18 1.4 30 14 0.7 37 17 2.1 LP > MP 32 17 2.2 27 15 3.3 36 19 2.6 MP > LP 36 19 2.6 31 20 1.3 32 17 2.2 NW > GP 34 20 1.6 35 17 2.6 43 22 1.4 NW > NC 34 20 1.6 23 12 1.9 37 25 0.4 Mean S.d. Slope Mean S.d. Slope Mean S.d. Slope Country 36 19 1.8 Sending province Moving group Receiving province EC > KN 31 18 1.4 30 14 0.7 37 17 2.1 LP > MP 32 17 2.2 27 15 3.3 36 19 2.6 MP > LP 36 19 2.6 31 20 1.3 32 17 2.2 NW > GP 34 20 1.6 35 17 2.6 43 22 1.4 NW > NC 34 20 1.6 23 12 1.9 37 25 0.4 Note: The statistics all describe the dependent variable of models A to D, namely mark at the 95th percentile relative to earlier grade 10. The slope and overall mean are calculated using the year-group means for all the years 2005–2013. The slope is the annual slope across the year-group means. Regressions represented by equation (5) below were run for sub-samples of the observations: just the years 2005–2007, and for just one province at a time. School fixed effects (not shown in the equation) and the bootstrapping of standard errors were applied. The aim was to detect whether the trajectory of schools about to move was different from that of other schools in the province. Here T is a 0–1 dummy indicating schools which will move, and P is period (carrying values 0, 1 and 2 for 2005, 2006 and 2007). Any pre-2008 group-specific trend could suggest a group was on a trajectory that would be sustained after the switch, which could change the interpretation of the results in Table 4. The only statistically significant (at the 10% level) coefficient emerging for β2 was that for the LP > MP group, the value being a negative 1.5, meaning this group was improving at a slower speed than the rest of LP. This obviously removes the possibility that the large improvements seen by this group from 2008 were part of a trend which preceded the change in province. Eit=λˆ+βˆ1Pit+βˆ2PTit+uˆit (5) In order to test whether trends in the switching groups were attributable to broader trends in the wider geographical areas of these groups, an additional control was introduced in the form of non-switching schools near the switching schools, within the sending province. This was done for the two improving groups with the clearest trends, LP > MP and NW > GP, and for the one worsening group, MP > LP. These were also groups for which a reasonable number of nearby schools existed. Nearby schools from the sending province were added, starting with the school closest to any of the switching schools, until the number of schools in the new control group equalled the number of schools in the treatment, or switching group. This occurred upon reaching a distance of 41 km for schools near NW > GP, 99 km for LP > MP and 9 km for MP > LP. Dummies for the three control groups, and interactions between the dummies and period, were added to Model C, along the lines of what had been done for the switching groups. Of the six new coefficients, none of the interactions with period were significant (p values of 0.573 or higher), and just one of the coefficients on the dummies emerged as significant. This was in the case of the dummy for schools near the LP > MP group, where the coefficient was −4.1 (p of 0.001). This indicates that these nearby schools experienced a deterioration in their results, relative to the rest of LP, while schools which moved to MP saw relatively strong annual improvements. The possibility of wider geographical trends unrelated to the border changes being an alternative explanation is not supported by the data. The possibility that better students from non-switching LP schools started attending the switching LP > MP schools in the hope of receiving a better education is also not supported by the data—this would otherwise have been a possible explanation for the performance decline in nearby schools. The group of 83 non-switching schools near the LP > MP schools are in fact mostly far from the switching schools: only two of the 83 were within 50 km, with the remaining 81 being in the range of 50–99 km. Such distances would largely preclude substantial ‘voting with ones feet’, in particular considering that the area in question is relatively poor. To obtain a more precise idea of improvements over time among schools moving to a better province, Model A was adapted. To the eight general year dummies were added another eight single-year dummies, with 2005 serving as the reference year, which carried a value of 1 only for schools moving to a better province. The coefficients, and their confidence intervals, for the additional year dummies are illustrated in Figure 3 below. Bootstrapping was applied. In aggregate, for the 138 schools moving to a better province, most of the improvement had been achieved by 2011, implying an initial period of relatively intense improvement over the years 2008–2011. The dip in 2008 is noteworthy, and a bit puzzling. It is possible that the switch to the new province brought about initial disruptions to the schooling process. However, this disruption was clearly overcome after a year, with 2009 displaying the best outputs since 2005. Figure 3: View largeDownload slide Year-specific gains for ‘to better’ schools. Note: Three-point vertical lines represent the point estimate for the coefficient with 95% confidence intervals. Values along the vertical axis are changes in mathematics marks relative to the 2005 reference year Figure 3: View largeDownload slide Year-specific gains for ‘to better’ schools. Note: Three-point vertical lines represent the point estimate for the coefficient with 95% confidence intervals. Values along the vertical axis are changes in mathematics marks relative to the 2005 reference year 3.3 Alternative dependent variables and student-level models The coefficients shown in Table 5 below are from models similar to B and C in Table 3, but with the complexity of interactions with period removed. They are thus the coefficients for dummy variables taking on the value 1 for the group in question if the year is in the range 2008–2013. Each row of Table 5 draws from two regressions, one to obtain the first five columns, and another for the final column. The overall objective was to examine the impact of using the mean, instead of the 95th percentile, of removing observations with zero representing non-selected students, and of running the analysis with student observations, as opposed to school observations. Table 5: Alternative Dependent Variables and Level Level E Scope MP > LP EC > KN LP > MP NW > GP NW > NC To better School p95 All −8.55*** 4.42*** 3.59* 3.10*** School p95 Grade −3.44** 5.71*** School p95 Class 7.78*** School Mean All 0.64** 0.41** School Mean Grade −1.85*** 1.60*** −0.87*** School Mean Class −3.86*** 4.87*** −3.97** −1.73*** Student Mean All −2.16*** Student Mean Grade −4.92*** −3.21*** 1.68** −1.05*** Student Mean Class −3.60*** −4.56*** 3.23*** −5.07*** −1.52** Level E Scope MP > LP EC > KN LP > MP NW > GP NW > NC To better School p95 All −8.55*** 4.42*** 3.59* 3.10*** School p95 Grade −3.44** 5.71*** School p95 Class 7.78*** School Mean All 0.64** 0.41** School Mean Grade −1.85*** 1.60*** −0.87*** School Mean Class −3.86*** 4.87*** −3.97** −1.73*** Student Mean All −2.16*** Student Mean Grade −4.92*** −3.21*** 1.68** −1.05*** Student Mean Class −3.60*** −4.56*** 3.23*** −5.07*** −1.52** Note: The student-level models used around 7.9, 4.0 and 2.3 million observations for the ‘All’, ‘Grade’ and ‘Class’ models. For the student-level models, standard errors consider clustering by schools. *** indicates that the estimate is significant at the 1% level of significance, ** at the 5% level, and * at the 10% level. Table 5: Alternative Dependent Variables and Level Level E Scope MP > LP EC > KN LP > MP NW > GP NW > NC To better School p95 All −8.55*** 4.42*** 3.59* 3.10*** School p95 Grade −3.44** 5.71*** School p95 Class 7.78*** School Mean All 0.64** 0.41** School Mean Grade −1.85*** 1.60*** −0.87*** School Mean Class −3.86*** 4.87*** −3.97** −1.73*** Student Mean All −2.16*** Student Mean Grade −4.92*** −3.21*** 1.68** −1.05*** Student Mean Class −3.60*** −4.56*** 3.23*** −5.07*** −1.52** Level E Scope MP > LP EC > KN LP > MP NW > GP NW > NC To better School p95 All −8.55*** 4.42*** 3.59* 3.10*** School p95 Grade −3.44** 5.71*** School p95 Class 7.78*** School Mean All 0.64** 0.41** School Mean Grade −1.85*** 1.60*** −0.87*** School Mean Class −3.86*** 4.87*** −3.97** −1.73*** Student Mean All −2.16*** Student Mean Grade −4.92*** −3.21*** 1.68** −1.05*** Student Mean Class −3.60*** −4.56*** 3.23*** −5.07*** −1.52** Note: The student-level models used around 7.9, 4.0 and 2.3 million observations for the ‘All’, ‘Grade’ and ‘Class’ models. For the student-level models, standard errors consider clustering by schools. *** indicates that the estimate is significant at the 1% level of significance, ** at the 5% level, and * at the 10% level. The variations to the model specification do bring about important differences in the results. The last column suggests that if the mean mark is the dependent variable, moving to a better province carries a positive and significant coefficient only if one controls for all students who drop out, or who do not reach the grade 12 mathematics class. If one controls only for those who reach grade 12 but do not take mathematics (row ‘Grade’), or if one considers only those in the mathematics class, all significant coefficients come out negative. If one uses performance at the 95th percentile, however, moving to a better province never produces a negative coefficient, regardless of the ‘Scope’ of students considered. Turning to the group-specific statistics of the middle four columns, for the NW > GP group the coefficients are positive (wherever a significant coefficient emerges), no matter how one deals with non-selected students, whether one uses school or student observations, and regardless of which of the two dependent variables E one chooses. The NW > GP results thus seem consistent with what one might expect, and with the analysis of the previous section. The coefficients in the MP > LP column are all negative, which is also a consistent result, given that this is a move to a worse province. The counter-intuitive negative coefficients in the last column are driven by negative coefficients for LP > MP, a group comprising more than half of the schools moving to a better province. Why would moving to a better province improve performance at the 95th percentile, while mostly lowering the mean, in a relative sense? (Where one controls for all non-selected students, and uses the mean, the expected positive coefficient for LP > MP does emerge—see the value 0.64.) Examination of the mean for the mathematics class, and the mean taking into account all non-selected students, along the lines of the descriptive statistics of Table 4, confirms that LP > MP experienced a steeper positive improvement than non-switching LP schools with respect to the latter indicator, but a less steep improvement with respect to the first indicator. Crucially, while LP > MP schools saw no change in the percentage of students making it into the mathematics class (relative to earlier grade 10), this indicator saw a considerable decline for the rest of the sending province (annual slope of −1.1 percentage points). Even the receiving province of MP was seeing a similar decline. One can conclude that schools other than LP > MP schools experienced an artificial additional ‘improvement’ in the mean for the mathematics class because they kept weaker students out of the class. All this underscores the importance of controlling for selection effects when analysing trends in examination results. 3.4 Policy-related explanatory variables So far, there has been no discussion of what in the province-switching experience led to educational improvement. With respect to human resources, two distinct possibilities exist, and differentiating between the two is important for policy purposes. On the one hand, it is possible that the new provincial administrations changed the effectiveness of incoming schools by inserting new and better education staff. Alternatively, by changing incentive structures, or providing certain physical resources, the new provincial administration could have changed the behaviour of inherited staff in positive ways, without changing the composition of this staff. This has policy implications insofar as the latter possibility would support more strongly the policy argument that change can be brought about simply through changes in incentive structures (or physical resourcing), in other words through a policy reform that is more widely replicable. Payroll data from 2005 and 2012 were used to calculate a staffing stability indicator. si=Ti2Ti1+Tia (6) Here Ti2 is the number of ‘educators’, meaning teachers or education management staff in schools, employed permanently in the province, who were based in school i in both 2005 and 2012. Ti1 is the number of educators who were based in the school in 2005 and Tia is new educators who arrived in the school in the sense that they worked there in 2012, but not in 2005. The stability indicator s carries a value between 0.00 (maximum staff turnover) and 1.00 (no staff turnover). The mean indicator values for three key switching groups appear in Table 6. This section will focus on specific groups of switching schools, in particular those groups producing significant coefficients in the foregoing analysis. Combining groups into a ‘to better’ category is not done as one would expect policy interventions to be fairly specific to the receiving province. Table 4 indicates that the switching groups experienced greater staffing stability than either non-switching schools in the entire remainder of the sending province, or the subset of these schools found near the switching schools (these are the sub-sets discussed above). Clearly, performance improvements in the switching schools were not achieved through exceptional staffing changes. Did switching schools experience a higher probability of a change in the school principal, an event which might have turned schools around? The payroll data analysed are a bit inconsistent when it comes to identifying who the school principal is in each year, which limits the possibility of precise measures. The indications from the data were that virtually no changes to the school principal amongst the 151 switching schools occurred, and that the probability of a principal change was greater amongst non-switching schools across the country. Table 6: Descriptive Statistics for Indicator of Staffing Stability Sending province—all Sending province—just closest Switching group LP > MP 0.511 (0.002) 0.502 (0.008) 0.661 (0.007) MP > LP 0.362 (0.003) 0.442 (0.011) 0.695 (0.011) NW > GP 0.398 (0.003) 0.418 (0.007) 0.524 (0.01) Sending province—all Sending province—just closest Switching group LP > MP 0.511 (0.002) 0.502 (0.008) 0.661 (0.007) MP > LP 0.362 (0.003) 0.442 (0.011) 0.695 (0.011) NW > GP 0.398 (0.003) 0.418 (0.007) 0.524 (0.01) Note: Values in brackets are standard errors. Table 6: Descriptive Statistics for Indicator of Staffing Stability Sending province—all Sending province—just closest Switching group LP > MP 0.511 (0.002) 0.502 (0.008) 0.661 (0.007) MP > LP 0.362 (0.003) 0.442 (0.011) 0.695 (0.011) NW > GP 0.398 (0.003) 0.418 (0.007) 0.524 (0.01) Sending province—all Sending province—just closest Switching group LP > MP 0.511 (0.002) 0.502 (0.008) 0.661 (0.007) MP > LP 0.362 (0.003) 0.442 (0.011) 0.695 (0.011) NW > GP 0.398 (0.003) 0.418 (0.007) 0.524 (0.01) Note: Values in brackets are standard errors. In order to test what incentives or physical resources made a difference to human behaviour and learning in the moving schools, we obtained school- and year-level data for 16 variables which conceivably captured important policy-induced changes. For many possible effects, data do not exist. For instance, there are no data on the instructions issued by departmental offices to schools, and on how schools respond to these instructions. Even phenomena for which data exist in many other countries, are not covered by any South African national database, including student and teacher attendance, and access to books in schools. We were thus severely limited by what data was available. Despite this, we were able to come up with 16 variables which seemed worth testing. From the Snap school census data, we extracted the following seven variables: total school enrolment; total educators (this would include teachers and schools-based managers); the pupil-teacher ratio (from the previous two variables); the ratio of educators to schools-based support staff; whether any educators are paid privately (using funds raised through fees); the percentage of educators paid privately; and the percentage of educators who are women. From the examinations data seven variables were derived: the number of students in the grade 12 group; students in the grade 12 mathematics class; the percentage of females in this class; the percentage of students in this class who are black African; the average age of the students in this class; the percentage of grade 12 students taking mathematics; and the number of grade 12 subjects examined. From the payroll data we obtained the gender of the school principal, where it was clear who the principal was. Finally, one variable used a combination of sources: the ratio of grade 12 students to grade 10 students two years previously. All but two of the variables had values specific to each of the nine years. The school principal’s gender was available for 2005 and 2012 only, while the number of grade 12 subjects examined was available for just 2005 and 2015. For each of these two variables, the initial value was used for 2005–2007, and the subsequent value for 2008–2013. We tested the 16 time-variant school-level variables in two ways. Firstly, we tested whether the annual change in the variable for three of the five moving groups (the three from Table 6) was different from the annual change seen in the respective sending provinces. In this step, we were interested not in examining correlations with the dependent variable on mathematics performance, E, but simply in examining whether moving schools displayed exceptional trends with respect to the explanatory variables, something which would provide suggestive evidence of causal factors. Secondly, for each of the 16 variables, we ran a few panel regressions, of the kind shown in Table 3, to test the conditional correlation of the variable with the dependent variable of Table 3 (performance at the 95th percentile with non-selected students assigned a mark of zero). What emerged as a particularly interesting variable was the ratio of educators to schools-based support staff, so we discuss this first. The slopes in Table 7 below are shown where the coefficient β2 in equation (7) was significant at least at the 10% level. In this equation, X is the policy variable of interest, T is the dummy variable indicating inclusion in the switching group, i is school, t is period, and P is the period. For each of the 16 variables, the regression (with school fixed effects and bootstrapping) was run three by three, so nine, times. It was run for three periods: the whole 2005 to 2013 range; 2005 to 2007 (pre-treatment years); and 2008 to 2013 (treatment years). It was also run for each of the three moving groups, with only observations from the moving group and its sending province included. Xit=λˆ+βˆ1Pit+βˆ2PTit+uˆit (7) Table 7: Staffing Details Variable Group Mean across all years (2005–2013) Slope β2 from equation (7) 2005–2013 2005–2007 2008–2013 S M R S M S M S M PT ratio LP > MP 28 29 28 −0.79 −0.20 −0.17 −0.12 −0.42 Enrolment LP > MP 557 676 802 −3.7 8.4 Ed. staff LP > MP 20 23 28 0.38 0.43 ES ratio LP > MP 9 8 6 −0.68 −0.43 −0.24 −0.70 Gr. 12 group MP > LP 48 41 36 −2.04 2.37 −3.45 −3.12 Gr. 12 math. MP > LP 26 27 22 −2.10 1.87 −2.55 −2.12 Enrolment MP > LP 802 631 557 −8.0 −20.4 −12.0 −29.0 −33.2 Gr. 12/Gr. 10 MP > LP 0.55 0.51 0.56 −0.02 0.02 −0.01 −0.03 PT ratio NW > GP 26 28 29 −0.42 −1.68 1.61 0.16 Enrolment NW > GP 668 917 1172 −6.5 33.0 Ed. staff NW > GP 26 33 43 0.11 0.64 0.74 0.40 ES ratio NW > GP 9 8 4 −1.65 0.22 −1.18 Variable Group Mean across all years (2005–2013) Slope β2 from equation (7) 2005–2013 2005–2007 2008–2013 S M R S M S M S M PT ratio LP > MP 28 29 28 −0.79 −0.20 −0.17 −0.12 −0.42 Enrolment LP > MP 557 676 802 −3.7 8.4 Ed. staff LP > MP 20 23 28 0.38 0.43 ES ratio LP > MP 9 8 6 −0.68 −0.43 −0.24 −0.70 Gr. 12 group MP > LP 48 41 36 −2.04 2.37 −3.45 −3.12 Gr. 12 math. MP > LP 26 27 22 −2.10 1.87 −2.55 −2.12 Enrolment MP > LP 802 631 557 −8.0 −20.4 −12.0 −29.0 −33.2 Gr. 12/Gr. 10 MP > LP 0.55 0.51 0.56 −0.02 0.02 −0.01 −0.03 PT ratio NW > GP 26 28 29 −0.42 −1.68 1.61 0.16 Enrolment NW > GP 668 917 1172 −6.5 33.0 Ed. staff NW > GP 26 33 43 0.11 0.64 0.74 0.40 ES ratio NW > GP 9 8 4 −1.65 0.22 −1.18 Note: ‘S’ is sending province (including moving schools), ‘M’ is moving schools, ‘R’ is receiving province (excluding moving schools). Each row in the last six columns draws from three separate regressions based on equation (7). This table presents results from only a selection of all the regressions run. Table 7: Staffing Details Variable Group Mean across all years (2005–2013) Slope β2 from equation (7) 2005–2013 2005–2007 2008–2013 S M R S M S M S M PT ratio LP > MP 28 29 28 −0.79 −0.20 −0.17 −0.12 −0.42 Enrolment LP > MP 557 676 802 −3.7 8.4 Ed. staff LP > MP 20 23 28 0.38 0.43 ES ratio LP > MP 9 8 6 −0.68 −0.43 −0.24 −0.70 Gr. 12 group MP > LP 48 41 36 −2.04 2.37 −3.45 −3.12 Gr. 12 math. MP > LP 26 27 22 −2.10 1.87 −2.55 −2.12 Enrolment MP > LP 802 631 557 −8.0 −20.4 −12.0 −29.0 −33.2 Gr. 12/Gr. 10 MP > LP 0.55 0.51 0.56 −0.02 0.02 −0.01 −0.03 PT ratio NW > GP 26 28 29 −0.42 −1.68 1.61 0.16 Enrolment NW > GP 668 917 1172 −6.5 33.0 Ed. staff NW > GP 26 33 43 0.11 0.64 0.74 0.40 ES ratio NW > GP 9 8 4 −1.65 0.22 −1.18 Variable Group Mean across all years (2005–2013) Slope β2 from equation (7) 2005–2013 2005–2007 2008–2013 S M R S M S M S M PT ratio LP > MP 28 29 28 −0.79 −0.20 −0.17 −0.12 −0.42 Enrolment LP > MP 557 676 802 −3.7 8.4 Ed. staff LP > MP 20 23 28 0.38 0.43 ES ratio LP > MP 9 8 6 −0.68 −0.43 −0.24 −0.70 Gr. 12 group MP > LP 48 41 36 −2.04 2.37 −3.45 −3.12 Gr. 12 math. MP > LP 26 27 22 −2.10 1.87 −2.55 −2.12 Enrolment MP > LP 802 631 557 −8.0 −20.4 −12.0 −29.0 −33.2 Gr. 12/Gr. 10 MP > LP 0.55 0.51 0.56 −0.02 0.02 −0.01 −0.03 PT ratio NW > GP 26 28 29 −0.42 −1.68 1.61 0.16 Enrolment NW > GP 668 917 1172 −6.5 33.0 Ed. staff NW > GP 26 33 43 0.11 0.64 0.74 0.40 ES ratio NW > GP 9 8 4 −1.65 0.22 −1.18 Note: ‘S’ is sending province (including moving schools), ‘M’ is moving schools, ‘R’ is receiving province (excluding moving schools). Each row in the last six columns draws from three separate regressions based on equation (7). This table presents results from only a selection of all the regressions run. Table 7 indicates that in the case of the LP > MP group, the ratio of educators to support staff, ‘ES ratio’, changed in the group, relative to the control group of the sending province LP as a whole, to a statistically significant degree, over the entire period. In the sending province, the ratio declined by 0.68 a year, but the decline for LP > MP schools was steeper, by an additional 0.43. Educators were thus enjoying an exceptionally large improvement in access to support staff. The annual mean values for this variable for LP > MP schools as a whole ranged between 12.1 and 13.1 between 2005 and 2008, but then fell precipitously to a range of 4.6–6.7 for the years 2009–2013 (the mean across all years was 8, as shown in Table 7). For non-switching schools in the sending province, there was a smaller yet substantial shift, from a mean of 11.0 for the years 2005–2008, to 7.5 for 2009–2013. Analysis of the original data reveals that the large shifts in the switching group were due to a better availability of support staff, not fewer educators. In particular, moving schools gained more access to administrative support staff, though there were also improvements in other categories such as cleaners and security. In 2007, only 6 of the 83 LP > MP schools had administrative support staff, and 35 had no support staff of any kind. By 2012, these figures had become 75 schools with administrative staff and only two with no support staff. Moreover, the average number of administrative support staff per school, where this human resource existed, had risen from 1.3 to 1.7. For the 13 MP > LP schools moving in the opposite direction schools with administrative support staff declined from 11 to 10 between 2007 and 2012. In the limited data we had available, improvements in access to administrative staff were, in the case of the LP > MP schools, by far the most striking positive change. This change would have allowed teachers to focus more on teaching and less on administration, and is likely to have affected the general administration of the school positively. However, the absence of data on other phenomena, and the literature on how change occurs in schools, should caution against reading too much into the administrative staff change. While important in and of itself, this change is also likely to be indicative of the general ability of the receiving province MP to govern and resource schools in better ways than the sending province LP. A slope for the ‘ES ratio’ that was significantly different to that of the control group was also found for the NW > GP ‘treatment group’. Here the most striking change was not in the number of schools with any administrative support staff—this increased from 20 to 28 between 2007 and 2012 (among the 29 schools)—but in the average number of such staff per school where such staff existed—this rose from 1.2 to 2.7. Other indicators according to which the switching schools stood out are reported in Table 7 (as well as selected indicators without significant slopes for the moving schools). LP > MP schools experienced a relative decline in the pupil-teacher ratio, though for neither the numerator (enrolments) nor for the denominator (educator staff) did a significant slope emerge. The MP > LP trends suggest a process of shrinkage not seen in the sending province in general. For instance, during the 2008 to 2013 ‘treatment period’—actually a ‘mistreatment’ period considering this was a move to a worse province—the group-specific slopes for the number of grade 12 students, the number of grade 12 mathematics candidates, overall enrolment in the school and the ratio of grade 12 students to earlier grade 10 students are all significantly negative. This would be consistent with the hypothesis that the worsening governance context for these schools disincentivised staff and students, and led to more dropping out amongst the latter. Mathematics performance appears sensitive to changes in the ratio of educators to support staff, but only to a statistically significant degree for province-switching schools experiencing large shifts in the ratio. This can be seen in Table 8 below. In Model E, which uses all observations from schools which started off in 2005 in the province LP, the ‘ES ratio’ for all schools and years does not produce a significant coefficient, whilst ‘ES ratio’ values for just LP > MP switching schools do. The coefficient is negative, indicating that the move to a lower ratio (as more support staff joined the school) was associated with better performance. The absence of a similarly significant coefficient on ‘ES ratio’ in the control group, despite the fact that even the control group experienced substantial improvements in the availability of support staff (as discussed earlier), strengthens the conclusion that it was not improvements in this indicator on its own that made a difference to the performance of switching schools, but rather the manner in which this change formed part of a wider package of reforms for the schools. Table 8: Regression Outputs with Policy Variables Dependent variable → E LP & LP > MP F NW & NW > GP Mark at 95th percentile with non-selected students (based on earlier grade 10) assigned a mark of zero Constant 39.6*** (0.57) 39.8*** (1.47) Is to better 4.482*** (1.56) 5.978*** (2.17) ES ratio 0.026 (0.02) 0.032 (0.04) Int. ES ratio & G −0.175* (0.10) −0.187* (0.10) N 6,387 2,137 Number of schools 1,025 265 R2 overall 0.119 0.070 Dependent variable → E LP & LP > MP F NW & NW > GP Mark at 95th percentile with non-selected students (based on earlier grade 10) assigned a mark of zero Constant 39.6*** (0.57) 39.8*** (1.47) Is to better 4.482*** (1.56) 5.978*** (2.17) ES ratio 0.026 (0.02) 0.032 (0.04) Int. ES ratio & G −0.175* (0.10) −0.187* (0.10) N 6,387 2,137 Number of schools 1,025 265 R2 overall 0.119 0.070 Note: G is a dummy with 1 being in the moving group, regardless of period. ‘Is to better’ only assumes non-zero values for the 2008–2013 period. Not reported here are single-year dummies, used in both models, with 2013 being the reference year. All models make use of bootstrap estimation of standard errors, which appear in parentheses. *** indicates that the estimate is significant at the 1% level of significance, ** at the 5% level, and * at the 10% level. Table 8: Regression Outputs with Policy Variables Dependent variable → E LP & LP > MP F NW & NW > GP Mark at 95th percentile with non-selected students (based on earlier grade 10) assigned a mark of zero Constant 39.6*** (0.57) 39.8*** (1.47) Is to better 4.482*** (1.56) 5.978*** (2.17) ES ratio 0.026 (0.02) 0.032 (0.04) Int. ES ratio & G −0.175* (0.10) −0.187* (0.10) N 6,387 2,137 Number of schools 1,025 265 R2 overall 0.119 0.070 Dependent variable → E LP & LP > MP F NW & NW > GP Mark at 95th percentile with non-selected students (based on earlier grade 10) assigned a mark of zero Constant 39.6*** (0.57) 39.8*** (1.47) Is to better 4.482*** (1.56) 5.978*** (2.17) ES ratio 0.026 (0.02) 0.032 (0.04) Int. ES ratio & G −0.175* (0.10) −0.187* (0.10) N 6,387 2,137 Number of schools 1,025 265 R2 overall 0.119 0.070 Note: G is a dummy with 1 being in the moving group, regardless of period. ‘Is to better’ only assumes non-zero values for the 2008–2013 period. Not reported here are single-year dummies, used in both models, with 2013 being the reference year. All models make use of bootstrap estimation of standard errors, which appear in parentheses. *** indicates that the estimate is significant at the 1% level of significance, ** at the 5% level, and * at the 10% level. The coefficient on ‘ES ratio’ interacted with G (dummy for the switching group) remains negative, even in the presence of the ‘Is to better’ treatment dummy, which in the case of Model E means a value of 1 for LP > MP schools from 2008 onwards. Model F displays a very similar sensitivity of performance to support staff in the case of NW > GP schools. However, for both models the coefficient on ‘ES ratio’ loses its significance if one inserts the interaction of ‘Is to better’ with period. Any improvement beyond the basic linear trend during the treatment period is thus not explained by access to support staff. Other policy variables were tested using the approaches applied to the ‘ES ratio’. None of these other variables yielded significant coefficients worth reporting on. Essentially, the smallness of the treatment samples and the paucity of data on likely causal factors preclude any firm findings on causality via the econometric route. The trends for the indicators on the percentage of females and black Africans in the mathematics class, and the average age within this class, were not significantly and substantively different in the three groups of switching schools, relative to their sending provinces. These are indicators whose values could change due to policy shifts, or due to factors outside the control of the system. The data suggest that neither of these two possibilities were realised. 4. Interpreting the results in the context of additional information 4.1 Additional province characteristics The School Monitoring Survey, conducted in 2011 by the national education department, offers additional insights into quality factors behind changes seen in switching schools. This survey involved interviews and physical inspections in a nationally representative sample of around 2,000 schools. The result was data which are fairly rare in developing country schooling systems. Provincial values for three official indicators which draw from the survey data are reflected in the first three columns of Table 9 below. Table 9: Additional Indicators of Provincial Effectiveness Value in sending province Change when switching District support Access to books Curriculum coverage TIMSS District support Access to books Curriculum coverage EC > KN 52 63 34 0.159 8 −27 6 LP > MP 60 42 67 0.190 5 3 30 MP > LP 65 45 97 −0.190 −5 −3 −30 NW > GP 55 56 34 0.292 12 5 59 NW > NC 55 56 34 0.247 13 −18 −6 Std. dev. 9 11 25 Value in sending province Change when switching District support Access to books Curriculum coverage TIMSS District support Access to books Curriculum coverage EC > KN 52 63 34 0.159 8 −27 6 LP > MP 60 42 67 0.190 5 3 30 MP > LP 65 45 97 −0.190 −5 −3 −30 NW > GP 55 56 34 0.292 12 5 59 NW > NC 55 56 34 0.247 13 −18 −6 Std. dev. 9 11 25 Source:South Africa: Department of Basic Education, 2013: 16, 37, 44. The TIMSS values are reproduced from Table 2 above. The standard deviations in the last row are calculated across all nine provincial indicator values. Table 9: Additional Indicators of Provincial Effectiveness Value in sending province Change when switching District support Access to books Curriculum coverage TIMSS District support Access to books Curriculum coverage EC > KN 52 63 34 0.159 8 −27 6 LP > MP 60 42 67 0.190 5 3 30 MP > LP 65 45 97 −0.190 −5 −3 −30 NW > GP 55 56 34 0.292 12 5 59 NW > NC 55 56 34 0.247 13 −18 −6 Std. dev. 9 11 25 Value in sending province Change when switching District support Access to books Curriculum coverage TIMSS District support Access to books Curriculum coverage EC > KN 52 63 34 0.159 8 −27 6 LP > MP 60 42 67 0.190 5 3 30 MP > LP 65 45 97 −0.190 −5 −3 −30 NW > GP 55 56 34 0.292 12 5 59 NW > NC 55 56 34 0.247 13 −18 −6 Std. dev. 9 11 25 Source:South Africa: Department of Basic Education, 2013: 16, 37, 44. The TIMSS values are reproduced from Table 2 above. The standard deviations in the last row are calculated across all nine provincial indicator values. The district support indicator reflects the percentage of responses from school principals which said ‘satisfactory’, where each principal could express opinions on up to 21 monitoring and support functions performed by the district (as explained earlier, education district offices are an integral part of the provincial education administration). The access to books indicator reflects the level of access to mathematics textbooks in grade 9 classrooms, as seen by fieldworkers who visited classrooms. The curriculum coverage indicator reflects a fieldworker’s evaluation of the mathematics writing book of one well-performing grade 9 student from each of the sampled schools which offered grade 9. The indicator drew from the data covering the volume of work in the student’s book. The relevance of the last two indicators is strengthened by the fact that of the 151 switching schools, only 29 do not offer grade 9 (14 of these were originally in Eastern Cape). The three school ‘migrations’ producing the least ambiguous results in the sense that they are inside the ‘correct’ quadrant of earlier Figure 2, all experienced changes in the three indicators which are consistent with the earlier findings, and with the TIMSS-based indicator of province effectiveness. For example, NW > GP schools moved to a province where the indicator of curriculum coverage was far higher than in the old province, the difference being a whole two standard deviations (measured across all nine provincial values). This suggests a far better culture of schooling in the new province, with more time-on-task in classrooms and greater teacher accountability. For schools switching between LP and MP, the same curriculum coverage indicator explains much of the difference between the two provinces. In fact, MP displayed the best value for this indicator of all nine provinces. However, even for the district support and access to books indicators across-province differences are consistent with the grade 12 examinations changes seen earlier, at least for NW > GP, LP > MP and MP > LP. Much of the literature on school financing points to differences in public spending playing little or no role in producing better education, beyond a basic level of per student spending (Glewwe, Hanushek and Humpage, Ravina, 2011: 4). Indeed, an examination of provincial spending patterns reveals nothing that suggests changes in funding levels played a role in the improvements seen in province-switching schools. Per student funding has remained roughly similar across provinces during the years 2005–2013. The ‘NW > NC’ and ‘LP > MP’ groups of schools moved to provinces spending just 5% more whilst ‘EC > KN’ and ‘NW > GP’ schools moved to provinces spending slightly less (South Africa: National Treasury, 2009: 38; Kruger and Rawle, 2012: 33). How provinces spend their money appears to matter more than the amounts spent. In Reference removed for blinded peer-reviewing, we report on analysis which found that Gauteng (GP) had pursued, not just in the case of the province-switching schools, but all secondary schools, a tacit strategy of reducing the proportion and absolute numbers of grade 12 students enrolled in mathematics. Such a strategy is a controversial one which many policymakers and researchers would understand as damaging for national development. In fact, South Africa’s national development plan laments dwindling participation in mathematics in grade 12 (South Africa: National Planning Commission, 2012: 317). The problem with this logic is that it ignores the fact that the percentage of mathematics students who acquire the skills in this subject needed for mathematically-oriented university programmes has been very low. Gauteng’s reduction in the number of mathematics students per school is perhaps indicative of an understanding amongst planners that consolidating mathematics in the school through smaller classes is better than expanding these classes, if the desired outcome is more university-ready mathematics students. Gauteng has attempted to make senior managers in the education sector more accountable, through better use of performance targets. One aspect of this is the increasing use of fixed-term contracts, as opposed to permanent tenure, in the case of senior managers in Gauteng’s education administration. Our own analysis of the payroll data revealed that in each province except for Gauteng, the percentage of the top paid one hundred public servants employed on a permanent basis has been at least 90% during the period 2005 to 2014 (counting only the education sector). In Gauteng, however, this percentage has dropped steadily, from 95% in 2005 to just below 60% in 2014. Conversations with Gauteng officials indicate that employing new senior managers on a contract basis, generally for terms of around four years, has been a deliberate strategy aimed at making the organogram more responsive to changing circumstances, and improving the incentives for senior managers to perform well. Even if moving to Gauteng did not mean an increase in per student spending, interviews we were able to conduct with Gauteng officials suggested that physical resourcing did play a role in improving performance. Additional education resources such as textbooks, videos of science experiments and equipment for practical exercises in technical subjects did reportedly help schools moving to Gauteng improve. 4.2 The speed of the improvements An obvious question is how results found in the current paper compare against substantial improvements seen elsewhere. Put differently, how remarkable is the impact of a new administration? Comparisons across education systems of improvements are generally done by comparing changes in student test score means in terms of standard deviations. However, the central findings of the current paper refer to improvements at the school level, and with respect to performance at the 95th percentile. Using the metrics of the current paper, the improvement between the pre-treatment period and 2013, the last treated year analysed, comes to 0.30 of a standard deviation. This is the difference between the mean of the pre-treatment years 2005–2007 and the 2013 estimate seen in Figure 3 (a gain of 5.9) over the standard deviation of the performance measure (mark at the 95th percentile) across schools (19.5). TIMSS grade 9 mathematics data, from 2011 and 2015, for 35 countries, were used to obtain a conversion factor. A gain, by a country, of one standard deviation across the 2 years with respect to a conventional measure of student-level means expressed as a standard deviation, was found to be associated with a gain of 1.9 in the measure we are interested in (the mean across school-level performance at the 95th percentile). We can thus say that the gain we find of 0.30 converts to approximately 0.16 of a student-level standard deviation, which is the more conventional measure. This is a noteworthy gain, achieved over six years, meaning 0.03 student-level standard deviations a year. It is around half of the speed of the improvement experienced by Brazil in its PISA2 mathematics results in the nine years 2000 to 2009: Brazil’s improvement was 0.53 Brazilian standard deviations, or 0.06 a year. Brazil has arguably displayed stronger and more consistent gains in an international testing programme than any other country. Importantly, the South African improvement for the switching schools would be additional to system-wide improvements, which are substantial according to South Africa’s grade 9 TIMSS mathematics scores for the 2002–2015 period. Improvements seen in project-type intervention programmes tend to be considerably larger than, say, those of Brazil’s PISA improvement. Such improvements can reach 0.15 of a standard deviation across students, achieved possibly in one year (McEwan, 2015). However, improvements of this magnitude are not seen in whole schooling systems. Given that the provincial change phenomenon studied in this paper was not a purposively designed intervention programme, comparisons to system-wide improvement trends seem more relevant. 5. Conclusion The paper has used examinations data across nine years, plus the fact that administrative boundaries in South Africa changed, to create a quasi-experiment examining the possible impact of a different administration, within the same country and general policy environment, on student performance in mathematics at the secondary level. The analysis concludes that what administration a school falls under matters for performance. The improvement for schools moving to a better province was considerable. The size of the annual gains sustained over 6 years, at 0.03 student-level standard deviations per year, is around half that of the best improver countries in the international testing programmes. Many of the administrative strategies which seemed to have played a role are somewhat predictable: better monitoring and support by the administration, and a strong focus on ensuring that schools have the educational materials they need. Importantly, higher per student spending appears not to be a major explanatory factor. Yet within fairly constant per student spending parameters, some provincial administrations paid more attention to providing administrative support staff to schools, and this does seem to be a factor contributing to better results. Specifically, many switching schools whose mathematics results improved experienced a shift from having no such staff to having one or two per school. A possible factor which could easily have been overlooked, because it is not directly observable within schools, is the strategy of making senior managers in the administration more accountable for their actions, partly by relying less on permanent tenure and more on fixed-term contracts amongst these managers. This strategy is followed by one of the provincial administrations associated with improvements in schools. Examinations data, as opposed to data from standardised tests, are not easy to use for the analysis of trends and cause and effect. Yet as shown above, the task is not necessarily impossible. In fact, examinations data may be the best available option for studying within-country dynamics, given the relatively high frequency of examinations and the absence of sample size limitations. Two matters which must be controlled for when using examinations data, and were controlled for in the current analysis, are relatively weak comparability of examinations scores over time and selection effects in the form of the dropping out of students prior to the examination. The fact that an indicator which gauged grade 12 mathematics performance at the 95th percentile relative to grade 10 enrolments 2 years previously produced particularly robust results is interesting. Indicators such as these may not be the simplest to calculate, yet they should arguably be used to a greater extent in, for instance, school accountability programmes. The quasi-experiment created by historical circumstances has allowed for an unusual focus on the administrative layer existing above schools, as opposed to interventions dealing with specific inputs such as teacher training, educational materials or accountability tools. Focussing on the latter is obviously important, but so is understanding what general characteristics of public sector management and leadership lend themselves to good decision-making with respect to education interventions. Footnotes 1 Trends in International Mathematics and Science Study. 2 Programme for International Student Assessment. References Cogneau D. , Moradi A. ( 2014 ) ‘ Borders that divide: Education and religion in Ghana and Togo since colonial times ’, Journal of Economic History , 74 ( 3 ): 694 – 729 . Google Scholar Crossref Search ADS Glewwe P. , Hanushek E. , Humpage S. , Ravina R. ( 2011 ) School Resources and Educational Outcomes in Developing Countries: A Review of the Literature from 1990 to 2010 . Washington : NBER . Gustafsson M. ( 2016 ) Understanding Trends in High-Level Achievement in Grade 12 Mathematics and Physical Science . Pretoria : Department of Basic Education . Gustafsson M. , Taylor S. ( 2016 ) Treating schools to a new administration: Evidence from South Africa of the impact of better practices in the system-level administration of schools . Stellenbosch : University of Stellenbosch . Hahn Y. , Wang L. , Yang H. ( 2015 ) Does Greater School Autonomy Make a Difference? Evidence from a Randomized Natural Experiment in South Korea . Oxford : RISE . Häkkinen I. , Kirjavainen T. , Uusitalo R. ( 2003 ) ‘ School resources and school achievement revisited: New evidence from panel data ’, Economics of Education Review , 22 : 329 – 35 . Google Scholar Crossref Search ADS Imbens G. W. , Wooldridge J. M. ( 2009 ) ‘ Recent developments in the econometrics of program evaluation ’, Journal of Economic Literature , 47 ( 1 ): 5 – 86 . Google Scholar Crossref Search ADS Kruger J. , Rawle G. ( 2012 ) Public Expenditure Analysis for the Basic Education Sector in South Africa [unpublished report] . Pretoria : Oxford Policy Management . McEwan P. ( 2015 ) ‘ Improving learning in primary schools of developing countries: A meta-analysis of randomized experiments ’, Review of Educational Research , 85 ( 3 ): 353 – 94 . Google Scholar Crossref Search ADS Pritchett L. , Beatty A. ( 2012 ) The negative consequences of overambitious curricula in developing countries . Washington : Center for Global Development . Pritchett L. , Woolcock M. , Andrews M. ( 2012 ) ‘ Looking like a state: Techniques of persistent failure in state capability for implementation ’, Journal of Development Studies , 49 ( 1 ): 1 – 18 . Google Scholar Crossref Search ADS Schlotter M. , Schwerdt G. , Woessman L. ( 2011 ) ‘ Econometric methods for causal evaluation of education policies and practices ’, Education Economics , 2 ( 19 ): 109 – 37 . Google Scholar Crossref Search ADS South Africa: Department of Basic Education ( 2013 ) Detailed Indicator Report for Basic Education Sector, Pretoria. South Africa: National Planning Commission ( 2012 ) National Development Plan 2030: Our Future—Make It Work, Pretoria. South Africa: National Treasury ( 2009 ) Provincial Budgets and Expenditure Review 2005/06-2011/12, Pretoria. UNESCO ( 2014 ) Education for All Global Monitoring Report 2013/4: Teaching and Learning: Achieving Quality Education for all, Paris. Author notes This paper would have been impossible without the collaboration and advice from colleagues in South Africa’s Department of Basic Education. Both authors are currently based at the Department. © The Author(s) 2018. Published by Oxford University Press on behalf of the Centre for the Study of African Economies, all rights reserved. For Permissions, please email: journals.permissions@oup.com. This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/open_access/funder_policies/chorus/standard_publication_model) http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Journal of African Economies Oxford University Press

Treating Schools to a New Administration: Evidence of the Impact of Better Practices in the System-Level Administration of Schools

Loading next page...
 
/lp/oxford-university-press/treating-schools-to-a-new-administration-evidence-of-the-impact-of-Ic0dCitjvJ

References (24)

Publisher
Oxford University Press
Copyright
© The Author(s) 2018. Published by Oxford University Press on behalf of the Centre for the Study of African Economies, all rights reserved. For Permissions, please email: journals.permissions@oup.com.
ISSN
0963-8024
eISSN
1464-3723
DOI
10.1093/jae/ejy005
Publisher site
See Article on Publisher Site

Abstract

Abstract A large body of literature focusses on how efficient practices in national and sub-national administrations can improve the welfare of citizens. Yet it is difficult to demonstrate this effect empirically, partly because citizens are not randomly assigned to different administrations and standard impact evaluation techniques are thus not viable. Within education, randomised trials that attribute educational improvement to specific interventions have been influential. Yet critics have argued that these studies, by focussing on single interventions, fail to prove their validity in a context of multiple interventions and, above all, weak system governance. The current paper takes advantage of provincial boundary changes occurring in South Africa, where provinces manage schools, to measure the effects of better administration, using a quasi-experimental approach. Changing to a more effective province was found to improve the mathematics performance of secondary school students, in national examinations, by a magnitude that is half of that seen in the fastest improving countries in international testing programmes. The replacement of teachers or school principals does not explain the improvement, though the addition of administrative support staff does emerge as a likely contributing factor. Greater efficiency, and not additional funding, appears to account for most of the change. It seems noteworthy that one of the provinces found to be relatively effective makes use of fixed-term contracts for senior managers in the administration, as opposed to permanent tenure, to enhance organisational capacity. 1. Introduction Identifying the causes behind improvements in schooling is inherently difficult, even when good data are available, and often such data are not available. The primary contribution of this paper is to provide causal estimates of the effect of better public administration by exploiting a natural experiment in which schools were ‘treated’ to a new administration. We find a substantial positive impact of a better provincial government administration, with the performance gap between treated schools and the counterfactual growing over time. A further contribution made by this paper is to unpack, to a limited extent, what it is about better administrations that makes a difference to school performance. Neither changes in overall funding, nor changes in the educator personnel at affected schools, played a significant role, though better access to schools-based administrative support staff appears to have made a difference. Mostly, improvements seem to have been achieved through better accountability mechanisms facing officials, better curriculum support to schools, and a prioritisation of specific learning support materials. Analysts are often warned against the use of traditional examinations data to gauge systemic improvements in education (UNESCO, 2014: 6). This paper confirms that while such data are difficult to use, they can yield robust findings if one exercises sufficient caution. This is important given how scarce the ideal of standardised test data can be in developing countries. Finally, this paper illustrates the potential use of quasi-experimental methods using spatial dynamics. The specific context which this paper takes advantage of is the limited redrawing of South Africa’s provincial boundaries in 2005. For 710 schools across the country, this meant a new provincial education administration. The boundary changes did not occur for any reason linked to education, yet they had profound implications for many of the schools affected since provincial governments play a central role in the administration of schools. Random assignment of students or schools to governance systems is extremely rare. This partly explains why there is so little hard evidence on the impact of governance on education, although one suspects governance does play a large role. Two research questions stand out. Does the governance system under which one falls make a difference to educational outcomes? If so, what elements in the system make the difference? The only study we know of which, like ours, makes the assumption that the redrawing of political boundaries constitutes a natural experiment with sufficient elements of randomisation, and then examines impacts on education, is that of Cogneau and Moradi (2014), who focus on West Africa in the early 20th century. They conclude that who governs did make a difference to literacy levels, and attribute the difference to more generous and cost-effective school funding provided by British colonial authorities, relative to French ones. Like us, they make use of difference-in-differences models to answer the first research question, but also draw from a wider range of sources to address the policy issues (the second question). The value of answering the first question, on whether governance matters, and by how much, should not be under-estimated. Such answers can in themselves help to shift the emphasis in the policy debates towards what occurs in the administration. This is a shift advocated by, for instance, Pritchett et al. (2012), who argue that in developing countries too much ‘mimicry’ of administrative functionality, sometimes facilitated by donor countries, has diverted attention away from administrative reform. Apart from Cogneau and Moradi, the only other study we found which used a ‘natural experiment’ to estimate the effects of different packages of education policies is that of Hahn et al. (2015). This study exploits an unusual system in Seoul, South Korea, whereby students are assigned randomly to different sub-systems of secondary schools within the city, the one featuring more school autonomy than the other. Which sub-system one is assigned to is found to impact on results in standardised tests. Differences are attributed to ‘autonomy in personnel decisions, together with strong incentives for principals and teachers to perform’ (Hahn et al., 2015: 30). These policy conclusions draw in part from the interaction between, on the one hand, time-variant and school-level variables relating to the employment of teachers and, on the other hand, a dummy representing the sub-system, within a difference-in-differences regression. Our methods moreover resemble those of Häkkinen et al. (2003), who used a panel of examinations and other administrative data to conclude that spending cuts during a 1990s recession in Finland did not influence educational outcomes. Section 2 discusses the various datasets we use, and introduces the institutional elements of South Africa’s schooling system, and the 2005 boundary changes. Section 3 presents the econometric analysis. Section 4 discusses the findings in the context of additional information. Section 5 concludes. The current paper builds on an earlier working paper by the authors (Gustafsson and Taylor, 2016). 2. The data and institutional background 2.1 The examinations data South Africa’s grade 12 examinations, written in around 5,600 schools, have offered for many decades the only reliable measure of school performance. Consequently, much attention has focussed on grade 12 indicators, in particular ‘pass rates’, the percentage of examination candidates successfully obtaining a national certificate or surpassing minimum thresholds in individual subjects. Around 50% of youths have obtained the grade 12 certificate in recent years. Roughly a further 20% of youths participate in grade 12 but do not obtain the certificate, while the bulk of the remaining 30% of youths do not reach grade 12 or any equivalent level of education outside a school. Both public and some private schools participate in the national public examinations. The examinations approach changed between 2007 and 2008. Subjects were redesigned, and a distinction between standard grade and higher grade examination papers across all subjects was removed. This paper focuses on improvements in mathematics, a subject that is widely taken, by around half of students, across virtually all schools. Apart from being the subject taken by most schools, it is also the subject most likely to produce consistent school rankings between 1 year and the next, when using school means. The mathematics examination papers are national, and security procedures established over several decades ensure that the ‘leaking’ of papers before the time at which all students sit for examination is extremely rare. After examinations are written, scripts are sent to provincial marking centres, which are monitored by the national authorities. Poor student performance, in particular in mathematics, is widely acknowledged as a hurdle to skills development and economic growth in South Africa. Typically, a mark of at least 60 or 70 out of 100 is a requirement for entry into under-graduate university programmes requiring strong mathematical competencies, for instance engineering. In 2013, a mark of 62 was obtained by the 95th percentile of mathematics students, while the mean for these students was as low as 35. The basic pass mark is 30. We had at our disposal student records of the grade 12 examination results, for the years 2005–2013. These data were obtained from the national Department of Basic Education (DBE), where both of the authors are employed. Though public access for researchers to school-level examinations data, through the University of Cape Town’s DataFirst portal, is now possible, student-level data are not widely available, in part due to privacy concerns and weak capacity in the administration to anonymise data. In the case of the 2005 to 2007 data, we used overlaps between participation in mathematics and in physical science higher grade, in order to put all mathematics students, whether higher or standard grade students, on a common scale. The standard of the examinations, it could be argued, is set too high, as seen in the low mean. Pritchett and Beatty (2012) have argued that overly demanding examinations are a common feature in developing countries, and go on to say that ‘Paradoxically, learning could go faster if curricula and teachers were to slow down’. The fact that the examinations are demanding is likely to create floor effects, or weaknesses in the ability of the examinations to differentiate between relatively weak students. For this reason, and because high-level achievement is important for development, much of our analysis focusses on data collapsed to the school level, and specifically each school’s 95th percentile mark. Marks in traditional examinations systems are never fully equivalent across years, despite mark adjustments by authorities (including the South African ones) to enhance across-year comparability (the marks in our data were post-adjustment marks). This is a problem inherent to examinations which lack secure anchor items (common questions kept secret). We control for this comparability problem by using year dummies in our analysis. Moreover, our analysis rests strongly on the assumption that provinces apply similar marking standards within each year, and that national controls in this regard work well. What we do not report on in the paper is an analysis of relatively well-performing schools with stable demographics, similar to what appears in Gustafsson (2016). The ranking of these schools did not shift over time according to province, a finding which seemed to confirm that provincial marking standards were sufficiently constant. 2.2 Province-switching Seven provinces of the nine South African provinces saw their boundaries change in 2005 (see list in Table 1 note). A total of 710 schools, 158 of which offer grade 12, experienced a change in provincial administration. The distribution of the 158 schools is shown in Figure 1. The five largest province-to-province ‘migrations’—the five rows of Table 1—account for 151 of the 158 schools, and it is these 151 we consider province-switching schools in all our models in Section 3. Table 1: Count of Schools Sending province (before change) Switching group Receiving province (before change) EC > KN 736 15 1254 LP > MP 1052 83 321 MP > LP 321 13 1052 NW > GP 278 29 445 NW > NC 278 11 82 Total 151 Sending province (before change) Switching group Receiving province (before change) EC > KN 736 15 1254 LP > MP 1052 83 321 MP > LP 321 13 1052 NW > GP 278 29 445 NW > NC 278 11 82 Total 151 Note: Numbers reflect schools considered in models A to D of Table 3. The seven provinces involved in the boundary changes are: Eastern Cape (EC), Gauteng (GP), KwaZulu-Natal (KN), Limpopo (LP), Mpumalanga (MP), Northern Cape (NC) and North West (NW). Table 1: Count of Schools Sending province (before change) Switching group Receiving province (before change) EC > KN 736 15 1254 LP > MP 1052 83 321 MP > LP 321 13 1052 NW > GP 278 29 445 NW > NC 278 11 82 Total 151 Sending province (before change) Switching group Receiving province (before change) EC > KN 736 15 1254 LP > MP 1052 83 321 MP > LP 321 13 1052 NW > GP 278 29 445 NW > NC 278 11 82 Total 151 Note: Numbers reflect schools considered in models A to D of Table 3. The seven provinces involved in the boundary changes are: Eastern Cape (EC), Gauteng (GP), KwaZulu-Natal (KN), Limpopo (LP), Mpumalanga (MP), Northern Cape (NC) and North West (NW). Figure 1: View largeDownload slide Schools experiencing a province change after 2005. Note: Provincial boundaries are those created by the 2005 boundary changes. Schools which moved from a neighbouring province are marked by large points. Schools which moved from Limpopo to Mpumalanga are represented by large grey points, to make them distinct from schools which moved in the opposite direction, from Mpumalanga to Limpopo. Small points represent non-switching schools participating in the grade 12 examinations. Figure 1: View largeDownload slide Schools experiencing a province change after 2005. Note: Provincial boundaries are those created by the 2005 boundary changes. Schools which moved from a neighbouring province are marked by large points. Schools which moved from Limpopo to Mpumalanga are represented by large grey points, to make them distinct from schools which moved in the opposite direction, from Mpumalanga to Limpopo. Small points represent non-switching schools participating in the grade 12 examinations. All but 2 of the 151 had grade 12 groups which were between 90% and 100% black African (the two exceptions were in the MP > LP switching group). The schools are thus interesting in terms of understanding educational improvements for the most historically disadvantaged segments of the South African population. The boundary changes occurred to ensure that no municipality straddled two provinces. Municipalities do not play a role in the administration of schools. This is the responsibility of the provincial government. Reporting directly to the provincial authorities are a number of ‘education districts’, whose boundaries often coincide with those of the municipalities, though institutionally they are independent of each other. Provincial education departments are funded by the provincial department of finance. Though legislation changing the provincial boundaries was passed in December 2005, shortly after the 2005 examinations had been written, the transfer of schools to their new provincial administrations was not immediate. The official transfer for all affected provinces appears to have occurred in April 2007, which would have been a few months into the 2007 academic year, starting in January 2007. The 2008 academic year would therefore have been the first such year in which a new province would have been in full administrative control of the transferred school. The fact that from December 2005 schools knew that the transfer was imminent could have influenced behaviour in the schools from as early as the 2006 school year. For instance, school principals may have felt invigorated or dejected by the fact that they were being transferred to what was perceived as a better or worse province. 2.3 TIMSS data and measures of provincial effectiveness TIMSS1 secondary school data for 2002, 2011 and 2015 were used to gain a sense of whether a move from one province to another could be considered an improvement or a deterioration for a school. This was obviously important for understanding the nature of the ‘treatment’. Even roughly objective measures of the effectiveness of any administration, national or sub-national, are hard to come by. An advantage with the TIMSS data is that they allow one to control for the socio-economic status of students. Values for δ in equation (1) were derived through a series of simple education production functions. Four regressions, one for each of the four pair of provinces involved in the inter-provincial movement, were run. Data from the three years were pooled. ei=βˆ0+βˆ1si+β2si2+δˆD+uˆi (1) Here e is the TIMSS mathematics score of student i converted to a z-score, and s is the socio-economic status of student i calculated using a principal components analysis that drew from around 10 household items declared in the student’s background questionnaire (the number varied slightly by year). s was also converted to a z-score. Table 2 below reports the values of δ and its standard error, where the p value was no more than 0.1. The regressions used weights for students adjusted in a manner that gave an equal weight to each of the three years. Clustering by school was taken into account throughout. The pooled data thus represents a time period 2002 to 2015, which encompasses the 2005 to 2013 period from which the examinations data come. Table 2: Conditional Inter-Provincial Quality Differences (δ) Using TIMSS 2002 2011 2015 Pooled EC > KN 0.203 (0.114) 0.159 (0.088) LP > MP 0.342 (0.118) 0.178 (0.101) 0.190 (0.095) NW > GP 0.258 (0.108) 0.491 (0.125) 0.292 (0.091) NW > NC 0.407 (0.164) 0.247 (0.115) 2002 2011 2015 Pooled EC > KN 0.203 (0.114) 0.159 (0.088) LP > MP 0.342 (0.118) 0.178 (0.101) 0.190 (0.095) NW > GP 0.258 (0.108) 0.491 (0.125) 0.292 (0.091) NW > NC 0.407 (0.164) 0.247 (0.115) Source: Own calculations using (for 2011 and 2015) data available at https://timssandpirls.bc.edu. For 2002, grade 9 data were obtained from the Human Sciences Research Council. Thus all 3 years use grade 9 data. Table 2: Conditional Inter-Provincial Quality Differences (δ) Using TIMSS 2002 2011 2015 Pooled EC > KN 0.203 (0.114) 0.159 (0.088) LP > MP 0.342 (0.118) 0.178 (0.101) 0.190 (0.095) NW > GP 0.258 (0.108) 0.491 (0.125) 0.292 (0.091) NW > NC 0.407 (0.164) 0.247 (0.115) 2002 2011 2015 Pooled EC > KN 0.203 (0.114) 0.159 (0.088) LP > MP 0.342 (0.118) 0.178 (0.101) 0.190 (0.095) NW > GP 0.258 (0.108) 0.491 (0.125) 0.292 (0.091) NW > NC 0.407 (0.164) 0.247 (0.115) Source: Own calculations using (for 2011 and 2015) data available at https://timssandpirls.bc.edu. For 2002, grade 9 data were obtained from the Human Sciences Research Council. Thus all 3 years use grade 9 data. All values of δ are positive, indicating that for these four movements, the move was to a better province. Only the MP > LP movement emerges as a move to a worse province—here the value for δ would be the negative of δ for LP > MP. It is not entirely accidental that moves to better provinces feature so prominently. The town of Khutsong was supposed to move from Gauteng to North West, but protests against this, which was seen as a move to a province with worse service delivery, eventually stopped the boundary demarcation. It was the only instance of a reversal to the changes announced in 2005. 2.4 Other administrative data Apart from the examinations data, two other administrative data sources were used: school census data (specifically the so-called Snap Survey) and payroll data. The school census data were used, above all, to provide estimates of the cohort of students in each year who would have been in grade 12, had there been no selection effects in the form of dropping out. These selection effects, which may change by school over time, play a large role in influencing each school’s relative performance in the examinations. Dropping out is not random. Academically struggling students are more likely to drop out, and different schools, and provinces, employ different methods to reduce, or even encourage, dropping out. There is in fact a strong incentive for schools to allow weaker students to drop out, as this improves examination pass rates. Grade 10 enrolment 2 years prior to the grade 12 examination event was considered the optimal estimate of who would have reached the examination in the absence of dropping out. This estimate has serious drawbacks mostly as grade repetition rates are high in grade 10, with considerable variation across schools. The ideal would have been the number of 15-year-olds 2 years prior to the examination. Around 98% of the population remains in school until the year they turn fifteen. However, age data in the school census data are missing for a considerable number of schools. Payroll data were used mainly to gauge whether province-switching schools experienced exceptional changes in their staffing as a result of the switch. 3. Analysis of student performance and province-switching 3.1 The estimation model Equations for three estimation models, which serve as our point of departure, appear below. Equation (2) is a basic difference-in-differences (DiD) model with panel data. For this model, we have collapsed our student data to the level of the school, meaning we have a panel with schools linked across years. E is the performance of school i in year t. To begin with, E is the score of the student at the 95th percentile, relative to all students who took or might have taken mathematics, meaning the school’s grade 10 enrolment 2 years previously. In calculating the 95th percentile, it was assumed that students who did not reach the mathematics class, calculated as grade 12 mathematics students minus earlier enrolment in grade 10, would have performed poorly in mathematics. These non-mathematics students were assigned a mark of zero in our data. We also explore other versions of E, specifically the mean and statistics where selection effects are not taken into account, in Section 3.3. The variables S in brackets refer to dummy 0–1 variables for each of the schools i, except for one (school i = 0). Each school, except for one, thus carries its own intercept λ. This means that unobserved and time-invariant phenomena influencing the general level of performance of each school are controlled for. T is a 0–1 dummy variable indicating which school in which year was subject to the treatment. T would thus be 1 for schools in the four switching categories other than MP > LP, and for the ‘treatment’ years 2008 to 2013. Finally, eight year dummies D for the periods 1–8, or 2006–2013, are entered, making the model a two-way fixed effects model (school and year are fixed). In terms of the difference-in-differences method, the first difference is controlled for by the school fixed effects λ. The second difference, or the extent to which the difference between the treated and control schools changed, is captured in β1 (Imbens and Wooldridge, 2009: 70; Schlotter et al. 2011: 127). A positive treatment effect would be reflected in a positive value for β1. Eit=λˆ0+(λˆ1Si=1,+…+λˆnSi=n)+βˆ1Tit+βˆ2Dt=1+…+βˆ9Dt=8+uˆit (2) We assume that the selection of treated schools, though not strictly random since they had to be close to borders that moved, can be considered random for the purpose of causal inference. Put differently, we assume that these schools were not on a distinct performance trajectory even before the move. We test this assumption below, both by looking at trends in the pre-treatment years, so 2005 to 2007, and by testing the trends for schools which were geographically close to the moving schools in the ‘sending’ province but did not move. The problem with equation (2) in the context of several treatment years, and we have treatments extending across 6 years, 2008–2013, is that whilst β1 may tell us that on average treated schools performed better, it will not provide information on the slope of E over the years 2008 and 2013. We are interested in a slope, and would expect a positive one where the move is to a more effective province, given that effects are likely to take time, largely because grade 12 results reflect progress in earlier grades. To gauge the slope, we need equation (3) below. Here the treatment variable of interest is the interaction term PT, meaning the interaction between period and the 0–1 treatment dummy. P carries value 0 for 2008, the first treatment year (T on its own takes on a value of 1 for 2008, and later years). If the treated schools display a slope for E which is more positive (or less negative) than non-treated schools, β2 will be positive. Eit=λˆ0+(λˆ1Si=1,+…+λˆnSi=n)+βˆ1Tit+βˆ2PTit+βˆ3Dt=1+…+βˆ10Dt=8+uˆit (3) One further adaptation is needed. Equation (3) is appropriate if we have one treatment group, being all schools moving to a better province, and consider as irrelevant the sending province from which each school comes. However, if we want to test each group of switching schools as a separate treatment group, with its own distinct treatment experience, we should not only consider more than one treatment group, we should also consider more than one control group. This is if we want to avoid running four separate models, one for each combination of sending province and group of schools switching to a better province. (For now we ignore the group which moved to a worse province.) Equation (4) illustrates the required model. Here four treatment variables, T1–T4, are employed, but also three control variables, K1–K3, representing the three sending provinces EC, LP and NW. Each K is a dummy taking on a value of 1 for a specific sending province, and if the year is in the treatment period, in other words 2008–2013. We interact the dummies with period as defined previously. Using this approach, we for instance consider the LP > MP group of schools a subset of all schools which started off in LP. Eit=λˆ0+(λˆ1Si=1,+…+λˆnSi=n)+αˆ1K1it+αˆ2PK1it+…+αˆ5K3it+αˆ6PK3it+βˆ1T1it+βˆ2PT1it+…+βˆ7T4it+βˆ8PT4it+βˆ2Dt=1+…+βˆ12Dt=8+uˆit (4) 3.2 Panel model with school fixed effects Table 3 reports on the results of four key models, all using as the dependent variable the school’s performance at the 95th percentile, with non-selected students (based on the earlier grade 10 enrolment) being assigned a score of zero. In Model A, being in one of the four moving groups involving a move to a better province meant a value of 1 was assigned to the variable ‘Is to better’ in the case of observations of the treatment years, 2008 to 2013. ‘Is to better’ is thus the treatment dummy T of equation (2). A similar ‘Is to worse’ dummy for the only group involving a move to a worse province, MP > LP, was also created. Both dummy variables produce statistically significant coefficients of the expected sign, and of similar absolute magnitudes. Moving to a better (worse) province is associated with a gain (loss) of between 3 and 4 marks with respect to E. That this should emerge so clearly is remarkable, given that ‘to better’ schools were spread across four ‘migrations’. One might expect a single ‘migration’ to display gains in the case of a ‘receiving province’ which was particularly good at fixing problems experienced in the ‘sending province’, but a general positive effect of moving to a better province seems striking. Table 3: Regression Outputs for School Fixed Effects Panel Model A B C D Dependent variable → Mark at 95th percentile with non-selected students (based on earlier grade 10) assigned a mark of zero Constant 41.2*** (0.27) 40.0*** (0.31) 40.0*** (0.00) 40.0*** (0.32) Is to better (B) 3.480*** (0.69) −1.494 (0.94) Int. B & Period 1.317*** (0.26) Is to worse (W) −3.745* (2.23) −8.380*** (2.45) Int. W & Period −0.049 (0.70) Is EC > KN 2.781* (1.41) Int. EC > KN & Period −1.488** (0.38) Is LP > MP −1.362 (2.40) Int. LP > MP & Period 1.663*** (0.84) Is MP > LP −8.380*** (2.79) Int. MP > LP & Period −0.049 (0.58) Is NW > GP −3.052 (2.63) Int. NW > GP & Period 1.882*** (0.63) Is NW > NC −5.521** (0.60) Int. NW > NC & Period 1.490** (0.30) Provincial diff. (δ) 1.643 (3.71) Int. δ & Period 6.011*** (0.98) Is EC −3.755*** (0.46) −3.785*** (0.36) −3.764*** (0.35) Int. EC & Period 0.528*** (0.12) 0.585*** (0.10) 0.535*** (0.10) Is LP −0.511 (0.40) −0.549 (0.55) −0.549 (0.39) Int. LP & Period 0.953*** (0.15) 0.953*** (0.14) 0.997*** (0.14) Is MP 2.736*** (0.54) 2.736*** (0.52) 2.407*** (0.58) Int. MP & Period 0.886*** (0.12) 0.860*** (0.14) 0.900*** (0.09) Is NW 0.464 (0.69) 0.712 (0.10) 0.371 (0.65) Int. NW & Period −0.291* (0.18) −0.356*** (1.46) −0.343 (0.22) N 42,392 42,392 42,392 42,392 Number of schools 4,711 4,711 4,711 4,711 R2 overall 0.087 0.089 0.089 0.089 A B C D Dependent variable → Mark at 95th percentile with non-selected students (based on earlier grade 10) assigned a mark of zero Constant 41.2*** (0.27) 40.0*** (0.31) 40.0*** (0.00) 40.0*** (0.32) Is to better (B) 3.480*** (0.69) −1.494 (0.94) Int. B & Period 1.317*** (0.26) Is to worse (W) −3.745* (2.23) −8.380*** (2.45) Int. W & Period −0.049 (0.70) Is EC > KN 2.781* (1.41) Int. EC > KN & Period −1.488** (0.38) Is LP > MP −1.362 (2.40) Int. LP > MP & Period 1.663*** (0.84) Is MP > LP −8.380*** (2.79) Int. MP > LP & Period −0.049 (0.58) Is NW > GP −3.052 (2.63) Int. NW > GP & Period 1.882*** (0.63) Is NW > NC −5.521** (0.60) Int. NW > NC & Period 1.490** (0.30) Provincial diff. (δ) 1.643 (3.71) Int. δ & Period 6.011*** (0.98) Is EC −3.755*** (0.46) −3.785*** (0.36) −3.764*** (0.35) Int. EC & Period 0.528*** (0.12) 0.585*** (0.10) 0.535*** (0.10) Is LP −0.511 (0.40) −0.549 (0.55) −0.549 (0.39) Int. LP & Period 0.953*** (0.15) 0.953*** (0.14) 0.997*** (0.14) Is MP 2.736*** (0.54) 2.736*** (0.52) 2.407*** (0.58) Int. MP & Period 0.886*** (0.12) 0.860*** (0.14) 0.900*** (0.09) Is NW 0.464 (0.69) 0.712 (0.10) 0.371 (0.65) Int. NW & Period −0.291* (0.18) −0.356*** (1.46) −0.343 (0.22) N 42,392 42,392 42,392 42,392 Number of schools 4,711 4,711 4,711 4,711 R2 overall 0.087 0.089 0.089 0.089 Note: ‘Period’ here makes the years 2005–2008 zero, 2009 value 1, 2010 value 2, and so on. All listed variables would be zero for the years 2005–2007. Not reported here are single-year dummies, used in all models, with 2013 being the reference year (because of the exceptionally low values for E in the starting year 2005, this year is not the reference). All models make use of bootstrap estimation of standard errors, which appear in parentheses. Here and in tables that follow ***indicates that the estimate is significant at the 1% level of significance, ** at the 5% level and * at the 10% level. Table 3: Regression Outputs for School Fixed Effects Panel Model A B C D Dependent variable → Mark at 95th percentile with non-selected students (based on earlier grade 10) assigned a mark of zero Constant 41.2*** (0.27) 40.0*** (0.31) 40.0*** (0.00) 40.0*** (0.32) Is to better (B) 3.480*** (0.69) −1.494 (0.94) Int. B & Period 1.317*** (0.26) Is to worse (W) −3.745* (2.23) −8.380*** (2.45) Int. W & Period −0.049 (0.70) Is EC > KN 2.781* (1.41) Int. EC > KN & Period −1.488** (0.38) Is LP > MP −1.362 (2.40) Int. LP > MP & Period 1.663*** (0.84) Is MP > LP −8.380*** (2.79) Int. MP > LP & Period −0.049 (0.58) Is NW > GP −3.052 (2.63) Int. NW > GP & Period 1.882*** (0.63) Is NW > NC −5.521** (0.60) Int. NW > NC & Period 1.490** (0.30) Provincial diff. (δ) 1.643 (3.71) Int. δ & Period 6.011*** (0.98) Is EC −3.755*** (0.46) −3.785*** (0.36) −3.764*** (0.35) Int. EC & Period 0.528*** (0.12) 0.585*** (0.10) 0.535*** (0.10) Is LP −0.511 (0.40) −0.549 (0.55) −0.549 (0.39) Int. LP & Period 0.953*** (0.15) 0.953*** (0.14) 0.997*** (0.14) Is MP 2.736*** (0.54) 2.736*** (0.52) 2.407*** (0.58) Int. MP & Period 0.886*** (0.12) 0.860*** (0.14) 0.900*** (0.09) Is NW 0.464 (0.69) 0.712 (0.10) 0.371 (0.65) Int. NW & Period −0.291* (0.18) −0.356*** (1.46) −0.343 (0.22) N 42,392 42,392 42,392 42,392 Number of schools 4,711 4,711 4,711 4,711 R2 overall 0.087 0.089 0.089 0.089 A B C D Dependent variable → Mark at 95th percentile with non-selected students (based on earlier grade 10) assigned a mark of zero Constant 41.2*** (0.27) 40.0*** (0.31) 40.0*** (0.00) 40.0*** (0.32) Is to better (B) 3.480*** (0.69) −1.494 (0.94) Int. B & Period 1.317*** (0.26) Is to worse (W) −3.745* (2.23) −8.380*** (2.45) Int. W & Period −0.049 (0.70) Is EC > KN 2.781* (1.41) Int. EC > KN & Period −1.488** (0.38) Is LP > MP −1.362 (2.40) Int. LP > MP & Period 1.663*** (0.84) Is MP > LP −8.380*** (2.79) Int. MP > LP & Period −0.049 (0.58) Is NW > GP −3.052 (2.63) Int. NW > GP & Period 1.882*** (0.63) Is NW > NC −5.521** (0.60) Int. NW > NC & Period 1.490** (0.30) Provincial diff. (δ) 1.643 (3.71) Int. δ & Period 6.011*** (0.98) Is EC −3.755*** (0.46) −3.785*** (0.36) −3.764*** (0.35) Int. EC & Period 0.528*** (0.12) 0.585*** (0.10) 0.535*** (0.10) Is LP −0.511 (0.40) −0.549 (0.55) −0.549 (0.39) Int. LP & Period 0.953*** (0.15) 0.953*** (0.14) 0.997*** (0.14) Is MP 2.736*** (0.54) 2.736*** (0.52) 2.407*** (0.58) Int. MP & Period 0.886*** (0.12) 0.860*** (0.14) 0.900*** (0.09) Is NW 0.464 (0.69) 0.712 (0.10) 0.371 (0.65) Int. NW & Period −0.291* (0.18) −0.356*** (1.46) −0.343 (0.22) N 42,392 42,392 42,392 42,392 Number of schools 4,711 4,711 4,711 4,711 R2 overall 0.087 0.089 0.089 0.089 Note: ‘Period’ here makes the years 2005–2008 zero, 2009 value 1, 2010 value 2, and so on. All listed variables would be zero for the years 2005–2007. Not reported here are single-year dummies, used in all models, with 2013 being the reference year (because of the exceptionally low values for E in the starting year 2005, this year is not the reference). All models make use of bootstrap estimation of standard errors, which appear in parentheses. Here and in tables that follow ***indicates that the estimate is significant at the 1% level of significance, ** at the 5% level and * at the 10% level. Model B essentially implements equation (4), though with still one positive treatment group and one negative treatment group. The interaction between ‘Is to better’ and period (where 2008 is zero, 2009 is one, and so on) produces a coefficient that is statistically significant at the 1% level. For each year that passes in the treatment period, 1.3 marks is added to E. The results obtained if one enters each year separately for just the treatment group are explored below. The negative coefficient on the ‘Is to worse’ dummy is now much larger mostly as this is now relative to the new dummy ‘Is MP’, the sending province for the ‘Is to worse’ group. In Model C, which implements equation (4) fully, the performance slope during the treatment period is examined at the level of the five switching groups. Three of the four groups moving to better provinces display the expected positive slope, significant at least at the 10% level. The exception is EC > KN, where a positive coefficient on the dummy appears together with a negative (and significant) coefficient for the period interaction. The MP > LP group displays the same results as in Model B: a statistically significant negative coefficient on the 0–1 dummy, without a statistically significant slope. Finally, Model D considers the effectiveness gap between the sending and receiving provinces as a non-binary treatment variable representing the ‘dosage’ of the treatment. Values from the last column of Table 2 were used for δ, with MP > LP schools taking the negative for LP > MP. The coefficient on the interaction with period indicates that a ‘step-up’ (step-down) of one standard deviation of performance in TIMSS, associated with the move to a new province, comes with an annual gain (loss) of 6 marks at the 95th percentile of the mathematics examination. The relationship between the step-up and the actual annual gain is illustrated in Figure 2 below. The confidence intervals (at the 95% level) for the step-up (horizontal lines) draw from Table 2. The estimated gain by 2013 (midpoints of the vertical lines) was calculated using the coefficients for both the dummy and the dummy–period interaction in Model C, in the case of each of the five groups. The confidence intervals represented by the vertical lines use the standard errors around the predicted means produced when Model C as a whole is calculated. Clearly one cannot say much about the differences between the four step-ups. Their confidence intervals overlap to a large degree. Yet the general pattern is telling. Three groups moving to better provinces display confidence intervals, against both axes, which are mostly within the ‘correct’ quadrant, namely the top-right one, representing a step-up associated with actual improvements. The move to a worse province for MP > LP puts this group in the correct bottom-left quadrant, with a deterioration according to the vertical axis (minus 8.6) which is not too different in absolute terms from the improvement of 7.0 experienced by the group moving in the opposite direction, LP > MP. Only EC > KN produces an anomalous result: a loss in performance, relative to the sending province, though the move was to a better performing province. Figure 2: View largeDownload slide Gain by 2013 and size of δ. Note: Three-point vertical and horizontal lines represent the point estimate for the coefficient with a confidence interval at the 95% level. Figure 2: View largeDownload slide Gain by 2013 and size of δ. Note: Three-point vertical and horizontal lines represent the point estimate for the coefficient with a confidence interval at the 95% level. The fact that there were movements between LP and MP in both directions, and that these are associated with opposite effects of roughly equivalent magnitudes, is particularly important for our conclusions. This points strongly, independently of any use of TIMSS-based measures of province effectiveness, to significant quality differences between provinces, and to significant and fairly rapid impacts of these quality differences on incoming schools. The descriptive statistics in Table 4 illustrate the unconditional trends with respect to the dependent variable. The fact that all slopes are positive indicates that gains and losses must be understood in terms of faster and slower positive change. The general upward trend could be due to changing standards, occurring for instance with the introduction of the new examination system in 2008, but TIMSS has also pointed to a general and real improvement over this period in secondary schools. Of note is the fact that four of the five switching groups, all except for NW > GP, display means that are below both the sending and receiving provinces. In this sense, the switching schools are not representative of their sending provinces. Also of note is the fact that LP > MP and NW > GP, the two largest groups of the five, display slopes which are not just steeper than those of their sending provinces, but also their receiving provinces, suggesting these groups could catch up to the higher means of their new provinces. Table 4: Descriptive Statistics for 95th Percentile of all Potential Students Mean S.d. Slope Mean S.d. Slope Mean S.d. Slope Country 36 19 1.8 Sending province Moving group Receiving province EC > KN 31 18 1.4 30 14 0.7 37 17 2.1 LP > MP 32 17 2.2 27 15 3.3 36 19 2.6 MP > LP 36 19 2.6 31 20 1.3 32 17 2.2 NW > GP 34 20 1.6 35 17 2.6 43 22 1.4 NW > NC 34 20 1.6 23 12 1.9 37 25 0.4 Mean S.d. Slope Mean S.d. Slope Mean S.d. Slope Country 36 19 1.8 Sending province Moving group Receiving province EC > KN 31 18 1.4 30 14 0.7 37 17 2.1 LP > MP 32 17 2.2 27 15 3.3 36 19 2.6 MP > LP 36 19 2.6 31 20 1.3 32 17 2.2 NW > GP 34 20 1.6 35 17 2.6 43 22 1.4 NW > NC 34 20 1.6 23 12 1.9 37 25 0.4 Note: The statistics all describe the dependent variable of models A to D, namely mark at the 95th percentile relative to earlier grade 10. The slope and overall mean are calculated using the year-group means for all the years 2005–2013. The slope is the annual slope across the year-group means. Table 4: Descriptive Statistics for 95th Percentile of all Potential Students Mean S.d. Slope Mean S.d. Slope Mean S.d. Slope Country 36 19 1.8 Sending province Moving group Receiving province EC > KN 31 18 1.4 30 14 0.7 37 17 2.1 LP > MP 32 17 2.2 27 15 3.3 36 19 2.6 MP > LP 36 19 2.6 31 20 1.3 32 17 2.2 NW > GP 34 20 1.6 35 17 2.6 43 22 1.4 NW > NC 34 20 1.6 23 12 1.9 37 25 0.4 Mean S.d. Slope Mean S.d. Slope Mean S.d. Slope Country 36 19 1.8 Sending province Moving group Receiving province EC > KN 31 18 1.4 30 14 0.7 37 17 2.1 LP > MP 32 17 2.2 27 15 3.3 36 19 2.6 MP > LP 36 19 2.6 31 20 1.3 32 17 2.2 NW > GP 34 20 1.6 35 17 2.6 43 22 1.4 NW > NC 34 20 1.6 23 12 1.9 37 25 0.4 Note: The statistics all describe the dependent variable of models A to D, namely mark at the 95th percentile relative to earlier grade 10. The slope and overall mean are calculated using the year-group means for all the years 2005–2013. The slope is the annual slope across the year-group means. Regressions represented by equation (5) below were run for sub-samples of the observations: just the years 2005–2007, and for just one province at a time. School fixed effects (not shown in the equation) and the bootstrapping of standard errors were applied. The aim was to detect whether the trajectory of schools about to move was different from that of other schools in the province. Here T is a 0–1 dummy indicating schools which will move, and P is period (carrying values 0, 1 and 2 for 2005, 2006 and 2007). Any pre-2008 group-specific trend could suggest a group was on a trajectory that would be sustained after the switch, which could change the interpretation of the results in Table 4. The only statistically significant (at the 10% level) coefficient emerging for β2 was that for the LP > MP group, the value being a negative 1.5, meaning this group was improving at a slower speed than the rest of LP. This obviously removes the possibility that the large improvements seen by this group from 2008 were part of a trend which preceded the change in province. Eit=λˆ+βˆ1Pit+βˆ2PTit+uˆit (5) In order to test whether trends in the switching groups were attributable to broader trends in the wider geographical areas of these groups, an additional control was introduced in the form of non-switching schools near the switching schools, within the sending province. This was done for the two improving groups with the clearest trends, LP > MP and NW > GP, and for the one worsening group, MP > LP. These were also groups for which a reasonable number of nearby schools existed. Nearby schools from the sending province were added, starting with the school closest to any of the switching schools, until the number of schools in the new control group equalled the number of schools in the treatment, or switching group. This occurred upon reaching a distance of 41 km for schools near NW > GP, 99 km for LP > MP and 9 km for MP > LP. Dummies for the three control groups, and interactions between the dummies and period, were added to Model C, along the lines of what had been done for the switching groups. Of the six new coefficients, none of the interactions with period were significant (p values of 0.573 or higher), and just one of the coefficients on the dummies emerged as significant. This was in the case of the dummy for schools near the LP > MP group, where the coefficient was −4.1 (p of 0.001). This indicates that these nearby schools experienced a deterioration in their results, relative to the rest of LP, while schools which moved to MP saw relatively strong annual improvements. The possibility of wider geographical trends unrelated to the border changes being an alternative explanation is not supported by the data. The possibility that better students from non-switching LP schools started attending the switching LP > MP schools in the hope of receiving a better education is also not supported by the data—this would otherwise have been a possible explanation for the performance decline in nearby schools. The group of 83 non-switching schools near the LP > MP schools are in fact mostly far from the switching schools: only two of the 83 were within 50 km, with the remaining 81 being in the range of 50–99 km. Such distances would largely preclude substantial ‘voting with ones feet’, in particular considering that the area in question is relatively poor. To obtain a more precise idea of improvements over time among schools moving to a better province, Model A was adapted. To the eight general year dummies were added another eight single-year dummies, with 2005 serving as the reference year, which carried a value of 1 only for schools moving to a better province. The coefficients, and their confidence intervals, for the additional year dummies are illustrated in Figure 3 below. Bootstrapping was applied. In aggregate, for the 138 schools moving to a better province, most of the improvement had been achieved by 2011, implying an initial period of relatively intense improvement over the years 2008–2011. The dip in 2008 is noteworthy, and a bit puzzling. It is possible that the switch to the new province brought about initial disruptions to the schooling process. However, this disruption was clearly overcome after a year, with 2009 displaying the best outputs since 2005. Figure 3: View largeDownload slide Year-specific gains for ‘to better’ schools. Note: Three-point vertical lines represent the point estimate for the coefficient with 95% confidence intervals. Values along the vertical axis are changes in mathematics marks relative to the 2005 reference year Figure 3: View largeDownload slide Year-specific gains for ‘to better’ schools. Note: Three-point vertical lines represent the point estimate for the coefficient with 95% confidence intervals. Values along the vertical axis are changes in mathematics marks relative to the 2005 reference year 3.3 Alternative dependent variables and student-level models The coefficients shown in Table 5 below are from models similar to B and C in Table 3, but with the complexity of interactions with period removed. They are thus the coefficients for dummy variables taking on the value 1 for the group in question if the year is in the range 2008–2013. Each row of Table 5 draws from two regressions, one to obtain the first five columns, and another for the final column. The overall objective was to examine the impact of using the mean, instead of the 95th percentile, of removing observations with zero representing non-selected students, and of running the analysis with student observations, as opposed to school observations. Table 5: Alternative Dependent Variables and Level Level E Scope MP > LP EC > KN LP > MP NW > GP NW > NC To better School p95 All −8.55*** 4.42*** 3.59* 3.10*** School p95 Grade −3.44** 5.71*** School p95 Class 7.78*** School Mean All 0.64** 0.41** School Mean Grade −1.85*** 1.60*** −0.87*** School Mean Class −3.86*** 4.87*** −3.97** −1.73*** Student Mean All −2.16*** Student Mean Grade −4.92*** −3.21*** 1.68** −1.05*** Student Mean Class −3.60*** −4.56*** 3.23*** −5.07*** −1.52** Level E Scope MP > LP EC > KN LP > MP NW > GP NW > NC To better School p95 All −8.55*** 4.42*** 3.59* 3.10*** School p95 Grade −3.44** 5.71*** School p95 Class 7.78*** School Mean All 0.64** 0.41** School Mean Grade −1.85*** 1.60*** −0.87*** School Mean Class −3.86*** 4.87*** −3.97** −1.73*** Student Mean All −2.16*** Student Mean Grade −4.92*** −3.21*** 1.68** −1.05*** Student Mean Class −3.60*** −4.56*** 3.23*** −5.07*** −1.52** Note: The student-level models used around 7.9, 4.0 and 2.3 million observations for the ‘All’, ‘Grade’ and ‘Class’ models. For the student-level models, standard errors consider clustering by schools. *** indicates that the estimate is significant at the 1% level of significance, ** at the 5% level, and * at the 10% level. Table 5: Alternative Dependent Variables and Level Level E Scope MP > LP EC > KN LP > MP NW > GP NW > NC To better School p95 All −8.55*** 4.42*** 3.59* 3.10*** School p95 Grade −3.44** 5.71*** School p95 Class 7.78*** School Mean All 0.64** 0.41** School Mean Grade −1.85*** 1.60*** −0.87*** School Mean Class −3.86*** 4.87*** −3.97** −1.73*** Student Mean All −2.16*** Student Mean Grade −4.92*** −3.21*** 1.68** −1.05*** Student Mean Class −3.60*** −4.56*** 3.23*** −5.07*** −1.52** Level E Scope MP > LP EC > KN LP > MP NW > GP NW > NC To better School p95 All −8.55*** 4.42*** 3.59* 3.10*** School p95 Grade −3.44** 5.71*** School p95 Class 7.78*** School Mean All 0.64** 0.41** School Mean Grade −1.85*** 1.60*** −0.87*** School Mean Class −3.86*** 4.87*** −3.97** −1.73*** Student Mean All −2.16*** Student Mean Grade −4.92*** −3.21*** 1.68** −1.05*** Student Mean Class −3.60*** −4.56*** 3.23*** −5.07*** −1.52** Note: The student-level models used around 7.9, 4.0 and 2.3 million observations for the ‘All’, ‘Grade’ and ‘Class’ models. For the student-level models, standard errors consider clustering by schools. *** indicates that the estimate is significant at the 1% level of significance, ** at the 5% level, and * at the 10% level. The variations to the model specification do bring about important differences in the results. The last column suggests that if the mean mark is the dependent variable, moving to a better province carries a positive and significant coefficient only if one controls for all students who drop out, or who do not reach the grade 12 mathematics class. If one controls only for those who reach grade 12 but do not take mathematics (row ‘Grade’), or if one considers only those in the mathematics class, all significant coefficients come out negative. If one uses performance at the 95th percentile, however, moving to a better province never produces a negative coefficient, regardless of the ‘Scope’ of students considered. Turning to the group-specific statistics of the middle four columns, for the NW > GP group the coefficients are positive (wherever a significant coefficient emerges), no matter how one deals with non-selected students, whether one uses school or student observations, and regardless of which of the two dependent variables E one chooses. The NW > GP results thus seem consistent with what one might expect, and with the analysis of the previous section. The coefficients in the MP > LP column are all negative, which is also a consistent result, given that this is a move to a worse province. The counter-intuitive negative coefficients in the last column are driven by negative coefficients for LP > MP, a group comprising more than half of the schools moving to a better province. Why would moving to a better province improve performance at the 95th percentile, while mostly lowering the mean, in a relative sense? (Where one controls for all non-selected students, and uses the mean, the expected positive coefficient for LP > MP does emerge—see the value 0.64.) Examination of the mean for the mathematics class, and the mean taking into account all non-selected students, along the lines of the descriptive statistics of Table 4, confirms that LP > MP experienced a steeper positive improvement than non-switching LP schools with respect to the latter indicator, but a less steep improvement with respect to the first indicator. Crucially, while LP > MP schools saw no change in the percentage of students making it into the mathematics class (relative to earlier grade 10), this indicator saw a considerable decline for the rest of the sending province (annual slope of −1.1 percentage points). Even the receiving province of MP was seeing a similar decline. One can conclude that schools other than LP > MP schools experienced an artificial additional ‘improvement’ in the mean for the mathematics class because they kept weaker students out of the class. All this underscores the importance of controlling for selection effects when analysing trends in examination results. 3.4 Policy-related explanatory variables So far, there has been no discussion of what in the province-switching experience led to educational improvement. With respect to human resources, two distinct possibilities exist, and differentiating between the two is important for policy purposes. On the one hand, it is possible that the new provincial administrations changed the effectiveness of incoming schools by inserting new and better education staff. Alternatively, by changing incentive structures, or providing certain physical resources, the new provincial administration could have changed the behaviour of inherited staff in positive ways, without changing the composition of this staff. This has policy implications insofar as the latter possibility would support more strongly the policy argument that change can be brought about simply through changes in incentive structures (or physical resourcing), in other words through a policy reform that is more widely replicable. Payroll data from 2005 and 2012 were used to calculate a staffing stability indicator. si=Ti2Ti1+Tia (6) Here Ti2 is the number of ‘educators’, meaning teachers or education management staff in schools, employed permanently in the province, who were based in school i in both 2005 and 2012. Ti1 is the number of educators who were based in the school in 2005 and Tia is new educators who arrived in the school in the sense that they worked there in 2012, but not in 2005. The stability indicator s carries a value between 0.00 (maximum staff turnover) and 1.00 (no staff turnover). The mean indicator values for three key switching groups appear in Table 6. This section will focus on specific groups of switching schools, in particular those groups producing significant coefficients in the foregoing analysis. Combining groups into a ‘to better’ category is not done as one would expect policy interventions to be fairly specific to the receiving province. Table 4 indicates that the switching groups experienced greater staffing stability than either non-switching schools in the entire remainder of the sending province, or the subset of these schools found near the switching schools (these are the sub-sets discussed above). Clearly, performance improvements in the switching schools were not achieved through exceptional staffing changes. Did switching schools experience a higher probability of a change in the school principal, an event which might have turned schools around? The payroll data analysed are a bit inconsistent when it comes to identifying who the school principal is in each year, which limits the possibility of precise measures. The indications from the data were that virtually no changes to the school principal amongst the 151 switching schools occurred, and that the probability of a principal change was greater amongst non-switching schools across the country. Table 6: Descriptive Statistics for Indicator of Staffing Stability Sending province—all Sending province—just closest Switching group LP > MP 0.511 (0.002) 0.502 (0.008) 0.661 (0.007) MP > LP 0.362 (0.003) 0.442 (0.011) 0.695 (0.011) NW > GP 0.398 (0.003) 0.418 (0.007) 0.524 (0.01) Sending province—all Sending province—just closest Switching group LP > MP 0.511 (0.002) 0.502 (0.008) 0.661 (0.007) MP > LP 0.362 (0.003) 0.442 (0.011) 0.695 (0.011) NW > GP 0.398 (0.003) 0.418 (0.007) 0.524 (0.01) Note: Values in brackets are standard errors. Table 6: Descriptive Statistics for Indicator of Staffing Stability Sending province—all Sending province—just closest Switching group LP > MP 0.511 (0.002) 0.502 (0.008) 0.661 (0.007) MP > LP 0.362 (0.003) 0.442 (0.011) 0.695 (0.011) NW > GP 0.398 (0.003) 0.418 (0.007) 0.524 (0.01) Sending province—all Sending province—just closest Switching group LP > MP 0.511 (0.002) 0.502 (0.008) 0.661 (0.007) MP > LP 0.362 (0.003) 0.442 (0.011) 0.695 (0.011) NW > GP 0.398 (0.003) 0.418 (0.007) 0.524 (0.01) Note: Values in brackets are standard errors. In order to test what incentives or physical resources made a difference to human behaviour and learning in the moving schools, we obtained school- and year-level data for 16 variables which conceivably captured important policy-induced changes. For many possible effects, data do not exist. For instance, there are no data on the instructions issued by departmental offices to schools, and on how schools respond to these instructions. Even phenomena for which data exist in many other countries, are not covered by any South African national database, including student and teacher attendance, and access to books in schools. We were thus severely limited by what data was available. Despite this, we were able to come up with 16 variables which seemed worth testing. From the Snap school census data, we extracted the following seven variables: total school enrolment; total educators (this would include teachers and schools-based managers); the pupil-teacher ratio (from the previous two variables); the ratio of educators to schools-based support staff; whether any educators are paid privately (using funds raised through fees); the percentage of educators paid privately; and the percentage of educators who are women. From the examinations data seven variables were derived: the number of students in the grade 12 group; students in the grade 12 mathematics class; the percentage of females in this class; the percentage of students in this class who are black African; the average age of the students in this class; the percentage of grade 12 students taking mathematics; and the number of grade 12 subjects examined. From the payroll data we obtained the gender of the school principal, where it was clear who the principal was. Finally, one variable used a combination of sources: the ratio of grade 12 students to grade 10 students two years previously. All but two of the variables had values specific to each of the nine years. The school principal’s gender was available for 2005 and 2012 only, while the number of grade 12 subjects examined was available for just 2005 and 2015. For each of these two variables, the initial value was used for 2005–2007, and the subsequent value for 2008–2013. We tested the 16 time-variant school-level variables in two ways. Firstly, we tested whether the annual change in the variable for three of the five moving groups (the three from Table 6) was different from the annual change seen in the respective sending provinces. In this step, we were interested not in examining correlations with the dependent variable on mathematics performance, E, but simply in examining whether moving schools displayed exceptional trends with respect to the explanatory variables, something which would provide suggestive evidence of causal factors. Secondly, for each of the 16 variables, we ran a few panel regressions, of the kind shown in Table 3, to test the conditional correlation of the variable with the dependent variable of Table 3 (performance at the 95th percentile with non-selected students assigned a mark of zero). What emerged as a particularly interesting variable was the ratio of educators to schools-based support staff, so we discuss this first. The slopes in Table 7 below are shown where the coefficient β2 in equation (7) was significant at least at the 10% level. In this equation, X is the policy variable of interest, T is the dummy variable indicating inclusion in the switching group, i is school, t is period, and P is the period. For each of the 16 variables, the regression (with school fixed effects and bootstrapping) was run three by three, so nine, times. It was run for three periods: the whole 2005 to 2013 range; 2005 to 2007 (pre-treatment years); and 2008 to 2013 (treatment years). It was also run for each of the three moving groups, with only observations from the moving group and its sending province included. Xit=λˆ+βˆ1Pit+βˆ2PTit+uˆit (7) Table 7: Staffing Details Variable Group Mean across all years (2005–2013) Slope β2 from equation (7) 2005–2013 2005–2007 2008–2013 S M R S M S M S M PT ratio LP > MP 28 29 28 −0.79 −0.20 −0.17 −0.12 −0.42 Enrolment LP > MP 557 676 802 −3.7 8.4 Ed. staff LP > MP 20 23 28 0.38 0.43 ES ratio LP > MP 9 8 6 −0.68 −0.43 −0.24 −0.70 Gr. 12 group MP > LP 48 41 36 −2.04 2.37 −3.45 −3.12 Gr. 12 math. MP > LP 26 27 22 −2.10 1.87 −2.55 −2.12 Enrolment MP > LP 802 631 557 −8.0 −20.4 −12.0 −29.0 −33.2 Gr. 12/Gr. 10 MP > LP 0.55 0.51 0.56 −0.02 0.02 −0.01 −0.03 PT ratio NW > GP 26 28 29 −0.42 −1.68 1.61 0.16 Enrolment NW > GP 668 917 1172 −6.5 33.0 Ed. staff NW > GP 26 33 43 0.11 0.64 0.74 0.40 ES ratio NW > GP 9 8 4 −1.65 0.22 −1.18 Variable Group Mean across all years (2005–2013) Slope β2 from equation (7) 2005–2013 2005–2007 2008–2013 S M R S M S M S M PT ratio LP > MP 28 29 28 −0.79 −0.20 −0.17 −0.12 −0.42 Enrolment LP > MP 557 676 802 −3.7 8.4 Ed. staff LP > MP 20 23 28 0.38 0.43 ES ratio LP > MP 9 8 6 −0.68 −0.43 −0.24 −0.70 Gr. 12 group MP > LP 48 41 36 −2.04 2.37 −3.45 −3.12 Gr. 12 math. MP > LP 26 27 22 −2.10 1.87 −2.55 −2.12 Enrolment MP > LP 802 631 557 −8.0 −20.4 −12.0 −29.0 −33.2 Gr. 12/Gr. 10 MP > LP 0.55 0.51 0.56 −0.02 0.02 −0.01 −0.03 PT ratio NW > GP 26 28 29 −0.42 −1.68 1.61 0.16 Enrolment NW > GP 668 917 1172 −6.5 33.0 Ed. staff NW > GP 26 33 43 0.11 0.64 0.74 0.40 ES ratio NW > GP 9 8 4 −1.65 0.22 −1.18 Note: ‘S’ is sending province (including moving schools), ‘M’ is moving schools, ‘R’ is receiving province (excluding moving schools). Each row in the last six columns draws from three separate regressions based on equation (7). This table presents results from only a selection of all the regressions run. Table 7: Staffing Details Variable Group Mean across all years (2005–2013) Slope β2 from equation (7) 2005–2013 2005–2007 2008–2013 S M R S M S M S M PT ratio LP > MP 28 29 28 −0.79 −0.20 −0.17 −0.12 −0.42 Enrolment LP > MP 557 676 802 −3.7 8.4 Ed. staff LP > MP 20 23 28 0.38 0.43 ES ratio LP > MP 9 8 6 −0.68 −0.43 −0.24 −0.70 Gr. 12 group MP > LP 48 41 36 −2.04 2.37 −3.45 −3.12 Gr. 12 math. MP > LP 26 27 22 −2.10 1.87 −2.55 −2.12 Enrolment MP > LP 802 631 557 −8.0 −20.4 −12.0 −29.0 −33.2 Gr. 12/Gr. 10 MP > LP 0.55 0.51 0.56 −0.02 0.02 −0.01 −0.03 PT ratio NW > GP 26 28 29 −0.42 −1.68 1.61 0.16 Enrolment NW > GP 668 917 1172 −6.5 33.0 Ed. staff NW > GP 26 33 43 0.11 0.64 0.74 0.40 ES ratio NW > GP 9 8 4 −1.65 0.22 −1.18 Variable Group Mean across all years (2005–2013) Slope β2 from equation (7) 2005–2013 2005–2007 2008–2013 S M R S M S M S M PT ratio LP > MP 28 29 28 −0.79 −0.20 −0.17 −0.12 −0.42 Enrolment LP > MP 557 676 802 −3.7 8.4 Ed. staff LP > MP 20 23 28 0.38 0.43 ES ratio LP > MP 9 8 6 −0.68 −0.43 −0.24 −0.70 Gr. 12 group MP > LP 48 41 36 −2.04 2.37 −3.45 −3.12 Gr. 12 math. MP > LP 26 27 22 −2.10 1.87 −2.55 −2.12 Enrolment MP > LP 802 631 557 −8.0 −20.4 −12.0 −29.0 −33.2 Gr. 12/Gr. 10 MP > LP 0.55 0.51 0.56 −0.02 0.02 −0.01 −0.03 PT ratio NW > GP 26 28 29 −0.42 −1.68 1.61 0.16 Enrolment NW > GP 668 917 1172 −6.5 33.0 Ed. staff NW > GP 26 33 43 0.11 0.64 0.74 0.40 ES ratio NW > GP 9 8 4 −1.65 0.22 −1.18 Note: ‘S’ is sending province (including moving schools), ‘M’ is moving schools, ‘R’ is receiving province (excluding moving schools). Each row in the last six columns draws from three separate regressions based on equation (7). This table presents results from only a selection of all the regressions run. Table 7 indicates that in the case of the LP > MP group, the ratio of educators to support staff, ‘ES ratio’, changed in the group, relative to the control group of the sending province LP as a whole, to a statistically significant degree, over the entire period. In the sending province, the ratio declined by 0.68 a year, but the decline for LP > MP schools was steeper, by an additional 0.43. Educators were thus enjoying an exceptionally large improvement in access to support staff. The annual mean values for this variable for LP > MP schools as a whole ranged between 12.1 and 13.1 between 2005 and 2008, but then fell precipitously to a range of 4.6–6.7 for the years 2009–2013 (the mean across all years was 8, as shown in Table 7). For non-switching schools in the sending province, there was a smaller yet substantial shift, from a mean of 11.0 for the years 2005–2008, to 7.5 for 2009–2013. Analysis of the original data reveals that the large shifts in the switching group were due to a better availability of support staff, not fewer educators. In particular, moving schools gained more access to administrative support staff, though there were also improvements in other categories such as cleaners and security. In 2007, only 6 of the 83 LP > MP schools had administrative support staff, and 35 had no support staff of any kind. By 2012, these figures had become 75 schools with administrative staff and only two with no support staff. Moreover, the average number of administrative support staff per school, where this human resource existed, had risen from 1.3 to 1.7. For the 13 MP > LP schools moving in the opposite direction schools with administrative support staff declined from 11 to 10 between 2007 and 2012. In the limited data we had available, improvements in access to administrative staff were, in the case of the LP > MP schools, by far the most striking positive change. This change would have allowed teachers to focus more on teaching and less on administration, and is likely to have affected the general administration of the school positively. However, the absence of data on other phenomena, and the literature on how change occurs in schools, should caution against reading too much into the administrative staff change. While important in and of itself, this change is also likely to be indicative of the general ability of the receiving province MP to govern and resource schools in better ways than the sending province LP. A slope for the ‘ES ratio’ that was significantly different to that of the control group was also found for the NW > GP ‘treatment group’. Here the most striking change was not in the number of schools with any administrative support staff—this increased from 20 to 28 between 2007 and 2012 (among the 29 schools)—but in the average number of such staff per school where such staff existed—this rose from 1.2 to 2.7. Other indicators according to which the switching schools stood out are reported in Table 7 (as well as selected indicators without significant slopes for the moving schools). LP > MP schools experienced a relative decline in the pupil-teacher ratio, though for neither the numerator (enrolments) nor for the denominator (educator staff) did a significant slope emerge. The MP > LP trends suggest a process of shrinkage not seen in the sending province in general. For instance, during the 2008 to 2013 ‘treatment period’—actually a ‘mistreatment’ period considering this was a move to a worse province—the group-specific slopes for the number of grade 12 students, the number of grade 12 mathematics candidates, overall enrolment in the school and the ratio of grade 12 students to earlier grade 10 students are all significantly negative. This would be consistent with the hypothesis that the worsening governance context for these schools disincentivised staff and students, and led to more dropping out amongst the latter. Mathematics performance appears sensitive to changes in the ratio of educators to support staff, but only to a statistically significant degree for province-switching schools experiencing large shifts in the ratio. This can be seen in Table 8 below. In Model E, which uses all observations from schools which started off in 2005 in the province LP, the ‘ES ratio’ for all schools and years does not produce a significant coefficient, whilst ‘ES ratio’ values for just LP > MP switching schools do. The coefficient is negative, indicating that the move to a lower ratio (as more support staff joined the school) was associated with better performance. The absence of a similarly significant coefficient on ‘ES ratio’ in the control group, despite the fact that even the control group experienced substantial improvements in the availability of support staff (as discussed earlier), strengthens the conclusion that it was not improvements in this indicator on its own that made a difference to the performance of switching schools, but rather the manner in which this change formed part of a wider package of reforms for the schools. Table 8: Regression Outputs with Policy Variables Dependent variable → E LP & LP > MP F NW & NW > GP Mark at 95th percentile with non-selected students (based on earlier grade 10) assigned a mark of zero Constant 39.6*** (0.57) 39.8*** (1.47) Is to better 4.482*** (1.56) 5.978*** (2.17) ES ratio 0.026 (0.02) 0.032 (0.04) Int. ES ratio & G −0.175* (0.10) −0.187* (0.10) N 6,387 2,137 Number of schools 1,025 265 R2 overall 0.119 0.070 Dependent variable → E LP & LP > MP F NW & NW > GP Mark at 95th percentile with non-selected students (based on earlier grade 10) assigned a mark of zero Constant 39.6*** (0.57) 39.8*** (1.47) Is to better 4.482*** (1.56) 5.978*** (2.17) ES ratio 0.026 (0.02) 0.032 (0.04) Int. ES ratio & G −0.175* (0.10) −0.187* (0.10) N 6,387 2,137 Number of schools 1,025 265 R2 overall 0.119 0.070 Note: G is a dummy with 1 being in the moving group, regardless of period. ‘Is to better’ only assumes non-zero values for the 2008–2013 period. Not reported here are single-year dummies, used in both models, with 2013 being the reference year. All models make use of bootstrap estimation of standard errors, which appear in parentheses. *** indicates that the estimate is significant at the 1% level of significance, ** at the 5% level, and * at the 10% level. Table 8: Regression Outputs with Policy Variables Dependent variable → E LP & LP > MP F NW & NW > GP Mark at 95th percentile with non-selected students (based on earlier grade 10) assigned a mark of zero Constant 39.6*** (0.57) 39.8*** (1.47) Is to better 4.482*** (1.56) 5.978*** (2.17) ES ratio 0.026 (0.02) 0.032 (0.04) Int. ES ratio & G −0.175* (0.10) −0.187* (0.10) N 6,387 2,137 Number of schools 1,025 265 R2 overall 0.119 0.070 Dependent variable → E LP & LP > MP F NW & NW > GP Mark at 95th percentile with non-selected students (based on earlier grade 10) assigned a mark of zero Constant 39.6*** (0.57) 39.8*** (1.47) Is to better 4.482*** (1.56) 5.978*** (2.17) ES ratio 0.026 (0.02) 0.032 (0.04) Int. ES ratio & G −0.175* (0.10) −0.187* (0.10) N 6,387 2,137 Number of schools 1,025 265 R2 overall 0.119 0.070 Note: G is a dummy with 1 being in the moving group, regardless of period. ‘Is to better’ only assumes non-zero values for the 2008–2013 period. Not reported here are single-year dummies, used in both models, with 2013 being the reference year. All models make use of bootstrap estimation of standard errors, which appear in parentheses. *** indicates that the estimate is significant at the 1% level of significance, ** at the 5% level, and * at the 10% level. The coefficient on ‘ES ratio’ interacted with G (dummy for the switching group) remains negative, even in the presence of the ‘Is to better’ treatment dummy, which in the case of Model E means a value of 1 for LP > MP schools from 2008 onwards. Model F displays a very similar sensitivity of performance to support staff in the case of NW > GP schools. However, for both models the coefficient on ‘ES ratio’ loses its significance if one inserts the interaction of ‘Is to better’ with period. Any improvement beyond the basic linear trend during the treatment period is thus not explained by access to support staff. Other policy variables were tested using the approaches applied to the ‘ES ratio’. None of these other variables yielded significant coefficients worth reporting on. Essentially, the smallness of the treatment samples and the paucity of data on likely causal factors preclude any firm findings on causality via the econometric route. The trends for the indicators on the percentage of females and black Africans in the mathematics class, and the average age within this class, were not significantly and substantively different in the three groups of switching schools, relative to their sending provinces. These are indicators whose values could change due to policy shifts, or due to factors outside the control of the system. The data suggest that neither of these two possibilities were realised. 4. Interpreting the results in the context of additional information 4.1 Additional province characteristics The School Monitoring Survey, conducted in 2011 by the national education department, offers additional insights into quality factors behind changes seen in switching schools. This survey involved interviews and physical inspections in a nationally representative sample of around 2,000 schools. The result was data which are fairly rare in developing country schooling systems. Provincial values for three official indicators which draw from the survey data are reflected in the first three columns of Table 9 below. Table 9: Additional Indicators of Provincial Effectiveness Value in sending province Change when switching District support Access to books Curriculum coverage TIMSS District support Access to books Curriculum coverage EC > KN 52 63 34 0.159 8 −27 6 LP > MP 60 42 67 0.190 5 3 30 MP > LP 65 45 97 −0.190 −5 −3 −30 NW > GP 55 56 34 0.292 12 5 59 NW > NC 55 56 34 0.247 13 −18 −6 Std. dev. 9 11 25 Value in sending province Change when switching District support Access to books Curriculum coverage TIMSS District support Access to books Curriculum coverage EC > KN 52 63 34 0.159 8 −27 6 LP > MP 60 42 67 0.190 5 3 30 MP > LP 65 45 97 −0.190 −5 −3 −30 NW > GP 55 56 34 0.292 12 5 59 NW > NC 55 56 34 0.247 13 −18 −6 Std. dev. 9 11 25 Source:South Africa: Department of Basic Education, 2013: 16, 37, 44. The TIMSS values are reproduced from Table 2 above. The standard deviations in the last row are calculated across all nine provincial indicator values. Table 9: Additional Indicators of Provincial Effectiveness Value in sending province Change when switching District support Access to books Curriculum coverage TIMSS District support Access to books Curriculum coverage EC > KN 52 63 34 0.159 8 −27 6 LP > MP 60 42 67 0.190 5 3 30 MP > LP 65 45 97 −0.190 −5 −3 −30 NW > GP 55 56 34 0.292 12 5 59 NW > NC 55 56 34 0.247 13 −18 −6 Std. dev. 9 11 25 Value in sending province Change when switching District support Access to books Curriculum coverage TIMSS District support Access to books Curriculum coverage EC > KN 52 63 34 0.159 8 −27 6 LP > MP 60 42 67 0.190 5 3 30 MP > LP 65 45 97 −0.190 −5 −3 −30 NW > GP 55 56 34 0.292 12 5 59 NW > NC 55 56 34 0.247 13 −18 −6 Std. dev. 9 11 25 Source:South Africa: Department of Basic Education, 2013: 16, 37, 44. The TIMSS values are reproduced from Table 2 above. The standard deviations in the last row are calculated across all nine provincial indicator values. The district support indicator reflects the percentage of responses from school principals which said ‘satisfactory’, where each principal could express opinions on up to 21 monitoring and support functions performed by the district (as explained earlier, education district offices are an integral part of the provincial education administration). The access to books indicator reflects the level of access to mathematics textbooks in grade 9 classrooms, as seen by fieldworkers who visited classrooms. The curriculum coverage indicator reflects a fieldworker’s evaluation of the mathematics writing book of one well-performing grade 9 student from each of the sampled schools which offered grade 9. The indicator drew from the data covering the volume of work in the student’s book. The relevance of the last two indicators is strengthened by the fact that of the 151 switching schools, only 29 do not offer grade 9 (14 of these were originally in Eastern Cape). The three school ‘migrations’ producing the least ambiguous results in the sense that they are inside the ‘correct’ quadrant of earlier Figure 2, all experienced changes in the three indicators which are consistent with the earlier findings, and with the TIMSS-based indicator of province effectiveness. For example, NW > GP schools moved to a province where the indicator of curriculum coverage was far higher than in the old province, the difference being a whole two standard deviations (measured across all nine provincial values). This suggests a far better culture of schooling in the new province, with more time-on-task in classrooms and greater teacher accountability. For schools switching between LP and MP, the same curriculum coverage indicator explains much of the difference between the two provinces. In fact, MP displayed the best value for this indicator of all nine provinces. However, even for the district support and access to books indicators across-province differences are consistent with the grade 12 examinations changes seen earlier, at least for NW > GP, LP > MP and MP > LP. Much of the literature on school financing points to differences in public spending playing little or no role in producing better education, beyond a basic level of per student spending (Glewwe, Hanushek and Humpage, Ravina, 2011: 4). Indeed, an examination of provincial spending patterns reveals nothing that suggests changes in funding levels played a role in the improvements seen in province-switching schools. Per student funding has remained roughly similar across provinces during the years 2005–2013. The ‘NW > NC’ and ‘LP > MP’ groups of schools moved to provinces spending just 5% more whilst ‘EC > KN’ and ‘NW > GP’ schools moved to provinces spending slightly less (South Africa: National Treasury, 2009: 38; Kruger and Rawle, 2012: 33). How provinces spend their money appears to matter more than the amounts spent. In Reference removed for blinded peer-reviewing, we report on analysis which found that Gauteng (GP) had pursued, not just in the case of the province-switching schools, but all secondary schools, a tacit strategy of reducing the proportion and absolute numbers of grade 12 students enrolled in mathematics. Such a strategy is a controversial one which many policymakers and researchers would understand as damaging for national development. In fact, South Africa’s national development plan laments dwindling participation in mathematics in grade 12 (South Africa: National Planning Commission, 2012: 317). The problem with this logic is that it ignores the fact that the percentage of mathematics students who acquire the skills in this subject needed for mathematically-oriented university programmes has been very low. Gauteng’s reduction in the number of mathematics students per school is perhaps indicative of an understanding amongst planners that consolidating mathematics in the school through smaller classes is better than expanding these classes, if the desired outcome is more university-ready mathematics students. Gauteng has attempted to make senior managers in the education sector more accountable, through better use of performance targets. One aspect of this is the increasing use of fixed-term contracts, as opposed to permanent tenure, in the case of senior managers in Gauteng’s education administration. Our own analysis of the payroll data revealed that in each province except for Gauteng, the percentage of the top paid one hundred public servants employed on a permanent basis has been at least 90% during the period 2005 to 2014 (counting only the education sector). In Gauteng, however, this percentage has dropped steadily, from 95% in 2005 to just below 60% in 2014. Conversations with Gauteng officials indicate that employing new senior managers on a contract basis, generally for terms of around four years, has been a deliberate strategy aimed at making the organogram more responsive to changing circumstances, and improving the incentives for senior managers to perform well. Even if moving to Gauteng did not mean an increase in per student spending, interviews we were able to conduct with Gauteng officials suggested that physical resourcing did play a role in improving performance. Additional education resources such as textbooks, videos of science experiments and equipment for practical exercises in technical subjects did reportedly help schools moving to Gauteng improve. 4.2 The speed of the improvements An obvious question is how results found in the current paper compare against substantial improvements seen elsewhere. Put differently, how remarkable is the impact of a new administration? Comparisons across education systems of improvements are generally done by comparing changes in student test score means in terms of standard deviations. However, the central findings of the current paper refer to improvements at the school level, and with respect to performance at the 95th percentile. Using the metrics of the current paper, the improvement between the pre-treatment period and 2013, the last treated year analysed, comes to 0.30 of a standard deviation. This is the difference between the mean of the pre-treatment years 2005–2007 and the 2013 estimate seen in Figure 3 (a gain of 5.9) over the standard deviation of the performance measure (mark at the 95th percentile) across schools (19.5). TIMSS grade 9 mathematics data, from 2011 and 2015, for 35 countries, were used to obtain a conversion factor. A gain, by a country, of one standard deviation across the 2 years with respect to a conventional measure of student-level means expressed as a standard deviation, was found to be associated with a gain of 1.9 in the measure we are interested in (the mean across school-level performance at the 95th percentile). We can thus say that the gain we find of 0.30 converts to approximately 0.16 of a student-level standard deviation, which is the more conventional measure. This is a noteworthy gain, achieved over six years, meaning 0.03 student-level standard deviations a year. It is around half of the speed of the improvement experienced by Brazil in its PISA2 mathematics results in the nine years 2000 to 2009: Brazil’s improvement was 0.53 Brazilian standard deviations, or 0.06 a year. Brazil has arguably displayed stronger and more consistent gains in an international testing programme than any other country. Importantly, the South African improvement for the switching schools would be additional to system-wide improvements, which are substantial according to South Africa’s grade 9 TIMSS mathematics scores for the 2002–2015 period. Improvements seen in project-type intervention programmes tend to be considerably larger than, say, those of Brazil’s PISA improvement. Such improvements can reach 0.15 of a standard deviation across students, achieved possibly in one year (McEwan, 2015). However, improvements of this magnitude are not seen in whole schooling systems. Given that the provincial change phenomenon studied in this paper was not a purposively designed intervention programme, comparisons to system-wide improvement trends seem more relevant. 5. Conclusion The paper has used examinations data across nine years, plus the fact that administrative boundaries in South Africa changed, to create a quasi-experiment examining the possible impact of a different administration, within the same country and general policy environment, on student performance in mathematics at the secondary level. The analysis concludes that what administration a school falls under matters for performance. The improvement for schools moving to a better province was considerable. The size of the annual gains sustained over 6 years, at 0.03 student-level standard deviations per year, is around half that of the best improver countries in the international testing programmes. Many of the administrative strategies which seemed to have played a role are somewhat predictable: better monitoring and support by the administration, and a strong focus on ensuring that schools have the educational materials they need. Importantly, higher per student spending appears not to be a major explanatory factor. Yet within fairly constant per student spending parameters, some provincial administrations paid more attention to providing administrative support staff to schools, and this does seem to be a factor contributing to better results. Specifically, many switching schools whose mathematics results improved experienced a shift from having no such staff to having one or two per school. A possible factor which could easily have been overlooked, because it is not directly observable within schools, is the strategy of making senior managers in the administration more accountable for their actions, partly by relying less on permanent tenure and more on fixed-term contracts amongst these managers. This strategy is followed by one of the provincial administrations associated with improvements in schools. Examinations data, as opposed to data from standardised tests, are not easy to use for the analysis of trends and cause and effect. Yet as shown above, the task is not necessarily impossible. In fact, examinations data may be the best available option for studying within-country dynamics, given the relatively high frequency of examinations and the absence of sample size limitations. Two matters which must be controlled for when using examinations data, and were controlled for in the current analysis, are relatively weak comparability of examinations scores over time and selection effects in the form of the dropping out of students prior to the examination. The fact that an indicator which gauged grade 12 mathematics performance at the 95th percentile relative to grade 10 enrolments 2 years previously produced particularly robust results is interesting. Indicators such as these may not be the simplest to calculate, yet they should arguably be used to a greater extent in, for instance, school accountability programmes. The quasi-experiment created by historical circumstances has allowed for an unusual focus on the administrative layer existing above schools, as opposed to interventions dealing with specific inputs such as teacher training, educational materials or accountability tools. Focussing on the latter is obviously important, but so is understanding what general characteristics of public sector management and leadership lend themselves to good decision-making with respect to education interventions. Footnotes 1 Trends in International Mathematics and Science Study. 2 Programme for International Student Assessment. References Cogneau D. , Moradi A. ( 2014 ) ‘ Borders that divide: Education and religion in Ghana and Togo since colonial times ’, Journal of Economic History , 74 ( 3 ): 694 – 729 . Google Scholar Crossref Search ADS Glewwe P. , Hanushek E. , Humpage S. , Ravina R. ( 2011 ) School Resources and Educational Outcomes in Developing Countries: A Review of the Literature from 1990 to 2010 . Washington : NBER . Gustafsson M. ( 2016 ) Understanding Trends in High-Level Achievement in Grade 12 Mathematics and Physical Science . Pretoria : Department of Basic Education . Gustafsson M. , Taylor S. ( 2016 ) Treating schools to a new administration: Evidence from South Africa of the impact of better practices in the system-level administration of schools . Stellenbosch : University of Stellenbosch . Hahn Y. , Wang L. , Yang H. ( 2015 ) Does Greater School Autonomy Make a Difference? Evidence from a Randomized Natural Experiment in South Korea . Oxford : RISE . Häkkinen I. , Kirjavainen T. , Uusitalo R. ( 2003 ) ‘ School resources and school achievement revisited: New evidence from panel data ’, Economics of Education Review , 22 : 329 – 35 . Google Scholar Crossref Search ADS Imbens G. W. , Wooldridge J. M. ( 2009 ) ‘ Recent developments in the econometrics of program evaluation ’, Journal of Economic Literature , 47 ( 1 ): 5 – 86 . Google Scholar Crossref Search ADS Kruger J. , Rawle G. ( 2012 ) Public Expenditure Analysis for the Basic Education Sector in South Africa [unpublished report] . Pretoria : Oxford Policy Management . McEwan P. ( 2015 ) ‘ Improving learning in primary schools of developing countries: A meta-analysis of randomized experiments ’, Review of Educational Research , 85 ( 3 ): 353 – 94 . Google Scholar Crossref Search ADS Pritchett L. , Beatty A. ( 2012 ) The negative consequences of overambitious curricula in developing countries . Washington : Center for Global Development . Pritchett L. , Woolcock M. , Andrews M. ( 2012 ) ‘ Looking like a state: Techniques of persistent failure in state capability for implementation ’, Journal of Development Studies , 49 ( 1 ): 1 – 18 . Google Scholar Crossref Search ADS Schlotter M. , Schwerdt G. , Woessman L. ( 2011 ) ‘ Econometric methods for causal evaluation of education policies and practices ’, Education Economics , 2 ( 19 ): 109 – 37 . Google Scholar Crossref Search ADS South Africa: Department of Basic Education ( 2013 ) Detailed Indicator Report for Basic Education Sector, Pretoria. South Africa: National Planning Commission ( 2012 ) National Development Plan 2030: Our Future—Make It Work, Pretoria. South Africa: National Treasury ( 2009 ) Provincial Budgets and Expenditure Review 2005/06-2011/12, Pretoria. UNESCO ( 2014 ) Education for All Global Monitoring Report 2013/4: Teaching and Learning: Achieving Quality Education for all, Paris. Author notes This paper would have been impossible without the collaboration and advice from colleagues in South Africa’s Department of Basic Education. Both authors are currently based at the Department. © The Author(s) 2018. Published by Oxford University Press on behalf of the Centre for the Study of African Economies, all rights reserved. For Permissions, please email: journals.permissions@oup.com. This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/open_access/funder_policies/chorus/standard_publication_model)

Journal

Journal of African EconomiesOxford University Press

Published: Nov 1, 2018

There are no references for this article.