Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
A Capacity Management Plan (CMP) is one of the strategies that the Department for Education employs to support government schools that are experiencing increased enrolment demand. Increased enrolment demand occurs due to several factors including changes to the demographic profile of the local community and increases in local housing development. The purpose of a CMP is to assist a school to return to, or maintain, a sustainable enrolment level and to assist children to be able to attend their local school. The CMP outlines the enrolment criteria relevant to each school. The Department for Education has been implementing CMPs since 2009. Generally, the CMPs, over a period of time, have supported schools to manage enrolment demand within their school enrolment capacity. Further strategies include the provision of additional accommodation, implementation of a school zone or planning for new educational facilities. CMPs are approved by the Minister and published in the South Australian Government Gazette. Refer to the Capacity Management Plan webpage for more information and links to the CMPs published in the South Australian Government Gazette.
This dataset shows secondary and primary school locations, where the catchment area intersects with Stirling's Planning Policy Area, along with current and future estimated capacities.Note : A complete dataset, showing all of Stirling Council's schools, will also be made available within the Open Data platform.
Open Government Licence 3.0http://www.nationalarchives.gov.uk/doc/open-government-licence/version/3/
License information was derived automatically
This data shows Actual and Projected Pupil Numbers and Capacity of Schools by local authority district. 'Actual' figures provided are based on October school census data for the relevant year of intake.
For more information please see the annual School Organisation Plan published by Lincolnshire County Council.
This data is updated annually. Data source: Lincolnshire County Council School Organisation Planning Team. For any enquiries about this publication contact schoolorganisation@lincolnshire.gov.uk
The annual school capacity survey 2019 provides national and local authority level information as at 1 May 2019 on the numbers of:
It also provides pupil number forecasts up to the:
Additional tables, on estimated places needed at the national, local authority and planning area level, and capacity in school sixth forms, are also provided.
School Capacity
Simone Cardin-Stewart
Pupil Place Planning team
Email mailto:%20SCAP.PPP@education.gov.uk%20%20"> SCAP.PPP@education.gov.uk
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The dataset provides the number of schools by municipality and year, tracking changes in educational infrastructure over time. It includes data from multiple years, allowing for trend analysis and policy evaluation. The dataset is useful for education planners, researchers, and policymakers to assess school distribution and capacity planning.
U.S. Government Workshttps://www.usa.gov/government-works
License information was derived automatically
School capacity and enrollment from 2000 to current school year.
The 2021 Camden Annual School Places Planning reporting incorporates all underlying demographic data including: existing provision and capacity, actual registered births, and fertility, the latest GLA resident forecasts for births and their relationship to school rolls, GLA resident population and school roll forecasts, together with the additional pupils associated with new housing developments. The analysis is used to help us make informed decisions about the future organisation of school places at primary and secondary. This year, the Council has also included a review of its planning of places for pupils with special education needs and disabilities. Our understanding of the current demographic pressures facing schools has also informed the development of Camden’s new Education Strategy. It will be the purpose of the strategy, and our continued school organisation work, to ensure that our school system is sustainable and stable, offering the best outcomes for our residents. Recent school place reports have identified significant changes and a high level of volatility in demography at a national and local level. Over the last 5 years there has been a significant reduction in demand for pupil places within Camden. Forecasts for this year have been drawn up at a time of unprecedented change and challenge for families in Camden. Specifically, the effects of the COVID-19 pandemic have fundamentally altered people’s lives, the impacts of which are significantly reducing the level of demand for places anticipated within the current forecasts. These factors create a significant degree of uncertainty, and a less stable environment in which to plan ahead. The GLA modelling aims to account for these as best as they can but rely on assumptions of future trends that can only be accurately assessed over time. We consider that it is too early to be definitive about the medium-term impact on school rolls of the new forecast figures given the hugely disruptive impact of Covid. We will need to consider carefully current admission numbers and next year’s school roll projections before we can come to any firm conclusions beyond the difficult action the Council has already taken in removing available school places. The impact on pupil rolls of the recent arrival in Camden of significant numbers of children and young people, including asylum seekers and Afghan nationals, is also not yet known. Ensuring Camden has the right number of school places is both the Council’s statutory responsibility and aligns with our Camden 2025 priorities. Preventing schools becoming financially vulnerable and thus subject to unplanned change, helps maintain strong, safe, and open communities. Good and outstanding schools promote independent healthy lives and support strong growth and jobs. The attached zipped file includes: • 2021 Camden Annual School Places Planning Report • Appendix A: General (with Tables 1-5) • Appendix B: Primary (with Tables 1-7) • Appendix C: Secondary (with Tables 1-6) • Appendix D: Additional (with Figures 1-6) • Appendix E: Latest housing development trajectory and estimated child yield • Appendix F: Glossary of school places planning abbreviations, report references • Appendix G: A narrative description and guide for interpreting the forecasts and data provided • Appendix H: Special Education Needs and Disabilities analysis
Open Government Licence 3.0http://www.nationalarchives.gov.uk/doc/open-government-licence/version/3/
License information was derived automatically
Every year, the authority reviews its school places planning projections. This report highlights the 2015 current capacity and pressures within the borough, projected future demand and any likely impact of changes in neighbouring boroughs. It is linked to the Camden Plan aim of ‘investing in our communities to ensure sustainable neighbourhoods’ and fulfils the statutory duty for local authorities to ensure there are sufficient school places.
http://reference.data.gov.uk/id/open-government-licencehttp://reference.data.gov.uk/id/open-government-licence
Every year, the authority reviews its school places planning projections for primary and secondary sectors. This latest report highlights the 2016 current capacity and pressures within the borough, projected future demand and any likely impact of changes in neighbouring boroughs’. It is linked to the Camden Plan aim of ‘investing in our communities to ensure sustainable neighbourhoods’ and fulfils the statutory duty for local authorities to ensure there are sufficient school places. The compressed zip file comprises; Main report and appendices A – E, this report replaces reports/appendices from previous years’ due to updated data feeds from 2016
https://www.icpsr.umich.edu/web/ICPSR/studies/3525/termshttps://www.icpsr.umich.edu/web/ICPSR/studies/3525/terms
The 1970 Census School District Data Tape (SDDT) User's Guide was designed to complement the 1970 Census User's Guide prepared by the United States Census Bureau. The School District Data Tape (SDDT) created by the National Center for Education Statistics is a recompilation of the 1970 Census Fourth Count Population data, providing data tables for each school district in the country with 300 or more students. The preparation of the School District Data Tape required three major steps: (1) overlaying school district boundaries on census maps, (2) creating a geo-reference tape indicating the percent of each census area falling within each school district, and (3) merging the geo-reference tape with the 1970 Census Fourth Count Population Files A (Traced Areas) and B (Minor Civil Divisions). Some of the major uses of the School District Data Tape include: allocation of federal funds, desegregation planning, bilingual and minority special education planning, preschool and child care planning, facility planning, redistricting, urban-suburban-rural analyses, mobility analysis, social and economic inequality among school districts, and school children profiles. In addition to these uses, most state education agencies will find data by school district of value in allocating federal and state aid to school districts and in the evaluation of the inequality of property taxes as a basis for financing elementary and secondary education. The School District Data Tape matches, as closely as possible, the format of the Fourth Count (Population) Summary tapes supplied by the Census Bureau.
2023 - 2024 Elementary option school 'geo-zone' boundaries for Seattle Public Schools.
Students who want to attend an option school need to apply. No one is assigned automatically to an option school. Each option school has a geographic priority area (geo zone). The geographic zone tiebreaker is for applicants to an option school who live within a defined area in proximity to the school. Please note that living within the geographic zone does not guarantee assignment to the requested option school, but gives a priority for admission after siblings. Geo Zones may change from year to year as a tool for capacity management.
For questions, please contact enrollmentplanning@seattleschools.org
Templates for the submission of data for use in the GLA School Roll Projection Service
Notes on completing the Actual Roll templates:
Subscribers should use the templates included on this page to provide roll data for individual schools. This data should be split by national curriculum year and gender. Data for primary and secondary schools should be provided in the separate templates. Data for special schools and pupil referral units should be excluded.
Please do not change the names of position of the column headers or change the file format of the template.
Planning Area: This is used for aggregation of projections into planning area level projections. Where possible, it is recommended that the same naming convention for planning areas is used each year to faciliatate the comparison of results between successive rounds of projections. If a school is not in a planning area then this field should be left blank.
DfE Number: Please enter the school’s DfE number either with or without the borough identifier.
School Name: Where a school has both a primary and secondary phase the school name should be consistent across both roll files.
Notes on completing the Ward to Planning Area Matrix:
These matrices are used to apportion ward-level population data to education planning areas. Estimated and projected population by education planning area are produced as reporting outputs, and serve to provide context to and assist with quality assurance of the roll projections themselves.
It is only necessary to provide a new matrix in the event of changes to existing planning areas.
The matrices are not used directly in the production of the scholl roll projections, which are instead based on relationships between pupil's ward of residence and school attended that have been derived from anonymised extracts of records from the National Pupil Database.
The templates are pre-populated with default identifiers for planning areas (PA1 to PA15). If necessary, these should be changed to be consistent with the planning area names used in the submitted Actual Roll templates.
Each cell in the body of the matrix represents the proportion of the population for the ward in the cell's row that should be allocated to the planning area in the cell's column. This proportion should be a value between 0 and 1, with blank cells being treated as zeroes.
Both primary and (if applicable) secondary school planning areas should be included in the same template. If only primary planning areas are included, the values of the cells in each row should sum to 1. If secondary planning areas are also included, they should sum to 2.
There is no single "correct" way to assign ward populations to planning areas in the template - different local authorities draw up planning area boundaries using different methods and criteria, and the method used to apportion the population here is inherrently crude. In practice, rough approximations are sufficient for the purpose to which they are applied.
This dataset shows primary school catchment areas within Stirling Council's Planning Policy area. Data has been provided by Education Services, July 2024, and includes capacity, current and project pupil numbers
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This impact evaluation was designed to evaluate Whole School Development (WSD) program, a comprehensive school management and capacity building program in The Gambia. WSD provided a grant and management training to principals, teachers, and community representatives in a set of schools. In order to be able to separate the impact of the capacity building component from the grant, the second intervention group received the grant but did not receive the training. These two interventions were compared to a control group that received neither the grant nor the training. Each of 273 Gambian primary schools were randomized to one of the three groups. A grant of US$500 was given to all the schools in the WSD and the grant-only groups after a school development plan was presented. The schools were required to spend the funds on activities pertaining broadly to learning and teaching. This study is part of the broader World Bank's Africa Program for Education Impact Evaluation. The Gambia Bureau of Statistics, under the supervision of the research team, collected the data for this study. The baseline data was collected in 2008 at the onset of the study, the first round of follow-up data was collected in 2009, the second round of follow-up data was collected in 2010, and the endline data was collected in 2011. The endline survey is documented here. All other rounds of this impact evaluation are published in the Microdata Library.
The dashboard project collects new data in each country using three new instruments: a School Survey, a Policy Survey, and a Survey of Public Officials. Data collection involves school visits, classroom observations, legislative reviews, teacher and student assessments, and interviews with teachers, principals, and public officials. In addition, the project draws on some existing data sources to complement the new data it collects. A major objective of the GEPD project was to develop focused, cost-effective instruments and data-collection procedures, so that the dashboard can be inexpensive enough to be applied (and re-applied) in many countries. The team achieved this by streamlining and simplifying existing instruments, and thereby reducing the time required for data collection and training of enumerators.
National
Schools, teachers, students, public officials
Sample survey data [ssd]
The aim of the Global Education Policy Dashboard school survey is to produce nationally representative estimates, which will be able to detect changes in the indicators over time at a minimum power of 80% and with a 0.05 significance level. We also wish to detect differences by urban/rural location. For our school survey, we will employ a two-stage random sample design, where in the first stage a sample of typically around 200 schools, based on local conditions, is drawn, chosen in advance by the Bank staff. In the second stage, a sample of teachers and students will be drawn to answer questions from our survey modules, chosen in the field. A total of 10 teachers will be sampled for absenteeism. Five teachers will be interviewed and given a content knowledge exam. Three 1st grade students will be assessed at random, and a classroom of 4th grade students will be assessed at random. Stratification will be based on the school’s urban/rural classification and based on region. When stratifying by region, we will work with our partners within the country to make sure we include all relevant geographical divisions. For our Survey of Public Officials, we will sample a total of 200 public officials. Roughly 60 officials are typically surveyed at the federal level, while 140 officials will be surveyed at the regional/district level. For selection of officials at the regional and district level, we will employ a cluster sampling strategy, where roughly 10 regional offices (or whatever the secondary administrative unit is called) are chosen at random from among the regions in which schools were sampled. Then among these 10 regions, we also typically select around 10 districts (tertiary administrative level units) from among the districts in which schools werer sampled. The result of this sampling approach is that for 10 clusters we will have links from the school to the district office to the regional office to the central office. Within the regions/districts, five or six officials will be sampled, including the head of organization, HR director, two division directors from finance and planning, and one or two randomly selected professional employees among the finance, planning, and one other service related department chosen at random. At the federal level, we will interview the HR director, finance director, planning director, and three randomly selected service focused departments. In addition to the directors of each of these departments, a sample of 9 professional employees will be chosen in each department at random on the day of the interview.
The sample for the Global Education Policy Dashboard in SLE was based in part on a previous sample of 260 schools which were part of an early EGRA study. Details from the sampling for that study are quoted below. An additional booster sample of 40 schools was chosen to be representative of smaller schools of less than 30 learners.
EGRA Details:
"The sampling frame began with the 2019 Annual School Census (ASC) list of primary schools as provided by UNICEF/MBSSE where the sample of 260 schools for this study were obtained from an initial list of 7,154 primary schools. Only schools that meet a pre-defined selection criteria were eligible for sampling.
To achieve the recommended sample size of 10 learners per grade, schools that had an enrolment of at least 30 learners in Grade 2 in 2019 were considered. To achieve a high level of confidence in the findings and generate enough data for analysis, the selection criteria only considered schools that: • had an enrolment of at least 30 learners in grade 1; and • had an active grade 4 in 2019 (enrolment not zero)
The sample was taken from a population of 4,597 primary schools that met the eligibility criteria above, representing 64.3% of all the 7,154 primary schools in Sierra Leone (as per the 2019 school census). Schools with higher numbers of learners were purposefully selected to ensure the sample size could be met in each site.
As a result, a sample of 260 schools were drawn using proportional to size allocation with simple random sampling without replacement in each stratum. In the population, there were 16 districts and five school ownership categories (community, government, mission/religious, private and others). A total of 63 strata were made by forming combinations of the 16 districts and school ownership categories. In each stratum, a sample size was computed proportional to the total population and samples were drawn randomly without replacement. Drawing from other EGRA/EGMA studies conducted by Montrose in the past, a backup sample of up to 78 schools (30% of the sample population) with which enumerator teams can replace sample schools was also be drawn.
In the distribution of sampled schools by ownership, majority of the sampled schools are owned by mission/religious group (62.7%, n=163) followed by the government owned schools at 18.5% (n=48). Additionally, in school distribution by district, majority of the sampled schools (54%) were found in Bo, Kambia, Kenema, Kono, Port Loko and Kailahun districts. Refer to annex 9. for details on the population and sample distribution by district."
Because of the restriction that at least 30 learners were available in Grade 2, we chose to add an additional 40 schools to the sample from among smaller schools, with between 3 and 30 grade 2 students. The objective of this supplement was to make the sample more nationally representative, as the restriction reduced the sampling frame for the EGRA/EGMA sample by over 1,500 schools from 7,154 to 4,597.
The 40 schools were chosen in a manner consistent with the original set of EGRA/EGMA schools. The 16 districts formed the strata. In each stratum, the number of schools selected were proportional to the total population of the stratum, and within stratum schools were chosen with probability proportional to size.
Computer Assisted Personal Interview [capi]
The dashboard project collects new data in each country using three new instruments: a School Survey, a Policy Survey, and a Survey of Public Officials. Data collection involves school visits, classroom observations, legislative reviews, teacher and student assessments, and interviews with teachers, principals, and public officials. In addition, the project draws on some existing data sources to complement the new data it collects. A major objective of the GEPD project was to develop focused, cost-effective instruments and data-collection procedures, so that the dashboard can be inexpensive enough to be applied (and re-applied) in many countries. The team achieved this by streamlining and simplifying existing instruments, and thereby reducing the time required for data collection and training of enumerators.
More information pertaining to each of the three instruments can be found below: - School Survey: The School Survey collects data primarily on practices (the quality of service delivery in schools), but also on some de facto policy indicators. It consists of streamlined versions of existing instruments—including Service Delivery Surveys on teachers and inputs/infrastructure, Teach on pedagogical practice, Global Early Child Development Database (GECDD) on school readiness of young children, and the Development World Management Survey (DWMS) on management quality—together with new questions to fill gaps in those instruments. Though the number of modules is similar to the full version of the Service Delivery Indicators (SDI) Survey, the number of items and the complexity of the questions within each module is significantly lower. The School Survey includes 8 short modules: School Information, Teacher Presence, Teacher Survey, Classroom Observation, Teacher Assessment, Early Learner Direct Assessment, School Management Survey, and 4th-grade Student Assessment. For a team of two enumerators, it takes on average about 4 hours to collect all information in a given school. For more information, refer to the Frequently Asked Questions.
2023 - 2024 option high school 'geo-zone' boundaries for Seattle Public Schools.Students who want to attend an option school need to apply. No one is assigned automatically to an option school. Each option school has a geographic priority area (geo zone). The geographic zone tiebreaker is for applicants to an option school who live within a defined area in proximity to the school. Please note that living within the geographic zone does not guarantee assignment to the requested option school, but gives a priority for admission after siblings. Geo Zones may change from year to year as a tool for capacity management.https://www.seattleschools.org/departments/enrollment-planning/For questions, please contact enrollmentplanning@seattleschools.org
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This layer was developed by the Research & Analytics Group of the Atlanta Regional Commission, using data from the U.S. Census Bureau’s American Community Survey 5-year estimates for 2013-2017, to show counts and percentages for school enrollment by education level by Neighborhood Planning Units S, T, and V in the Atlanta region.
The user should note that American Community Survey data represent estimates derived from a surveyed sample of the population, which creates some level of uncertainty, as opposed to an exact measure of the entire population (the full census count is only conducted once every 10 years and does not cover as many detailed characteristics of the population). Therefore, any measure reported by ACS should not be taken as an exact number – this is why a corresponding margin of error (MOE) is also given for ACS measures. The size of the MOE relative to its corresponding estimate value provides an indication of confidence in the accuracy of each estimate. Each MOE is expressed in the same units as its corresponding measure; for example, if the estimate value is expressed as a number, then its MOE will also be a number; if the estimate value is expressed as a percent, then its MOE will also be a percent.
The user should also note that for relatively small geographic areas, such as census tracts shown here, ACS only releases combined 5-year estimates, meaning these estimates represent rolling averages of survey results that were collected over a 5-year span (in this case 2013-2017). Therefore, these data do not represent any one specific point in time or even one specific year. For geographic areas with larger populations, 3-year and 1-year estimates are also available.
For further explanation of ACS estimates and margin of error, visit Census ACS website.
Naming conventions:
Prefixes:
None
Count
p
Percent
r
Rate
m
Median
a
Mean (average)
t
Aggregate (total)
ch
Change in absolute terms (value in t2 - value in t1)
pch
Percent change ((value in t2 - value in t1) / value in t1)
chp
Change in percent (percent in t2 - percent in t1)
Suffixes:
None
Change over two periods
_e
Estimate from most recent ACS
_m
Margin of Error from most recent ACS
_00
Decennial 2000
Attributes:
SumLevel
Summary level of geographic unit (e.g., County, Tract, NSA, NPU, DSNI, SuperDistrict, etc)
GEOID
Census tract Federal Information Processing Series (FIPS) code
NAME
Name of geographic unit
Planning_Region
Planning region designation for ARC purposes
Acres
Total area within the tract (in acres)
SqMi
Total area within the tract (in square miles)
County
County identifier (combination of Federal Information Processing Series (FIPS) codes for state and county)
CountyName
County Name
Pop3P_e
# Population ages 3 and over, 2017
Pop3P_m
# Population ages 3 and over, 2017 (MOE)
InSchool_e
# Population 3 years and over enrolled in school, 2017
InSchool_m
# Population 3 years and over enrolled in school, 2017 (MOE)
InPreSchool_e
# Enrolled in nursery school, preschool, 2017
InPreSchool_m
# Enrolled in nursery school, preschool, 2017 (MOE)
pInPreSchool_e
% Enrolled in nursery school, preschool, 2017
pInPreSchool_m
% Enrolled in nursery school, preschool, 2017 (MOE)
InKindergarten_e
# Enrolled in kindergarten, 2017
InKindergarten_m
# Enrolled in kindergarten, 2017 (MOE)
pInKindergarten_e
% Enrolled in kindergarten, 2017
pInKindergarten_m
% Enrolled in kindergarten, 2017 (MOE)
InElementary_e
# Enrolled in elementary school (grades 1-8), 2017
InElementary_m
# Enrolled in elementary school (grades 1-8), 2017 (MOE)
pInElementary_e
% Enrolled in elementary school (grades 1-8), 2017
pInElementary_m
% Enrolled in elementary school (grades 1-8), 2017 (MOE)
InHS_e
# Enrolled in high school (grades 9-12), 2017
InHS_m
# Enrolled in high school (grades 9-12), 2017 (MOE)
pInHS_e
% Enrolled in high school (grades 9-12), 2017
pInHS_m
% Enrolled in high school (grades 9-12), 2017 (MOE)
InCollegeGradSch_e
# Enrolled in college or graduate school, 2017
InCollegeGradSch_m
# Enrolled in college or graduate school, 2017 (MOE)
last_edited_date
Last date the feature was edited by ARC
Source: U.S. Census Bureau, Atlanta Regional Commission
Date: 2013-2017
For additional information, please visit the Census ACS website.
The main objective of the interventions supported by this impact evaluation is to strengthen linkages between communities and school to improve education outcomes. Rigorous evidence generated from the research will provide valuable information to Pakistani policy makers, donors and development practitioners on the ways in which school based management reforms can be strengthened in low-governance environments like Sindh, Pakistan. The findings of this research are valuable for the ongoing dialogue with the GoSindh on school based management, one of the critical reform area supported under the Second Sindh Education Sector Program (SEP-II).
The impact evaluation is a component of the World Bank's ongoing technical and advisory support to the Government of Sindh for improving the quality and performance of government primary schools as part of its medium-term, multi-pronged Sindh Education Sector Reform Program (SERP-II). An important subprogram under SERP and SERP-II has been the revitalization of school management committees (SMCs) in government schools, with the provision of annual school improvements grants and basic guidelines on SMCs rights, roles and responsibilities across Sindh province. An area of concern in these early efforts has been poor or dissipating community interest and engagement. The interventions piloted in select districts of rural Sindh were designed by the World Bank in partnership with the Reform Support Unit, which is the implementation arm of the Education and Literacy Department of GoSindh. The aim of these interventions was to explore concrete ways to elicit meaningful and sustained local community engagement in improving education outcomes.
Both the baseline survey and the interventions were implemented in three pilot districts in 2012 and 2013. The core intervention being evaluated is community engagement to revitalize SMCs under two distinct mechanisms: 1) a community-level meeting to engage the community in a dialogue for school improvement via SMCs; 2) a virtual network of community members to engage in a similar dialogue supported through text messages on mobile phones.
The first intervention arm makes use of an existing social platform, enabling community members to participate in traditional meetings to acquire information and engage the community in dialogue and discussion on school-related issues. The second arm has created an innovative virtual platform through which registered community members receive school-related information, anonymously send text messages about these issues and receive a summary of key observations or issues twice every month.
The baseline survey, documented here, was implemented in January 2012 - January 2013. There is no midline survey for this study. The endline survey will start in January 2015.
Mirpur Khas, Mitiari and Sanghar districts in Sindh province.
The unit of randomization for the intervention is a village.
Administered questionnaires have the following units of analysis: individuals (teachers, students, parents), households, schools, and communities.
All primary schools and rural households in Mirpur Khas, Mitiari and Sanghar districts in Sindh province.
Sample survey data [ssd]
The districts chosen for the study were based on district ranks in terms of school density in the district and school participation rates from the Pakistan Social and Living Standards Measurement Survey (PSLM) and Administrative School Census (ASC) data respectively. One district each was chosen from the low, middle and top category to make an overall representative sample of rural Sindh. By this method, the final districts selected were Mirpur Khas, Mitiari and Sanghar. Using the ASC data in terms of number of schools, Mitiari was ranked the third smallest district, Mirpur Khas was ranked at number twelve (middle rank) and Sanghar at number eighteen (top rank). Using the PSLM for education indicators (proportion of adults who ever attended school and school participation rate of primary-age children), Sanghar ranked at the top followed by Mitiari (median) and lastly by Mirpur Khas.
The Administrative School Census (ASC) data is collected by the Government of Sindh every year to provide an updated list of primary schools in all districts of Sindh. The census data for 2010-2011 was used to randomly draw 300 villages within our sample districts. However, because of poor quality of administrative census data, researchers conducted a census listing of all households and also mapped all primary schools in these 300 villages to set the population frame for the study.
The school sampling strategy was primarily to target all primary schools in the main settlement that were either open on the day of visit or closed for a period of less than one year. In addition, 15% of the remaining schools in these villages were also surveyed to capture spillover effects. For villages with no school in the main settlement, all schools located out of the main settlement were surveyed1. For villages that did not meet these criteria, all schools were sampled even if the school was closed for more than one year. 4 villages had to be dropped because no school was found in village-level mapping of primary schools.
The household sampling strategy for each village was to randomly select 20 households from the main settlement and 8 households from the peripheral settlements conditional on the household having at least one child of school going age (5-16 years). From this list, the first 16 households were to be surveyed and in case the head of the household was not available, the household was substituted from the list of four buffer households. For the peripheral settlement, any 4 out of the 8 households were surveyed2. In addition, household questionnaires were also administered to all SMC members from the target schools, approximately 4 households in a village.
Overall, on the school level 514 school, 454 head teacher, 409 teacher and 4,573 student questionnaires were administered. On the household level, 6,505 head of the household, 6,503 spouse, 5,281 child and 901 school management committee questionnaires were administered.
Face-to-face [f2f]
School Surveys
Detailed data on school-level variables such as enrollment, attendance, teacher on-task, facilities, school committees, funding and expenditure were collected through a set of four questionnaires: School Observation, Teacher Roster, Head Teacher and Teacher Questionnaire. In addition, a list of School Management Committees (SMC) members was enumerated at the school-level for household surveys.
School Observation Questionnaire
School questionnaire consisted of five sections and was based on the observation of the enumerator about school building, facilities, hygiene conditions, on-going classroom practices and teacher activities. The questionnaire also required the enumerator to record school GPS coordinates and school visit details.
Head Teacher Questionnaire
Head Teacher questionnaire compromised of two parts: information based on the head teacher’s knowledge and information based on official school records. The first part gathered data on the respondent’s personal and professional background as well as his knowledge of students, school facilities and SMC. The second part collected official school details on school improvement plan, enrollment, attendance, fee, SMC funds and expenditures.
School Teacher Questionnaire
Teacher questionnaire consisted of nine sections and was administered to all teachers present in the school . It gathered the personal and professional information of the teacher as well as his perceptions on SMC functionality, student learning and returns to education.
Teacher Roster Questionnaire
Teacher Roster collected information on teachers that are currently teaching in the school and those that left or transferred over the last two years. The survey recorded teacher information on attendance, contact number, gender, contract type, pay scale and class taught. For teachers that have left, it also covered information on reasons for leaving school. The information for the roster is to be provided by the head teacher or the senior most teacher in the school.
Household Surveys
The baseline survey also covered households to gather information on demographic and socioeconomic characteristics, parent choices about child’s school, parent engagement with school’s SMC, adult perceptions of returns to schooling and quality of learning through four set of questionnaires: Household Roster, Household Head Questionnaire, Spouse of Head Questionnaire and SMC Member Questionnaire.
Household Roster Questionnaire
The household roster questionnaire collected information about gender, age, marital status, education and job status of all members of the household. This roster information was filled by the head of the household but in case of his absence, the survey was filled by other members that were required to explain their relationship to the head.
Head of the Household Questionnaire
The head of the household questionnaire consisted of fifteen sections and collected detailed information on family members, education, consumption pattern, business details, household expenditures and incomes. It also recorded information on about the respondent’s aspirations, awareness about the SMC, trust in the education system and perceptions about returns to education and quality of learning in the respective school.
Questionnaire for Female
The Education Quality Improvement Programme in Tanzania (EQUIP-T) is a large, four-year Department for International Development (DFID) funded programme. It targets some of the most educationally disadvantaged regions in Tanzania to increase the quality of primary education and improve pupil learning outcomes, in particular for girls. EQUIP-T covers seven regions in Tanzania and has five components: 1) enhanced professional capacity and performance of teachers; 2) enhanced school leadership and management skills; 3) strengthened systems that support the district and regional management of education; 4) strengthened community participation and demand for accountability; and 5) strengthened learning and dissemination of results. Together, changes in these five outputs are intended to reduce constraints on pupil learning and thereby contribute to better-quality education (outcome) and ultimately improved pupil learning (impact).
The independent impact evaluation (IE) of EQUIP-T conducted by Oxford Policy Management Ltd (OPM) is a four-year study funded by DFID. It covers five of the seven programme regions (the two regions that will join EQUIP-T in a later phase are not included) and the first four EQUIP-T components (see above). The IE uses a mixed methods approach where qualitative and quantitative methods are integrated. The baseline approach consists of three main parts to allow the IE to: 1) capture the situation prior to the start of EQUIP-T so that changes can be measured during the follow-up data collection rounds; impact attributable to the programme assessed and mechanisms for programme impact explored; 2) develop an expanded programme theory of change to help inform possible programme adjustments; and 3) provide an assessment of the education situation in some of the most educationally disadvantaged regions in Tanzania to the Government and other education stakeholders.
This approach includes:
Standard two lesson observations in Kiswahili and mathematics.
Qualitative fieldwork in nine research sites that overlap with a sub-set of the quantitative survey schools, in 2014, 2016 and 2018, consisting of key informant interviews (KIIs) and focus group discussions (FGDs) with head teachers, teachers, pupils, parents, school committee (SC) members, region, district and ward education officials and EQUIP-T programme staff; and
A mapping of causal mechanisms, and assessment of the strength of assumptions underpinning the programme theory of change using qualitative and quantitative IE baseline data as well as national and international evidence.
The data and documentation contained in the World Bank Microdata Catalog are those from the EQUIP-T IE quantitative baseline survey conducted in 2014. For information on the qualitative research findings see OPM. 2015b. EQUIP-Tanzania Impact Evaluation. Final Baseline Technical Report, Volume II: Methods and Technical Annexes.
The survey is representative of the 17 EQUIP-T programme treatment districts. The survey is NOT representative of the eight control districts. For more details see the section on Representativeness and OPM. 2015. EQUIP-Tanzania Impact Evaluation: Final Baseline Technical Report, Volume I: Results and Discussion and OPM. 2015. EQUIP-Tanzania Impact Evaluation. Final Baseline Technical Report, Volume II: Methods and Technical Annexes.
-Dodoma Region: Bahi DC, Chamwino DC, Kongwa DC, Mpwapwa DC -Kigoma Region: Kakonko DC, Kibondo DC -Shinyanga Region: Kishapu DC, Shinyanga DC -Simiyu Region: Bariadi DC, Bariadi TC, Itilima DC, Maswa DC, Meatu DC -Tabora Region: Igunga DC, Nzega DC, Sikonge DC, Uyui DC
-Arusha Region: Ngorongoro DC
-Mwanza Region: Misungwi DC
-Pwani Region: Rufiji DC
-Rukwa Region: Nkasi DC
-Ruvuma Region: Tunduru DC
-Singida Region: Ikungi DC, Singida DC
-Tanga Region: Kilindi DC
Sample survey data [ssd]
Because the EQUIP-T regions and districts were purposively selected (see OPM. 2015. EQUIP-Tanzania Impact Evaluation: Final Baseline Technical Report, Volume I: Results and Discussion.), the IE sampling strategy used propensity score matching (PSM) to: (i) match eligible control districts to the pre-selected and eligible EQUIP-T districts (see below), and (ii) match schools from the control districts to a sample of randomly sampled treatment schools in the treatment districts. The same schools will be surveyed for each round of the IE (panel of schools) and standard 3 pupils will be interviewed at each round of the survey (no pupil panel).
Eligible control and treatment districts were those not participating in any other education programme or project that may confound the measurement of EQUIP-T impact. To generate the list of eligible control and treatment districts, all districts that are contaminated because of other education programmes or projects or may be affected by programme spill-over were excluded as follows:
-All districts located in Lindi and Mara regions as these are part of the EQUIP-T programme, but the impact evaluation does not cover these two regions; -Districts that will receive partial EQUIP-T programme treatment or will be subject to potential EQUIP-T programme spill-overs; -Districts that are receiving other education programmes/projects that aim to influence the same outcomes as the EQUIP-T programme and would confound measurement of EQUIP-T impact; -Districts that were part of pre-test 1 (two districts); and -Districts that were part of pre-test 2 (one district).
To be able to select an appropriate sample of pupils and teachers within schools and districts, the sampling frame consisted of information at three levels:
-District level; -School level; and -Within school level.
The sampling frame data at the district and school levels was compiled from the following sources: the 2002 and 2012 Tanzania Population Censuses, Education Management Information System (EMIS) data from the Ministry of Education and Vocational Training (MoEVT) and the Prime Minister's Office for Regional and Local Government (PMO-RALG), and the UWEZO 2011 student learning assessment survey. For within school level sampling, the frames were constructed upon arrival at the selected schools and was used to sample pupils and teachers on the day of the school visit.
Because the treatment districts were known, the first step was to find sufficiently similar control districts that could serve as the counterfactual. PSM was used to match eligible control districts to the pre-selected, eligible treatment districts using the following matching variables: Population density, proportion of male headed households, household size, number of children per household, proportion of households that speak an ethnic language at home, and district level averages for household assets, infrastructure, education spending, parental education, school remoteness, pupil learning levels and pupil drop out.
In the second stage, schools in the treatment districts were selected using stratified systematic random sampling. The schools were selected using a probability proportional to size approach, where the measure of school size was the standard two enrolment of pupils. This means that schools with more pupils had a higher probability of being selected into the sample. To obtain a representative sample of programme treatment schools, the sample was implicitly stratified along four dimensions:
-Districts; -PSLE scores for Kiswahili; -PSLE scores for mathematics; and -Total number of teachers per school.
As in stage one, a non-random PSM approach was used to match eligible control schools to the sample of treatment schools. The matching variables were similar to the ones used as stratification criteria: Standard two enrolment, PSLE scores for Kiswahili and mathematics, and the total number of teachers per school.
The midline and endline surveys will be conducted for the same schools as the baseline survey (a panel of schools). However, the IE will not have a panel of pupils as a pupil only attends standard three once (unless repeating). Thus, the IE will have a repeated cross-section of pupils in a panel of schools.
Stage 4: Selection of pupils and teachers within
Successful free school proposal forms approved by the Department for Education during the wave 13 application round.
The free school application form includes the following information:
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
A Capacity Management Plan (CMP) is one of the strategies that the Department for Education employs to support government schools that are experiencing increased enrolment demand. Increased enrolment demand occurs due to several factors including changes to the demographic profile of the local community and increases in local housing development. The purpose of a CMP is to assist a school to return to, or maintain, a sustainable enrolment level and to assist children to be able to attend their local school. The CMP outlines the enrolment criteria relevant to each school. The Department for Education has been implementing CMPs since 2009. Generally, the CMPs, over a period of time, have supported schools to manage enrolment demand within their school enrolment capacity. Further strategies include the provision of additional accommodation, implementation of a school zone or planning for new educational facilities. CMPs are approved by the Minister and published in the South Australian Government Gazette. Refer to the Capacity Management Plan webpage for more information and links to the CMPs published in the South Australian Government Gazette.