Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
A Capacity Management Plan (CMP) is one of the strategies that the Department for Education employs to support government schools that are experiencing increased enrolment demand. Increased enrolment demand occurs due to several factors including changes to the demographic profile of the local community and increases in local housing development. The purpose of a CMP is to assist a school to return to, or maintain, a sustainable enrolment level and to assist children to be able to attend their local school. The CMP outlines the enrolment criteria relevant to each school. The Department for Education has been implementing CMPs since 2009. Generally, the CMPs, over a period of time, have supported schools to manage enrolment demand within their school enrolment capacity. Further strategies include the provision of additional accommodation, implementation of a school zone or planning for new educational facilities. CMPs are approved by the Minister and published in the South Australian Government Gazette. Refer to the Capacity Management Plan webpage for more information and links to the CMPs published in the South Australian Government Gazette.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
With the rising strain of job rivalry in the contemporary social climate, the incapacity of recent graduates to adjust to society has become a pressing issue. College vocational education’s social industry adaptability and students’ job development skills are profoundly influenced by the growth direction of colleges, universities, and majors. Accordingly, 616 students and 7 instructors from three institutions in Southwest China serve as the research subjects for this study. The objective is to examine the societal support for colleges and universities in Southwest China, as well as the adaptation of colleges and universities to society, and to establish their link and attempt to prove its psychological capital significance. Moreover, via the investigation of the interview questionnaires of instructors, we can learn the true requirements of students to adjust to social vocations. The findings indicate that (1) social support for colleges and universities may predict social adaption considerably and favorably. (2) Psychological capital has an important beneficial function as an intermediate between societal support for colleges and universities and the adaptation of institutions to society. (3) More attention and resources should be devoted to career preparation for students. Schools, as the primary institution, should build a variety of relationships with social businesses, and the steps taken to facilitate students’ integration into society will assist schools in establishing their reputation. This contributes to the enhancement of the school’s capacity for comprehensive management, so promoting the school’s and society’s virtuous growth cycle.
The School Attendance Boundaries Survey (SABS) was an experimental survey conducted by the U.S. Department of Education’s (ED) National Center for Education Statistics (NCES) with assistance from the U.S. Census Bureau to collect school attendance boundaries for regular schools in the 50 states and the District of Columbia. Attendance boundaries, sometimes known as school catchment areas, define the geographic extent served by a local school for the purpose of student assignments. School district administrators create attendance areas to help organize and plan district-wide services, and districts may adjust individual school boundaries to help balance the physical capacity of local schools with changes in the local school-age population. The SABS collection includes boundaries for more than 70,000 schools in over 12,000 school districts throughout the U.S.All information contained in this file is in the public _domain. Data users are advised to review NCES program documentation and feature class metadata to understand the limitations and appropriate use of these data.
Education Quality Improvement Programme in Tanzania (EQUIP-T) is a Government of Tanzania programme, funded by UK DIFD, which seeks to improve the quality of primary education, especially for girls, in seven regions of Tanzania. It focuses on strengthening professional capacity and performance of teachers, school leadership and management, systems which support district management of education, and community participation in education.
The independent Impact Evaluation (IE) of EQUIP-T is a four-year study funded by the United Kingdom Department for International Development (DFID). It is designed to: i) generate evidence on the impact of EQUIP-T on primary pupil learning outcomes, including any differential effects for boys and girls; ii) examine perceptions of effectiveness of different EQUIP-T components; iii) provide evidence on the fiscal affordability of scaling up EQUIP-T post-2018; and iv) communicate evidence generated by the impact evaluation to policy-makers and key education stakeholders.
The research priorities for the midline IE are captured in a comprehensive midline evaluation matrix (see Annex B in the 'EQUIP-Tanzania Impact Evaluation. Midline Technical Report, Volume I: Results and Discussion' under Reports and policy notes). The matrix sets out evaluation questions linked to the programme theory of change, and identifies sources of evidence to answer each question-either the quantitative survey or qualitative research, or both. It asks questions related to the expected results at each stage along the results chain (from the receipt of inputs to delivery of outputs, and contributions to outcomes and impact) under each of the programme's components. The aim is to establish: (i) whether changes have happened as expected; (ii) why they happened or did not happen (i.e. whether key assumptions in the theory of change hold or not); (iii) whether there are any important unanticipated changes; and (iv) what links there are between the components in driving changes.
The main IE research areas are: - Impact of EQUIP-T on standard 3 pupil learning in Kiswahili and mathematics. - Impact of EQUIP-T on teacher absence from school and from classrooms. - Impact of EQUIP-T on selected aspects of school leadership and management.
The IE uses a mixed methods approach that includes: - A quantitative survey of 100 government primary schools in 17 programme treatment districts and 100 schools in 8 control districts in 2014, 2016 and 2018 covering: - Standard three pupils and their parents/caregivers; - Teachers who teach standards 1-3 Kiswahili; - Teachers who teach standards 1-3 mathematics; - Teachers who teach standards 4-7 mathematics; - Head teachers; and - Standard two lesson observations in Kiswahili and mathematics.
The midline data available in the World Bank Microdata Catalog are from the EQUIP-T IE quantitative midline survey conducted in 2016. For the qualitative research findings and methods see 'EQUIP-Tanzania Impact Evaluation. Midline Technical Report, Volume I: Results and Discussion' and 'EQUIP-Tanzania Impact Evaluation. Midline Technical Report, Volume II: Methods and Supplementary Evidence' under Reports and policy notes.
The survey is representative of the 17 EQUIP-T programme treatment districts. The survey is NOT representative of the 8 control districts. For more details see the section on Representativeness in 'EQUIP-Tanzania Impact Evaluation. Final Baseline Technical Report, Volume I: Results and Discussion' and 'EQUIP-Tanzania Impact Evaluation. Final Baseline Technical Report, Volume II: Methods and Technical Annexes' under Reports.
The 17 treatment districts are: - Dodoma Region: Bahi DC, Chamwino DC, Kongwa DC, Mpwapwa DC - Kigoma Region: Kakonko DC, Kibondo DC - Shinyanga Region: Kishapu DC, Shinyanga DC - Simiyu Region: Bariadi DC, Bariadi TC, Itilima DC, Maswa DC, Meatu DC - Tabora Region: Igunga DC, Nzega DC, Sikonge DC, Uyui DC
The 8 control districts are:
- Arusha Region: Ngorongoro DC
- Mwanza Region: Misungwi DC
- Pwani Region: Rufiji DC
- Rukwa Region: Nkasi DC
- Ruvuma Region: Tunduru DC
- Singida Region: Ikungi DC, Singida DC
- Tanga Region: Kilindi DC
Sample survey data [ssd]
Because the EQUIP-T regions and districts were purposively selected (see 'EQUIP-Tanzania Impact Evaluation. Final Baseline Technical Report, Volume I: Results and Discussion' under Reports and policy notes), the IE sampling strategy used propensity score matching (PSM) to: (i) match eligible control districts to the pre-selected and eligible EQUIP-T districts (see below), and (ii) match schools from the control districts to a sample of randomly sampled treatment schools in the treatment districts. The same schools are surveyed for each round of the IE (panel of schools) and standard 3 pupils will be interviewed at each round of the survey (no pupil panel).
Identifying districts eligible for matching
Eligible control and treatment districts were those not participating in any other education programme or project that may confound the measurement of EQUIP-T impact. To generate the list of eligible control and treatment districts, all districts that are contaminated because of other education programmes or projects or may be affected by programme spill-over were excluded as follows:
Sampling frame
To be able to select an appropriate sample of pupils and teachers within schools and districts, the sampling frame consisted of information at three levels:
The sampling frame data at the district and school levels was compiled from the following sources: the 2002 and 2012 Tanzania Population Censuses, Education Management Information System (EMIS) data from the Ministry of Education and Vocational Training (MoEVT) and the Prime Minister's Office for Regional and Local Government (PMO-RALG), and the UWEZO 2011 student learning assessment survey. For within school level sampling, the frames were constructed upon arrival at the selected schools and was used to sample pupils and teachers on the day of the school visit.
Sampling stages
Stage 1: Selection of control districts Because the treatment districts were known, the first step was to find sufficiently similar control districts that could serve as the counterfactual. PSM was used to match eligible control districts to the pre-selected, eligible treatment districts using the following matching variables: Population density, proportion of male headed households, household size, number of children per household, proportion of households that speak an ethnic language at home, and district level averages for household assets, infrastructure, education spending, parental education, school remoteness, pupil learning levels and pupil drop out.
Stage 2: Selection of treatment schools In the second stage, schools in the treatment districts were selected using stratified systematic random sampling. The schools were selected using a probability proportional to size approach, where the measure of school size was the standard two enrolment of pupils. This means that schools with more pupils had a higher probability of being selected into the sample. To obtain a representative sample of programme treatment schools, the sample was implicitly stratified along four dimensions:
Stage 3: Selection of control schools As in stage one, a non-random PSM approach was used to match eligible control schools to the sample of treatment schools. The matching variables were similar to the ones used as stratification criteria: Standard two enrolment, PSLE scores for Kiswahili and mathematics, and the total number of teachers per school.
The midline survey was conducted for the same schools as the baseline survey (a panel of schools) and the endline survey in 2018 will cover the same sample of schools. However, the IE does not have a panel of pupils as a pupil only attends standard three once (unless repeating). Thus, the IE sample is a repeated cross-section of pupils in a panel of schools.
Stage 4: Selection of pupils and teachers within schools Pupils and teachers were sampled within schools using systematic random sampling based on school registers. The within-school sampling was assisted by selection tables automatically generated within the computer assisted survey instruments.
Per school, 15 standard 3 pupils were sampled. For the teacher development needs assessment (TDNA), in the sample treatment schools, up to three teachers of standards 1 to 3 Kiswahili, up to three teachers of standards 1 to 3 mathematics; and up to three
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
BackgroundTen percent of the school-aged population have speech, language, and communication needs (SLCN) that impact access to the curriculum. Successful implementation of classroom-based SLCN interventions can reduce barriers to learning, thereby improving educational outcomes for this vulnerable population. The challenges of implementing innovations in educational settings are well-documented, yet limited studies have addressed such considerations when developing, and piloting universal level SLCN interventions for use in Irish schools.MethodsA qualitative exploratory study was undertaken to establish the acceptability, feasibility, and appropriateness of a universal level SLCN intervention. An advisory panel of teachers (n = 8) and children with SLCN (n = 2) were engaged as co-researchers in the study. The Communication Supporting Classrooms Observation Tool, developed as part of the Better Communication Project in the UK, was trialled across a diverse sample of school settings (n = 5). Semi-structured interviews were conducted with school practitioners and school leaders, and a deductive content analysis was undertaken using the domains of the Consolidation Framework for Implementation Research.DiscussionThe observation tool was viewed as acceptable with suggested additions. Integrating use of the tool within existing data-informed, school self-evaluation processes aimed at supporting school improvement was noted as a potential means of supporting implementation. A knowledge gap in relation to school-based models of support for SLCN was identified which may negatively impact implementation. An implementation strategy targeting coherence, cognitive engagement and contextual integration is indicated if the tool is to be normalised into routine practice in Irish classrooms. Implementation needs appeared to vary at the school level.ConclusionsThe importance of early-stage exploration to guide implementation planning with regards to developing and testing universal level interventions for SLCN in schools is highlighted. Engaging an advisory panel provides important insights to guide implementation decisions. Findings suggest an adaptive design is required when planning implementation studies targeting classroom setting.
This dataset shows primary school catchment areas within Stirling Council's Planning Policy area. Data has been provided by Education Services, July 2024, and includes capacity, current and project pupil numbers
The dashboard project collects new data in each country using three new instruments: a School Survey, a Policy Survey, and a Survey of Public Officials. Data collection involves school visits, classroom observations, legislative reviews, teacher and student assessments, and interviews with teachers, principals, and public officials. In addition, the project draws on some existing data sources to complement the new data it collects. A major objective of the GEPD project was to develop focused, cost-effective instruments and data-collection procedures, so that the dashboard can be inexpensive enough to be applied (and re-applied) in many countries. The team achieved this by streamlining and simplifying existing instruments, and thereby reducing the time required for data collection and training of enumerators.
National
Schools, teachers, students, public officials
Sample survey data [ssd]
The aim of the Global Education Policy Dashboard school survey is to produce nationally representative estimates, which will be able to detect changes in the indicators over time at a minimum power of 80% and with a 0.05 significance level. We also wish to detect differences by urban/rural location. For our school survey, we will employ a two-stage random sample design, where in the first stage a sample of typically around 200 schools, based on local conditions, is drawn, chosen in advance by the Bank staff. In the second stage, a sample of teachers and students will be drawn to answer questions from our survey modules, chosen in the field. A total of 10 teachers will be sampled for absenteeism. Five teachers will be interviewed and given a content knowledge exam. Three 1st grade students will be assessed at random, and a classroom of 4th grade students will be assessed at random. Stratification will be based on the school’s urban/rural classification and based on region. When stratifying by region, we will work with our partners within the country to make sure we include all relevant geographical divisions. For our Survey of Public Officials, we will sample a total of 200 public officials. Roughly 60 officials are typically surveyed at the federal level, while 140 officials will be surveyed at the regional/district level. For selection of officials at the regional and district level, we will employ a cluster sampling strategy, where roughly 10 regional offices (or whatever the secondary administrative unit is called) are chosen at random from among the regions in which schools were sampled. Then among these 10 regions, we also typically select around 10 districts (tertiary administrative level units) from among the districts in which schools werer sampled. The result of this sampling approach is that for 10 clusters we will have links from the school to the district office to the regional office to the central office. Within the regions/districts, five or six officials will be sampled, including the head of organization, HR director, two division directors from finance and planning, and one or two randomly selected professional employees among the finance, planning, and one other service related department chosen at random. At the federal level, we will interview the HR director, finance director, planning director, and three randomly selected service focused departments. In addition to the directors of each of these departments, a sample of 9 professional employees will be chosen in each department at random on the day of the interview.
The sample for the Global Education Policy Dashboard in SLE was based in part on a previous sample of 260 schools which were part of an early EGRA study. Details from the sampling for that study are quoted below. An additional booster sample of 40 schools was chosen to be representative of smaller schools of less than 30 learners.
EGRA Details:
"The sampling frame began with the 2019 Annual School Census (ASC) list of primary schools as provided by UNICEF/MBSSE where the sample of 260 schools for this study were obtained from an initial list of 7,154 primary schools. Only schools that meet a pre-defined selection criteria were eligible for sampling.
To achieve the recommended sample size of 10 learners per grade, schools that had an enrolment of at least 30 learners in Grade 2 in 2019 were considered. To achieve a high level of confidence in the findings and generate enough data for analysis, the selection criteria only considered schools that: • had an enrolment of at least 30 learners in grade 1; and • had an active grade 4 in 2019 (enrolment not zero)
The sample was taken from a population of 4,597 primary schools that met the eligibility criteria above, representing 64.3% of all the 7,154 primary schools in Sierra Leone (as per the 2019 school census). Schools with higher numbers of learners were purposefully selected to ensure the sample size could be met in each site.
As a result, a sample of 260 schools were drawn using proportional to size allocation with simple random sampling without replacement in each stratum. In the population, there were 16 districts and five school ownership categories (community, government, mission/religious, private and others). A total of 63 strata were made by forming combinations of the 16 districts and school ownership categories. In each stratum, a sample size was computed proportional to the total population and samples were drawn randomly without replacement. Drawing from other EGRA/EGMA studies conducted by Montrose in the past, a backup sample of up to 78 schools (30% of the sample population) with which enumerator teams can replace sample schools was also be drawn.
In the distribution of sampled schools by ownership, majority of the sampled schools are owned by mission/religious group (62.7%, n=163) followed by the government owned schools at 18.5% (n=48). Additionally, in school distribution by district, majority of the sampled schools (54%) were found in Bo, Kambia, Kenema, Kono, Port Loko and Kailahun districts. Refer to annex 9. for details on the population and sample distribution by district."
Because of the restriction that at least 30 learners were available in Grade 2, we chose to add an additional 40 schools to the sample from among smaller schools, with between 3 and 30 grade 2 students. The objective of this supplement was to make the sample more nationally representative, as the restriction reduced the sampling frame for the EGRA/EGMA sample by over 1,500 schools from 7,154 to 4,597.
The 40 schools were chosen in a manner consistent with the original set of EGRA/EGMA schools. The 16 districts formed the strata. In each stratum, the number of schools selected were proportional to the total population of the stratum, and within stratum schools were chosen with probability proportional to size.
Computer Assisted Personal Interview [capi]
The dashboard project collects new data in each country using three new instruments: a School Survey, a Policy Survey, and a Survey of Public Officials. Data collection involves school visits, classroom observations, legislative reviews, teacher and student assessments, and interviews with teachers, principals, and public officials. In addition, the project draws on some existing data sources to complement the new data it collects. A major objective of the GEPD project was to develop focused, cost-effective instruments and data-collection procedures, so that the dashboard can be inexpensive enough to be applied (and re-applied) in many countries. The team achieved this by streamlining and simplifying existing instruments, and thereby reducing the time required for data collection and training of enumerators.
More information pertaining to each of the three instruments can be found below: - School Survey: The School Survey collects data primarily on practices (the quality of service delivery in schools), but also on some de facto policy indicators. It consists of streamlined versions of existing instruments—including Service Delivery Surveys on teachers and inputs/infrastructure, Teach on pedagogical practice, Global Early Child Development Database (GECDD) on school readiness of young children, and the Development World Management Survey (DWMS) on management quality—together with new questions to fill gaps in those instruments. Though the number of modules is similar to the full version of the Service Delivery Indicators (SDI) Survey, the number of items and the complexity of the questions within each module is significantly lower. The School Survey includes 8 short modules: School Information, Teacher Presence, Teacher Survey, Classroom Observation, Teacher Assessment, Early Learner Direct Assessment, School Management Survey, and 4th-grade Student Assessment. For a team of two enumerators, it takes on average about 4 hours to collect all information in a given school. For more information, refer to the Frequently Asked Questions.
The overall aim of the USAID/SA basic education program is to improve primary grade reading outcomes by building teacher effectiveness and strengthening classroom and school management. This is being accomplished through support to innovative, local interventions that have a demonstrated capacity for scale-up. The main USAID/SA program is the School Capacity and Innovation Program (SCIP), which also leverages significant private sector resources, amplifying the impact of USAID’s investment in the South African education system. SCIP is co-funded by The ELMA Foundation and J.P. Morgan and designed in collaboration with the South African Department of Basic Education. SCIP supports local South African models or interventions that work directly with teachers and school management teams in innovative ways in order to improve their practice as instructional leaders and managers. SCIP is aligned to the USAID Global Education Strategy (2011–2015) which supports interventions to improve learning outcomes with a focus on primary grade reading as a measure of performance. In addition to seeking initiatives that demonstrate innovation and impact, sustainability and scalability are key components of the SCIP program. The Strengthening Teaching of Early Language and Literacy (STELLAR) Program improves the language and literacy skills of Grade R children from disadvantaged communities in South Africa by training and supporting Grade R teachers. Grade R (also called the Reception Year) is the year of schooling before Grade 1.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Education Quality Improvement Programme in Tanzania (EQUIP-T) is a six-year (2014-20) Government of Tanzania programme, funded by the United Kingdom Department for International Development (DFID), which seeks to improve the quality of primary education and to improve pupil learning outcomes, especially for girls. The programme focuses on strengthening professional capacity and performance of teachers, school leadership and management, systems which support district management of education, and community participation in education. Initially, the programme was intended to run for four years, with activities targeted at seven of the most educationally disadvantaged regions in Tanzania. In 2017 the programme was extended for a further two years, and the extension introduced some new sub-components to the seven regions, and introduced a reduced package of interventions to two new regions. The independent Impact Evaluation (IE) of EQUIP-T is a five-year study funded by DFID. It is designed to: i) generate evidence on the impact of EQUIP-T on primary pupil learning outcomes, including any differential effects for boys and girls; ii) examine perceptions of effectiveness of different EQUIP-T components; iii) provide evidence on the fiscal affordability of scaling up EQUIP-T post-endline; and iv) communicate evidence generated by the impact evaluation to policy-makers and key education stakeholders. The evaluation uses a quasi-experimental approach to quantitative estimation of impact that combines propensity score matching (PSM) with difference-indifferences (DID). The research priorities for the quantitative endline IE are captured in a comprehensive endline evaluation matrix (see Annex C in the 'EQUIP-Tanzania Impact Evaluation. Endline Quantitative Technical Report, Volume I: Results and Discussion' under Reports and policy notes). The matrix sets out evaluation questions linked to the programme theory of change. It asks questions related to the expected results at each stage along the results chain (from the receipt of inputs to delivery of outputs, and contributions to outcomes and impact) under each of the programme's components. The aim is to establish: (i) whether changes have happened as expected; (ii) why they happened or did not happen (i.e. whether key assumptions in the theory of change hold or not); (iii) whether there are any important unanticipated changes; and (iv) what links there are between the components in driving changes. The main IE research areas are:Impact of EQUIP-T on standard 3 pupil learning in Kiswahili and mathematics.Impact of EQUIP-T on teacher absence from school and from classrooms.Impact of EQUIP-T on selected aspects of school leadership and management. The IE uses a mixed methods approach that includes:A quantitative survey of 100 government primary schools in 17 programme treatment districts and 100 schools in 8 control districts in 2014, 2016 and 2018 covering: Standard three pupils and their parents/caregivers; Teachers who teach standards 1-3 Kiswahili; Teachers who teach standards 1-3 mathematics; Schools; Head teachers; and Standard two lesson observations in Kiswahili and mathematics.Qualitative fieldwork in a few treatment schools that overlap with a sub-set of the quantitative survey schools, in 2014, 2016 and 2019, consisting of key informant interviews (KIIs) and focus group discussions (FGDs) with head teachers, teachers, pupils, parents, school committee (SC) members, PTP members, region, district and ward education officials and EQUIP-T programme staff. The endline data available in the World Bank Microdata Catalog are from the EQUIP-T IE quantitative endline survey conducted in 2018. The endline qualitative research will take place in mid-2019 with results available in early 2020.
The overall aim of the USAID/SA basic education program is to improve primary grade reading outcomes by building teacher effectiveness and strengthening classroom and school management. This is being accomplished through support to innovative, local interventions that have a demonstrated capacity for scale-up. The main USAID/SA program is the School Capacity and Innovation Program (SCIP), which also leverages significant private sector resources, amplifying the impact of USAID’s investment in the South African education system. SCIP is co-funded by The ELMA Foundation and J.P. Morgan and designed in collaboration with the South African Department of Basic Education. SCIP supports local South African models or interventions that work directly with teachers and school management teams in innovative ways in order to improve their practice as instructional leaders and managers. SCIP is aligned to the USAID Global Education Strategy (2011–2015) which supports interventions to improve learning outcomes with a focus on primary grade reading as a measure of performance. In addition to seeking initiatives that demonstrate innovation and impact, sustainability and scalability are key components of the SCIP program. The goal of the kaMhinga Literacy Project is to demonstrate that the combination of teacher training and community-based teacher support can sustainably achieve primary grade reading levels at a 60% learner literacy level. This will be done through activities aimed at developing the capacity of teachers. Two assessments are reported per year: a baseline assessment completed in February and a final assessment completed in November. To date, a total of three assessments will be reported – Baseline February 2013, Final November 2013 and Baseline February 2014.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
BackgroundTen percent of the school-aged population have speech, language, and communication needs (SLCN) that impact access to the curriculum. Successful implementation of classroom-based SLCN interventions can reduce barriers to learning, thereby improving educational outcomes for this vulnerable population. The challenges of implementing innovations in educational settings are well-documented, yet limited studies have addressed such considerations when developing, and piloting universal level SLCN interventions for use in Irish schools.MethodsA qualitative exploratory study was undertaken to establish the acceptability, feasibility, and appropriateness of a universal level SLCN intervention. An advisory panel of teachers (n = 8) and children with SLCN (n = 2) were engaged as co-researchers in the study. The Communication Supporting Classrooms Observation Tool, developed as part of the Better Communication Project in the UK, was trialled across a diverse sample of school settings (n = 5). Semi-structured interviews were conducted with school practitioners and school leaders, and a deductive content analysis was undertaken using the domains of the Consolidation Framework for Implementation Research.DiscussionThe observation tool was viewed as acceptable with suggested additions. Integrating use of the tool within existing data-informed, school self-evaluation processes aimed at supporting school improvement was noted as a potential means of supporting implementation. A knowledge gap in relation to school-based models of support for SLCN was identified which may negatively impact implementation. An implementation strategy targeting coherence, cognitive engagement and contextual integration is indicated if the tool is to be normalised into routine practice in Irish classrooms. Implementation needs appeared to vary at the school level.ConclusionsThe importance of early-stage exploration to guide implementation planning with regards to developing and testing universal level interventions for SLCN in schools is highlighted. Engaging an advisory panel provides important insights to guide implementation decisions. Findings suggest an adaptive design is required when planning implementation studies targeting classroom setting.
‘Local authorities seeking proposers’ contains details of all local authorities seeking proposers to establish a new academy or free school.
It includes the:
‘Section 6A approved and under consideration schools’ contains details of:
It includes the:
Read the free school presumption guidance for further information about the process for establishing new schools.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Education Quality Improvement Programme in Tanzania (EQUIP-T) is a Government of Tanzania programme, funded by UK DIFD, which seeks to improve the quality of primary education, especially for girls, in seven regions of Tanzania. It focuses on strengthening professional capacity and performance of teachers, school leadership and management, systems which support district management of education, and community participation in education. The independent Impact Evaluation (IE) of EQUIP-T is a four-year study funded by the United Kingdom Department for International Development (DFID). It is designed to: i) generate evidence on the impact of EQUIP-T on primary pupil learning outcomes, including any differential effects for boys and girls; ii) examine perceptions of effectiveness of different EQUIP-T components; iii) provide evidence on the fiscal affordability of scaling up EQUIP-T post-2018; and iv) communicate evidence generated by the impact evaluation to policy-makers and key education stakeholders. The research priorities for the midline IE are captured in a comprehensive midline evaluation matrix (see Annex B in the 'EQUIP-Tanzania Impact Evaluation. Midline Technical Report, Volume I: Results and Discussion' under Reports and policy notes). The matrix sets out evaluation questions linked to the programme theory of change, and identifies sources of evidence to answer each question-either the quantitative survey or qualitative research, or both. It asks questions related to the expected results at each stage along the results chain (from the receipt of inputs to delivery of outputs, and contributions to outcomes and impact) under each of the programme's components. The aim is to establish: (i) whether changes have happened as expected; (ii) why they happened or did not happen (i.e. whether key assumptions in the theory of change hold or not); (iii) whether there are any important unanticipated changes; and (iv) what links there are between the components in driving changes. The main IE research areas are: Impact of EQUIP-T on standard 3 pupil learning in Kiswahili and mathematics. Impact of EQUIP-T on teacher absence from school and from classrooms. Impact of EQUIP-T on selected aspects of school leadership and management. The IE uses a mixed methods approach that includes: A quantitative survey of 100 government primary schools in 17 programme treatment districts and 100 schools in 8 control districts in 2014, 2016 and 2018 covering: Standard three pupils and their parents/caregivers; Teachers who teach standards 1-3 Kiswahili; Teachers who teach standards 1-3 mathematics; Teachers who teach standards 4-7 mathematics; Head teachers; and Standard two lesson observations in Kiswahili and mathematics. Qualitative fieldwork in nine research sites that overlap with a sub-set of the quantitative survey schools, in 2014, 2016 and 2018, consisting of key informant interviews (KIIs) and focus group discussions (FGDs) with head teachers, teachers, pupils, parents, school committee (SC) members, region, district and ward education officials and EQUIP-T programme staff. The midline data available in the World Bank Microdata Catalog are from the EQUIP-T IE quantitative midline survey conducted in 2016. For the qualitative research findings and methods see 'EQUIP-Tanzania Impact Evaluation. Midline Technical Report, Volume I: Results and Discussion' and 'EQUIP-Tanzania Impact Evaluation. Midline Technical Report, Volume II: Methods and Supplementary Evidence' under Reports and policy notes.
The overall aim of the USAID/SA basic education program is to improve primary grade reading outcomes by building teacher effectiveness and strengthening classroom and school management. This is being accomplished through support to innovative, local interventions that have a demonstrated capacity for scale-up. The main USAID/SA program is the School Capacity and Innovation Program (SCIP), which also leverages significant private sector resources, amplifying the impact of USAID’s investment in the South African education system. SCIP is co-funded by The ELMA Foundation and J.P. Morgan and designed in collaboration with the South African Department of Basic Education. SCIP supports local South African models or interventions that work directly with teachers and school management teams in innovative ways in order to improve their practice as instructional leaders and managers. SCIP is aligned to the USAID Global Education Strategy (2011–2015) which supports interventions to improve learning outcomes with a focus on primary grade reading as a measure of performance. In addition to seeking initiatives that demonstrate innovation and impact, sustainability and scalability are key components of the SCIP program. The Teacher Assessment Resources for Monitoring and Improving Instruction for Foundation Phase (TARMII-FP) will provide teachers with a computer-based assessment tool that will help teachers to more effectively address individual student learning needs in literacy. TARMII-FP is implemented by the Human Sciences Research Council and is co-funded by USAID, the ELMA Foundation, and J.P. Morgan Chase Foundation, with non-financial support from the South African Department of Basic Education. This $1.5 million project, part of the SCIP, is designed to improve primary grade reading outcomes by building teacher effectiveness and strengthening classroom and school management. Running from July 2012 to June 2015, TARMII-FP will enable teachers to draw upon a database of thousands of reading activities and test items to generate assessments and homework exercises tailored for their students. The tool will allow teachers to record and analyze student results.
https://www.cognitivemarketresearch.com/privacy-policyhttps://www.cognitivemarketresearch.com/privacy-policy
According to Cognitive Market Research, The Global K 12 International Schools market size is USD 7.8 billion in 2023 and will expand at a compound annual growth rate (CAGR) of 9.00% from 2023 to 2030.
The demand for K-12 International Schools is rising due to the growing international education.
Demand for English language international schools remains higher in the K 12 International Schools market.
The international baccalaureate category held the highest K 12 International Schools market revenue share in 2023.
North American K 12 International Schools will continue to lead, whereas the European K 12 International Schools market will experience the most substantial growth until 2030.
Increase in Government Initiatives to Provide Viable Market Output
The increasing government initiatives propel the growth of the K 12 International Schools market. Policymakers worldwide are increasingly recognizing the importance of quality education and are implementing initiatives to enhance the accessibility and standards of K12 education. Financial support, curriculum development, and infrastructure improvements are common focus areas. These initiatives aim to foster a conducive learning environment, boost student outcomes, and prepare future generations for a rapidly evolving global landscape. As governments actively invest in education the K12 international schools market benefits from a supportive regulatory framework and increased resources, fostering growth and innovation in the sector.
For instance, Saudi Arabia announced the 'Madrasati' e-learning platform in 2020. In 2021, the Online Learning Consortium (OLC) ranked it among the seven top global e-learning platforms.
(Source:www.arabnews.com/node/1918431/saudi-arabia)
Growing Demand for International Education to Propel Market Growth
The growth of international education has significantly impacted the K 12 International Schools market. Parents increasingly recognize the importance of a globally oriented curriculum that fosters cultural awareness and equips students with a competitive edge in the global job market. International schools offer diverse and comprehensive learning experiences, often incorporating internationally recognized curricula such as the International Baccalaureate (IB) or Cambridge Assessment International Education. This demand is further fueled by the rise in expatriate populations, the desire for English language proficiency, and the aspiration for a well-rounded education beyond traditional academic metrics. As a result, the K12 international schools sector is witnessing sustained growth to meet the evolving educational preferences of a globally-minded generation.
For instance, on October 26, 2022, US-headquartered investment house Safanad and international education platform Global School Management propounded an initial investment of $200 million in the Middle East to take over education assets. It significantly boosted their investment as they spread their portfolio of K-12 schools in the region.
(Source:safanad.com/posts_news/safanad-and-global-school-management-plan-investments-of-us200-million-in-mena-education-sector/)
Rising Demand of Online Education Fuels the Market
Key Dynamics of
K 12 International Schools Market
Key Drivers of
K 12 International Schools Market
Rising Expatriate Population and Global Mobility: As international professionals move for employment, the demand for high-quality, globally recognized educational systems such as IB, Cambridge, and American curricula is increasing. K–12 international schools fulfill this need by providing consistent academic standards and cultural inclusivity, establishing themselves as the preferred option for expatriate families throughout Asia, the Middle East, and Europe.
Growing Demand from Local Elite and Affluent Families: In addition to expatriate families, local high-income households are progressively opting for international schools that offer English-medium instruction, global curricula, and enhanced pathways to prestigious universities overseas. This trend is particularly notable in emerging economies where international education is perceived as a gateway to global opportunities and future success.
Increased Investment from Private Equity and EdTech Players: The K–12 international school sector is drawing significant private equity investment due t...
The Population and Housing census (PHC) like the previous censuses was a national exercise, it was the country's sixth completed census. The information presented in this census report was extracted from the abstracts prepared by the enumerators immediately after completion of the 2013 census count. The PHC covered characteristics such as population size, sex composition, density and household size at local government area and district levels. This report was followed by detail basic reports that gave information related to demographic, environmental, communication, agricultural and other socio-economic characteristics of the population and housing units. The provisional population estimates indicated that the population of The Gambia has steadily grown since the commencement of a complete census in 1963, rising from less than one-third million persons in 1963 to 1.4 million persons in 2003 and now 1.9 million persons in 2013.The PHC enumeration was successfully conducted from April 8th to 28th 2013. The census was carried out under the legal framework of the Statistical Act 2005 which empowered the Gambia Bureau of Statistics (GBoS) to conduct a population census in 2013 and every ten years thereafter. The specific objectives of the PHC included:
National
Census/enumeration data [cen]
The Preliminary results of the 2013 Population and Housing Census show that 1,882,450 persons were enumerated in The Gambia. The pilot census covered a sample of 40 Enumeration Areas (EAs) selected all over the country using statistical techniques. All in all 40 enumerators and 8 supervisors were finally selected after the 2013 training.
Face-to-face [f2f]
The PHC was comprised of a set of survey instruments. These were the following questionnaires: 1. Form A Household Questionnaire - Part 1 2. Form B Group Quarters and Floating Population Questionnaire - Part 1 3. Form C Building & Compound Particulars
In 1991 the International Institute for Educational Planning (IIEP) and a number of Ministries of Education in Southern and Eastern Africa began to work together in order to address training and research needs in Education. The focus for this work was on establishing long-term strategies for building the capacity of educational planners to monitor and evaluate the quality of their basic education systems. The first two educational policy research projects undertaken by SACMEQ (widely known as "SACMEQ I" and "SACMEQ II") were designed to provide detailed information that could be used to guide planning decisions aimed at improving the quality of education in primary school systems.
During 1995-1998 seven Ministries of Education participated in the SACMEQ I Project. The SACMEQ II Project commenced in 1998 and the surveys of schools, involving 14 Ministries of Education, took place between 2000 and 2004. The survey was undertaken in schools in Botswana, Kenya, Lesotho, Malawi, Mauritius, Mozambique, Namibia, Seychelles, South Africa, Swaziland, Tanzania, Uganda, Zambia and Zanzibar.
Moving from the SACMEQ I Project (covering around 1100 schools and 20,000 pupils) to the SACMEQ II Project (covering around 2500 schools and 45,000 pupils) resulted in a major increase in the scale and complexity of SACMEQ's research and training programmes.
SACMEQ's mission is to: a) Expand opportunities for educational planners to gain the technical skills required to monitor and evaluate the quality of their education systems; and b) Generate information that can be used by decision-makers to plan and improve the quality of education.
National coverage
The target population for SACMEQ's Initial Project was defined as "all pupils at the Grade 6 level in 1995 who were attending registered government or non-government schools". Grade 6 was chosen because it was the grade level where the basics of reading literacy were expected to have been acquired.
Sample survey data [ssd]
Sampling Sample designs in the field of education are usually prepared amid a network of competing constraints. These designs need to adhere to established survey sampling theory and, at the same time, give due recognition to the financial, administrative, and socio-political settings in which they are to be applied. The "best" sample design for a particular project is one that provides levels of sampling accuracy that are acceptable in terms of the main aims of the project, while simultaneously limiting cost, logistic, and procedural demands to manageable levels. The major constraints that were established prior to the preparation of the sample designs for the SACMEQ II Project have been listed below.
Target Population: The target population definitions should focus on Grade 6 pupils attending registered mainstream government or non-government schools. In addition, the defined target population should be constructed by excluding no more than 5 percent of pupils from the desired target population.
Bias Control: The sampling should conform to the accepted rules of scientific probability sampling. That is, the members of the defined target population should have a known and non-zero probability of selection into the sample so that any potential for bias in sample estimates due to variations from "epsem sampling" (equal probability of selection method) may be addressed through the use of appropriate sampling weights.
Sampling Errors: The sample estimates for the main criterion variables should conform to the sampling accuracy requirements set down by the International Association for the Evaluation of Educational Achievement. That is, the standard error of sampling for the pupil tests should be of a magnitude that is equal to, or smaller than, what would be achieved by employing a simple random sample of 400 pupils.
Response Rates: Each SACMEQ country should aim to achieve an overall response rate for pupils of 80 percent. This figure was based on the wish to achieve or exceed a response rate of 90 percent for schools and a response rate of 90 percent for pupils within schools.
The Specification of the Target Population The target population for both the SACMEQ I and SACMEQ II Projects was focussed on the Grade 6 level for three main reasons.
First, Grade 6 identified a point near the end of primary schooling where school participation rates were reasonably high for most of the seven countries that participated in the SACMEQ I data collection during 1995-1997, and also reasonably high for most of the fourteen countries that participated in the SACMEQ II collection during 2000-2002. For this reason, Grade 6 represented a point that was suitable for making an assessment of the contribution of primary schooling towards the literacy and numeracy levels of a broad cross-section of society.
Second, the NRCs considered that testing pupils at grade levels lower than Grade 6 was problematic - because in some SACMEQ countries the lower grades were too close to the transition point between the use of local and national languages by teachers in the classroom. This transition point generally occurred at around Grade 3 level - but in some rural areas of some countries it was thought to be as high as Grade 4 level.
Third, the NRCs were of the opinion that the collection of home background information from pupils at grade levels lower than Grade 6 was likely to lack validity for certain key "explanatory" variables. For example, the NRCs felt that children at lower grade levels did not know how many years of education that their parents had received, and they also had difficulty in accurately describing the socioeconomic environment of their own homes (for example, the number of books at home).
Note: Details of sampling design procedures are presented in the "Malawi Working Report".
Face-to-face [f2f]
The data collection for SACMEQ’s Initial Project took place in October 1995 and involved the administration of questionnaires to pupils, teachers, and school heads. The pupil questionnaire contained questions about the pupils’ home backgrounds and their school life; the teacher questionnaire asked about classrooms, teaching practices, working conditions, and teacher housing; and the school head questionnaire collected information about teachers, enrolments, buildings, facilities, and management. A reading literacy test was also given to the pupils. The test was based on items that were selected after a trial-testing programme had been completed.
Data Checking and Data Entry Data preparation commenced soon after the main data collection was completed. The NRCs had to organize the safe return of all materials to the Ministry of Education where the data collection instruments could be checked, entered into computers, and then "cleaned" to remove errors prior to data analysis. The data-checking involved the "hand editing" of data collection instruments by a team of trained staff. They were required to check that: (i) all questionnaires, tests, and forms had arrived back from the sample schools, (ii) the identification numbers on all instruments were complete and accurate, and (iii) certain logical linkages between questions made sense (for example, the two questions to school heads concerning "Do you have a school library?" and "How many books do you have in your school library?").
The next step was the entry of data into computers using the WINDEM software. A team of 5-10 staff normally undertook this work. In some cases the data were "double entered" in order to monitor accuracy. The numbers of keystrokes required to enter one copy of each data collection instrument were as follows: pupil questionnaire: 150; pupil reading test: 85; pupil mathematics test: 65; teacher questionnaire: 587; teacher reading test: 51; teacher mathematics test: 43; school head questionnaire: 319; school form: 58; and pupil name form: 51.
There was a great deal of variation in the delivery dates for the initial versions of the computer-stored SACMEQ II data files. This occurred because of different testing dates and also because of different amounts of time required to complete entry of data into computers.
Data Cleaning The NRCs received written instructions and follow-up support from IIEP staff in the basic steps of data cleaning using the WINDEM software. This permitted the NRCs to (i) identify major errors in the sequence of identification numbers, (ii) cross-check identification numbers across files (for example, to ensure that all pupils were linked with their own reading and mathematics teachers), (iii) ensure that all schools listed on the original sampling frame also had valid data collection instruments and vice-versa, (iv) check for "wild codes" that occurred when some variables had values that fell outside pre-specified reasonable limits, and (v) validate that variables used as linkage devices in later file merges were available and accurate.
A second phase of data preparation directed efforts towards the identification and correction of "wild codes" (which refer to data values that fall outside credible limits), and "inconsistencies" (which refer to different responses to the same, or related, questions). There were also some errors in the identification codes for teachers that needed to be corrected before data could be merged.
During 2002 a supplementary training programme was prepared and delivered to all countries via the Internet. This training led each SACMEQ Research Team
In 1991 the International Institute for Educational Planning (IIEP) and a number of Ministries of Education in Southern and Eastern Africa began to work together in order to address training and research needs in Education. The focus for this work was on establishing long-term strategies for building the capacity of educational planners to monitor and evaluate the quality of their basic education systems. The first two educational policy research projects undertaken by SACMEQ (widely known as "SACMEQ I" and "SACMEQ II") were designed to provide detailed information that could be used to guide planning decisions aimed at improving the quality of education in primary school systems.
During 1995-1998 seven Ministries of Education participated in the SACMEQ I Project. The SACMEQ II Project commenced in 1998 and the surveys of schools, involving 14 Ministries of Education, took place between 2000 and 2004. The survey was undertaken in schools in Botswana, Kenya, Lesotho, Malawi, Mauritius, Mozambique, Namibia, Seychelles, South Africa, Swaziland, Tanzania, Uganda, Zambia and Zanzibar.
Moving from the SACMEQ I Project (covering around 1100 schools and 20,000 pupils) to the SACMEQ II Project (covering around 2500 schools and 45,000 pupils) resulted in a major increase in the scale and complexity of SACMEQ's research and training programmes.
SACMEQ's mission is to: a) Expand opportunities for educational planners to gain the technical skills required to monitor and evaluate the quality of their education systems; and b) Generate information that can be used by decision-makers to plan and improve the quality of education.
The survey covered Mainland Tanzania.
The target population for SACMEQ's Initial Project was defined as "all pupils at the Grade 6 level in 1995 who were attending registered government or non-government schools". Grade 6 was chosen because it was the grade level where the basics of reading literacy were expected to have been acquired.
Sample survey data [ssd]
The sample designs used in the SACMEQ II Project were selected so as to meet the standards set down by the International Association for the Evaluation of Educational Achievement. These standards required that sample estimates of important pupil population parameters should have sampling accuracy that was at least equivalent to a simple random sample of 400 pupils (thereby guaranteeing 95 percent confidence limits for sample means of plus or minus one tenth of a pupil standard deviation unit).
Some Constraints on Sample Design Sample designs in the field of education are usually prepared amid a network of competing constraints. These designs need to adhere to established survey sampling theory and, at the same time, give due recognition to the financial, administrative, and socio-political settings in which they are to be applied. The "best" sample design for a particular project is one that provides levels of sampling accuracy that are acceptable in terms of the main aims of the project, while simultaneously limiting cost, logistic, and procedural demands to manageable levels. The major constraints that were established prior to the preparation of the sample designs for the SACMEQ II Project have been listed below.
Target Population: The target population definitions should focus on Grade 6 pupils attending registered mainstream government or non-government schools. In addition, the defined target population should be constructed by excluding no more than 5 percent of pupils from the desired target population.
Bias Control: The sampling should conform to the accepted rules of scientific probability sampling. That is, the members of the defined target population should have a known and non-zero probability of selection into the sample so that any potential for bias in sample estimates due to variations from "epsem sampling" (equal probability of selection method) may be addressed through the use of appropriate sampling weights (Kish, 1965).
Sampling Errors: The sample estimates for the main criterion variables should conform to the sampling accuracy requirements set down by the International Association for the Evaluation of Educational Achievement (Ross, 1991). That is, the standard error of sampling for the pupil tests should be of a magnitude that is equal to, or smaller than, what would be achieved by employing a simple random sample of 400 pupils (Ross, 1985).
Response Rates: Each SACMEQ country should aim to achieve an overall response rate for pupils of 80 percent. This figure was based on the wish to achieve or exceed a response rate of 90 percent for schools and a response rate of 90 percent for pupils within schools.
Administrative and Financial Costs: The number of schools selected in each country should recognize limitations in the administrative and financial resources available for data collection.
Other Constraints: The number of pupils selected to participate in the data collection in each selected school should be set at a level that will maximize validity of the within-school data collection for the pupil reading and mathematics tests. The Specification of the Target Population For Tanzania the desired target population was all pupils enrolled in Standard 6 in the ninth month of the school year (i.e., in the first week of December 2000). A decision was made to exclude pupils in special schools and those in schools which had fewer than 20 Standard 6 pupils which led to the establishment of the defined target population.
Tanzania there were 10,786 schools having 529,296 Standard 6 pupils. The excluded population was 17,942 pupils from 1,270 schools which was 3.3 percent of all pupils. The defined population from which a sample had to be drawn consisted of 511,354 pupils from 9,516 schools.
Note: Detailed descriptions of the sample design, sample selection, and sample evaluation procedures have been presented in the "Tanzania Working Report".
Face-to-face [f2f]
The data collection for SACMEQ’s Initial Project took place in October 1995 and involved the administration of questionnaires to pupils, teachers, and school heads. The pupil questionnaire contained questions about the pupils’ home backgrounds and their school life; the teacher questionnaire asked about classrooms, teaching practices, working conditions, and teacher housing; and the school head questionnaire collected information about teachers, enrolments, buildings, facilities, and management. A reading literacy test was also given to the pupils. The test was based on items that were selected after a trial-testing programme had been completed.
Data Entry and Data Cleaning Six persons from the National Examination Council of Tanzania (NECTA) and MOEC were appointed and trained in the use of WINDEM, a special data entry package to be used in SACMEQ. NECTA and MOEC computers were used for Data entry and data cleaning. The process was facilitated by written instructions and follow- up support from IIEP staff in the basic steps mainly via the internet and permitted the NRCs to: (i) identify major errors in the sequence of identification numbers, (ii) cross-check identification numbers across files (for example, to ensure that all pupils were linked with their own reading and mathematics teachers), (iii) ensure that all schools listed on the original sampling frame also had valid data collection instruments and vice-versa, (iv) check for "wild codes" that occurred when some variables had values that fell outside pre-specified reasonable limits, and validate that variables used as linkage devices in later file merges were available and accurate.
The volume of information required to be entered and cleaned in the code sheets was immense despite the user friendliness of the software thus perseverance and experience of keyboard operation was required. The following can help one imagine the volume of work entered. Data collection instruments contained the follows information to be coded: school form: 58; pupil name form: 51; pupil questionnaire: 150; pupil reading test: 85; pupil mathematics test: 65; teacher questionnaire: 587; teacher reading test: 51; teacher mathematics test: 43; and school head questionnaire: 319. All the data entered were sent to the IIEP for checking in order to ensure that there were no errors such as inconsistencies or wild values. The IIEP then sent back the data to Tanzania for cleaning, after which the Ministry sent it back to IIEP for further checks. This process continued until the data was absolutely clean, and it took 21 months to complete (March 2001and November 2002).
Response rates for pupils and schools respectively were 77% and 98%.
The sample designs employed in the SACMEQ Projects departed markedly from the usual "textbook model" of simple random sampling. This departure demanded that special steps be taken in order to calculate "sampling errors" (that is, measures of the stability of sample estimates of population characteristics).
In 1991 the International Institute for Educational Planning (IIEP) and a number of Ministries of Education in Southern and Eastern Africa began to work together in order to address training and research needs in Education. The focus for this work was on establishing long-term strategies for building the capacity of educational planners to monitor and evaluate the quality of their basic education systems. The first two educational policy research projects undertaken by SACMEQ (widely known as "SACMEQ I" and "SACMEQ II") were designed to provide detailed information that could be used to guide planning decisions aimed at improving the quality of education in primary school systems.
During 1995-1998 seven Ministries of Education participated in the SACMEQ I Project. The SACMEQ II Project commenced in 1998 and the surveys of schools, involving 14 Ministries of Education, took place between 2000 and 2004. The survey was undertaken in schools in Botswana, Kenya, Lesotho, Malawi, Mauritius, Mozambique, Namibia, Seychelles, South Africa, Swaziland, Tanzania, Uganda, Zambia and Zanzibar.
Moving from the SACMEQ I Project (covering around 1100 schools and 20,000 pupils) to the SACMEQ II Project (covering around 2500 schools and 45,000 pupils) resulted in a major increase in the scale and complexity of SACMEQ's research and training programmes.
SACMEQ's mission is to: a) Expand opportunities for educational planners to gain the technical skills required to monitor and evaluate the quality of their education systems; and b) Generate information that can be used by decision-makers to plan and improve the quality of education.
National coverage
The target population for SACMEQ's Initial Project was defined as "all pupils at the Grade 6 level in 1995 who were attending registered government or non-government schools". Grade 6 was chosen because it was the grade level where the basics of reading literacy were expected to have been acquired.
Sample survey data [ssd]
The sample designs used in the SACMEQ II Project were selected so as to meet the standards set down by the International Association for the Evaluation of Educational Achievement. These standards required that sample estimates of important pupil population parameters should have sampling accuracy that was at least equivalent to a simple random sample of 400 pupils (thereby guaranteeing 95 percent confidence limits for sample means of plus or minus one tenth of a pupil standard deviation unit).
Some Constraints on Sample Design Sample designs in the field of education are usually prepared amid a network of competing constraints. These designs need to adhere to established survey sampling theory and, at the same time, give due recognition to the financial, administrative, and socio-political settings in which they are to be applied. The "best" sample design for a particular project is one that provides levels of sampling accuracy that are acceptable in terms of the main aims of the project, while simultaneously limiting cost, logistic, and procedural demands to manageable levels. The major constraints that were established prior to the preparation of the sample designs for the SACMEQ II Project have been listed below.
Target Population: The target population definitions should focus on Grade 6 pupils attending registered mainstream government or non-government schools. In addition, the defined target population should be constructed by excluding no more than 5 percent of pupils from the desired target population.
Bias Control: The sampling should conform to the accepted rules of scientific probability sampling. That is, the members of the defined target population should have a known and non-zero probability of selection into the sample so that any potential for bias in sample estimates due to variations from "epsem sampling" (equal probability of selection method) may be addressed through the use of appropriate sampling weights (Kish, 1965).
Sampling Errors: The sample estimates for the main criterion variables should conform to the sampling accuracy requirements set down by the International Association for the Evaluation of Educational Achievement (Ross, 1991). That is, the standard error of sampling for the pupil tests should be of a magnitude that is equal to, or smaller than, what would be achieved by employing a simple random sample of 400 pupils (Ross, 1985).
Response Rates: Each SACMEQ country should aim to achieve an overall response rate for pupils of 80 percent. This figure was based on the wish to achieve or exceed a response rate of 90 percent for schools and a response rate of 90 percent for pupils within schools.
Administrative and Financial Costs: The number of schools selected in each country should recognize limitations in the administrative and financial resources available for data collection.
Other Constraints: The number of pupils selected to participate in the data collection in each selected school should be set at a level that will maximize validity of the within-school data collection for the pupil reading and mathematics tests.
Note: Detailed descriptions of the sample design, sample selection, and sample evaluation procedures have been presented in the "Swaziland Working Report".
Face-to-face [f2f]
The data collection for SACMEQ’s Initial Project took place in October 1995 and involved the administration of questionnaires to pupils, teachers, and school heads. The pupil questionnaire contained questions about the pupils’ home backgrounds and their school life; the teacher questionnaire asked about classrooms, teaching practices, working conditions, and teacher housing; and the school head questionnaire collected information about teachers, enrolments, buildings, facilities, and management. A reading literacy test was also given to the pupils. The test was based on items that were selected after a trial-testing programme had been completed.
Data Checking and Data Entry Data preparation commenced soon after the main data collection was completed. The NRCs had to organize the safe return of all materials to the Ministry of Education where the data collection instruments could be checked, entered into computers, and then "cleaned" to remove errors prior to data analysis. The data-checking involved the "hand editing" of data collection instruments by a team of trained staff. They were required to check that: (i) all questionnaires, tests, and forms had arrived back from the sample schools, (ii) the identification numbers on all instruments were complete and accurate, and (iii) certain logical linkages between questions made sense (for example, the two questions to school heads concerning "Do you have a school library?" and "How many books do you have in your school library?").
Data Cleaning The NRCs received written instructions and follow-up support from IIEP staff in the basic steps of data cleaning using the WINDEM software. This permitted the NRCs to (i) identify major errors in the sequence of identification numbers, (ii) cross-check identification numbers across files (for example, to ensure that all pupils were linked with their own reading and mathematics teachers), (iii) ensure that all schools listed on the original sampling frame also had valid data collection instruments and vice-versa, (iv) check for "wild codes" that occurred when some variables had values that fell outside pre-specified reasonable limits, and (v) validate that variables used as linkage devices in later file merges were available and accurate.
A second phase of data preparation directed efforts towards the identification and correction of "wild codes" (which refer to data values that that fall outside credible limits), and "inconsistencies" (which refer to different responses to the same, or related, questions). There were also some errors in the identification codes for teachers that needed to be corrected before data could be merged.
During 2002 a supplementary training programme was prepared and delivered to all countries via the Internet. This training led each SACMEQ Research Team step-by-step through the required data cleaning procedures - with the NRCs supervising "hands-on" data cleaning activities and IIEP staff occasionally using advanced software systems to validate the quality of the work involved in each data-cleaning step.
This resulted in a "cyclical" process whereby data files were cleaned by the NRC and then emailed to the IIEP for checking and then emailed back to the NRC for further cleaning.
Response rates for pupils and schools respectively were %92 and %99.
The sample designs employed in the SACMEQ Projects departed markedly from the usual "textbook model" of simple random sampling. This departure demanded that special steps be taken in order to calculate "sampling errors" (that is, measures of the stability of sample estimates of population characteristics).
In the report (Swaziland Working Report) a brief overview of various aspects of the general concept of "sampling error" has been presented. This has included a discussion of notions of "design effect", "the effective sample size", and the "Jackknife procedure" for estimating sampling errors.
In 1991 the International Institute for Educational Planning (IIEP) and a number of Ministries of Education in Southern and Eastern Africa began to work together in order to address training and research needs in Education. The focus for this work was on establishing long-term strategies for building the capacity of educational planners to monitor and evaluate the quality of their basic education systems. The first two educational policy research projects undertaken by SACMEQ (widely known as "SACMEQ I" and "SACMEQ II") were designed to provide detailed information that could be used to guide planning decisions aimed at improving the quality of education in primary school systems.
During 1995-1998 seven Ministries of Education participated in the SACMEQ I Project. The SACMEQ II Project commenced in 1998 and the surveys of schools, involving 14 Ministries of Education, took place between 2000 and 2004. The survey was undertaken in schools in Botswana, Kenya, Lesotho, Malawi, Mauritius, Mozambique, Namibia, Seychelles, South Africa, Swaziland, Tanzania, Uganda, Zambia and Zanzibar.
Moving from the SACMEQ I Project (covering around 1100 schools and 20,000 pupils) to the SACMEQ II Project (covering around 2500 schools and 45,000 pupils) resulted in a major increase in the scale and complexity of SACMEQ's research and training programmes.
SACMEQ's mission is to: a) Expand opportunities for educational planners to gain the technical skills required to monitor and evaluate the quality of their education systems; and b) Generate information that can be used by decision-makers to plan and improve the quality of education.
National coverage
The target population for SACMEQ's Initial Project was defined as "all pupils at the Grade 6 level in 1995 who were attending registered government or non-government schools". Grade 6 was chosen because it was the grade level where the basics of reading literacy were expected to have been acquired.
Sample survey data [ssd]
Two-stage sampling procedures were used. The first stage of sampling consists of the PPS selection of schools followed by the selection of a simple random sample of pupils in selected schools.
Note: Details of sampling design procedures are presented in the "Seychelles Working Report".
Face-to-face [f2f]
The data collection for SACMEQ’s Initial Project took place in October 1995 and involved the administration of questionnaires to pupils, teachers, and school heads. The pupil questionnaire contained questions about the pupils’ home backgrounds and their school life; the teacher questionnaire asked about classrooms, teaching practices, working conditions, and teacher housing; and the school head questionnaire collected information about teachers, enrolments, buildings, facilities, and management. A reading literacy test was also given to the pupils. The test was based on items that were selected after a trial-testing programme had been completed.
Data Checking and Data Entry Data preparation commenced soon after the main data collection was completed. The NRCs had to organize the safe return of all materials to the Ministry of Education where the data collection instruments could be checked, entered into computers, and then "cleaned" to remove errors prior to data analysis. The data-checking involved the "hand editing" of data collection instruments by a team of trained staff. They were required to check that: (i) all questionnaires, tests, and forms had arrived back from the sample schools, (ii) the identification numbers on all instruments were complete and accurate, and (iii) certain logical linkages between questions made sense (for example, the two questions to school heads concerning "Do you have a school library?" and "How many books do you have in your school library?").
The next step was the entry of data into computers using the WINDEM software. A team of 5-10 staff normally undertook this work. In some cases the data were "double entered" in order to monitor accuracy.
The numbers of keystrokes required to enter one copy of each data collection instrument were as follows: pupil questionnaire: 150; pupil reading test: 85; pupil mathematics test: 65; teacher questionnaire: 587; teacher reading test: 51; teacher mathematics test: 43; school head questionnaire: 319; school form: 58; and pupil name form: 51.
Data Cleaning The NRCs received written instructions and follow-up support from IIEP staff in the basic steps of data cleaning using the WINDEM software. This permitted the NRCs to (i) identify major errors in the sequence of identification numbers, (ii) cross-check identification numbers across files (for example, to ensure that all pupils were linked with their own reading and mathematics teachers), (iii) ensure that all schools listed on the original sampling frame also had valid data collection instruments and vice-versa, (iv) check for "wild codes" that occurred when some variables had values that fell outside pre-specified reasonable limits, and (v) validate that variables used as linkage devices in later file merges were available and accurate.
A second phase of data preparation directed efforts towards the identification and correction of "wild codes" (which refer to data values that that fall outside credible limits), and "inconsistencies" (which refer to different responses to the same, or related, questions). There were also some errors in the identification codes for teachers that needed to be corrected before data could be merged.
During 2002 a supplementary training programme was prepared and delivered to all countries via the Internet. This training led each SACMEQ Research Team step-by-step through the required data cleaning procedures - with the NRCs supervising "hands-on" data cleaning activities and IIEP staff occasionally using advanced software systems to validate the quality of the work involved in each data-cleaning step.
This resulted in a "cyclical" process whereby data files were cleaned by the NRC and then emailed to the IIEP for checking and then emailed back to the NRC for further cleaning.
The number of cycles required to complete all of the data cleaning was 9 cycles and it took 9 months to complete all the data cleaning.
Response rates for pupils and schools respectively were %96 and %100.
The sample designs employed in the SACMEQ Projects departed markedly from the usual "textbook model" of simple random sampling. This departure demanded that special steps be taken in order to calculate "sampling errors" (that is, measures of the stability of sample estimates of population characteristics).
In the report (Seychelles Working Report) a brief overview of various aspects of the general concept of "sampling error" has been presented. This has included a discussion of notions of "design effect", "the effective sample size", and the "Jackknife procedure" for estimating sampling errors.
Not seeing a result you expected?
Learn how you can add new datasets to our index.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
A Capacity Management Plan (CMP) is one of the strategies that the Department for Education employs to support government schools that are experiencing increased enrolment demand. Increased enrolment demand occurs due to several factors including changes to the demographic profile of the local community and increases in local housing development. The purpose of a CMP is to assist a school to return to, or maintain, a sustainable enrolment level and to assist children to be able to attend their local school. The CMP outlines the enrolment criteria relevant to each school. The Department for Education has been implementing CMPs since 2009. Generally, the CMPs, over a period of time, have supported schools to manage enrolment demand within their school enrolment capacity. Further strategies include the provision of additional accommodation, implementation of a school zone or planning for new educational facilities. CMPs are approved by the Minister and published in the South Australian Government Gazette. Refer to the Capacity Management Plan webpage for more information and links to the CMPs published in the South Australian Government Gazette.