21 datasets found
  1. Secondary school performance tables in England: 2012 to 2013

    • gov.uk
    Updated Jan 23, 2014
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Department for Education (2014). Secondary school performance tables in England: 2012 to 2013 [Dataset]. https://www.gov.uk/government/statistics/secondary-school-performance-tables-in-england-2012-to-2013
    Explore at:
    Dataset updated
    Jan 23, 2014
    Dataset provided by
    GOV.UKhttp://gov.uk/
    Authors
    Department for Education
    Area covered
    England
    Description

    The secondary school performance tables show:

    • attainment results for pupils at the end of key stage 4

    • key stage 2 to 4 progress measures in English and mathematics

    • how the performance of deprived pupils compares against other pupils in the school

    • any differences in the performance of low-attaining pupils, high-attaining pupils, and pupils performing at expected levels

    Additional data on schools is available, including information on the expenditure of each maintained school open for the full 2012 to 2013 financial year. The expenditure data shows spend-per-pupil statistics for a wide range of expenditure categories, including:

    • funding and income
    • education staff spend
    • learning resources and curriculum spend

    The school spend data also includes:

    • information about the school (such as the proportion of pupils in the school eligible for free school meals)
    • headline key stage 4 performance data
    • comparisons against the local authority and national averages
    • the numbers of teachers, teaching assistants and other school staff

    It also provides:

    • the pupil-to-teacher ratio
    • the mean gross salary of full-time teachers
    • information on the characteristics of the pupils attending the school
    • pupil absence data

    for each school.

    http://www.education.gov.uk/schools/performance/" class="govuk-link">Performance tables

    Attainment statistics team

    Email mailto:Attainment.STATISTICS@education.gov.uk">Attainment.STATISTICS@education.gov.uk

    Telephone: Raffaele Sasso 07469 413 581

  2. School and college performance tables: 2012 to 2013

    • gov.uk
    Updated Jan 23, 2014
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Department for Education (2014). School and college performance tables: 2012 to 2013 [Dataset]. https://www.gov.uk/government/statistics/school-and-college-performance-tables-in-england-2012-to-2013
    Explore at:
    Dataset updated
    Jan 23, 2014
    Dataset provided by
    GOV.UKhttp://gov.uk/
    Authors
    Department for Education
    Description

    These performance tables provide information on the attainment of students of sixth-form age in local secondary schools and further education sector colleges in the academic year 2012 to 2013. They also show how these results compare with other schools and colleges in the local authority area and in England as a whole.

    The tables report the results of 16- to 18-year-old students at the end of advanced level study in the 2012 to 2013 academic year. All schools and colleges in a local authority area are listed in alphabetical order, including:

    • maintained secondary schools
    • academies
    • free schools
    • independent schools
    • further education colleges with students aged 16 to 18

    Special schools that have chosen to be included are also listed, as are any sixth-form centres or consortia that operate in an area.

    This year, the performance indicators are separated into three separate cohorts:

    • A level
    • academic
    • vocational

    To be included in a cohort, a student needs to have taken at least one substantial qualification in one or more of the qualification types. Students following programmes of mixed qualification types may belong to more than one cohort, therefore full-time equivalent (FTE) figures are provided alongside student numbers. FTE figures take account of the proportion of time a student spends in each cohort based on the size of the qualification.

    http://www.education.gov.uk/schools/performance/index.html" class="govuk-link">Performance tables

    Joanna Edgell and Moira Nelson

    0370 000 2288

    attainment.statistics@education.gsi.gov.uk

  3. Secondary school performance tables in England: 2011 to 2012

    • gov.uk
    Updated Jan 24, 2013
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Department for Education (2013). Secondary school performance tables in England: 2011 to 2012 [Dataset]. https://www.gov.uk/government/statistics/secondary-school-performance-tables-in-england-key-stage-4-academic-year-2011-to-2012
    Explore at:
    Dataset updated
    Jan 24, 2013
    Dataset provided by
    GOV.UKhttp://gov.uk/
    Authors
    Department for Education
    Area covered
    England
    Description

    Reference Id: SFR03/2013

    Publication type: Performance tables

    Local authority data: LA data

    Region: England

    Release date: 24 January 2013

    Coverage status: Final/provisional

    Publication status: Recently updated

    The secondary school performance tables show:

    • attainment results for pupils at the end of key stage 4
    • key stage 2 to 4 progress measures in English and mathematics
    • information showing how the performance of deprived pupils compares against other pupils in the school
    • information which highlights any differences in the performance of low-attaining pupils, high-attaining pupils, and pupils performing at expected levels

    Additional data on schools will be made available, which includes information on the expenditure of each maintained school open for the full 2011 to 2012 financial year. The expenditure data will take the form of spend-per-pupil statistics for a wide range of categories, including: funding and income, education staff spend and learning resources and curriculum spend.

    The school-spend data will also contain information about the school (such as the proportion of pupils in the school eligible for free school meals), headline key stage 4 performance data and comparisons against the local authority and national averages, the numbers of teachers, teaching assistants and other school staff.

    It also provides the pupil-to-teacher ratio and the mean gross salary of full-time teachers, information on the characteristics of the pupils attending the school and pupil absence data for each school.

    http://www.education.gov.uk/schools/performance/index.html" class="govuk-link">Performance tables

    Richard Baker - Attainment Statistics Team
    0114 274 2118

    attainment.statistics@education.gsi.gov.uk

  4. Primary school performance tables: 2013

    • gov.uk
    Updated Dec 12, 2013
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Department for Education (2013). Primary school performance tables: 2013 [Dataset]. https://www.gov.uk/government/statistics/2013-primary-school-performance-tables
    Explore at:
    Dataset updated
    Dec 12, 2013
    Dataset provided by
    GOV.UKhttp://gov.uk/
    Authors
    Department for Education
    Description

    The primary school performance tables provide information on the achievements of pupils in primary schools, how they compare with other schools in the local authority (LA) area and in England as a whole.

    The data can be viewed and downloaded from the http://www.education.gov.uk/schools/performance/index.html" class="govuk-link">performance tables section of the Department for Education website.

    The tables show:

    • results from the KS2 tests in reading, mathematics and grammar, punctuation and spelling

    • KS2 teacher assessments in English, reading, writing, mathematics and science

    • KS1-2 progress measures in reading, writing and mathematics

    • KS1-2 value added

    They also include key measures for sub-groups of pupils in each school, including disadvantaged pupils, low, middle and high attaining pupils, boys, girls, pupils with English as an additional language and pupils who have been in the school throughout the whole of years 5 and 6 (non-mobile pupils).

    Additional school level data is also available including:

    • information on the expenditure of each maintained school open for the full 2012 to 2013 financial year

    • the numbers of teachers, teaching assistants and other school staff, the pupil teacher ratio and the mean gross salary of full-time teachers

    • information on the characteristics of the pupils attending the school

    • pupil absence data for each school

    • Ofsted ratings

    Primary attainment statistics team

    Email mailto:primary.attainment@education.gov.uk">primary.attainment@education.gov.uk

    Telephone: Gemma Coleman 020 7783 8239

  5. w

    Children in Care and Adoption Performance Tables

    • data.wu.ac.at
    • data.europa.eu
    html, xls
    Updated Sep 8, 2014
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Department for Education (2014). Children in Care and Adoption Performance Tables [Dataset]. https://data.wu.ac.at/schema/data_gov_uk/MWNhYTgyZTAtN2ZjNC00N2FmLWJlNDYtMzMzZDU5ZDg0ZjI3
    Explore at:
    html, xlsAvailable download formats
    Dataset updated
    Sep 8, 2014
    Dataset provided by
    Department for Education
    License

    Open Government Licence 3.0http://www.nationalarchives.gov.uk/doc/open-government-licence/version/3/
    License information was derived automatically

    Description

    The Children in Care and Adoption Performance Tables show, against 15 key indicators, how each local authority is performing. The data we have used is already available. We want the tables to help generate debate, discussion and, above, all action. We’ll be updating the tables as new data becomes available. And we will be talking to local authorities and other partners about how the tables can be developed and extended.

    A table is provided for each of the indicators mainly based on a three year rolling average. Where available, data are provided for 2011, 2012, 2013. The only exceptions are the indicators on absence from school and schools performing below the floor target for which only two years and one year of data are available.

  6. r

    Growth Academic Performance Indicator (API)

    • redivis.com
    • stanford.redivis.com
    Updated Aug 3, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Stanford Center for Population Health Sciences (2025). Growth Academic Performance Indicator (API) [Dataset]. https://redivis.com/datasets/kxa3-bbw2dknma
    Explore at:
    Dataset updated
    Aug 3, 2025
    Dataset authored and provided by
    Stanford Center for Population Health Sciences
    Time period covered
    Mar 21, 2012 - Mar 21, 2013
    Description

    The system is on a two-year cycle that gives a "base" score for the first year and a "growth" score in the second year. The Base API, which is usually released in the spring (for example, 2013), comes from the previous spring's test scores (2012). The Growth API, released in October (2013), comes from 2013 spring test scores.

  7. National curriculum assessments: key stage 2, 2012 (revised)

    • gov.uk
    Updated Dec 13, 2012
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Department for Education (2012). National curriculum assessments: key stage 2, 2012 (revised) [Dataset]. https://www.gov.uk/government/statistics/national-curriculum-assessments-at-key-stage-2-in-england-academic-year-2011-to-2012
    Explore at:
    Dataset updated
    Dec 13, 2012
    Dataset provided by
    GOV.UKhttp://gov.uk/
    Authors
    Department for Education
    Description

    This statistical first release (SFR) provides revised key stage 2 national curriculum assessment results for pupils in schools in England at national and local authority level for the academic year 2011 to 2012.

    Information on attainment is also broken down by different pupil characteristics, specifically:

    • gender
    • ethnicity
    • first language
    • free school meal eligibility
    • disadvantage
    • special educational need
    • income deprivation affecting children index

    This SFR also provides the updated percentage of pupils making expected progress in each of English and mathematics between key stage 1 and key stage 2, and updated data relating to impact indicator 3.7, the attainment gap at age 11 between pupils eligible for free school meals and the rest.

    The revised figures are based on data checked by schools prior to publication in the primary school performance tables. The figures contained within this publication combine this revised data with the information gathered through the school census in January 2012.

    Figures in this SFR update provisional figures released in September in SFR 19/2012.

    The key points from this release are:

    • The percentage of pupils achieving the expected level, level 4 or above, in the 2012 key stage 2 reading and mathematics tests in all schools increased between 2011 and 2012.
    • This has led to an increase in the percentage of pupils in state-funded schools making expected progress in mathematics.
    • High-performing groups were Chinese pupils, pupils not known to be eligible for free school meals and pupils with no identified special educational needs.

    In the academic year 2011 to 2012, due to local area free school meal initiatives, there has been both an under and an over recording of free school meal eligibility in some local authorities. The impact on National figures as a result of these mis-recordings is considered negligible. 2012 figures will be corrected in the December 2013 KS2 release.

    School level data is available in the http://www.education.gov.uk/schools/performance/" class="govuk-link">primary school performance tables.

    Karen Attew
    0207 7838455

    attainment.statistics@education.gsi.gov.uk

  8. i

    Education Sector Support Programme 2012 - Nigeria

    • catalog.ihsn.org
    Updated Jun 26, 2017
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Stuart Cameron (2017). Education Sector Support Programme 2012 - Nigeria [Dataset]. https://catalog.ihsn.org/index.php/catalog/6933
    Explore at:
    Dataset updated
    Jun 26, 2017
    Dataset authored and provided by
    Stuart Cameron
    Time period covered
    2012
    Area covered
    Nigeria
    Description

    Abstract

    In July 2012, representative stratified samples of public primary schools, head teachers, teachers and pupils were surveyed in the six Nigerian states where the DFID/UKaid-funded Education Sector Support Programme in Nigeria works.

    The ESSPIN Composite Survey (CS) process serves two main functions: periodically assessing the effects of ESSPIN's integrated School Improvement Programme (SIP), and reporting on selected indicators of the quality of education in the six ESSPIN-supported states. The CS addresses five Output pillars of the SIP, namely teacher competence, head teacher effectiveness, school development planning, school based management committee functionality and inclusive practices in schools. It also provides estimates of one Outcome indicator-school quality; and one Impact indicator-pupil learning achievement. The CS is wide-ranging but not exhaustive: it complements other ESSPIN/state monitoring and evaluation processes in areas such as institutional development, school enrolments and infrastructure. It brings together into a single exercise baseline surveys that were conducted by ESSPIN in 2010, hence 'composite' survey.

    Four data collection methods were used to complete ten questionnaires: interviews, record schedules, observation and oral/written tests. The total sample covered 595 schools/head teachers/SBMCs, 2,975 teachers and 9,520 pupils. Enumerators drawn from State School Improvement Teams and education officials were trained and then mobilised to collect the data over a six week period, with field supervision by NPC and ESSPIN. Data entry, cleaning and checking took longer than intended due to several technical problems. Each indicator of education quality was underpinned by a variety of objectively observable criteria. Estimates (values drawn from a sample to describe the population as a whole) are shown within 95% confidence intervals. In the case of Kano (and to a lesser extent Kaduna) some values are insufficiently precise to include in programme-wide aggregates. Mean estimates for ESSPIN-supported schools and non-ESSPIN supported schools are compared, and said to be significantly different at the 0.05 level (ie, where there is at least a 95% probability that the values for Phase 1 and Control Schools are actually different from one another). For certain numeracy measures, a comparison of the difference between 2010 and 2012 values for Phase 1 and Control Schools is possible. In most cases, such 'difference in differences' calculations will have to wait until the CS is repeated in 2014 and beyond. Although those CS 2012 results which show a significant difference between Phase 1 and Control Schools cannot necessarily be ascribed to 'the ESSPIN effect' (since other characteristics of schools in those categories could actually determine the difference), in the absence of evidence for an alternative cause it is reasonable to suppose that ESSPIN interventions are having the intended effect. This is particularly true of the Output and Outcome indicators but less likely with respect to Impact (children's learning outcomes) at this stage in the programme. The basis of allocation of schools to Phase 1 in each state is reported, to aid critical consideration of any selection bias.

    Geographic coverage

    Six Nigerian states - Enugu, Jigawa, Kaduna, Kano, Kwara, and Lagos

    Analysis unit

    School; Pupil; Teacher

    Universe

    Schools in the six ESSPIN states - Enugu, Jigawa, Kaduna, Kano and Lagos

    Kind of data

    Sample survey data [ssd]

    Sampling procedure

    This section outlines the sampling strategy and target sample sizes for each unit of observation for the 2012 ESSPIN composite survey conducted in the six focus states: Enugu, Jigawa, Kaduna, Kano, Kwara and Lagos.

    1. Aim of sampling design

    The analysis requires estimation of several indicators for each of the units of observation and where the 2010 MLA data and documentation allow it, attribution of any observed changes in the outputs and outcomes of interest over time to corresponding ESSPIN programme interventions; therefore, the sample of units was selected with rigorous scientific procedures in order that selection probabilities are known.

    In each of the six focus states, the intended sample for the 2012 CS was 105 primary schools, except in Enugu where phase 2 schools had not been identified at the time of the survey and the intended sample was 70 schools. This gives a total sample size of 595 schools. In each school the head teacher (N~595) and five other teachers who had received ESSPIN-sponsored training (N~2,975) and five other teachers who had not received such training (N~2,975) were expected to be interviewed except in cases where a sample school had fewer than five teachers (of either category) in which case all teachers were interviewed. Four primary 2 pupils were to be assessed in literacy and four primary 2 pupils in numeracy in each school, and similarly for primary 4 pupils (N~ 9,520).

    1. Construction of sampling frame

    The school sample frame was constructed using information on school ESSPIN and 2010 MLA survey participation and school size from the Education Management Information System (EMIS). To enable the planned analyses a multi-stage sampling design was used as shown in Figure A.1 in the CS1 report.

    The lines connecting the units of observation in Figure A.1 represent sampling stages. The six survey states were pre-determined as the ESSPIN programme operates in these states. In each focus state, public primary schools were selected (first stage), and then within each sample school, teachers and grade 2 and grade 4 pupils respectively (second stage) were selected. In the first sampling stage, there is stratification in order to allow the observation of a minimum number of units in each stratum of various types of analytical importance such as ESSPIN phase 1, ESSPIN phase 2, and control (no ESSPIN interventions) schools. The total intended sample across the six states was 595 public primary schools

    1. Drawing of the samples for the baseline survey

    Selection of schools The major sampling strata (hereafter denoted with the subscript h) are the schools' participation in the ESSPIN programme: ESSPIN phase 1 schools, ESSPIN phase 2 schools, and control (no ESSPIN intervention) schools in each of the six states with the exception of Enugu, where there are no phase 2 schools. Each of the major strata is divided into two sub-strata, respectively composed of the schools selected and not selected for the 2010 MLA survey.

    2010 MLA schools were selected in one of two ways depending on the total number of 2010 MLA schools in the 2010 MLA school sub-strata. If there were more than 17 MLA schools, 17 were selected using systematic equal probability sampling and if there were fewer than 17 MLA schools, all were selected with certainty.

    The reason for using systematic equal probability sampling was that this method was used to select the school sample for the 2010 MLA survey combined with the need for a minimum number of 2010 MLA schools to be contained within the 2012 sample in order to enable analysis over time of any changes in pupil learning as measured by the MLA Selection of teachers

    The head teacher was interviewed in all sample schools. Five ESSPIN-trained and five non-ESSPIN-trained teachers were selected in each sample school using simple random sampling. The teacher sampling was conducted in schools by the enumerators who used a special form and random number tables.

    The teacher and pupil sampling was conducted in the field. The sampling selections delegated to the enumerators were conducted as a part of interviewing processes that had broader objectives. For this reason the selection processes were not supported by stand-alone forms but were instead integrated with the survey questionnaires and used as follows for pupils (the same procedure was used for teacher sampling):

    • First, the enumerator used the school's pupil register to write pupil codes next to each pupil name starting with 1 for the first pupil listed up until the last pupil listed, which provided the largest pupil code.
    • Second, the interviewer wrote down the largest pupil code in a box on the questionnaire.
    • Third, the interviewer scanned the provided random number table according to the instructions provided to find the pupil codes of the eligible pupils to be selected.

    Selection of pupils Four grade 2 pupils and four grade 4 pupils were selected for each of the literacy and numeracy assessments respectively in each sample school using simple random sampling. The pupil sampling was conducted in schools by the enumerators who used a special form and random number tables similar to the teacher sampling.

    The teacher and pupil sampling was conducted in the field. The sampling selections delegated to the enumerators were conducted as a part of interviewing processes that had broader objectives. For this reason the selection processes were not supported by stand-alone forms but were instead integrated with the survey questionnaires and used as follows for pupils (the same procedure was used for teacher sampling):

    • First, the enumerator used the school's pupil register to write pupil codes next to each pupil name starting with 1 for the first pupil listed up until the last pupil listed, which provided the largest pupil code.
    • Second, the interviewer wrote down the largest pupil code in a box on the questionnaire.
    • Third, the interviewer scanned the provided random number table according to the instructions provided to find the pupil codes of the eligible pupils to be selected.

    Panel component CS1 forms the baseline survey with the aim to visit the same schools during future rounds.

    Sampling deviation

    One

  9. f

    Additional file 1 of Admission criteria and academic performance in medical...

    • springernature.figshare.com
    xlsx
    Updated Aug 13, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ahmad Tamimi; Mariam Hassuneh; Iskandar Tamimi; Malik Juweid; Dana Shibli; Batool AlMasri; Faleh Tamimi (2024). Additional file 1 of Admission criteria and academic performance in medical school [Dataset]. http://doi.org/10.6084/m9.figshare.26584200.v1
    Explore at:
    xlsxAvailable download formats
    Dataset updated
    Aug 13, 2024
    Dataset provided by
    figshare
    Authors
    Ahmad Tamimi; Mariam Hassuneh; Iskandar Tamimi; Malik Juweid; Dana Shibli; Batool AlMasri; Faleh Tamimi
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Additional file 1: Supplementary Table 1. Characteristics of students accepted in the academic year 2012-2013.

  10. Formal Technical Education 2009-2013, Independent Impact Evaluation - El...

    • catalog.ihsn.org
    • datacatalog.ihsn.org
    Updated Jan 19, 2021
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Mathematica Policy Research (2021). Formal Technical Education 2009-2013, Independent Impact Evaluation - El Salvador [Dataset]. https://catalog.ihsn.org/catalog/9471
    Explore at:
    Dataset updated
    Jan 19, 2021
    Dataset provided by
    Mathematicahttp://www.mathematica.org/
    Authors
    Mathematica Policy Research
    Time period covered
    2009 - 2013
    Area covered
    El Salvador
    Description

    Abstract

    With a budget of nearly $20 million, the Formal Technical Education Sub-Activity was designed to strengthen technical and vocational educational institutions in the Northern Zone of El Salvador. By improving schools and offering scholarships, the sub-activity financed efforts to increase youths' access to high-quality technical education in the region, thus increasing their achievement levels, secondary (and post-secondary) school graduation rates, and prospects for gainful employment. By 2012, the Formal Technical Education Sub-Activity was scheduled to invest $3.8 million in scholarships for students enrolled in secondary and post-secondary technical schools in the Northern Zone. According to preliminary budgets, the sub-activity would also provide $9 million to improve 20 technical secondary schools in the Northern Zone with infrastructure investments and additional teacher training programs. In addition, the sub-activity was scheduled to invest $7 million to strengthen ITCHA, an existing post-secondary institute in the Northern Zone.

    In conducting the evaluation of the Formal Technical Education Sub-Activity-which includes secondary and post-secondary school improvements and scholarships-Mathematica will address the following research questions regarding Sub-Activity investments from 2009 to 2012:

    1.Program design/implementation. How were the secondary school strengthening and scholarship programs, and the ITCHA strengthening program designed and implemented? Did implementation meet original targets and expectations? Why or why not?

    2.Description of participants. What are the characteristics (age, gender, initial household income, etc.) of scholarship recipients? What are the basic characteristics of secondary school and ITCHA students?

    3.Impact/Results. What is the impact of FOMILENIO's strengthening secondary school program on students' education and labor market outcomes, including secondary school enrollment, grade completion, graduation, and further education, employment, and income? What is the impact of the offer of scholarships in some programs within strengthened schools on student educational and labor outcomes? Did ITCHA graduates obtain jobs and experience increased income following graduation? Did ITCHA students who graduated from secondary school MEGATEC programs have better academic and labor market outcomes than students who did not attend secondary school MEGATEC programs?

    4.Impacts/Results by key target subgroups. Were impacts/results different for girls versus boys? What types of participants experienced positive impacts?

    5.Explanation for impact findings and results. What was the ex-post statistical power, and can this explain the lack of impacts (in cases where no impacts are found)? What aspects of implementation could explain the impacts/results? If impacts/results were different for girls versus boys, why?

    6.Sustainability. Are secondary school improvements and scholarships being maintained? Are ITCHA improvements being maintained? Are they likely to be maintained in the medium to long term?

    To answer all research questions regarding the design, implementation, and sustainability of the strengthening efforts and scholarships (Topics 1, 2, 5, and 6), Mathematica will use a mixed-methods evaluation design that uses qualitative and quantitative methods (see Table III.1). With this approach, researchers will use qualitative methods-namely, qualitative interview data and programmatic reports-to help understand processes and activities, provide information on setting or context, and communicate the perspectives and experiences of key participants through direct quotes. In addition, Mathematica will use quantitative information on program outputs and costs, participant characteristics, and budget outlays to summarize the intervention, describe its participants, and analyze the sustainability of its original investments.

    To answer research questions regarding impacts and results (3, 4, and 5), Mathematica will use a variety of designs. To determine the impact of secondary school scholarships, researchers designed and implemented a random assignment design, by which some eligible applicants were randomly selected to receive scholarships. To determine the impact of secondary school strengthening investments, Mathematica designed and implemented a matched comparison group approach using propensity score methods, by which students at the 20 strengthened schools are compared to students at 20 similar non-strengthened schools. Finally, to measure key results of the ITCHA intervention-including graduation and employment rates-researchers used a mixed-methods approach that featured a follow-up survey of ITCHA students.

    All impact and results analyses rely on in-person surveys, including panel surveys of scholarship applicants, cross-sectional baseline and follow-up surveys of secondary school students, and a follow-up survey of ITCHA students.

    Geographic coverage

    The Northern Zone of El Salvador

    Analysis unit

    Individuals, Schools

    Kind of data

    Sample survey data [ssd]

    Research instrument

    This study features student-level questionnaires. Two primary versions of these questionnaires were developed--one for secondary school students and one for post-secondary school students. The secondary school questionnaire was administered to students who applied for scholarships and students who attended the 40 secondary schools in the secondary school evaluation. The post-secondary school questionnaire was administered to students of the Chalatenango Technical Institute (ITCHA). Both questionnaires asked students (or former students) about enrollment, academic performance, progression and graduation, as well as their employment and income in the previous 12 months.

  11. K

    2012 National Assessment of Information Acquisition and Processing in...

    • rdr.kuleuven.be
    pdf +2
    Updated Dec 12, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Centre of Educational Effectiveness and Evaluation; Rianne Janssen; Rianne Janssen; Centre of Educational Effectiveness and Evaluation (2023). 2012 National Assessment of Information Acquisition and Processing in Primary Education in Flanders [Dataset]. http://doi.org/10.48804/MKGDAP
    Explore at:
    text/comma-separated-values(83477), text/comma-separated-values(63368), text/comma-separated-values(31776), text/comma-separated-values(34897), text/comma-separated-values(47739), pdf(99993), text/comma-separated-values(83569), pdf(5910138), pdf(163428), pdf(47877), text/comma-separated-values(48328), pdf(191447), text/comma-separated-values(127655), text/comma-separated-values(823), text/comma-separated-values(194455), text/comma-separated-values(37046), pdf(15821789), text/comma-separated-values(5058), pdf(1291305), text/comma-separated-values(72198), text/comma-separated-values(279367), text/comma-separated-values(31147), pdf(2782078), text/comma-separated-values(4598), text/comma-separated-values(973), pdf(17385646), pdf(138611), pdf(644556), text/comma-separated-values(474169), text/comma-separated-values(188676), pdf(389123), text/comma-separated-values(706838), text/comma-separated-values(4584), txt(4413), pdf(100001), text/comma-separated-values(7889), pdf(111127), text/comma-separated-values(299346), text/comma-separated-values(162531)Available download formats
    Dataset updated
    Dec 12, 2023
    Dataset provided by
    KU Leuven RDR
    Authors
    Centre of Educational Effectiveness and Evaluation; Rianne Janssen; Rianne Janssen; Centre of Educational Effectiveness and Evaluation
    License

    Attribution-NonCommercial-NoDerivs 4.0 (CC BY-NC-ND 4.0)https://creativecommons.org/licenses/by-nc-nd/4.0/
    License information was derived automatically

    Area covered
    Flanders
    Dataset funded by
    Departement Onderwijs en Vorming, Vlaams Ministerie van Onderwijs en Vorming
    Description

    In 2012 a national assessment was carried out, which was commissioned by the Flemish Government, to assess whether pupils have sufficient skills to acquire and process information at the end of primary education. Data were collected through two written tests: a test on working with tables and graphs and a test on working with plans and drawings. A representative sample of 2383 pupils from 92 Flemish primary schools participated. In addition, the extent to which pupils have mastered certain ICT competencies was assessed through a ICT prformance assessment. A subsample of six pupils from each school participated in this performance assessment. Tests were supplemented with background questionnaires for pupils, parents, teachers and ICT coordinators. The pupil and parent questionnaires collected, among other things, information on socio-economic background, diagnosis of disabilities and circumstances that may stimulate learning. The questionnaire for teachers collected, among other things, information on their teaching experience, the use of educational materials and ICT during the classes. The questionnaire for the ICT coordinator asked about the ICT policy and infrastructure at school. The response rates to these questionnaires were almost 100% for the pupil questionnaire, 93% for the parent questionnaire, 95% for the teacher questionnaire and 93% for the ICT coordinator. In 2012 vond een peiling plaats in opdracht van de Vlaamse overheid om na te gaan in welke mate leerlingen op het einde van het basisonderwijs beschikken over voldoende vaardigheden om informatie te verwerven en verwerken. De gegevens werden verzameld door middel van twee schriftelijke toetsen: een toets over het werken met tabellen en grafieken en een toets over het werken met plannen en tekeningen. Deze toetsen werden afgenomen bij een representatieve steekproef van 2383 leerlingen uit 92 Vlaamse basisscholen. Daarnaast werd ook nagegaan in hoeverre leerlingen bepaalde ICT-competenties beheersen door middel van een praktische proef. Deze werd afgenomen bij een deelsteekproef van zes leerlingen per school. Naast de toetsen werden ook vragenlijsten afgenomen bij leerlingen, ouders, leerkrachten en ICT-coördinatoren. De leerling- en oudervragenlijsten verzamelden onder meer informatie over de sociaal-economische achtergrond, eventuele diagnoses van beperkingen en omstandigheden die het leren kunnen stimuleren. Leerkrachtvragenlijsten verzamelden informatie over hun diploma, hun onderwijservaring, het gebruik van leermiddelen en ICT tijdens de lessen. Vragenlijsten voor de ICT-coördinator gingen over het ICT-beleid en de infrastructuur op school. De respons op deze vragenlijsten was bijna 100% voor de leerlingen, 93% voor de ouders, 95% voor de leerkrachten en 93% voor de ICT-coördinator.

  12. i

    Formal Technical Education 2009-2013 - El Salvador

    • catalog.ihsn.org
    Updated Mar 29, 2019
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Mathematica Policy Research (2019). Formal Technical Education 2009-2013 - El Salvador [Dataset]. https://catalog.ihsn.org/index.php/catalog/6219
    Explore at:
    Dataset updated
    Mar 29, 2019
    Dataset authored and provided by
    Mathematica Policy Research
    Time period covered
    2009 - 2013
    Description

    Abstract

    With a budget of nearly $20 million, the Formal Technical Education Sub-Activity was designed to strengthen technical and vocational educational institutions in the Northern Zone of El Salvador. By improving schools and offering scholarships, the sub-activity financed efforts to increase youths' access to high-quality technical education in the region, thus increasing their achievement levels, secondary (and post-secondary) school graduation rates, and prospects for gainful employment. By 2012, the Formal Technical Education Sub-Activity was scheduled to invest $3.8 million in scholarships for students enrolled in secondary and post-secondary technical schools in the Northern Zone. According to preliminary budgets, the sub-activity would also provide $9 million to improve 20 technical secondary schools in the Northern Zone with infrastructure investments and additional teacher training programs. In addition, the sub-activity was scheduled to invest $7 million to strengthen ITCHA, an existing post-secondary institute in the Northern Zone.

    In conducting the evaluation of the Formal Technical Education Sub-Activity-which includes secondary and post-secondary school improvements and scholarships-Mathematica will address the following research questions regarding Sub-Activity investments from 2009 to 2012:

    1.Program design/implementation. How were the secondary school strengthening and scholarship programs, and the ITCHA strengthening program designed and implemented? Did implementation meet original targets and expectations? Why or why not?

    2.Description of participants. What are the characteristics (age, gender, initial household income, etc.) of scholarship recipients? What are the basic characteristics of secondary school and ITCHA students?

    3.Impact/Results. What is the impact of FOMILENIO's strengthening secondary school program on students' education and labor market outcomes, including secondary school enrollment, grade completion, graduation, and further education, employment, and income? What is the impact of the offer of scholarships in some programs within strengthened schools on student educational and labor outcomes? Did ITCHA graduates obtain jobs and experience increased income following graduation? Did ITCHA students who graduated from secondary school MEGATEC programs have better academic and labor market outcomes than students who did not attend secondary school MEGATEC programs?

    4.Impacts/Results by key target subgroups. Were impacts/results different for girls versus boys? What types of participants experienced positive impacts?

    5.Explanation for impact findings and results. What was the ex-post statistical power, and can this explain the lack of impacts (in cases where no impacts are found)? What aspects of implementation could explain the impacts/results? If impacts/results were different for girls versus boys, why?

    6.Sustainability. Are secondary school improvements and scholarships being maintained? Are ITCHA improvements being maintained? Are they likely to be maintained in the medium to long term?

    To answer all research questions regarding the design, implementation, and sustainability of the strengthening efforts and scholarships (Topics 1, 2, 5, and 6), Matheatica will use a mixed-methods evaluation design that uses qualitative and quantitative methods (see Table III.1). With this approach, researchers will use qualitative methods-namely, qualitative interview data and programmatic reports-to help understand processes and activities, provide information on setting or context, and communicate the perspectives and experiences of key participants through direct quotes. In addition, Mathematica will use quantitative information on program outputs and costs, participant characteristics, and budget outlays to summarize the intervention, describe its participants, and analyze the sustainability of its original investments.

    To answer research questions regarding impacts and results (3, 4, and 5), Mathematica will use a variety of designs. To determine the impact of secondary school scholarships, researchers designed and implemented a random assignment design, by which some eligible applicants were randomly selected to receive scholarships. To determine the impact of secondary school strengthening investments, Mathematica designed and implemented a matched comparison group approach using propensity score methods, by which students at the 20 strengthened schools are compared to students at 20 similar non-strengthened schools. Finally, to measure key results of the ITCHA intervention-including graduation and employment rates-researchers used a mixed-methods approach that featured a follow-up survey of ITCHA students.

    All impact and results analyses rely on in-person surveys, including panel surveys of scholarship applicants, cross-sectional baseline and follow-up surveys of secondary school students, and a follow-up survey of ITCHA students.

    Geographic coverage

    The Northern Zone of El Salvador

    Analysis unit

    Individuals, Schools

    Kind of data

    Sample survey data [ssd]

    Research instrument

    This study features student-level questionnaires. Two primary versions of these questionnaires were developed--one for secondary school students and one for post-secondary school students. The secondary school questionnaire was administered to students who applied for scholarships and students who attended the 40 secondary schools in the secondary school evaluation. The post-secondary school questionnaire was administered to students of the Chalatenango Technical Institute (ITCHA). Both questionnaires asked students (or former students) about enrollment, academic performance, progression and graduation, as well as their employment and income in the previous 12 months.

  13. w

    Education Quality Improvement Programme Impact Evaluation Midline Survey...

    • microdata.worldbank.org
    • catalog.ihsn.org
    Updated Dec 2, 2021
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Oxford Policy Management Ltd (2021). Education Quality Improvement Programme Impact Evaluation Midline Survey 2016 - Tanzania [Dataset]. https://microdata.worldbank.org/index.php/catalog/2838
    Explore at:
    Dataset updated
    Dec 2, 2021
    Dataset authored and provided by
    Oxford Policy Management Ltd
    Time period covered
    2016
    Area covered
    Tanzania
    Description

    Abstract

    Education Quality Improvement Programme in Tanzania (EQUIP-T) is a Government of Tanzania programme, funded by UK DIFD, which seeks to improve the quality of primary education, especially for girls, in seven regions of Tanzania. It focuses on strengthening professional capacity and performance of teachers, school leadership and management, systems which support district management of education, and community participation in education.

    The independent Impact Evaluation (IE) of EQUIP-T is a four-year study funded by the United Kingdom Department for International Development (DFID). It is designed to: i) generate evidence on the impact of EQUIP-T on primary pupil learning outcomes, including any differential effects for boys and girls; ii) examine perceptions of effectiveness of different EQUIP-T components; iii) provide evidence on the fiscal affordability of scaling up EQUIP-T post-2018; and iv) communicate evidence generated by the impact evaluation to policy-makers and key education stakeholders.

    The research priorities for the midline IE are captured in a comprehensive midline evaluation matrix (see Annex B in the 'EQUIP-Tanzania Impact Evaluation. Midline Technical Report, Volume I: Results and Discussion' under Reports and policy notes). The matrix sets out evaluation questions linked to the programme theory of change, and identifies sources of evidence to answer each question-either the quantitative survey or qualitative research, or both. It asks questions related to the expected results at each stage along the results chain (from the receipt of inputs to delivery of outputs, and contributions to outcomes and impact) under each of the programme's components. The aim is to establish: (i) whether changes have happened as expected; (ii) why they happened or did not happen (i.e. whether key assumptions in the theory of change hold or not); (iii) whether there are any important unanticipated changes; and (iv) what links there are between the components in driving changes.

    The main IE research areas are: - Impact of EQUIP-T on standard 3 pupil learning in Kiswahili and mathematics. - Impact of EQUIP-T on teacher absence from school and from classrooms. - Impact of EQUIP-T on selected aspects of school leadership and management.

    The IE uses a mixed methods approach that includes: - A quantitative survey of 100 government primary schools in 17 programme treatment districts and 100 schools in 8 control districts in 2014, 2016 and 2018 covering: - Standard three pupils and their parents/caregivers; - Teachers who teach standards 1-3 Kiswahili; - Teachers who teach standards 1-3 mathematics; - Teachers who teach standards 4-7 mathematics; - Head teachers; and - Standard two lesson observations in Kiswahili and mathematics.

    • Qualitative fieldwork in nine research sites that overlap with a sub-set of the quantitative survey schools, in 2014, 2016 and 2018, consisting of key informant interviews (KIIs) and focus group discussions (FGDs) with head teachers, teachers, pupils, parents, school committee (SC) members, region, district and ward education officials and EQUIP-T programme staff.

    The midline data available in the World Bank Microdata Catalog are from the EQUIP-T IE quantitative midline survey conducted in 2016. For the qualitative research findings and methods see 'EQUIP-Tanzania Impact Evaluation. Midline Technical Report, Volume I: Results and Discussion' and 'EQUIP-Tanzania Impact Evaluation. Midline Technical Report, Volume II: Methods and Supplementary Evidence' under Reports and policy notes.

    Geographic coverage

    The survey is representative of the 17 EQUIP-T programme treatment districts. The survey is NOT representative of the 8 control districts. For more details see the section on Representativeness in 'EQUIP-Tanzania Impact Evaluation. Final Baseline Technical Report, Volume I: Results and Discussion' and 'EQUIP-Tanzania Impact Evaluation. Final Baseline Technical Report, Volume II: Methods and Technical Annexes' under Reports.

    The 17 treatment districts are: - Dodoma Region: Bahi DC, Chamwino DC, Kongwa DC, Mpwapwa DC - Kigoma Region: Kakonko DC, Kibondo DC - Shinyanga Region: Kishapu DC, Shinyanga DC - Simiyu Region: Bariadi DC, Bariadi TC, Itilima DC, Maswa DC, Meatu DC - Tabora Region: Igunga DC, Nzega DC, Sikonge DC, Uyui DC

    The 8 control districts are: - Arusha Region: Ngorongoro DC - Mwanza Region: Misungwi DC - Pwani Region: Rufiji DC
    - Rukwa Region: Nkasi DC - Ruvuma Region: Tunduru DC - Singida Region: Ikungi DC, Singida DC - Tanga Region: Kilindi DC

    Analysis unit

    • School
    • Teacher
    • Pupil
    • Lesson (not sampled)

    Kind of data

    Sample survey data [ssd]

    Sampling procedure

    Because the EQUIP-T regions and districts were purposively selected (see 'EQUIP-Tanzania Impact Evaluation. Final Baseline Technical Report, Volume I: Results and Discussion' under Reports and policy notes), the IE sampling strategy used propensity score matching (PSM) to: (i) match eligible control districts to the pre-selected and eligible EQUIP-T districts (see below), and (ii) match schools from the control districts to a sample of randomly sampled treatment schools in the treatment districts. The same schools are surveyed for each round of the IE (panel of schools) and standard 3 pupils will be interviewed at each round of the survey (no pupil panel).

    Identifying districts eligible for matching

    Eligible control and treatment districts were those not participating in any other education programme or project that may confound the measurement of EQUIP-T impact. To generate the list of eligible control and treatment districts, all districts that are contaminated because of other education programmes or projects or may be affected by programme spill-over were excluded as follows:

    • All districts located in Lindi and Mara regions as these are part of the EQUIP-T programme but implementation started later in these two regions (the IE does not cover these two regions);
    • Districts that will receive partial EQUIP-T programme treatment or will be subject to potential EQUIP-T programme spillovers;
    • Districts that are receiving other education programmes/projects that aim to influence the same outcomes as the EQUIP-T programme and would confound measurement of EQUIP-T impact;
    • Districts that were part of pre-test 1 (two districts); and
    • Districts that were part of pre-test 2 (one district).

    Sampling frame

    To be able to select an appropriate sample of pupils and teachers within schools and districts, the sampling frame consisted of information at three levels:

    • District;
    • School; and
    • Within school.

    The sampling frame data at the district and school levels was compiled from the following sources: the 2002 and 2012 Tanzania Population Censuses, Education Management Information System (EMIS) data from the Ministry of Education and Vocational Training (MoEVT) and the Prime Minister's Office for Regional and Local Government (PMO-RALG), and the UWEZO 2011 student learning assessment survey. For within school level sampling, the frames were constructed upon arrival at the selected schools and was used to sample pupils and teachers on the day of the school visit.

    Sampling stages

    Stage 1: Selection of control districts Because the treatment districts were known, the first step was to find sufficiently similar control districts that could serve as the counterfactual. PSM was used to match eligible control districts to the pre-selected, eligible treatment districts using the following matching variables: Population density, proportion of male headed households, household size, number of children per household, proportion of households that speak an ethnic language at home, and district level averages for household assets, infrastructure, education spending, parental education, school remoteness, pupil learning levels and pupil drop out.

    Stage 2: Selection of treatment schools In the second stage, schools in the treatment districts were selected using stratified systematic random sampling. The schools were selected using a probability proportional to size approach, where the measure of school size was the standard two enrolment of pupils. This means that schools with more pupils had a higher probability of being selected into the sample. To obtain a representative sample of programme treatment schools, the sample was implicitly stratified along four dimensions:

    • District;
    • PSLE scores for Kiswahili;
    • PSLE scores for mathematics; and
    • Total number of teachers per school.

    Stage 3: Selection of control schools As in stage one, a non-random PSM approach was used to match eligible control schools to the sample of treatment schools. The matching variables were similar to the ones used as stratification criteria: Standard two enrolment, PSLE scores for Kiswahili and mathematics, and the total number of teachers per school.

    The midline survey was conducted for the same schools as the baseline survey (a panel of schools) and the endline survey in 2018 will cover the same sample of schools. However, the IE does not have a panel of pupils as a pupil only attends standard three once (unless repeating). Thus, the IE sample is a repeated cross-section of pupils in a panel of schools.

    Stage 4: Selection of pupils and teachers within schools Pupils and teachers were sampled within schools using systematic random sampling based on school registers. The within-school sampling was assisted by selection tables automatically generated within the computer assisted survey instruments.

    Per school, 15 standard 3 pupils were sampled. For the teacher development needs assessment (TDNA), in the sample treatment schools, up to three teachers of standards 1 to 3 Kiswahili, up to three teachers of standards 1 to 3 mathematics; and up to three

  14. A level and other level 3 results: 2012 to 2013 (revised)

    • gov.uk
    Updated Jul 17, 2014
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Department for Education (2014). A level and other level 3 results: 2012 to 2013 (revised) [Dataset]. https://www.gov.uk/government/statistics/a-level-and-other-level-3-results-england-2012-to-2013-revised
    Explore at:
    Dataset updated
    Jul 17, 2014
    Dataset provided by
    GOV.UKhttp://gov.uk/
    Authors
    Department for Education
    Description

    This statistical first release (SFR) provides revised information on the overall achievements of young people in advanced level examinations (ie A levels and other level 3 qualifications). We published provisional figures for the 2012 to 2013 academic year in October 2013. The figures were checked by schools and colleges, and this SFR includes revised figures.

    We have also published http://www.education.gov.uk/schools/performance/" class="govuk-link">16 to 18 performance tables for 2013.

    Contact details

    Joanna Edgell and Moira Nelson

    0370 000 2288

    Attainment.STATISTICS@education.gov.uk

  15. a

    WCG Socio-Economic Dashboard 8: Education Barometer

    • wcg-opendataportal-westerncapegov.hub.arcgis.com
    Updated Jan 11, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Western Cape Government Living Atlas (2023). WCG Socio-Economic Dashboard 8: Education Barometer [Dataset]. https://wcg-opendataportal-westerncapegov.hub.arcgis.com/items/b4f7a2940cbe43a09ef0a487a98a4eac
    Explore at:
    Dataset updated
    Jan 11, 2023
    Dataset authored and provided by
    Western Cape Government Living Atlas
    Description

    Data is sourced from various education resources. Data is transformed into a BI format and quality assured. Data is consumed by a dashboard created in Power BI. Four reports exist for this dashboard:1. Matric Pass RatesMatric Pass rates (2016-2019); filter by year & province - Province and gender - count of Wrote and -Passed, %Pass average and Total pass rate Matric Pass Rates Western Cape (2016-2020); filter by year & province - Province and gender - count of Wrote and -Passed, %Pass average and Total pass rate Percentage of high schools attaining 60% and higher pass (2016-2020); filter by year & Region (province) - Region (Province) - count of schools, count of schools with 60% pass rate and higher, and actual pass rate (percentage) Percentage of high schools attaining 60% and higher pass in Western Cape (2016-2020); filter by year - count of schools, count of schools with 60% pass rate and higher, and actual pass rate (percentage)2. Tertiary EducationEligibility Bachelor Degree (2012-2018); filter by year and region - region, count B-degree passes, count grade 12 who wrote exams, % B-degree passesEligibility Bachelor Degree in Western Cape (2012-2018); filter by year - year, count B-degree passes, count grade 12 who wrote exams, % B-degree passesUniversity admission eligibility rate (2015-2018); filter by year - year, count B-degree passes, count grade 12 who wrote exams, % B-degree passesPercentage 25 years plus educational attainment (2018-2020); filter by year - Percentage 25 years and older tertiary education attainment by yearPercentage 25 years and older educational attainment by year and level (primary, secondary, NSC/Grade12, tertiary), province Percentage 25 years plus educational attainment Western Cape (2010-2020); filter by year - Percentage 25 years and older tertiary education attainment by yearPercentage 25 years and older educational attainment by year and level (primary, secondary, NSC/Grade12, tertiary), province Percentatge 20 years plus tertiary education in the Western Cape (2017-2020); filter by year - Percentage 20 years and older educational attainment by year by level (primary, secondary, NSC/Grade12, tertiary, other)3. Subject Pass RateMathematics and Physical Science pass rate (2012-2018); filter by year and region - Pass rate for Mathematics and Physical Science in Grade 12 by region with pass rates of >30% for maths literacy, maths, and physical sciencePass rate for Mathematics and Physical Science in Grade 12 by region with pass rates of >30% and >40% for maths literacy, maths, and physical scienceMathematics and Physical Science pass rate in Western Cape (2012-2018); filter by year and region - Pass rate for Mathematics and Physical Science in Grade 12 by year with pass rates of >30% for maths literacy, maths, and physical sciencePass rate for Mathematics and Physical Science in Grade 12 by year with pass rates of >30% and >40% for maths literacy, maths, and physical sciencePercentage Language and Mathematics systemic test pass rate for Grade 3, 6, 9 (2013-2019); filter by year, grade, measure (all, language. maths) - Pass rate for Language and Mathematics by year and pass rates grades 3, 6, and 9 language and maths4. LearnersSchool sport participation per 100K in WC (2014/15-2016/17); filter by years - count participants by year (high- or primary school); Primary and High School - count of participants, educators/volunteers trained to assist, Neighboring school participantsLearner retention rate in WC (2004-2020); filter by years - retention rate Grade 10 to 12 by year; retention count Grade 1 to 12 by year; retention rate by year for Grade 10 to 11 and Grade 11 to 12Count of learners in no-fee schools (2012/13-2020/21); filter by years - count of learners in no-fee public ordinary schools by year; count of no-fee learners, count of learners, % learners benefitting from no-fee by yearPublication Date20 January 2023LineageData from various education resources used to create dynamic dashboards reflecting the Outcome Indicators as in the Outcome Indicator Release: Percentage of Grade 3 learners in the Western Cape achieving a pass rate for Mathematics systemic tests; Language systemic testsPercentage of Grade 6 learners in the Western Cape achieving a pass rate for Mathematics systemic tests; Language systemic testsPercentage of Grade 9 learners in the Western Cape achieving a pass rate for Mathematics systemic tests; Language systemic testsMatric pass rate achievedMatric pass rate achieved in the Western CapePercentage of high schools attaining a 60% or higher pass rate for the matric examinationsPercentage of high schools attaining a 60% or higher pass rate for the matric examinations in the Western CapeMathematics pass rate; Physical Science pass rateMathematics pass rate; Physical Science pass rate in the Western CapeUniversity admission eligibility rate for learners completing grade 12University admission eligibility rate for learners completing grade 12 in the Western CapeLearner retention rate between Grade 8 and 12; and Grade 10 and12Percentage of learners who complete Grade 12 out of learners who entered Grade 10 two years prior in the Western CapeThe number of learners in no fee schools or benefitting from fee-exemptionPercentage of population (aged 25 years and older) who have completed a tertiary qualificationPercentage of population (aged 25 and older) who have completed a tertiary qualification in the Western CapePercentage of population (aged 20 and older) who have completed matric or equivalent in the Western Cape; Grade 7 or equivalent (literacy rate)Data SourcesWCED Annual Report 2018/19-2019/20, WCED APP 2018/19-2019/20;WCED Media Release January 2020 and March 2022;NSC Examination Report 2017-2021; DBE;Table reproduced from WCED Annual Performance Plan 2022/23; retention rates are own calculations based on tableGHS 2016-2020

  16. g

    GCSE entries and results (pupils in Year 11/pupils aged 15) by subject group...

    • statswales.gov.wales
    json
    Updated Dec 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2024). GCSE entries and results (pupils in Year 11/pupils aged 15) by subject group [Dataset]. https://statswales.gov.wales/Catalogue/Education-and-Skills/Schools-and-Teachers/Examinations-and-Assessments/Key-Stage-4/gcseentriesandresultspupilsaged15only-by-subjectgroup
    Explore at:
    jsonAvailable download formats
    Dataset updated
    Dec 2024
    Description

    This table covers data published in the Welsh Government's annual "Examination Results" release. It provides information on the number of GCSE entries into each subject group and the percentage of those entries achieving each GCSE grade. For more information see the Weblinks. Note that this year, the definition of this table has changed. This table now includes entries taken in previous years, and discounted exams are excluded. This is so that the table is consistent with the rest of the key performance indicators. Figures should be treated with caution - it is possible for pupils to have entered more than one exam within a small number of subject groups.

  17. GCSE and equivalent results: 2013 to 2014 (revised)

    • gov.uk
    Updated Mar 12, 2015
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Department for Education (2015). GCSE and equivalent results: 2013 to 2014 (revised) [Dataset]. https://www.gov.uk/government/statistics/revised-gcse-and-equivalent-results-in-england-2013-to-2014
    Explore at:
    Dataset updated
    Mar 12, 2015
    Dataset provided by
    GOV.UKhttp://gov.uk/
    Authors
    Department for Education
    Description

    Information on the achievements in GCSE examinations and other qualifications of young people.

    This typically covers those starting the academic year aged 15.

    The information is taken from data collated for the 2014 secondary school performance tables.

    Attainment statistics team

    Email mailto:Attainment.STATISTICS@education.gov.uk">Attainment.STATISTICS@education.gov.uk

    Telephone: Raffaele Sasso 07469 413 581

  18. Income and expenditure in academies in England: 2011 to 2012

    • gov.uk
    Updated Oct 29, 2013
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Department for Education (2013). Income and expenditure in academies in England: 2011 to 2012 [Dataset]. https://www.gov.uk/government/statistics/income-and-expenditure-in-academies-in-england-academic-year-2011-to-2012
    Explore at:
    Dataset updated
    Oct 29, 2013
    Dataset provided by
    GOV.UKhttp://gov.uk/
    Authors
    Department for Education
    Area covered
    England
    Description

    We are publishing, for the second time, academies’ income and expenditure data. However, this is the first statistical first release (SFR) to cover data on the income and expenditure of academies in England. It has been produced in response to the Department for Education’s commitment to publish academy trusts’ financial data in a form that is comparable with the publication of local authority (LA) maintained schools data - consistent financial reporting (CFR).

    Alongside this SFR, the academic year 2011 to 2012 data has also been published in Excel format, as was done last year, but now with improved benchmarking capability to make it possible for academies to benchmark themselves against each other. We are also publishing the raw data file so that people can carry out further analysis themselves. Topline attainment indicators from the 2012 performance tables have been included in these tables. They are: the percentage of pupils achieving level 4 or above in both English and mathematics at key stage 2 and the percentage of pupils achieving 5+ A* to C GCSEs (or equivalent), including English and maths GCSEs.

    The SFR presents information on the income and expenditure in academies in England, using data from the benchmarking section of the academic year 2011 to 2012 accounts returns, completed by each academy trust for the period ending 31 August 2012 (generally the academic year September 2011 to August 2012). Included in the publication, for the first time, will be information on the income and expenditure of the first free schools that opened in September 2011.

    Throughout this release, we have used the term ‘academy’ to mean ‘academy trust’, which is defined to include the following entities:

    • sponsored academies
    • converter academies
    • free schools
    • university technical colleges
    • city technology colleges
    • special academies
    • studio schools

    There has been considerable progress in aligning the benchmarking return (accounts return) dataset and the LA-maintained schools data - CFR - however, it remains that they are not directly comparable for a number of reasons including that academies receive additional funding to reflect their wider responsibilities and that the CFR relates to funding allocated and spent within a standard financial year - April to March. Academies, and the accounts return, work on a financial and academic year of September to August.

    All schools and academies work to achieve the best outcomes for their pupils and must use their resources effectively to do this. By publishing academies’ spend data alongside attainment data and other contextual information, we want to help academies to see if they are delivering value for money and equip parents with the information they need to ask questions of schools. We want to encourage people - and the academies themselves - to look at their spending, including that spending compared to other academies, so that they can ask questions about spending decisions and identify areas where there is scope to improve value for money.

    To make meaningful comparisons between academies, it is important to consider the percentage of children eligible for free school meals, the type of academy (including whether it is a primary or secondary academy) and whether it is in London or not. This is because all these factors will affect how much an academy spends.

    This publication was updated in October 2013 to include data from academy trusts that did not provide the Education Funding Agency (EFA) with their benchmarking return (accounts return) in time for inclusion in the original publication.

    The 4 files attached are: statistical first release, an Excel workbook which holds all of the academies’ income and expenditure data and the raw data, a user guide and a pre-release access list.

    Academies financial benchmarking team

    Email mailto:finance.statistics@education.gov.uk">finance.statistics@education.gov.uk

  19. Neighbourhood statistics in England: academic year 2011 to 2012

    • gov.uk
    Updated Jun 20, 2013
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Department for Education (2013). Neighbourhood statistics in England: academic year 2011 to 2012 [Dataset]. https://www.gov.uk/government/statistics/neighbourhood-statistics-small-area-pupil-attainment-and-absence-by-pupil-characteristics-in-england-academic-year-2010-to-2011
    Explore at:
    Dataset updated
    Jun 20, 2013
    Dataset provided by
    GOV.UKhttp://gov.uk/
    Authors
    Department for Education
    Area covered
    England
    Description

    The tables provide 2012 information on pupil residency-based small area pupil attainment (early years foundation stage profile (EYFSP) and key stages 1, 2, 4 and 5) broken down by gender, free school meal (FSM) eligibility (key stages 2 and 4 only) and ethnicity (key stages 2 and 4 only). The tables also provide information on pupil residency-based pupil absence broken down by gender for the 2011 to 2012 academic year.

    The key points from the latest release are:

    • at EYFSP, girls consistently perform better than boys in almost all LADs

    • at key stage 1, key stage 2, key stage 4 and key stage 5, girls also outperform boys in all but a small number of LADs

    • overall, all other pupils perform significantly better than FSM pupils in all areas of the country, this performance gap can be seen at both key stage 2 and key stage 4

    • Chinese pupils continue to have the highest levels of attainment in all regions for key stage 2 and key stage 4

    • black pupils have some of the lowest levels of attainment across the country at these 2 key stages

    • levels of persistent absence among pupils in secondary schools are generally higher among pupils living in the north of the country than those living in the south
    • there is smaller variation across the regions for pupils in primary schools with the Yorkshire and the Humber having the highest level of persistent absence and the South East the lowest

    • in all schools, pupils in Yorkshire and the Humber have the highest level of unauthorised absence (1.1% of half days missed), but not the highest level of overall absence which is in the North East (5.3% of half days missed)
    • pupils in London have the lowest levels of overall absence (4.8% of half days missed) but a relatively high level of unauthorised absence (1% of half days missed) compared to pupils in other regions

    Download formats http://www.neighbourhood.statistics.gov.uk/dissemination/LeadHome.do?m=0&s=1371649586709&enc=1&nsjs=true&nsck=false&nssvg=false&nswid=1276" class="govuk-link">www.neighbourhood.statistics.gov.uk

    Additional information on maps above:

    • map: early years foundation stage profile by local authority district - percentage of pupils in all schools and early years’ settings achieving a good level of development by local authority district (of pupil residence), 2012

    • map: early years foundation stage profile by middle layer super output area - percentage of pupils in all schools and early years’ settings achieving a good level of development by middle layer super output area (of pupil residence), 2012

    • map: key stage 1 average point score by local authority district - average point score of pupils in maintained schools by local authority district (of pupil residence), 2012

    • map: key stage 1 average point score by middle layer super output area - average point score of pupils in maintained schools by middle layer super output area (of pupil residence), 2012

    • map: key stage 2 by local authority district - percentage of pupils in maintained schools achieving level 4 or above in reading, writing and maths combined by local authority district (of pupil residence), 2012

    • map: key stage 2 by middle layer super output area - percentage of pupils in maintained schools achieving level 4 or above in reading, writing and maths combined by middle layer super output area (of pupil residence), 2012

    • map: key stage 4 - 5+ GCSEs at grades A* to C including English and mathematics by local authority district - percentage of pupils in maintained schools achieving 5 A* to C grades at GCSE or equivalent including English and mathematics GCSEs by local authority district (of pupil residence), 2012

    • map: key stage 4 - 5+ GCSEs at grades A* to C including English and mathematics by middle layer super output area - percentage of pupils in maintained schools achieving 5 A* to C grades at GCSE or equivalent including English and mathematics GCSEs by middle layer super output area (of pupil residence), 2012

    • map: key stage 4 English Baccalaureate by local authority district - percentage of pupils in maintained schools achieving the English Baccalaureate by local authority district (of pupil residence), 2012

    • map: key stage 5 - 2 or more passes by local authority district - percentage of students achieving 2 or more passes of A level equivalent size in maintained schools and further education sector colleges by local authority district (of student residence), 2012

    • map: key stage 5 avera

  20. National curriculum assessments: key stage 2, 2012 (provisional)

    • gov.uk
    Updated Sep 20, 2012
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Department for Education (2012). National curriculum assessments: key stage 2, 2012 (provisional) [Dataset]. https://www.gov.uk/government/statistics/national-curriculum-assessments-at-key-stage-2-in-england-2012
    Explore at:
    Dataset updated
    Sep 20, 2012
    Dataset provided by
    GOV.UKhttp://gov.uk/
    Authors
    Department for Education
    Description

    Reference Id: SFR19/2012

    Publication Type: Statistical First Release

    Publication data: Local Authority data

    Local Authority data: LA data

    Region: England

    Release Date: 20 September 2012

    Coverage status: Provisional

    Publication Status: Published

    This statistical first release (SFR) provides information on provisional key stage 2 national curriculum writing and science sample test results for pupils (typically aged 11) in schools in England in 2012.This SFR also provides provisional figures on expected progress between key stage 1 (typically age 7) and key stage 2. This release provides information at national, regional and local authority level.

    Two former SFRs, “Interim results for key stage 2 and 3 national curriculum assessments in England” and “Interim percentage of pupils making expected progress in English and mathematics between key stage 1 and key stage 2 in England” have been combined to produce this SFR, enabling a more comprehensive and coherent evaluation of pupils’ achievements at key stage 2 to be presented.

    This release will not include statistics based on key stage 3 core teacher assessment data. The KS3 teacher assessment data will be published in October alongside GCSE and equivalent examination results to create a single statistical release for secondary school attainment.

    There are significant changes to key stage 2 assessment arrangements for 2012 that affect this release:

    • Lord Bew’s review recommended that writing composition should be subject only to summative teacher assessment. It is no longer a requirement for all schools to administer a writing test and submit these for external marking. As a result measures based on writing teacher assessment have been introduced for the first time.
    • A representative sample of schools (approximately 1,500) was required to administer the writing tests and submit them for external marking. This data is used to publish an estimate of national attainment in the writing test.
    • A measure of overall attainment in English is published based on reading tests and writing teacher assessment results.
    • Level 6 tests in reading and mathematics were made available to schools this year, alongside an external marking service and we have incorporated these results into this release.

    Since 2010 KS2 science tests for the whole cohort have been discontinued. Schools are still required to provide teacher assessments, which are reported in this release. To continue to monitor national standards in science at the end of KS2, externally-marked science sampling was introduced. These results are only valid at national level and are reported in this statistical release.

    Emma Sass - Attainment Statistics Team
    0207 340 8357

    attainment.statistics@education.gsi.gov.uk

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Department for Education (2014). Secondary school performance tables in England: 2012 to 2013 [Dataset]. https://www.gov.uk/government/statistics/secondary-school-performance-tables-in-england-2012-to-2013
Organization logo

Secondary school performance tables in England: 2012 to 2013

Explore at:
Dataset updated
Jan 23, 2014
Dataset provided by
GOV.UKhttp://gov.uk/
Authors
Department for Education
Area covered
England
Description

The secondary school performance tables show:

  • attainment results for pupils at the end of key stage 4

  • key stage 2 to 4 progress measures in English and mathematics

  • how the performance of deprived pupils compares against other pupils in the school

  • any differences in the performance of low-attaining pupils, high-attaining pupils, and pupils performing at expected levels

Additional data on schools is available, including information on the expenditure of each maintained school open for the full 2012 to 2013 financial year. The expenditure data shows spend-per-pupil statistics for a wide range of expenditure categories, including:

  • funding and income
  • education staff spend
  • learning resources and curriculum spend

The school spend data also includes:

  • information about the school (such as the proportion of pupils in the school eligible for free school meals)
  • headline key stage 4 performance data
  • comparisons against the local authority and national averages
  • the numbers of teachers, teaching assistants and other school staff

It also provides:

  • the pupil-to-teacher ratio
  • the mean gross salary of full-time teachers
  • information on the characteristics of the pupils attending the school
  • pupil absence data

for each school.

http://www.education.gov.uk/schools/performance/" class="govuk-link">Performance tables

Attainment statistics team

Email mailto:Attainment.STATISTICS@education.gov.uk">Attainment.STATISTICS@education.gov.uk

Telephone: Raffaele Sasso 07469 413 581

Search
Clear search
Close search
Google apps
Main menu