Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Open Science in (Higher) Education – data of the February 2017 survey
This data set contains:
Survey structure
The survey includes 24 questions and its structure can be separated in five major themes: material used in courses (5), OER awareness, usage and development (6), collaborative tools used in courses (2), assessment and participation options (5), demographics (4). The last two questions include an open text questions about general issues on the topics and singular open education experiences, and a request on forwarding the respondent’s e-mail address for further questionings. The online survey was created with Limesurvey[1]. Several questions include filters, i.e. these questions were only shown if a participants did choose a specific answer beforehand ([n/a] in Excel file, [.] In SPSS).
Demographic questions
Demographic questions asked about the current position, the discipline, birth year and gender. The classification of research disciplines was adapted to general disciplines at German higher education institutions. As we wanted to have a broad classification, we summarised several disciplines and came up with the following list, including the option “other” for respondents who do not feel confident with the proposed classification:
The current job position classification was also chosen according to common positions in Germany, including positions with a teaching responsibility at higher education institutions. Here, we also included the option “other” for respondents who do not feel confident with the proposed classification:
We chose to have a free text (numerical) for asking about a respondent’s year of birth because we did not want to pre-classify respondents’ age intervals. It leaves us options to have different analysis on answers and possible correlations to the respondents’ age. Asking about the country was left out as the survey was designed for academics in Germany.
Remark on OER question
Data from earlier surveys revealed that academics suffer confusion about the proper definition of OER[2]. Some seem to understand OER as free resources, or only refer to open source software (Allen & Seaman, 2016, p. 11). Allen and Seaman (2016) decided to give a broad explanation of OER, avoiding details to not tempt the participant to claim “aware”. Thus, there is a danger of having a bias when giving an explanation. We decided not to give an explanation, but keep this question simple. We assume that either someone knows about OER or not. If they had not heard of the term before, they do not probably use OER (at least not consciously) or create them.
Data collection
The target group of the survey was academics at German institutions of higher education, mainly universities and universities of applied sciences. To reach them we sent the survey to diverse institutional-intern and extern mailing lists and via personal contacts. Included lists were discipline-based lists, lists deriving from higher education and higher education didactic communities as well as lists from open science and OER communities. Additionally, personal e-mails were sent to presidents and contact persons from those communities, and Twitter was used to spread the survey.
The survey was online from Feb 6th to March 3rd 2017, e-mails were mainly sent at the beginning and around mid-term.
Data clearance
We got 360 responses, whereof Limesurvey counted 208 completes and 152 incompletes. Two responses were marked as incomplete, but after checking them turned out to be complete, and we added them to the complete responses dataset. Thus, this data set includes 210 complete responses. From those 150 incomplete responses, 58 respondents did not answer 1st question, 40 respondents discontinued after 1st question. Data shows a constant decline in response answers, we did not detect any striking survey question with a high dropout rate. We deleted incomplete responses and they are not in this data set.
Due to data privacy reasons, we deleted seven variables automatically assigned by Limesurvey: submitdate, lastpage, startlanguage, startdate, datestamp, ipaddr, refurl. We also deleted answers to question No 24 (email address).
References
Allen, E., & Seaman, J. (2016). Opening the Textbook: Educational Resources in U.S. Higher Education, 2015-16.
First results of the survey are presented in the poster:
Heck, Tamara, Blümel, Ina, Heller, Lambert, Mazarakis, Athanasios, Peters, Isabella, Scherp, Ansgar, & Weisel, Luzian. (2017). Survey: Open Science in Higher Education. Zenodo. http://doi.org/10.5281/zenodo.400561
Contact:
Open Science in (Higher) Education working group, see http://www.leibniz-science20.de/forschung/projekte/laufende-projekte/open-science-in-higher-education/.
[1] https://www.limesurvey.org
[2] The survey question about the awareness of OER gave a broad explanation, avoiding details to not tempt the participant to claim “aware”.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Population: Education Level: College & Higher: Shanghai data was reported at 3.541 Person th in 2021. This records a decrease from the previous number of 8,424.214 Person th for 2020. Population: Education Level: College & Higher: Shanghai data is updated yearly, averaging 3.959 Person th from Dec 1982 (Median) to 2021, with 26 observations. The data reached an all-time high of 8,424.214 Person th in 2020 and a record low of 1.273 Person th in 1997. Population: Education Level: College & Higher: Shanghai data remains active status in CEIC and is reported by National Bureau of Statistics. The data is categorized under China Premium Database’s Socio-Demographic – Table CN.GA: Population: Sample Survey: Level of Education: By Region.
Description: The Educational Attainment Thematic Report is compiled using data from the Labour Force Survey (LFS). It is a household survey which replaced the Quarterly National Household Survey (QNHS) at the beginning of Q3 2017. The LFS is the official source of quarterly labour force estimates for Ireland including the official rates of employment and unemployment. Questions on educational attainment are included in the core LFS questionnaire each quarter. The Educational Attainment Thematic Report presents the LFS data for adults between 18 and 64 years old with differing levels of educational attainment based on these questions.This data provides a summary of the annual results of education attainment levels across the regional geographies in IrelandGeography available in RDM: State, Regional Assembly and Strategic Planning Area (SPA).Source: CSO Educational Attainment Thematic ReportWeblink: https://www.cso.ie/en/releasesandpublications/ep/p-eda/educationalattainmentthematicreport2021/Date of last source data update: February 2025Update Schedule: Annual
Open Database License (ODbL) v1.0https://www.opendatacommons.org/licenses/odbl/1.0/
License information was derived automatically
Overall educational attainment measures the highest level of education attained by a given individual: for example, an individual counted in the percentage of the measured population with a master’s or professional degree can be assumed to also have a bachelor’s degree and a high school diploma, but they are not counted in the population percentages for those two categories. Overall educational attainment is the broadest education indicator available, providing information about the measured county population as a whole.
Only members of the population aged 25 and older are included in these educational attainment estimates, sourced from the U.S. Census Bureau American Community Survey (ACS).
Champaign County has high educational attainment: over 48 percent of the county's population aged 25 or older has a bachelor's degree or graduate or professional degree as their highest level of education. In comparison, the percentage of the population aged 25 or older in the United States and Illinois with a bachelor's degree in 2023 was 21.8% (+/-0.1) and 22.8% (+/-0.2), respectively. The population aged 25 or older in the U.S. and Illinois with a graduate or professional degree in 2022, respectively, was 14.3% (+/-0.1) and 15.5% (+/-0.2).
Educational attainment data was sourced from the U.S. Census Bureau’s American Community Survey 1-Year Estimates, which are released annually.
As with any datasets that are estimates rather than exact counts, it is important to take into account the margins of error (listed in the column beside each figure) when drawing conclusions from the data.
Due to the impact of the COVID-19 pandemic, instead of providing the standard 1-year data products, the Census Bureau released experimental estimates from the 1-year data in 2020. This includes a limited number of data tables for the nation, states, and the District of Columbia. The Census Bureau states that the 2020 ACS 1-year experimental tables use an experimental estimation methodology and should not be compared with other ACS data. For these reasons, and because data is not available for Champaign County, no data for 2020 is included in this Indicator.
For interested data users, the 2020 ACS 1-Year Experimental data release includes a dataset on Educational Attainment for the Population 25 Years and Over.
Sources: U.S. Census Bureau; American Community Survey, 2023 American Community Survey 1-Year Estimates, Table S1501; generated by CCRPC staff; using data.census.gov; (16 October 2024).; U.S. Census Bureau; American Community Survey, 2022 American Community Survey 1-Year Estimates, Table S1501; generated by CCRPC staff; using data.census.gov; (29 September 2023).; U.S. Census Bureau; American Community Survey, 2021 American Community Survey 1-Year Estimates, Table S1501; generated by CCRPC staff; using data.census.gov; (6 October 2022).; U.S. Census Bureau; American Community Survey, 2019 American Community Survey 1-Year Estimates, Table S1501; generated by CCRPC staff; using data.census.gov; (4 June 2021).; U.S. Census Bureau; American Community Survey, 2018 American Community Survey 1-Year Estimates, Table S1501; generated by CCRPC staff; using data.census.gov; (4 June 2021).; U.S. Census Bureau; American Community Survey, 2017 American Community Survey 1-Year Estimates, Table S1501; generated by CCRPC staff; using American FactFinder; (13 September 2018).; U.S. Census Bureau; American Community Survey, 2016 American Community Survey 1-Year Estimates, Table S1501; generated by CCRPC staff; using American FactFinder; (13 September 2018). U.S. Census Bureau; American Community Survey, 2015 American Community Survey 1-Year Estimates, Table S1501; generated by CCRPC staff; using American FactFinder; (19 September 2016).; U.S. Census Bureau; American Community Survey, 2014 American Community Survey 1-Year Estimates, Table S1501; generated by CCRPC staff; using American FactFinder; (16 March 2016).; U.S. Census Bureau; American Community Survey, 2013 American Community Survey 1-Year Estimates, Table S1501; generated by CCRPC staff; using American FactFinder; (16 March 2016).; U.S. Census Bureau; American Community Survey, 2012 American Community Survey 1-Year Estimates, Table S1501; generated by CCRPC staff; using American FactFinder; (16 March 2016).; U.S. Census Bureau; American Community Survey, 2011 American Community Survey 1-Year Estimates, Table S1501; generated by CCRPC staff; using American FactFinder; (16 March 2016).; U.S. Census Bureau; American Community Survey, 2010 American Community Survey 1-Year Estimates, Table S1501; generated by CCRPC staff; using American FactFinder; (16 March 2016).; U.S. Census Bureau; American Community Survey, 2009 American Community Survey 1-Year Estimates, Table S1501; generated by CCRPC staff; using American FactFinder; (16 March 2016).; U.S. Census Bureau; American Community Survey, 2008 American Community Survey 1-Year Estimates, Table S1501; generated by CCRPC staff; using American FactFinder; (16 March 2016).; U.S. Census Bureau; American Community Survey, 2007 American Community Survey 1-Year Estimates, Table S1501; generated by CCRPC staff; using American FactFinder; (16 March 2016).; U.S. Census Bureau; American Community Survey, 2006 American Community Survey 1-Year Estimates, Table S1501; generated by CCRPC staff; using American FactFinder; (16 March 2016).; U.S. Census Bureau; American Community Survey, 2005 American Community Survey 1-Year Estimates, Table S1501; generated by CCRPC staff; using American FactFinder; (16 March 2016).
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Most social surveys collect data on respondents’ educational attainment. Current measurement practice involves a closed question with country-specific response options, which are needed because of the differences between educational systems. However, these are quite difficult to compare across countries. This is a challenge for both migrant and international surveys. Therefore, a measurement tool for educational attainment that was initially developed for German migrant surveys in the CAMCES project (Schneider, Briceno-Rosas, Herzing, et al. 2018; Schneider, Briceno-Rosas, Ortmanns, et al. 2018) was extended in the SERISS-project in work package 8, Task 8.3. In deliverable D8.8, we provide a database of educational qualifications and levels for 100 countries, including the definition of a search tree interface to facilitate the navigation of categories for respondents in computer-assisted surveys. All country-specific categories are linked to 3-digit codes of UNESCO's International Standard Classification of Education 2011 for Educational Attainment (ISCED-A), as well as to the education coding scheme used in the European Social Survey (ESS), "edulvlb". A live search of the database via two different interfaces, a search box (for a limited set of countries) and a search tree (for all countries), is available at the surveycodings website at https://surveycodings.org/articles/codings/levels-of-education. The search box and search tree can be implemented in survey questionnaires and thereby be used for respondents’ self-classification in computer-assisted surveys. The live search feature can also be used for post-coding open answers in already collected data.
https://www.icpsr.umich.edu/web/ICPSR/studies/2213/termshttps://www.icpsr.umich.edu/web/ICPSR/studies/2213/terms
The consolidated (CN) survey form was used for the first time in 1990-1991 to collect information from a subset of the 10,500 postsecondary institutions in the Integrated Postsecondary Education Data System (IPEDS) universe. IPEDS collects information on such topics as institutional characteristics, enrollments, completions, finance, staff, and libraries. All schools in the IPEDS universe were asked to complete an institutional characteristics form. Approximately 3,600 institutions of higher education (i.e., those that are accredited at the college level by an agency recognized by the Secretary, United States Department of Education) plus another 400 nonaccredited schools that grant a bachelor's, master's, doctoral, or first-professional degree were asked to complete the full complement of IPEDS surveys. Of the remaining 6,500 postsecondary schools who were eligible to receive the Consolidated (CN) survey form, 2,998 schools were sent the form, of which 2,472 responded. The following data were requested from institutions using the CN survey form: (1) fall enrollment for 1990, by racial/ethnic category and sex of student, (2) completions for the 1989-1990 academic year, by field of study and award level and by racial/ethnic category and sex of recipient, (3) financial statistics for fiscal year 1990, including revenue/tuition fees, expenditures/scholarship, and other expenditures, and (4) selected data on libraries on total FTE staff, operating expenses, and total circulation transactions. The single CN form was substituted for the four more detailed IPEDS surveys that were sent to the accredited/degree-granting institutions. Data for the CN respondent institutions appear on both the CN file and on the individual IPEDS adapt files for the fall enrollment, completions, finance, and libraries surveys.
https://www.icpsr.umich.edu/web/ICPSR/studies/7634/termshttps://www.icpsr.umich.edu/web/ICPSR/studies/7634/terms
This data collection contains information gathered in the Survey of Income and Education (SIE) conducted in April-July 1976 by the Census Bureau for the United States Department of Health, Education, and Welfare (HEW). Although national estimates of the number of children in poverty were available each year from the Census Bureau's Current Population Survey (CPS), those estimates were not statistically reliable on a state-by-state basis. In enacting the Educational Amendments of 1974, Congress mandated that HEW conduct a survey to obtain reliable state-by-state data on the numbers of school-age children in local areas with family incomes below the federal poverty level. This was the statistic that determined the amount of grant a local educational agency was entitled to under Title 1, Elementary and Secondary Education Act of 1965. (Such funds were distributed by HEW's Office of Education.) The SIE was the survey created to fulfill that mandate. Its questions include those used in the Current Population Survey regarding current employment, past work experience, and income. Additional questions covering school enrollment, disability, health insurance, bilingualism, food stamp recipiency, assets, and housing costs enabled the study of the poverty concept and of program effectiveness in reaching target groups. Basic household information also was recorded, including tenure of unit (a determination of whether the occupants of the living quarters owned, rented, or occupied the unit without rent), type of unit, household language, and for each member of the household: age, sex, race, ethnicity, marital history, and education.
The Service Delivery Indicators (SDI) are a set of health and education indicators that examine the effort and ability of staff and the availability of key inputs and resources that contribute to a functioning school or health facility. The indicators are standardized allowing comparison between and within countries over time.
The Education SDIs include teacher effort, teacher knowledge and ability, and the availability of key inputs (for example, textbooks, basic teaching equipment, and infrastructure such as blackboards and toilets). The indicators provide a snapshot of the learning environment and the key resources necessary for students to learn.
Kenya's Service Delivery Indicators Education Survey was implemented in May-July 2012 by the Economic Policy Research Center and Kimetrica, in close coordination with the World Bank SDI team. The data were collected from a stratified random sample of 239 public and 67 private schools to provide a representative snapshot of the learning environment in both public and private schools. The survey assessed the knowledge of 1,679 primary school teachers, surveyed 2,960 teachers for an absenteeism study, and observed 306 grade 4 lessons. In addition, learning outcomes were measured for almost 3,000 grade 4 students.
National
Schools, teachers, students.
All primary schools
Sample survey data [ssd]
The sampling strategy for SDI surveys is designed towards attaining indicators that are accurate and representative at the national level, as this allows for proper cross-country (i.e. international benchmarking) and across time comparisons, when applicable. In addition, other levels of representativeness are sought to allow for further disaggregation (rural/urban areas, public/private facilities, subregions, etc.) during the analysis stage.
The sampling strategy for SDI surveys follows a multistage sampling approach. The main units of analysis are facilities (schools and health centers) and providers (health and education workers: teachers, doctors, nurses, facility managers, etc.). In the case of education, SDI surveys also aim to produce accurate information on grade four pupils’ performance through a student assessment. The multistage sampling approach makes sampling procedures more practical by dividing the selection of large populations of sampling units in a step-by-step fashion. After defining the sampling frame and categorizing it by stratum, a first stage selection of sampling units is carried out independently within each stratum. Often, the primary sampling units (PSU) for this stage are cluster locations (e.g. districts, communities, counties, neighborhoods, etc.) which are randomly drawn within each stratum with a probability proportional to the size (PPS) of the cluster (measured by the location’s number of facilities, providers or pupils). Once locations are selected, a second stage takes place by randomly selecting facilities within location (either with equal probability or with PPS) as secondary sampling units. At a third stage, a fixed number of health and education workers and pupils are randomly selected within facilities to provide information for the different questionnaire modules.
Detailed information about the specific sampling process conducted for the 2012 Kenya Education SDI is available in the SDI Country Report (“SDI-Report-Kenya”) included as part of the documentation that accompanies these datasets.
Face-to-face [f2f]
The SDI Education Survey Questionnaire consists of six modules:
Module 1: School Information - Administered to the head of the school to collect information on school type, facilities, school governance, pupil numbers, and school hours. It includes direct observations of school infrastructure by enumerators.
Module 2a: Teacher Absence and Information - Administered to the headteacher and individual teachers to obtain a list of all school teachers, to measure teacher absence, and to collect information on teacher characteristics.
Module 2b: Teacher Absence and Information - Unannounced visit to the school to assess the absence rate.
Module 3: School Finances - Administered to the headteacher to collect information on school finances (this data is unharmonized)
Module 4: Classroom Observation - An observation module to assess teaching activities and classroom conditions.
Module 5: Pupil Assessment - A test of pupils to have a measure of pupil learning outcomes in mathematics and language in grade four. The test is carried out orally and one-on-one with each student by the enumerator.
Module 6: Teacher Assessment - A test of teachers covering mathematics and language subject knowledge and teaching skills.
Data entry was done using CSPro; quality control was performed in Stata.
At the national level, an anticipated standard error of 1.6 percentage points for absenteeism, and 4.4 percentage points for pupil literacy were calculated. At the county level, an anticipated standard error of 3.1 percent for absenteeism and 9.0 percent for literacy were estimated.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Population: Education Level: Primary: Guizhou data was reported at 12.761 Person th in 2023. This records an increase from the previous number of 12.469 Person th for 2022. Population: Education Level: Primary: Guizhou data is updated yearly, averaging 14.117 Person th from Dec 1982 (Median) to 2023, with 30 observations. The data reached an all-time high of 15,352.997 Person th in 2000 and a record low of 8.758 Person th in 2014. Population: Education Level: Primary: Guizhou data remains active status in CEIC and is reported by National Bureau of Statistics. The data is categorized under China Premium Database’s Socio-Demographic – Table CN.GA: Population: Sample Survey: Level of Education: By Region.
The Service Delivery Indicators (SDI) are a set of health and education indicators that examine the effort and ability of staff and the availability of key inputs and resources that contribute to a functioning school or health facility. The indicators are standardized allowing comparison between and within countries over time.
The Education SDIs include teacher effort, teacher knowledge and ability, and the availability of key inputs (for example, textbooks, basic teaching equipment, and infrastructure such as blackboards and toilets). The indicators provide a snapshot of the learning environment and the key resources necessary for students to learn.
Nigeria Service Delivery Indicators Education Survey was implemented in 2013 by the World Bank and the Research Triangle Institute International. The survey implementation was preceded by consultations with stakeholders in Nigeria to adapt instruments to the country context while maintaining comparability across countries. In addition, the implementation was done with close collaboration with the Universal Basic Education Commission, and in close coordination with the relevant state authorities (i.e. State Ministries of Education, and the State Universal Education Boards where they existed). Data was collected from primary schools in four states (Anambra, Bauchi, Ekiti, and Niger) using personal interviews and provider assessments. A total of 760 randomly selected public and private schools (190 per state) were surveyed, with 2,435 and 5,754 teachers assessed for knowledge and effort respectively. The sample was selected to make the survey representative at the State level, allowing for disaggregation by provider type (private/public) and location (rural/urban).
Four states: Anambra, Bauchi, Ekiti, and Niger.
Schools, teachers, students.
All primary schools.
Sample survey data [ssd]
The sampling strategy was designed aiming to produce state representative estimates and estimating a proportion with an absolute error of three percentage points for a variable proportion of 0.5 (i.e., has highest variance) with 95 percent degree of confidence per state (equal number used for state).
The strata were constructed according to ownership, urban/rural, and socioeconomic poverty status. The allocation was made in proportion to size for each sub-stratum within public and private. Within strata, simple random sampling was used. Finally, replacement schools were preselected, with a predetermined replacement order within strata.
A total of 190 schools were sampled from each of the four states (Anambra, Bauchi, Ekiti, and Niger).
The target population is all public primary-level school children. Since parts of the school questionnaire were administered to teachers and pupils at the grade four level, all public schools with at least one grade four class formed the sampling frame. The sample frame was created using the list of public schools from UBEC (Universal Basic Education Commission) and private schools from states.
None.
Face-to-face [f2f]
The SDI Education Survey Questionnaire consists of six modules:
Module 1: School Information - Administered to the head of the school to collect information on school type, facilities, school governance, pupil numbers, and school hours. It includes direct observations of school infrastructure by enumerators.
Module 2a: Teacher Absence and Information - Administered to the headteacher and individual teachers to obtain a list of all schoolteachers, to measure teacher absence and to collect information on teacher characteristics (this module was not included in this dataset).
Module 2b: Teacher Absence and Information - Unannounced visit to the school to assess absence rate.
Module 3: School Finances - Administered to the headteacher to collect information on school finances (this data is unharmonized).
Module 4: Classroom Observation - An observation module to assess teaching activities and classroom conditions.
Module 5: Pupil Assessment - A test of pupils to have a measure of pupil learning outcomes in mathematics and language in grade four.
Module 6: Teacher Assessment - A test of teachers covering mathematics and language subject knowledge and teaching skills.
Data entry was done using CSPro; quality control was performed in Stata.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
SID23 - Highest Level of Education of individuals aged 25-59 years. Published by Central Statistics Office. Available under the license Creative Commons Attribution 4.0 (CC-BY-4.0).Highest Level of Education of individuals aged 25-59 years...
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
SID22 - Highest Level of Education Attained by Either Parent. Published by Central Statistics Office. Available under the license Creative Commons Attribution 4.0 (CC-BY-4.0).Highest Level of Education Attained by Either Parent...
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Population: Education Level: Primary: Ningxia data was reported at 1.959 Person th in 2022. This records a decrease from the previous number of 1.991 Person th for 2021. Population: Education Level: Primary: Ningxia data is updated yearly, averaging 1.673 Person th from Dec 1982 (Median) to 2022, with 29 observations. The data reached an all-time high of 1,880.672 Person th in 2020 and a record low of 1.336 Person th in 2019. Population: Education Level: Primary: Ningxia data remains active status in CEIC and is reported by National Bureau of Statistics. The data is categorized under China Premium Database’s Socio-Demographic – Table CN.GA: Population: Sample Survey: Level of Education: By Region.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Population: Education Level: City: Senior Middle: Female data was reported at 53.248 Person th in 2023. This records an increase from the previous number of 52.622 Person th for 2022. Population: Education Level: City: Senior Middle: Female data is updated yearly, averaging 39.957 Person th from Dec 1997 (Median) to 2023, with 27 observations. The data reached an all-time high of 54,438.780 Person th in 2020 and a record low of 27.660 Person th in 1999. Population: Education Level: City: Senior Middle: Female data remains active status in CEIC and is reported by National Bureau of Statistics. The data is categorized under China Premium Database’s Socio-Demographic – Table CN.GA: Population: Sample Survey: Level of Education.
Abstract copyright UK Data Service and data collection copyright owner.
The purpose of this study was to investigate the factors which influence young people in their demand for higher education in its various forms - at universities, colleges of education (teacher training colleges), polytechnics and colleges of further education. Six of these eight surveys are the main study which was carried out on (a) the schools and the fifth-formers and the sixth-formers in them, and (b) the colleges of further education and their home students studying A' level subjects full-time.</p><p><br>
The material from the young people includes that given by them at two stages, first from the main survey which took place before they sat GCE examinations and before the results of higher education applications were available and secondly, from the follow-up survey after the results of the GCE examinations were known and the young people already embarked on courses the following session. For the fifth and sixth-form surveys (67001, 67002 and 68005) there is also incorporated the form teachers' broad assessment of ability (three categories) examination prospects and higher education and career aspirations. For the schools the main survey was carried out in the Spring term 1967 with the follow-up in the autumn. The equivalent dates in the colleges of further education were May 1967 and January 1968.</p><p><br>
(The remaining two surveys are subsidiary to the project; 66023 is the pilot stage of the main survey part of 68004, i.e. home students studying
A' levels full-time in the further education colleges, whilst 67005 (fifth-formers in the fast stream in schools) comprises a sub-set of material from the main fifth-form survey for an enlarged sample of those pupils in schools with fast streams).
The six surveys in the main study are interlinked with information from the school or college complementing that from the pupil or student. In addition there is standardisation - as far as was practicable - between sections of the questionnaire used for the fifth-formers, lower and upper sixth-formers and students in further education (e.g. general background). The contents of the questionnaire for the upper sixth-formers and further education students corresponded particularly closely. Copies of all reports on the surveys are in the Library of the Royal Statistical Society. Mainly they deal with specific aspects of the data e.g. 'Subject commitments and the demand for higher education', G. A. Barnard and M. D. McCreath (1970) Journal of the Royal Statistical Society Series A (General) 133 (3) 358 - 408, 'Report of the surveys of full-time 'A' level students (home) in colleges of further education', by M. D. McCreath (1970). All the material which is available is listed in the most recent report written in 1972, Factors influencing choice of higher education: surveys carried out by Margaret D McCreath under the direction of Professor G A Barnard, Department of Mathematics, University of Essex. This 1972 report includes data from both the school and further education surveys. The extensive tables are based on the following variables: social class, expectations about leaving school and reasons for doing so, source of the most useful discussion on what to do after school, family experience of higher education, O' and
A' level attempts and passes, knowledge of higher education entry requirements and with whom these were discussed, as well as intended and actual destinations in higher education.
The technical note on the sample design by Judith Doherty was published in 1970 as Appendix 1 of Volume 1 of the Schools Council Sixth-Form Survey, Sixth-Form Pupils and Teachers. Details of the response rates are given in the 1972 report mentioned above.
This survey covered advice on careers and education after 'A' level:
whether there is a formal system for the dissemination of such advice,
and whether this was instigated to meet a demand from students. A
record was made of the literature held on careers and higher education
establishments and how this was made available. Information gathered
included:
where the majority of students go - or expect to go - after 'A' levels
(according to sex and subject cohort); how and when they are advised by
the college about closing dates for courses beyond 'A' level; advice
students seem to need on which university course to include on the UCCA
form; whether the Youth Employment Officer or Careers Advisory Officer
is the main official source of advice for students; whether the
principal or representative of any college department with full-time 'A'
level students has been invited to a university conference of
headmasters and headmistresses.
General information includes: position of respondent on college staff;
official classification of college by type; whether college receives any
part of its fund from a source other than...
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The State Education Contextual Data Resource (S-ECDR) is a historical dataset that compiles state-level indicators of public education systems in the United States from 1919/20 through 1973/74. The dataset includes measures related to public school financing, teacher characteristics, school and classroom contexts, and segregation and desegregation in the U.S. South. Data were drawn from four historical sources: the Biennial Surveys of Education, the Statistics of State School Systems, a 1967 Southern Education Reporting Service report, and U.S. Census Abstracts. The dataset was created to support research on how early-life education contexts influence long-term outcomes in adulthood, particularly for cohorts who attended school during a period of significant expansion in U.S. public education. S-ECDR includes indicators that enable comparisons of state-level education investment, teacher workforce composition, and access to education across time and geographic region. The resource is designed to facilitate linkage to individual-level surveys containing state and year identifiers, enabling analysis of how historical education environments shaped later-life well-being.
The Service Delivery Indicators (SDI) are a set of health and education indicators that examine the effort and ability of staff and the availability of key inputs and resources that contribute to a functioning school or health facility. The indicators are standardized allowing comparison between and within countries over time.
The Education SDIs include teacher effort, teacher knowledge and ability, and the availability of key inputs (for example, textbooks, basic teaching equipment, and infrastructure such as blackboards and toilets). The indicators provide a snapshot of the learning environment and the key resources necessary for students to learn.
Madagascar Service Delivery Indicators Education Survey was implemented from April 2016 (for enumerator training and pre-testing of the instruments) to May and June 2016 (for fieldwork and data collection) by CAETIC Development, a strong local think-tank and survey firm. The sampling strategy was done by INSTAT the national institute for statistics. Information was collected from 473 primary schools, 2,130 teachers (for skills assessment), 2,475 teachers (for absence rate), and 3,960 pupils across Madagascar. The survey also collected basic information on all the 3,049 teachers or staff that teach in the 473 primary schools visited or are non-teaching directors.
National
Schools, teachers, students.
Sample survey data [ssd]
A two-stage sampling method was adopted. First, in each stratum schools were chosen within the selected councils. Once at a selected school, the enumerator selected teachers and pupils depending on the structure of the classrooms.
The schools were chosen using probability proportional to size (PPS), where size was the number of standard two pupils as provided by the 2014 EMIS database. As for the selection of the cluster, the use of PPS implied that each standard four pupil within a stratum had an equal probability for her school to be selected.
Finally, within each school, up to 10 standard four pupils and 10 teachers were selected. Pupils were randomly selected among the grade-four pupil body, whereas for teachers, there were two different procedures for measuring absence rate and assessing knowledge. For absence rate, 10 teachers were randomly selected from the teachers’ roster and the whereabouts of those teachers was ascertained in a return surprise visit. For the knowledge assessment, however, all teachers who were currently teaching in primary four or taught primary three the previous school year were included in the sample. Then a random number of teachers in upper grades were included to top up the sample. These procedures implied that pupils across strata, as well as teachers across strata and within a school (for the knowledge assessment) did not all have the same probability of selection. It was, therefore, warranted to compute weights for reporting the survey results.
The sampling strategy for the SDI in Madagascar was done by INSTAT the national statistics office.
Face-to-face [f2f]
The SDI Education Survey Questionnaire consists of six modules:
Module 1: School Information - Administered to the head of the school to collect information about school type, facilities, school governance, pupil numbers, and school hours. Includes direct observations of school infrastructure by enumerators.
Module 2a: Teacher Absence and Information - Administered to headteacher and individual teachers to obtain a list of all school teachers, to measure teacher absence, and to collect information about teacher characteristics.
Module 2b: Teacher Absence and Information - Unannounced visit to the school to assess absence rate.
Module 3: School Finances - Administered to the headteacher to collect information about school finances (this data is unharmonized).
Module 4: Classroom Observation - An observation module to assess teaching activities and classroom conditions.
Module 5: Pupil Assessment - A test of pupils to have a measure of pupil learning outcomes in mathematics and language in grade four.
Module 6: Teacher Assessment - A test of teachers covering mathematics and language subject knowledge and teaching skills.
Data quality control was performed in Stata.
Madagascar had low school enrollment rates: only 60% of the urban children and 12% of the rural children completed primary school (World Bank, 2002). To improve the enrollment and completion rates as well as the quality of education, Madagascar government had substantially increased investments in the education sector. It committed itself to the Education For All (EFA) initiative and started to fully subsidize the tuition fees through the so-called "caisse ecole," and to provide school kits for all students in public primary schools. The Government also raised the districts' budgets for school material and started distributing free textbooks to schools.
This study investigated the different resource flows in the financing of the public primary education sector in Madagascar.
The survey was conducted in two rounds. The first round was carried out in October-November 2006 and the second round in April-May 2007. The study was implemented using stratified random sampling. Data from more than 200 schools in 28 districts was analyzed.
Public Expenditure Tracking Survey among Madagascar health care facilities and workers was conducted at the same time with PETS in Education.
Provinces: Antananarivo, Fianarantsoa, Toamasina, Mahajanga, Toliara and Antsiranana.
Sample survey data [ssd]
The study was conducted using stratified random sampling.
The stratified sample was set up in such a way to be representative at the national level. Madagascar has 22 regions and 111 districts, and at least one district was visited in each region. Two districts were selected in the six largest regions. Hence, 28 districts were visited in total. The selected districts were obtained through random selection, giving greater (less) weight to districts with more (less) public primary schools within the district. In each district, three communes were randomly selected, giving greater weight to the communes with more schools. Within each commune, three public primary schools were randomly selected. By ranking schools from large to small and ensuring that a school was picked out of each tercile, a representative sample of school sizes was chosen.
In order to track different resource flows from the decentralized district facility levels to the schools, surveys were organized at district and school levels. At the local level, the directors of the education facility as well as teachers were interviewed independently. To ensure compatibility, the surveys at district and school levels were held at the same time.
In total, 252 schools were visited. Due to closure of some schools during either the first or the second round (or both rounds), researchers ended up with reliable panel data on 229 schools. Finally, it is noteworthy that the second round of data collection faced greater challenges to collect the data and therefore results were less robust for the second round. (For example, researchers faced mismatching codes for some schools and health centers as well as personnel codes between both rounds; there were a lot of blank entries on school equipment and other line items in the second round, and there were more cases where collected information on certain budgets did not add up to the total of those budgets as reported by the enumerators in the second round).
Overall, in the province of Antananarivo 54 schools were visited, 63 schools were visited in Fianarantsoa, 36 - in Toamasina, 45 - in Mahajanga, 36 - in Toliara and 18 - in Antsiranana.
Face-to-face [f2f]
The following survey instruments are available:
Detailed information about data editing procedures is available in "Data Cleaning Guide for PETS/QSDS Surveys" in external resources.
STATA cleaning do-files and data quality reports can also be found in external resources.
https://www.icpsr.umich.edu/web/ICPSR/studies/2081/termshttps://www.icpsr.umich.edu/web/ICPSR/studies/2081/terms
This study consists of data on earned degrees and other awards conferred by institutions of higher education in the United States and its outlying areas. Part of the Higher Education General Information Survey (HEGIS) Series, this survey provides complete data on earned degrees for the nation, the states, and individual institutions, which are widely used by planners and researchers. Data are provided for professional degrees, baccalaureate and higher degrees, and subbaccalaureate degrees awarded. Additional data specify number of degrees granted by level of degree, institutional control and type, academic disciplines and specialty, student enrollment, and state. Demographic items specify sex and race of recipients.
https://fred.stlouisfed.org/legal/#copyright-public-domainhttps://fred.stlouisfed.org/legal/#copyright-public-domain
Graph and download economic data for Unemployment Level - Bachelor's Degree and Higher, 45 to 54 years (CGRAU4554) from Jan 2000 to Jun 2025 about 45 to 54 years, tertiary schooling, education, household survey, unemployment, and USA.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Open Science in (Higher) Education – data of the February 2017 survey
This data set contains:
Survey structure
The survey includes 24 questions and its structure can be separated in five major themes: material used in courses (5), OER awareness, usage and development (6), collaborative tools used in courses (2), assessment and participation options (5), demographics (4). The last two questions include an open text questions about general issues on the topics and singular open education experiences, and a request on forwarding the respondent’s e-mail address for further questionings. The online survey was created with Limesurvey[1]. Several questions include filters, i.e. these questions were only shown if a participants did choose a specific answer beforehand ([n/a] in Excel file, [.] In SPSS).
Demographic questions
Demographic questions asked about the current position, the discipline, birth year and gender. The classification of research disciplines was adapted to general disciplines at German higher education institutions. As we wanted to have a broad classification, we summarised several disciplines and came up with the following list, including the option “other” for respondents who do not feel confident with the proposed classification:
The current job position classification was also chosen according to common positions in Germany, including positions with a teaching responsibility at higher education institutions. Here, we also included the option “other” for respondents who do not feel confident with the proposed classification:
We chose to have a free text (numerical) for asking about a respondent’s year of birth because we did not want to pre-classify respondents’ age intervals. It leaves us options to have different analysis on answers and possible correlations to the respondents’ age. Asking about the country was left out as the survey was designed for academics in Germany.
Remark on OER question
Data from earlier surveys revealed that academics suffer confusion about the proper definition of OER[2]. Some seem to understand OER as free resources, or only refer to open source software (Allen & Seaman, 2016, p. 11). Allen and Seaman (2016) decided to give a broad explanation of OER, avoiding details to not tempt the participant to claim “aware”. Thus, there is a danger of having a bias when giving an explanation. We decided not to give an explanation, but keep this question simple. We assume that either someone knows about OER or not. If they had not heard of the term before, they do not probably use OER (at least not consciously) or create them.
Data collection
The target group of the survey was academics at German institutions of higher education, mainly universities and universities of applied sciences. To reach them we sent the survey to diverse institutional-intern and extern mailing lists and via personal contacts. Included lists were discipline-based lists, lists deriving from higher education and higher education didactic communities as well as lists from open science and OER communities. Additionally, personal e-mails were sent to presidents and contact persons from those communities, and Twitter was used to spread the survey.
The survey was online from Feb 6th to March 3rd 2017, e-mails were mainly sent at the beginning and around mid-term.
Data clearance
We got 360 responses, whereof Limesurvey counted 208 completes and 152 incompletes. Two responses were marked as incomplete, but after checking them turned out to be complete, and we added them to the complete responses dataset. Thus, this data set includes 210 complete responses. From those 150 incomplete responses, 58 respondents did not answer 1st question, 40 respondents discontinued after 1st question. Data shows a constant decline in response answers, we did not detect any striking survey question with a high dropout rate. We deleted incomplete responses and they are not in this data set.
Due to data privacy reasons, we deleted seven variables automatically assigned by Limesurvey: submitdate, lastpage, startlanguage, startdate, datestamp, ipaddr, refurl. We also deleted answers to question No 24 (email address).
References
Allen, E., & Seaman, J. (2016). Opening the Textbook: Educational Resources in U.S. Higher Education, 2015-16.
First results of the survey are presented in the poster:
Heck, Tamara, Blümel, Ina, Heller, Lambert, Mazarakis, Athanasios, Peters, Isabella, Scherp, Ansgar, & Weisel, Luzian. (2017). Survey: Open Science in Higher Education. Zenodo. http://doi.org/10.5281/zenodo.400561
Contact:
Open Science in (Higher) Education working group, see http://www.leibniz-science20.de/forschung/projekte/laufende-projekte/open-science-in-higher-education/.
[1] https://www.limesurvey.org
[2] The survey question about the awareness of OER gave a broad explanation, avoiding details to not tempt the participant to claim “aware”.