Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Open Science in (Higher) Education – data of the February 2017 survey
This data set contains:
Survey structure
The survey includes 24 questions and its structure can be separated in five major themes: material used in courses (5), OER awareness, usage and development (6), collaborative tools used in courses (2), assessment and participation options (5), demographics (4). The last two questions include an open text questions about general issues on the topics and singular open education experiences, and a request on forwarding the respondent’s e-mail address for further questionings. The online survey was created with Limesurvey[1]. Several questions include filters, i.e. these questions were only shown if a participants did choose a specific answer beforehand ([n/a] in Excel file, [.] In SPSS).
Demographic questions
Demographic questions asked about the current position, the discipline, birth year and gender. The classification of research disciplines was adapted to general disciplines at German higher education institutions. As we wanted to have a broad classification, we summarised several disciplines and came up with the following list, including the option “other” for respondents who do not feel confident with the proposed classification:
The current job position classification was also chosen according to common positions in Germany, including positions with a teaching responsibility at higher education institutions. Here, we also included the option “other” for respondents who do not feel confident with the proposed classification:
We chose to have a free text (numerical) for asking about a respondent’s year of birth because we did not want to pre-classify respondents’ age intervals. It leaves us options to have different analysis on answers and possible correlations to the respondents’ age. Asking about the country was left out as the survey was designed for academics in Germany.
Remark on OER question
Data from earlier surveys revealed that academics suffer confusion about the proper definition of OER[2]. Some seem to understand OER as free resources, or only refer to open source software (Allen & Seaman, 2016, p. 11). Allen and Seaman (2016) decided to give a broad explanation of OER, avoiding details to not tempt the participant to claim “aware”. Thus, there is a danger of having a bias when giving an explanation. We decided not to give an explanation, but keep this question simple. We assume that either someone knows about OER or not. If they had not heard of the term before, they do not probably use OER (at least not consciously) or create them.
Data collection
The target group of the survey was academics at German institutions of higher education, mainly universities and universities of applied sciences. To reach them we sent the survey to diverse institutional-intern and extern mailing lists and via personal contacts. Included lists were discipline-based lists, lists deriving from higher education and higher education didactic communities as well as lists from open science and OER communities. Additionally, personal e-mails were sent to presidents and contact persons from those communities, and Twitter was used to spread the survey.
The survey was online from Feb 6th to March 3rd 2017, e-mails were mainly sent at the beginning and around mid-term.
Data clearance
We got 360 responses, whereof Limesurvey counted 208 completes and 152 incompletes. Two responses were marked as incomplete, but after checking them turned out to be complete, and we added them to the complete responses dataset. Thus, this data set includes 210 complete responses. From those 150 incomplete responses, 58 respondents did not answer 1st question, 40 respondents discontinued after 1st question. Data shows a constant decline in response answers, we did not detect any striking survey question with a high dropout rate. We deleted incomplete responses and they are not in this data set.
Due to data privacy reasons, we deleted seven variables automatically assigned by Limesurvey: submitdate, lastpage, startlanguage, startdate, datestamp, ipaddr, refurl. We also deleted answers to question No 24 (email address).
References
Allen, E., & Seaman, J. (2016). Opening the Textbook: Educational Resources in U.S. Higher Education, 2015-16.
First results of the survey are presented in the poster:
Heck, Tamara, Blümel, Ina, Heller, Lambert, Mazarakis, Athanasios, Peters, Isabella, Scherp, Ansgar, & Weisel, Luzian. (2017). Survey: Open Science in Higher Education. Zenodo. http://doi.org/10.5281/zenodo.400561
Contact:
Open Science in (Higher) Education working group, see http://www.leibniz-science20.de/forschung/projekte/laufende-projekte/open-science-in-higher-education/.
[1] https://www.limesurvey.org
[2] The survey question about the awareness of OER gave a broad explanation, avoiding details to not tempt the participant to claim “aware”.
https://www.icpsr.umich.edu/web/ICPSR/studies/33321/termshttps://www.icpsr.umich.edu/web/ICPSR/studies/33321/terms
The University of Washington - Beyond High School (UW-BHS) project surveyed students in Washington State to examine factors impacting educational attainment and the transition to adulthood among high school seniors. The project began in 1999 in an effort to assess the impact of I-200 (the referendum that ended Affirmative Action) on minority enrollment in higher education in Washington. The research objectives of the project were: (1) to describe and explain differences in the transition from high school to college by race and ethnicity, socioeconomic origins, and other characteristics, (2) to evaluate the impact of the Washington State Achievers Program, and (3) to explore the implications of multiple race and ethnic identities. Following a successful pilot survey in the spring of 2000, the project eventually included baseline and one-year follow-up surveys (conducted in 2002, 2003, 2004, and 2005) of almost 10,000 high school seniors in five cohorts across several Washington school districts. The high school senior surveys included questions that explored students' educational aspirations and future career plans, as well as questions on family background, home life, perceptions of school and home environments, self-esteem, and participation in school related and non-school related activities. To supplement the 2000, 2002, and 2003 student surveys, parents of high school seniors were also queried to determine their expectations and aspirations for their child's education, as well as their own educational backgrounds and fields of employment. Parents were also asked to report any financial measures undertaken to prepare for their child's continued education, and whether the household received any form of financial assistance. In 2010, a ten-year follow-up with the 2000 senior cohort was conducted to assess educational, career, and familial outcomes. The ten year follow-up surveys collected information on educational attainment, early employment experiences, family and partnership, civic engagement, and health status. The baseline, parent, and follow-up surveys also collected detailed demographic information, including age, sex, ethnicity, language, religion, education level, employment, income, marital status, and parental status.
The Education Experience Survey and Literacy Assessment was conducted in Shefa Province, Vanuatu in April, 2011 for Ni-Vanuatu aged from 15 to 60 years. The full report analyses in detail the results of the survey and literacy assessment and highlights correlations between respondents’ educational experience and their literacy levels, employment and income. The survey was aimed at rural Shefa Province so did not cover the capital Port Vila.
The survey and literacy assessment instrument and methodology has been designed to collect accurate and staitsially significant information about education and language experience and also assess acutul literacy levels at the provincial, village and individual level.
The results provide accurate, statistically significant primary data about the education experience of Ni-Vanuatu in Shefa Province.
Shefa Province, Vanuatu not including Port Vila. Eight out of the total of 15 islands within the Province mapping were randomly selected.
The villages surveyed were located on islands of Efate, Lelepa, Nguna, Emau, Emae, Buninga, Tongoa, and Laman Island (Epi). The villages in which the survey took place were Mele, Emua, Takara/Sara, Ekipe, Pangpang, Eton, Teoma, Etas, Lelepa,Utanlang, Taloa, Marou, Wiana, Buninga, Tongamea, Sangava, Euta, Matangi/Itakoma, Lumbukiti and Laman.
Household Individual (Ni-Vanuatu aged from 15 to 60 years).
The survey covered all people who normally resided in a selected household, between the ages of 15 and 60 years (inclusive).
Sample survey data [ssd]
The survey was conducted in households in randomly selected rural communities across 8 systematically selected islands out of the 15 islands of Shefa Province. Port Vila was consciously not included, but such a similar exercise in Port Vila would also be very worthwhile. Eight out of the total of 15 islands within the Province mapping were randomly selected. All people who normally resided in a selected household, between the ages of 15 and 60 years (inclusive), were invited to participate in the survey.
The literacy assessment questions were addressed only to respondents who declared an ability to read one of the official languages - English, French or Bislama. With regard to the sampling methodology, great care was taken to ensure that statistically significant results were obtained. The minimum required sample size was calculated using 2009 National Census population figures that indicated the total target population - those people between the ages of 15 to 60 - to be 57,174. The required sample size was 2.36% of the total population, meaning that the number of respondents required was 1,350 people.
This minimum sample size was then used to guide the number of households that needed to be surveyed. It was assumed that a household would typically contain at least three eligible people (15-60 years). As such, it was planned that 20 villages, with 30 households within each village, and an average of three people per household should be interviewed.
There was no deviation from sample design.
Face-to-face [f2f]
The survey instrument contains five sections as follows: 1. Individual profile 2. Education experience 3. Language experience 4. Literacy assessment 5. Employment experience
The Individual Profile section of the survey was designed to capture information about the respondents’ gender and age, to allow disaggregation analysis. The first section of the survey also included questions relating to the respondents’ number of children, sources of information used in the previous month, and the respondents’ attitudes to literacy and education.
The second and third parts of the survey were designed to capture information about the respondents’ educational and language experience. The questions in the second part of the survey, explored the education history of the individual, including the highest level of schooling attended and attained, as well as reasons behind non-completion where appropriate.
The third part of the survey questionnaire explored respondents’ language preferences in different situations, and asked respondents to self-declare their literacy status.
The fourth part of the survey is the literacy assessment, which was administered to those participants who self-declared an ability to read one of the three official languages - English, French or Bislama. Therefore, those respondents who indicated in Part 3 that they could read easily, or read some of their preferred official language, participated in the literacy assessment. In contrast, those respondents who indicated that they could not read one of the official languages, did not undertake the literacy assessment and were classified as nonliterate.
The fifth part of the survey looked at the employment experience of respondents. It was designed to extract information about individuals’ participation in the formal economy through cash -paying employment.
The survey results were encoded using the Census & Survey Processing System (CSPro) and the data was analysed using the Statistical Package for the Social Sciences (SPSS). For further explanatory notes on the survey analysis, see Appendix C of the External Resource entitled Vanuatu Rural ShefaProvince Education Experience Survey and Literacy Assessment Report
100% response rate.
The required sample size was 2.36% of the total population, meaning that the number of respondents required was 1,350 however of the1,350 households selected for the sample, in Shefa Province 1475 interviews were conducted, which is above the minimum sample size of 1,350 people. The survey sample comprised 628 males (42.6%) and 846 females (57.4%). All respondents were between the ages of 15 and 60 years, so as to encompass both the youth and adult demographic.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Most social surveys collect data on respondents’ educational attainment. Current measurement practice involves a closed question with country-specific response options, which are needed because of the differences between educational systems. However, these are quite difficult to compare across countries. This is a challenge for both migrant and international surveys. Therefore, a measurement tool for educational attainment that was initially developed for German migrant surveys in the CAMCES project (Schneider, Briceno-Rosas, Herzing, et al. 2018; Schneider, Briceno-Rosas, Ortmanns, et al. 2018) was extended in the SERISS-project in work package 8, Task 8.3. In deliverable D8.8, we provide a database of educational qualifications and levels for 100 countries, including the definition of a search tree interface to facilitate the navigation of categories for respondents in computer-assisted surveys. All country-specific categories are linked to 3-digit codes of UNESCO's International Standard Classification of Education 2011 for Educational Attainment (ISCED-A), as well as to the education coding scheme used in the European Social Survey (ESS), "edulvlb". A live search of the database via two different interfaces, a search box (for a limited set of countries) and a search tree (for all countries), is available at the surveycodings website at https://surveycodings.org/articles/codings/levels-of-education. The search box and search tree can be implemented in survey questionnaires and thereby be used for respondents’ self-classification in computer-assisted surveys. The live search feature can also be used for post-coding open answers in already collected data.
Historical Census data (2006, 2011, 2016 and 2021) on highest certificate, diploma or degree of visible minority groups, including percentages.
https://www.icpsr.umich.edu/web/ICPSR/studies/4029/termshttps://www.icpsr.umich.edu/web/ICPSR/studies/4029/terms
The National Science Foundation (NSF) Surveys of Public Attitudes monitored the general public's attitudes toward and interest in science and technology. In addition, the survey assessed levels of literacy and understanding of scientific and environmental concepts and constructs, how scientific knowledge and information were acquired, attentiveness to public policy issues, and computer access and usage. Since 1979, the survey was administered at regular intervals (occurring every two or three years), producing 11 cross-sectional surveys through 2001. Data for Part 1 (Survey of Public Attitudes Multiple Wave Data) were comprised of the survey questionnaire items asked most often throughout the 22-year survey series and account for approximately 70 percent of the original questions asked. Data for Part 2, General Social Survey Subsample Data, combine the 1983-1999 Survey of Public Attitudes data with a subsample from the 2002 General Social Survey (GSS) (GENERAL SOCIAL SURVEYS, 1972-2002: [CUMULATIVE FILE] [ICPSR 3728]) and focus solely on levels of education and computer access and usage. Variables for Part 1 include the respondents' interest in new scientific or medical discoveries and inventions, space exploration, military and defense policies, whether they voted in a recent election, if they had ever contacted an elected or public official about topics regarding science, energy, defense, civil rights, foreign policy, or general economics, and how they felt about government spending on scientific research. Respondents were asked how they received information concerning science or news (e.g., via newspapers, magazines, or television), what types of television programming they watched, and what kind of magazines they read. Respondents were asked a series of questions to assess their understanding of scientific concepts like DNA, probability, and experimental methods. Respondents were also asked if they agreed with statements concerning science and technology and how they affect everyday living. Respondents were further asked a series of true and false questions regarding science-based statements (e.g., the center of the Earth is hot, all radioactivity is manmade, electrons are smaller than atoms, the Earth moves around the sun, humans and dinosaurs co-existed, and human beings developed from earlier species of animals). Variables for Part 2 include highest level of math attained in high school, whether the respondent had a postsecondary degree, field of highest degree, number of science-based college courses taken, major in college, household ownership of a computer, access to the World Wide Web, number of hours spent on a computer at home or at work, and topics searched for via the Internet. Demographic variables for Parts 1 and 2 include gender, race, age, marital status, number of people in household, level of education, and occupation.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
School enrollment data are used to assess the socioeconomic condition of school-age children. Government agencies also require these data for funding allocations and program planning and implementation.
Data on school enrollment and grade or level attending were derived from answers to Question 10 in the 2015 American Community Survey (ACS). People were classified as enrolled in school if they were attending a public or private school or college at any time during the 3 months prior to the time of interview. The question included instructions to “include only nursery or preschool, kindergarten, elementary school, home school, and schooling which leads to a high school diploma, or a college degree.” Respondents who did not answer the enrollment question were assigned the enrollment status and type of school of a person with the same age, sex, race, and Hispanic or Latino origin whose residence was in the same or nearby area.
School enrollment is only recorded if the schooling advances a person toward an elementary school certificate, a high school diploma, or a college, university, or professional school (such as law or medicine) degree. Tutoring or correspondence schools are included if credit can be obtained from a public or private school or college. People enrolled in “vocational, technical, or business school” such as post secondary vocational, trade, hospital school, and on job training were not reported as enrolled in school. Field interviewers were instructed to classify individuals who were home schooled as enrolled in private school. The guide sent out with the mail questionnaire includes instructions for how to classify home schoolers.
Enrolled in Public and Private School – Includes people who attended school in the reference period and indicated they were enrolled by marking one of the questionnaire categories for “public school, public college,” or “private school, private college, home school.” The instruction guide defines a public school as “any school or college controlled and supported primarily by a local, county, state, or federal government.” Private schools are defined as schools supported and controlled primarily by religious organizations or other private groups. Home schools are defined as “parental-guided education outside of public or private school for grades 1-12.” Respondents who marked both the “public” and “private” boxes are edited to the first entry, “public.”
Grade in Which Enrolled – From 1999-2007, in the ACS, people reported to be enrolled in “public school, public college” or “private school, private college” were classified by grade or level according to responses to Question 10b, “What grade or level was this person attending?” Seven levels were identified: “nursery school, preschool;” “kindergarten;” elementary “grade 1 to grade 4” or “grade 5 to grade 8;” high school “grade 9 to grade 12;” “college undergraduate years (freshman to senior);” and “graduate or professional school (for example: medical, dental, or law school).”
In 2008, the school enrollment questions had several changes. “Home school” was explicitly included in the “private school, private college” category. For question 10b the categories changed to the following “Nursery school, preschool,” “Kindergarten,” “Grade 1 through grade 12,” “College undergraduate years (freshman to senior),” “Graduate or professional school beyond a bachelor’s degree (for example: MA or PhD program, or medical or law school).” The survey question allowed a write-in for the grades enrolled from 1-12.
Question/Concept History – Since 1999, the ACS enrollment status question (Question 10a) refers to “regular school or college,” while the 1996-1998 ACS did not restrict reporting to “regular” school, and contained an additional category for the “vocational, technical or business school.” The 1996-1998 ACS used the educational attainment question to estimate level of enrollment for those reported to be enrolled in school, and had a single year write-in for the attainment of grades 1 through 11. Grade levels estimated using the attainment question were not consistent with other estimates, so a new question specifically asking grade or level of enrollment was added starting with the 1999 ACS questionnaire.
Limitation of the Data – Beginning in 2006, the population universe in the ACS includes people living in group quarters. Data users may see slight differences in levels of school enrollment in any given geographic area due to the inclusion of this population. The extent of this difference, if any, depends on the type of group quarters present and whether the group quarters population makes up a large proportion of the total population. For example, in areas that are home to several colleges and universities, the percent of individuals 18 to 24 who were enrolled in college or graduate school would increase, as people living in college dormitories are now included in the universe.
Description: The Educational Attainment Thematic Report is compiled using data from the Labour Force Survey (LFS). It is a household survey which replaced the Quarterly National Household Survey (QNHS) at the beginning of Q3 2017. The LFS is the official source of quarterly labour force estimates for Ireland including the official rates of employment and unemployment. Questions on educational attainment are included in the core LFS questionnaire each quarter. The Educational Attainment Thematic Report presents the LFS data for adults between 18 and 64 years old with differing levels of educational attainment based on these questions.This data provides a summary of the annual results of education attainment levels across the regional geographies in IrelandGeography available in RDM: State, Regional Assembly and Strategic Planning Area (SPA).Source: CSO Educational Attainment Thematic ReportWeblink: https://www.cso.ie/en/releasesandpublications/ep/p-eda/educationalattainmentthematicreport2021/Date of last source data update: February 2025Update Schedule: Annual
https://lida.dataverse.lt/api/datasets/:persistentId/versions/2.2/customlicense?persistentId=hdl:21.12137/UP41CPhttps://lida.dataverse.lt/api/datasets/:persistentId/versions/2.2/customlicense?persistentId=hdl:21.12137/UP41CP
The purpose of the study: to investigate teachers' attitudes towards students' and bachelors' readiness for higher education and to identify the factors that influence the quality of readiness. Major investigated questions: respondents were asked about the school they work in and whether they like being a teacher. Relationships with administrative staff, other teachers, students, and students' parents were assessed. Working conditions at the school and the work of the administrative staff were also evaluated. Questions were asked about how the public evaluates the teaching profession. After a block of questions, respondent was asked to describe its school. It was clarified which students the respondents pay more attention to. Respondents were asked to rate their school's approach to learning this year. Respondents were also given the opportunity to name what they would like to change first in their school. It was asked if bullying was prevalent in the school. Furthermore, it was asked to rate the fact that students hire tutors or attend special classes to prepare for exams. Respondents' attitudes towards additional training with tutors or attending courses to prepare for the matriculation exams were asked. The questionnaire block was intended to clarify the importance of additional support from tutors or attending special courses for higher education. Views were sought on the proportion of students in this year's program at the respondents' school who engage in additional learning, and why students usually engage tutors or attend courses in preparation for matriculation. Respondents were asked to respond if they were a tutor for high school graduates this school year and to indicate the number of students. Respondents were asked if the preparation of high school graduates for the maturity exam has changed in the last five years. It goes on to explain what is most important for the first-year student to be ready for college matriculation. After a block of questions, students were asked to rate their school's readiness for this year's college readiness. It was clarified how many respondents paid attention to students' career orientation in school and what students' learning outcomes depended more on. It was asked whether freshers' motivation should be assessed when entering a higher education institution and how it should be assessed. Respondents also assessed the current situation in general education and higher education and commented on current and planned changes in higher education. Clarification was sought on how graduates should be admitted to the humanities program and how many state matriculation examinations should be passed in order to enter a university. The goal was to find out what the minimum score for a state matriculation exam should be in order for graduates to be admitted to a college. Finally, it asked who should receive public funding for college. Socio-demographic characteristics: age, gender, place of residence, whether respondent have completed schooling, education, how many years respondent have been working as a teacher, qualification category of teacher, when respondent upgraded qualification, number of hours worked per week, income. This survey was conducted at the initiative of the Research and Higher Education Monitoring and Analysis Centre (MOSTA). On January 1, 2019, MOSTA was reorganized into the Government Strategic Analysis Center (STRATA).
Despite the many useful studies on the use of Open Educational Resources (OER) in higher education, most are focused on the activity of students and instructors in the Global North who enjoy comparatively higher levels of economic development, educational provision, policy elaboration, and technological access than those in the Global South – the region where OER is touted as having its potentially greatest impact.This dataset arises from a study focusing on higher education instructors and students in South America, Sub-Saharan Africa, and South and Southeast Asia. Based on a cross-regional survey of 295 instructors at 28 universities in nine countries, this research seeks to establish a baseline of empirical data for assessing OER awareness and use in the Global South.The overarching research questions that this study set out to answer are:1. What proportion of instructors in the Global South have ever used OER?2. Which variables may account for different OER usage rates between respondents in the Global South?In order to address these questions, survey responses were correlated against the question (26) of the survey which directly addresses OER usage: “Have you ever used OER that are available in the public domain or has an open license (e.g. Creative Commons) that allows it to be used and/or adapted by others?”A core purpose of the overarching ROER4D project is the development of an empirical baseline of OER and Open Educational Practice (OEP) activity in the Global South. OER itself is a novel concept, and is tied to a broader spectrum of OEP that overlap with, but do not always exactly coincide with, formal OER practice. As such, an investigation into the use, reuse, adaptation, and sharing practices performed by higher education instructors, and the digital infrastructure and foundational literacies that underpin these practices (regardless of their knowledge of formal OER activity) is integral in ascertaining baseline practice.This dataset includes responses by instructors who engage in reuse and sharing activities, irrespective of whether they have consciously used OER in their practice. As such, it offers insights into the practices that exist outside of formally-labelled OER production. Dimension2 of the survey instrument “Educational Resources” is framed around general practice relating to sharing, use, reuse, creation, and licensing of educational materials, rather than OER per se. Data arising from these responses are to be treated with caution in terms of making inferences around OER, but remain useful in terms of gaining a more informed sense of instructors’ everyday practice.The survey was conducted in four languages (English, Spanish, Portuguese, and Bahasa Melayu); as such, four research instruments were originally produced and four sets of microdata collected. The microdata have been translated into English, and only the English instrument and the aggregated, translated instructor-response microdata is included here. The student-response microdata is not part of this dataset.The dataset is considered to be of potential interest to OER scholars, practitioners, and policy-makers, as it seeks to provide a useful cross-regional comparison of various aspects of OER adoption.This dataset was first published by DataFirst.
https://www.icpsr.umich.edu/web/ICPSR/studies/2413/termshttps://www.icpsr.umich.edu/web/ICPSR/studies/2413/terms
The principal purposes of this national longitudinal study of the higher education system in the United States are to describe the characteristics of new college freshmen and to explore the effects of college on students. For each wave of this survey, each student completes a questionnaire during freshman orientation or registration that asks for information on academic skills and preparation, high school activities and experiences, educational and career plans, majors and careers, student values, and financing college. Other questions elicit demographic information, including sex, age, parental education and occupation, household income, race, religious preference, and state of birth. Specific questions asked of respondents in the 1979 survey included type of high school, total of expenses the students expected to receive from different sources, questions regarding the Basic Educational Opportunity Grant (BEOG) and Guaranteed Student Loan (GSL), students' life patterns, and the best estimate of students' parents' income during the past year.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset was obtained as part of a research on the use of SMART complexes in vocational and professional higher education institutions. The data was collected through an online survey of pedagogical staff conducted in November-December 2024. The questionnaire included questions about the experience of introducing digital tools (SMART complexes) into the educational process, as well as the specifics of their use in crisis conditions.
Research objectives:
To assess the level of SMART complexes integration into the educational process.
Identify key barriers to their implementation.
Identify the needs for professional development of teachers to work with digital tools.
Determine the impact of SMART complexes on ensuring a personalised learning pace for students.
Methodology:
Data collection tool: Google Forms.
Type of survey: Online survey.
Target audience: Teachers of vocational and professional pre-higher education institutions of Ukraine.
Data collection period: 12 November - 23 December 2024.
Number of respondents: 4645 people.
The dataset includes:
Information on the work experience, type of educational institution and the need for additional training of pedagogical staff.
Respondents' answers to the question about the level of SMART complexes use.
Information on the impact on the learning pace and its stability during power outages or air raids.
A description of the main barriers to the implementation of digital tools.
List of aspects of the Vocational Education 4.0 concept implemented in the educational institution.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
In addition to respondents’ highest educational qualification, some surveys also collect data on their main field of education. Current measurement practice involves either a closed question with highly aggregated response categories, which are difficult to use for respondents, or an open question, requiring expensive post-coding. Therefore, a measurement tool for fields of education was developed in the SERISS-project in work package 8, Task 8.3. In deliverable D8.9 we provide a database of fields of education and training in 34 languages, including the definition of a search tree interface to facilitate navigation of categories for respondents. All 120 standard categories and classification codes are taken from UNESCO's International Standard Classification of Education for Fields of Education and Training (ISCED-F). For most languages, detailed 3-digit information is available. The database, including a live search feature, is available at the surveycodings website at https://surveycodings.org/articles/codings/fields-of-education. The search tree can be used for respondents’ self-identification of fields of education and training in computer-assisted surveys. The live search feature can also be used for post-coding open answers in already collected data.
Opportunity-focused, high-growth entrepreneurship and science-led innovation are crucial for continued economic growth and productivity. Working in these fields offers the opportunity for rewarding and high-paying careers. However, the majority of youth in developing countries do not consider either as job options, affecting their choices of what to study. Youth may not select these educational and career paths due to lack of knowledge, lack of appropriate skills, and lack of role models. We provide a scalable approach to overcoming these constraints through an online education course for secondary school students that covers entrepreneurial soft skills, scientific methods, and interviews with role models.
The study comprises three experimental trials provided Before and during COVID-19 pandemic in different regions of Ecuador. This catalog entry includes data from Experiment 3: Coastal Educational Regime (Régimen Costa) 2020/2021. The data from the other two experiments are also available in the catalog.
Experiment 3: Coastal Educational Regime (Régimen Costa) 2020/2021
A randomized experiment conducted in high schools in Ecuador as rapid fire response to the hurdles of COVID-19 for the Coastal Educational regimes schools (Régimen Costa); Students finish the program in December 2020). The intervention is an online education course that covers entrepreneurial soft skills, scientific methods, and interviews with role models. This course is taken by students at home during the COVID-19 pandemic under teachers’ supervision. We work mostly with 14-22-year-old students (16,441 students) in 598 schools assigned to the program. We randomly assign schools either to treatment (and receiving the entrepreneurship courses online), or placebo-control (receiving a placebo treatment of online courses from standard curricula) groups. We also cross-randomize the role models and evaluate set of nimble interventions to increase take-up. The details of intervention can be found in AEA registry: Asanov, Igor and David McKenzie. 2021. Scaling up virtual learning of online learning in high schools. AEA RCT Registry. March 23 Merged datasets from the baseline, midline, endline survey for each experiment administrated through online learning platform in school during normal educational hours before COVID-19 pandemic or at student’s home during COVID-19 pandemic are documented here. The detailed information about the questioner and each item can be found in the codebooks (Baseline 1, Baseline 2, Midline, Endline 1, Endline 2) for corresponding experiments.
Experiment 3: Coastal Educational Regime (Régimen Costa) 2020/2021
We cover students of last year of education in School K12 of technical specialization (Bachillerato técnico) that study in Coastal Educational Regime (Régimen Costa) 2020/2021, suppose to finish their education in school in March 2021 and we capable to register on the online platform. The schools in highlands educational regime covered in this experiment scatter over the next educational zones 1, 2, 3, 4, 5, 6, 7, 8, 9.
Taken together in the experiment 2,3 we offered the program across all Ecuador to schools that have technical specialization track.
Student
Sample survey data [ssd]
All students in selected schools who were present in classes filled out the baseline questionnaire
Internet [int]
Questionnaires We execute three main sets of questioners. A. Internet (Online Based survey)
The survey consists of a multi-topic questionnaire administered to the students through online learning platform in school during normal educational hours before COVID-19 pandemic or at home during the COVID-19 pandemic. We collect next information:
1. Subject specific knowledge tests. Spanish, English, Statistics, Personal Initiative (only endline), Negotiations (only endline).
2. Career intentions, preferences, beliefs, expectations, and attitudes. STEM and entrepreneurial intentions, preferences, beliefs, expectations, and attitudes.
3. Psychological characteristics. Personal Initiative, Negotiations, General Cognitions (General Self-Efficacy, Youth Self-Efficacy, Perceived Subsidiary Self-Efficacy Scale, Self-Regulatory Focus, Short Grit Scale), Entrepreneurial Cognitions (Business Self-Efficacy, Identifying Opportunities, Business Attitudes, Social Entrepreneurship Standards).
4. Behavior in (incentivized) games: Other-regarding preferences (dictator game), tendency to cooperate (Prisoners Dilemma), Perseverance (triangle game), preference for honesty, creativity (unscramble game).
5. Other background information. Socioeconomic level, language spoken, risk and time preferences, trust level, parents background, big-five personality traits of student, cognitive abilities.
Background information (5) collected only at the baseline.
B. First follow-up Phone-based Survey Zone 2, Summer (Phone Based).
The survey replicates by phone shorter version of the internet-based survey above. We collect next information:
1. Subject specific knowledge tests.
2. Career intentions, preferences, beliefs, expectations, and attitudes.
3. Psychological characteristics
C. (Second) Follow-up Phone-Based Survey, Winter, Zone 2, Highlands Educational Regime.
We execute multi-topic questionnaire by phone to capture the first life-outcomes of students who finished the school. We collect next information:
Data Editing A. Internet, Online-based surveys. We extracted the raw data generated on online platform from each experiment and prepared it for research purposes. We made several pre-processing steps of data: 1. We transform the raw data generated on platform in standard statistical software (R/STATA) readable format. 2. We extracted the answer for each item for each student for each survey (Baseline, Midline, Endline). 3. We cleaned duplicated students and duplicated answers for each item in each survey based on administrative data, performance and information given by students on platform. 4. In case of baseline survey, we standardized items/scales but also kept the raw items.
B. Phone-based surveys. The phone-based surveys are collected with help of advanced CATI kit. It contains all cases (attempts to call) and indication if the survey was effective. The data is cleaned to be ready for analysis. The data is anonymized but contains unique anonymous student id for merging across datasets.
Student Feedback Survey for Bachelor Graduates is a yearly study charting the respondents' experiences about university studies. The survey was nationwide and it was sent to every student in Finland who had graduated with a bachelor's degree or had completed three years of their studies in the fields of medicine or dentistry during that year. The questions deal with the studies as a whole and subjects such as study-life balance, well-being, teaching methods and time management. The survey began with questions about working while studying and questions charting the financial situation of the respondents. The respondents were also asked about their hobbies and extracurricular activities. Next the respondents were asked about their satisfaction with their education. This was followed by questions about the support they received from their friends and family. After this the respondents were asked to evaluate their satisfaction with their friends, family and life in general. Further, the respondents were asked to evaluate their success during the first years of university and their likelihood of completing university studies. In the next part, the respondents were asked to assess the guidance received from the university on topics such as planning of studies, health related problems, and motivation problems. The respondents were presented with statements about their studies dealing with group work, individual studying, internationality in their studies, and support and availability of information regarding their studies. Finally the respondents were presented with a large question battery containing 72 statements about their studies. The battery included questions about mental and physical welfare, personal abilities, social activity, relationship with own university and university staff, goals, and motivation. The questionnaire included university-specific questions for some universities. The background variables included the respondent's university, field of study, age, and gender. In addition, some universities had more detailed variables, charting, for example, the degree programme.
Abstract copyright UK Data Service and data collection copyright owner.
The purpose of this study was to investigate the factors which influence young people in their demand for higher education in its various forms - at universities, colleges of education (teacher training colleges), polytechnics and colleges of further education. Six of these eight surveys are the main study which was carried out on (a) the schools and the fifth-formers and the sixth-formers in them, and (b) the colleges of further education and their home students studying A' level subjects full-time.</p><p><br>
The material from the young people includes that given by them at two stages, first from the main survey which took place before they sat GCE examinations and before the results of higher education applications were available and secondly, from the follow-up survey after the results of the GCE examinations were known and the young people already embarked on courses the following session. For the fifth and sixth-form surveys (67001, 67002 and 68005) there is also incorporated the form teachers' broad assessment of ability (three categories) examination prospects and higher education and career aspirations. For the schools the main survey was carried out in the Spring term 1967 with the follow-up in the autumn. The equivalent dates in the colleges of further education were May 1967 and January 1968.</p><p><br>
(The remaining two surveys are subsidiary to the project; 66023 is the pilot stage of the main survey part of 68004, i.e. home students studying
A' levels full-time in the further education colleges, whilst 67005 (fifth-formers in the fast stream in schools) comprises a sub-set of material from the main fifth-form survey for an enlarged sample of those pupils in schools with fast streams).
The six surveys in the main study are interlinked with information from the school or college complementing that from the pupil or student. In addition there is standardisation - as far as was practicable - between sections of the questionnaire used for the fifth-formers, lower and upper sixth-formers and students in further education (e.g. general background). The contents of the questionnaire for the upper sixth-formers and further education students corresponded particularly closely. Copies of all reports on the surveys are in the Library of the Royal Statistical Society. Mainly they deal with specific aspects of the data e.g. 'Subject commitments and the demand for higher education', G. A. Barnard and M. D. McCreath (1970) Journal of the Royal Statistical Society Series A (General) 133 (3) 358 - 408, 'Report of the surveys of full-time 'A' level students (home) in colleges of further education', by M. D. McCreath (1970). All the material which is available is listed in the most recent report written in 1972, Factors influencing choice of higher education: surveys carried out by Margaret D McCreath under the direction of Professor G A Barnard, Department of Mathematics, University of Essex. This 1972 report includes data from both the school and further education surveys. The extensive tables are based on the following variables: social class, expectations about leaving school and reasons for doing so, source of the most useful discussion on what to do after school, family experience of higher education, O' and
A' level attempts and passes, knowledge of higher education entry requirements and with whom these were discussed, as well as intended and actual destinations in higher education.
The technical note on the sample design by Judith Doherty was published in 1970 as Appendix 1 of Volume 1 of the Schools Council Sixth-Form Survey, Sixth-Form Pupils and Teachers. Details of the response rates are given in the 1972 report mentioned above.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This repository includes the questionnaire and dataset collected for the paper "DEI in Computing Higher Education: Survey and Analysis of Brazilian and Finnish University Practices," which was submitted to ESEM 2025.
Paper Abstract:
Background: Efforts have been made in STEM, for example, to encourage women to pursue careers in computing or promote the importance of team diversity in the field. However, implementing Diversity and Inclusion (DEI) in university-level computing education remains underexplored.
Aims: This study compares the current state of DEI in Brazilian and Finnish universities.
Method: We replicated in Brazil an online survey conducted in Finland.
Results: We received 68 responses from Brazilian teachers. We compared the Brazilian and Finnish scenarios for incorporating DEI aspects in the classes.
Conclusions: While the importance of DEI in education is recognized, the implementation of DEI practices varies significantly across institutions and countries. Several challenges hinder this implementation, including a lack of teaching materials, insufficient training, limited institutional support, and concerns about addressing these topics appropriately. Regarding countries' differences, Brazilian professors rate DEI more important but report lower satisfaction than Finns, highlighting cultural and demographic factors affecting DEI practices.
Files Description:
Replicated Study:
The data consist of two surveys aimed at young people not in employment, education or training (NEET). The surveys charted the young people's social participation, mental and physical well-being, and use of social and health services. Main themes in the surveys included living conditions, quality of life, social relationships, use of social and health services, lifestyle, and social media use. The baseline survey was conducted at the beginning of the research (questionnaire A, variables marked with a) and the second survey six months after the baseline survey (questionnaire B, variables marked with b). The respondents were randomly assigned into test and control groups based on the baseline survey. The test group participated in online group discussion that covered topics such as loneliness and hobbies. The group discussions were held on the heimo.co website and led by youth work professionals. The respondents in the control group were offered conventional targeted youth work services. All in all, four moderated discussion groups were organised for nine weeks in collaboration with the departments of youth services in three cities in Finland. The data were collected as part of the Inclusive Promotion of Health and Wellbeing (PROMEQ 2016-2019) research project, which studied population groups that need special support. The aim of the PROMEQ project was to develop and demonstrate novel models of promotion of health and wellbeing. Survey data from the other target groups of the project as well as combined data from all surveys are also available at FSD (FSD3433-FSD3436). The surveys included many scales and questions used in other studies. Questions were selected, for instance, from the Finnish Youth Surveys, as well as the Regional Health and Well-being Study (ATH) and Welfare and Services in Finland (HYPA) surveys conducted by the Finnish Institute for Health and Welfare (THL). Most questions included in the baseline survey were repeated in the follow-up survey. First, the surveys charted the respondents' living conditions, income, loans, and need for financial aid or food assistance. The respondents' health, well-being and quality of life were examined with questions on, for example, how satisfied the respondents were with their health, how much they had enjoyed life in the past two weeks, how safe and secure they felt in their everyday life, and whether they had enough energy and drive for their daily life. Satisfaction with different spheres of life was also surveyed with questions regarding, for example, quality of sleep, capacity to work, relationships, and support received from friends. Social relationships and trust were examined next. The respondents were asked whether they often felt lonely and whether they were often in contact with their friends, partner, or family. Questions also focused on the respondents' sociability and feelings of belonging (e.g. whether they felt they were a part of a friend group, had much in common with people around them, and could find company when they wanted to, or whether they felt left out and isolated). The respondents' participation in group activities and use of social media were charted. Questions on social media use included, for instance, how often the respondents read or commented on posts by other people and how often they posted on social media themselves. Trust in other people and various institutions, such as public health care, the judicial system, and municipal decision-making, was then examined. The respondents' opinions on their own opportunities in life were also surveyed (e.g. whether they thought they had good or bad opportunities to strive for happiness in life and to act according to their conscience). Next, the respondents' use of social and health services was surveyed. The respondents were asked whether they had visited a doctor or other health or social services professional or received services that promote employment in the past 12 months. Additionally, the respondents were asked whether they had bought medication, met with a youth employee, or been otherwise in contact with social and health care employees or youth employees in the past 12 months. Questions also focused on basic social assistance, the respondents' satisfaction with the availability of various social and public services (e.g. library, indoor exercise and youth services), and their participation in group activities promoting health and well-being (e.g. weight management groups, AA, NA). Finally, the respondents' lifestyle was examined with questions focusing on their exercise, eating, and drinking habits. Background variables included, among others, the respondent's gender, year of birth, marital status, household composition, housing tenure, highest level of education, economic activity and occupational status, and household income.
Since the beginning of the 1960s, Statistics Sweden, in collaboration with various research institutions, has carried out follow-up surveys in the school system. These surveys have taken place within the framework of the IS project (Individual Statistics Project) at the University of Gothenburg and the UGU project (Evaluation through follow-up of students) at the University of Teacher Education in Stockholm, which since 1990 have been merged into a research project called 'Evaluation through Follow-up'. The follow-up surveys are part of the central evaluation of the school and are based on large nationally representative samples from different cohorts of students.
Evaluation through follow-up (UGU) is one of the country's largest research databases in the field of education. UGU is part of the central evaluation of the school and is based on large nationally representative samples from different cohorts of students. The longitudinal database contains information on nationally representative samples of school pupils from ten cohorts, born between 1948 and 2004. The sampling process was based on the student's birthday for the first two and on the school class for the other cohorts.
For each cohort, data of mainly two types are collected. School administrative data is collected annually by Statistics Sweden during the time that pupils are in the general school system (primary and secondary school), for most cohorts starting in compulsory school year 3. This information is provided by the school offices and, among other things, includes characteristics of school, class, special support, study choices and grades. Information obtained has varied somewhat, e.g. due to changes in curricula. A more detailed description of this data collection can be found in reports published by Statistics Sweden and linked to datasets for each cohort.
Survey data from the pupils is collected for the first time in compulsory school year 6 (for most cohorts). Questionnaire in survey in year 6 includes questions related to self-perception and interest in learning, attitudes to school, hobbies, school motivation and future plans. For some cohorts, questionnaire data are also collected in year 3 and year 9 in compulsory school and in upper secondary school.
Furthermore, results from various intelligence tests and standartized knowledge tests are included in the data collection year 6. The intelligence tests have been identical for all cohorts (except cohort born in 1987 from which questionnaire data were first collected in year 9). The intelligence test consists of a verbal, a spatial and an inductive test, each containing 40 tasks and specially designed for the UGU project. The verbal test is a vocabulary test of the opposite type. The spatial test is a so-called ‘sheet metal folding test’ and the inductive test are made up of series of numbers. The reliability of the test, intercorrelations and connection with school grades are reported by Svensson (1971).
For the first three cohorts (1948, 1953 and 1967), the standartized knowledge tests in year 6 consist of the standard tests in Swedish, mathematics and English that up to and including the beginning of the 1980s were offered to all pupils in compulsory school year 6. For the cohort 1972, specially prepared tests in reading and mathematics were used. The test in reading consists of 27 tasks and aimed to identify students with reading difficulties. The mathematics test, which was also offered for the fifth cohort, (1977) includes 19 assignments. After a changed version of the test, caused by the previously used test being judged to be somewhat too simple, has been used for the cohort born in 1982. Results on the mathematics test are not available for the 1987 cohort. The mathematics test was not offered to the students in the cohort in 1992, as the test did not seem to fully correspond with current curriculum intentions in mathematics. For further information, see the description of the dataset for each cohort.
For several of the samples, questionnaires were also collected from the students 'parents and teachers in year 6. The teacher questionnaire contains questions about the teacher, class size and composition, the teacher's assessments of the class' knowledge level, etc., school resources, working methods and parental involvement and questions about the existence of evaluations. The questionnaire for the guardians includes questions about the child's upbringing conditions, ambitions and wishes regarding the child's education, views on the school's objectives and the parents' own educational and professional situation.
The students are followed up even after they have left primary school. Among other things, data collection is done during the time they are in high school. Then school administrative data such as e.g. choice of upper secondary school line / program and grades after completing studies. For some of the cohorts, in addition to school administrative data, questionnaire data were also collected from the students.
he sample consisted of students born on the 5th, 15th and 25th of any month in 1953, a total of 10,723 students.
The data obtained in 1966 were: 1. School administrative data (school form, class type, year and grades). 2. Information about the parents' profession and education, number of siblings, the distance between home and school, etc.
This information was collected for 93% of all born on the current days. The reason for this is reduced resources for Statistics Sweden for follow-up work - reminders etc. Annual data for cohorts in 1953 were collected by Statistics Sweden up to and including academic year 1972/73.
Response rate for test and questionnaire data is 88% Standard test results were received for just over 85% of those who took the tests.
The sample included a total of 9955 students, for whom some form of information was obtained.
Part of the "Individual Statistics Project" together with cohort 1953.
Every year, all parents, all teachers, and students take the NYC School Survey. The survey ranks among the largest surveys of any kind ever conducted nationally. Survey results provide insight into a school's learning environment and contribute a measure of diversification that goes beyond test scores on the Progress Report. Survey questions assess the community's opinions on academic expectations, communication, engagement, and safety and respect. School leaders can use survey results to better understand their own school's strengths and target areas for improvement.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Open Science in (Higher) Education – data of the February 2017 survey
This data set contains:
Survey structure
The survey includes 24 questions and its structure can be separated in five major themes: material used in courses (5), OER awareness, usage and development (6), collaborative tools used in courses (2), assessment and participation options (5), demographics (4). The last two questions include an open text questions about general issues on the topics and singular open education experiences, and a request on forwarding the respondent’s e-mail address for further questionings. The online survey was created with Limesurvey[1]. Several questions include filters, i.e. these questions were only shown if a participants did choose a specific answer beforehand ([n/a] in Excel file, [.] In SPSS).
Demographic questions
Demographic questions asked about the current position, the discipline, birth year and gender. The classification of research disciplines was adapted to general disciplines at German higher education institutions. As we wanted to have a broad classification, we summarised several disciplines and came up with the following list, including the option “other” for respondents who do not feel confident with the proposed classification:
The current job position classification was also chosen according to common positions in Germany, including positions with a teaching responsibility at higher education institutions. Here, we also included the option “other” for respondents who do not feel confident with the proposed classification:
We chose to have a free text (numerical) for asking about a respondent’s year of birth because we did not want to pre-classify respondents’ age intervals. It leaves us options to have different analysis on answers and possible correlations to the respondents’ age. Asking about the country was left out as the survey was designed for academics in Germany.
Remark on OER question
Data from earlier surveys revealed that academics suffer confusion about the proper definition of OER[2]. Some seem to understand OER as free resources, or only refer to open source software (Allen & Seaman, 2016, p. 11). Allen and Seaman (2016) decided to give a broad explanation of OER, avoiding details to not tempt the participant to claim “aware”. Thus, there is a danger of having a bias when giving an explanation. We decided not to give an explanation, but keep this question simple. We assume that either someone knows about OER or not. If they had not heard of the term before, they do not probably use OER (at least not consciously) or create them.
Data collection
The target group of the survey was academics at German institutions of higher education, mainly universities and universities of applied sciences. To reach them we sent the survey to diverse institutional-intern and extern mailing lists and via personal contacts. Included lists were discipline-based lists, lists deriving from higher education and higher education didactic communities as well as lists from open science and OER communities. Additionally, personal e-mails were sent to presidents and contact persons from those communities, and Twitter was used to spread the survey.
The survey was online from Feb 6th to March 3rd 2017, e-mails were mainly sent at the beginning and around mid-term.
Data clearance
We got 360 responses, whereof Limesurvey counted 208 completes and 152 incompletes. Two responses were marked as incomplete, but after checking them turned out to be complete, and we added them to the complete responses dataset. Thus, this data set includes 210 complete responses. From those 150 incomplete responses, 58 respondents did not answer 1st question, 40 respondents discontinued after 1st question. Data shows a constant decline in response answers, we did not detect any striking survey question with a high dropout rate. We deleted incomplete responses and they are not in this data set.
Due to data privacy reasons, we deleted seven variables automatically assigned by Limesurvey: submitdate, lastpage, startlanguage, startdate, datestamp, ipaddr, refurl. We also deleted answers to question No 24 (email address).
References
Allen, E., & Seaman, J. (2016). Opening the Textbook: Educational Resources in U.S. Higher Education, 2015-16.
First results of the survey are presented in the poster:
Heck, Tamara, Blümel, Ina, Heller, Lambert, Mazarakis, Athanasios, Peters, Isabella, Scherp, Ansgar, & Weisel, Luzian. (2017). Survey: Open Science in Higher Education. Zenodo. http://doi.org/10.5281/zenodo.400561
Contact:
Open Science in (Higher) Education working group, see http://www.leibniz-science20.de/forschung/projekte/laufende-projekte/open-science-in-higher-education/.
[1] https://www.limesurvey.org
[2] The survey question about the awareness of OER gave a broad explanation, avoiding details to not tempt the participant to claim “aware”.